WO2014004075A2 - Enabling and disabling features of a headset computer based on real-time image analysis - Google Patents

Enabling and disabling features of a headset computer based on real-time image analysis Download PDF

Info

Publication number
WO2014004075A2
WO2014004075A2 PCT/US2013/045152 US2013045152W WO2014004075A2 WO 2014004075 A2 WO2014004075 A2 WO 2014004075A2 US 2013045152 W US2013045152 W US 2013045152W WO 2014004075 A2 WO2014004075 A2 WO 2014004075A2
Authority
WO
WIPO (PCT)
Prior art keywords
headset computer
computer
headset
features
vehicle
Prior art date
Application number
PCT/US2013/045152
Other languages
French (fr)
Other versions
WO2014004075A3 (en
Inventor
Stephen A. Pombo
Jeffrey J. Jacobsen
Christopher Parkinson
Original Assignee
Kopin Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kopin Corporation filed Critical Kopin Corporation
Priority to JP2015520240A priority Critical patent/JP2015523026A/en
Priority to EP13732732.6A priority patent/EP2867741A2/en
Priority to CN201380034874.2A priority patent/CN104428729A/en
Publication of WO2014004075A2 publication Critical patent/WO2014004075A2/en
Publication of WO2014004075A3 publication Critical patent/WO2014004075A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/025Services making use of location information using location based information parameters
    • H04W4/027Services making use of location information using location based information parameters using movement velocity, acceleration information
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems

Definitions

  • Mobile computing devices such as notebook personal computers (PC's), Smartphones, and tablet computing devices, are now common tools used for producing, analyzing, communicating, and consuming data in both business and personal life. Consumers continue to embrace a mobile digital lifestyle as the ease of access to digital information increases with high-speed wireless communications technologies becoming ubiquitous.
  • Popular uses of mobile computing devices include displaying large amounts of high-resolution computer graphics information and video content, often wirelessly streamed to the device. While these devices typically include a display screen, the preferred visual experience of a high- resolution, large format display cannot be easily replicated in such mobile devices because the physical size of such devices is limited to promote mobility.
  • the present disclosure relates to human/computer interfaces and more particularly to a headset computer that determines when a user may be wearing the headset computer while in a potentially unsafe situation, such as when operating a vehicle. If the potentially unsafe condition is detected, one or more operational features of the headset computer are disabled.
  • micro-displays can provide large-format, high- resolution color pictures and streaming video in a very small form factor.
  • One application for such displays can include integration into a wireless headset computer worn on the head of the user with the display positioned within the field of view of the user, similar in format to eyeglasses, an audio headset, or video eyewear.
  • a "wireless computing headset” device includes one or more small high-resolution micro-displays and optics to magnify the image.
  • the WVGA micro-displays can provide super video graphics array (SVGA) (800 x 600) resolution or extended graphic arrays (XGA) (1024 x 768) or even higher resolutions.
  • SVGA super video graphics array
  • XGA extended graphic arrays
  • a wireless computing headset contains one or more wireless computing and communication interfaces, enabling data and streaming video capability, and provides greater convenience and mobility than hands dependent devices.
  • a headset computer may also be referred to as a headset computing device or headmounted device (HMD) herein.
  • a headset computer can be equipped with a camera and other sensors, such as speed or acceleration sensor.
  • An image can be captured with the camera.
  • the captured image can be processed using image processing techniques to perform feature extraction.
  • Feature extraction can be performed locally at the headset computer (e.g., by the HSC processor) or remotely by a networked processor, for example in the cloud.
  • the combination of the detected image features and the current speed and/or acceleration information can be used to determine if the current environment is safe for operating the headset computer.
  • the operation of the headset computer functions or features can be modified based on results of the safety determination.
  • the operations, functions and/or features controlled can include powering down the HSC to an "off state, or operating the HSC in an "audio-only” mode, in which the display is disabled and turned off. If an unsafe condition is not detected, the HSC can operate unrestricted.
  • operating conditions for a headset computer are determined using input from a speed sensor or accelerometer together with the results of scene analysis (e.g., image processing with feature extraction) performed on images captured by the camera integrated with the headset computer. If the HSC is travelling above a predetermined speed or acceleration threshold, and if the scene analysis returns a decision that the wearer is apparently sitting in the driver's seat of a motor vehicle, then one or more operating features or functions of the headset computer can be disabled or restricted. For example the display can be disabled or the mobile phone operation can be restricted, audio interface options can be changed, or other actions can be controlled.
  • scene analysis e.g., image processing with feature extraction
  • the scene analysis may detect the presence of a steering wheel, manufacturers' logos, handlebars, gauges, levers, or other elements indicative of what an operator of a vehicle typically sees while operating the vehicle.
  • the scene analysis may account for a typical view from the perspective of a passenger of a vehicle, when determining whether the user of the headset computer is driving.
  • the HSC can automatically turn off its display or control other features when the user wearing the HSC is attempting to operate or operating a moving vehicle.
  • driver/user is protected against the temptation to use the HSC while driving and, thereby, causing a potentially dangerous situation.
  • a passenger can continue to use a fully functional HSC while travelling in a vehicle.
  • Example embodiments using both (i) speed and/or acceleration data and (ii) scene analysis results provide additional fidelity that is useful compared to using either in isolation.
  • An example method of controlling operation of a headset computer includes determining whether an acceleration or a velocity of the headset computer is greater than a predetermined threshold, capturing an image from a perspective of a user of the headset computer using a camera of the headset computer, comparing the captured image against one or more template images representative of elements of a vehicle as seen by occupants of the vehicle, and disabling one or more features of the headset computer based on the comparison of the captured image and the template images indicating that the user of the headset computer is operating a vehicle.
  • the one or more features disabled can include operation of a micro-display or a 3G/4G cellular radio.
  • Example methods of controlling operation of a headset computer can further including enabling one or more features of the headset computer based on the comparison of the captured image and the template images indicating that the user of the headset computer is not operating a vehicle.
  • the one or more features enabled can include operation of the headset computer in an audio only mode or operation of the headset computer wireless communications in a Bluetooth only mode.
  • the one or more of the one or more template images can be stored in a local memory of the headset computer or in a non-local memory accessible to the HSC.
  • An example method can further include determining a current global positioning location of the headset computer and associated jurisdiction based of the current location, and updating the one or more template images to reflect a right- hand drive or left-hand vehicle based on the determined jurisdiction.
  • the elements compared can include any one of the following: a steering wheel, manufacturer logos, speedometer, tachometer, fuel level gauge, battery gauge, oil pressure gauge, temperature gauge, stick shift, heating/air conditioning vents, a windshield orientation relative to a side window(s), car doors, and navigation systems.
  • a headset computer having a micro-display, audio components, camera, motion sensor, data storage media, and programmable data processor including one or more data processing machines that execute instructions retrieved from the data storage media, the instructions can be for: (i) determining whether an acceleration or a velocity received from the motion sensor is greater than a predetermined threshold, (ii) capturing image data using the camera, (iii) processing the image data to extract one or more image features, (iv) combining the image features and velocity and/or acceleration information to determine if a current environment is safe for operating at least one function of the headset computer, and (v) selectively enabling or disabling the headset computer function depending on the result of determining if the current environment is safe.
  • the micro-display can be disabled, an audio-only function can be enabled, a 3G/4G cellular radio function can be disabled, and a Bluetooth wireless communications function can be enabled.
  • the HSC functions can be fully enabled.
  • Example embodiments can further include accessing one or more image features from a network-based storage media in the determination if the current environment is safe.
  • Another example embodiment can further include a Global Positioning System (GPS) receiver to determine a current location and based on the current location determine jurisdiction associated therewith, and further combine a right- hand drive or left-hand drive determination based on the jurisdiction to determine if the current environment is safe or update an image template.
  • GPS Global Positioning System
  • the extracted one or more image features can represent any one of the following: a steering wheel, manufacturer logos, speedometer, tachometer, fuel level gauge, battery gauge, oil pressure gauge, temperature gauge, stick shift, heating/air conditioning vents, a windshield orientation relative to a side windows, car doors, and navigation systems.
  • Still further example embodiment includes a non-transitory computer program product for controlling operation of a headset computer, the computer program product comprising a computer readable medium having computer readable instructions stored thereon, which, when loaded and executed by a processor, cause the processor to determine whether an acceleration or a velocity of the headset computer is greater than a predetermined threshold, capture an image from a perspective of a user of the headset computer, compare the captured image against one or more template images representative of elements of a vehicle as seen by occupants of the vehicle, and disable or enable one or more features of the headset computer based on the comparison of the captured image and the template images indicating that the user of the headset computer is operating a vehicle.
  • FIG. 1 A is a perspective view of an example embodiment of a headset computer in which the approaches described herein may be implemented.
  • FIG. IB illustrates and example embodiment of a headset computer wirelessly communicating with a host computing device (e.g., Smartphone, PC. etc.) and employing a user interface responsive to voice commands, head motions, and hand movements.
  • a host computing device e.g., Smartphone, PC. etc.
  • a user interface responsive to voice commands, head motions, and hand movements.
  • FIG. 2 is a high-level electronic system block diagram of the components of the headset computer.
  • FIGs. 3A and 3B are example scenes including image features taken from inside an automobile from the perspective of the driver and passenger, respectively.
  • FIG. 4 is an example scene including image features from the perspective of a motorcycle operator.
  • FIG. 5 is an example scene including image features from an operator of an antique tractor.
  • FIG. 6 is a flow diagram of a process executed by a processor in the headset to control operation based on speed and scene information.
  • FIGs. 1 A and IB show an example embodiment of a wireless hands-free computing headset device 100 (also referred to herein as a headset computing device, headset computer (HSC) or headmounted device (HMD)) that incorporates a high-resolution (VGA or better) micro-display element 1010, and other features described below.
  • a wireless hands-free computing headset device 100 also referred to herein as a headset computing device, headset computer (HSC) or headmounted device (HMD)
  • HSC headset computer
  • HMD headmounted device
  • FIG. 1A depicts a HSC 100 and generally includes a frame 1000, strap 1002, housing section 1004, speaker(s) 1006, cantilever or arm 1008, micro display 1010, and camera 1020. Also, located within the housing 1004 are various electronic circuits including, as will be understood shortly, a microcomputer (single or multi-core processor), one or more wired or wireless interfaces, and/or optical interfaces, associated memory and/or storage devices, and various sensors.
  • a microcomputer single or multi-core processor
  • one or more wired or wireless interfaces and/or optical interfaces
  • associated memory and/or storage devices and various sensors.
  • a head worn frame 1000 and strap 1002 are generally configured so that a user can wear the headset computer device 100 on the user's head.
  • a housing 1004 is generally a low profile unit which houses the electronics, such as the microprocessor, memory or other storage device, low power wireless
  • Speakers 1006 provide audio output to the user so that the user can hear information, such as the audio portion of a multimedia presentation, or audio prompt, alert, or feedback signaling recognition of a user command.
  • Micro-display subassembly 1010 is used to render visual information, such as images and video, to the user.
  • Micro-display 1010 is coupled to the arm 1008.
  • the arm 1008 generally provides physical support such that the micro-display subassembly is able to be positioned within the user's field of view, preferably in front of the eye of the user or within its peripheral vision preferably slightly below or above the eye.
  • Arm 1008 also provides the electrical or optical connections between the micro-display subassembly 1010 and the control circuitry housed within housing unit 1004.
  • the electronic circuits located within the housing 1004 can include display drivers for the microdisplay element 1010 and input and/or output devices, such as one or more microphone(s), speaker(s), geo-position sensors, 3 axis to 9 axis degrees of freedom orientation sensing, atmospheric sensors, health condition sensors, GPS, digital compass, pressure sensors, environmental sensors, energy sensors, acceleration, position, altitude, motion, velocity or optical sensors, cameras (visible light, infrared (IR), ultra violet (UV), etc.), additional wireless radios (Bluetooth®, Wi-Fi®, LTE, 3G Cellular, 4G Cellular, NFC, FM, etc.), auxiliary lighting, range finders, or the like, and/or an array of sensors embedded in the headset frame and/or attached via one or more peripheral ports. (Bluetooth is a registered trademark of Bluetooth Sig, Inc., of Kirkland Washington; and Wi-Fi is a registered trademark of Wi-Fi Alliance Corporation of Austin Texas.)
  • example embodiments of the HSC 100 can receive user input through recognizing voice commands, sensing head movements, 110, 11 1 , 112 and hand gestures 1 13, or any combination thereof.
  • Microphone(s) operatively coupled or preferably integrated into the HSC 100 can be used to capture speech commands which are then digitized and processed (2310, Fig. 2) using automatic speech recognition (ASR) techniques.
  • Speech can be a primary input interface to the HSC 100, which is capable of detecting a user's voice, and using speech recognition, derive commands.
  • the HSC 100 then uses the commands derived from the speech recognition to perform various functions.
  • Gyroscopes, accelerometers, and other micro-electromechanical system sensors can be integrated into the HSC 100 and used to track the user's head movement to provide user input commands. Cameras or other motion tracking sensors can be used to monitor a user's hand gestures for user input commands.
  • the camera(s), motion sensor(s) and/or positional sensor(s) are used to track the motion and/or position of the user's head, hands and/or body in at least a first axis 111 (horizontal), but preferably also a second (vertical) 1 12, third (depth) 113, fourth (pitch), fifth (roll) and sixth (yaw).
  • a three axis magnetometer (digital compass) can be added to provide the wireless computing headset or peripheral device with a full 9 axis degrees of freedom position accuracy.
  • the voice command automatic speech recognition and head motion tracking features of such a user interface overcomes the hands-dependant formats of other mobile devices.
  • the headset computing device 100 can wirelessly communicate with a remote host computing device 200. Such communication can include streaming video signals received from host 200, such that the HSC 100 can be used as a remote auxiliary display.
  • the host 200 may be, for example, a notebook PC, Smartphone, tablet device, or other computing device having sufficient computational complexity to communicate with the HSC 100.
  • the host may be further capable of connecting to other networks 210, such as the Internet.
  • the HSC 100 and host 200 can wirelessly communicate via one or more wireless protocols, such as Bluetooth ® , Wi- Fi ® , WiMAX or other wireless radio link 150.
  • the HSC 100 can be used as a stand alone, fully functional wireless Internet-connected computer system.
  • the HSC 100 with microdisplay 1010 can enable the user to select a field of view 300 within a much larger area defined by a virtual display 400.
  • the user can control the position, extent (e.g., X - Y or 3D range), and/or magnification of the field of view 300.
  • the HSC may be embodied in various physical forms such as a monocular head mounted computer as shown, but also as a wearable computer, digital eyewear, electronic eyeglasses, and in other forms.
  • the HSC may take the form of the HSC described in a co-pending U.S. Patent Application No. 13/018,999, entitled “Wireless Hands-Free Computing Headset With Detachable Accessories Controllable By Motion, Body Gesture And/Or Vocal Commands" by Jacobsen et al, filed February 1, 2011, which is hereby incorporated by reference in its entirety.
  • FIG. 2 is a high-level block diagram of the electronic system of the headset computer 100.
  • the electronics system includes a processor 2100, memory 2102, and mass storage 2104, as is typical for any programmable digital computer system. Also included in the electronics system are the microdisplay 2110, one or more microphones 2112, 2114, speakers 2106, 2108, wireless communication module(s) 2105, camera 2120, and accelerometer 2150 or other speed sensors 2200, such as a Global Position System (GPS) receiver that can deliver speed and/or acceleration information.
  • GPS Global Position System
  • the processor 2100 executes instructions 2510 that are stored in the memory 2102 and accesses data stored in the memory 2102 and/or storage 2104.
  • the processor 2100 may for example execute instructions 2510 embodied as software code.
  • the processor 2100 may also make use of an operating system 2400 and applications 2410 running within the context of the operating system 2400 to provide various functions.
  • the processor 2100 can execute stored instructions 2510 to perform image capture 2350 and perform scene analysis 2360.
  • the instructions to perform image capture 2360 may include calls for the camera 2120 (1020 in Fig. 1A) to first activate autofocusing, autobalancing and/or other image capturing features, then take a picture.
  • Performing scene analysis 2360 can determine whether or not the image data contains some specific object, feature, element or activity.
  • Scene analysis 2360 can be performed in any variety of ways, including, for example object or feature recognition, identification, or detection, and can include content-based image retrieval.
  • the image capture 2350 and scene analysis 2360 preferably occur in real time, and, therefore, are preferably
  • image capture 2350 and scene analysis 2360 may also be implemented as applications 2410 running on top of the operating system 2400.
  • the memory 2102 and or storage not only store instructions 2510 for the processor to carry out, but can also store one more scene data templates 2300.
  • Scene data templates 2300 are digital representations of images typically seen by the operator and/or occupants of a motor vehicle.
  • the processor 2100 is programmed to automatically use the embedded camera 2120 and accelerometer 2150 to determine when a vehicle operator is wearing the headset computer 100.
  • the HSC 100 determines that such a condition exists, one or more features of the HSC 100 are then disabled.
  • the accelerometer 2150 or GPS 2200, etc.
  • the scene analysis 2360 concludes that the user of the headset computer is not operating the vehicle, but rather is a passenger in the vehicle, the HSC may remain fully functional.
  • the combination of speed or acceleration sensor 2150, 2200, and scene analysis 2360 provides a useful safety feature for drivers while providing a pleasant experience for passengers.
  • the HSC 100 may enable only the audio functions, and/or other functions, such as just the Bluetooth connectivity function.
  • the driver may still be able to use the Bluetooth audio system built into the vehicle to make calls using the 3G/4G cellular radios in the HSC 100 or stream other audio content.
  • FIGs 3 A and 3B illustrate image data representing typical scene data 2300 that can be stored in the HSC 100 and representative of an image captured by camera 2120.
  • FIG. 3A is a scene 3000 of the components inside a vehicle taken from the perspective of a driver.
  • the primarily recognizable component element or image feature of the scene 3000 is a steering wheel 3010.
  • other elements or image features of the scene 3000 can be useful in scene analysis 2360 and can include manufacturer logos 3012 (in the center of the steering wheel 3010), speedometer 3014, tachometer 3016, fuel level 3018 and other gauges, operator controls, such as a stick shift 3021, heating/air conditioning vents 3023, the relative orientation of the windshield 3025 and side windows 3027, the presence of car doors 3029, floors 3031 , other instruments located to the side of the dashboard, such as navigation systems 3033.
  • Image features that specify the relative orientation of doors 3029, windshields 3025, and side windows 3027 for both left-hand and right-hand drive automobiles can be included in image templates and scene data 2300.
  • Stored scene data 2300 or template images can include data for both right-hand and left-hand drive vehicles. Further, such stored scene data 2300 can include jurisdictional data.
  • the jurisdictional data can include the geographical locations of jurisdictions and whether it is a left-hand drive or right-hand drive jurisdiction.
  • a HSC 100 with GPS can provide location information, which can then be used to determine the jurisdiction in which the HSC 100 is located.
  • jurisdictional information can be used to prioritize scene analysis for left-hand drive or right-hand drive vehicles. For example, if the GPS determines the HSC 100 is located in Canada, then scene analysis for a right-hand drive vehicle can be prioritized.
  • the stored scene elements 2300 can also account the possible zoom settings of the camera 2120. For example, on some zoom settings only a portion of the dashboard may be visible (such as only a portion of the wheel 3010 and a few gauges 3018), whereas on other zoom settings, the windshield 3025, side windows 3027, doors 3029 and even portions of the floor 3031 may be visible. Such various possibilities can be accounted for by storing the scene data in particularly efficient ways, for example, by storing multiple versions of a given scene for different zoom levels or by using hierarchical scene element models.
  • Stored scene data 2300 may also include representations of vehicle occupant scenes such as scene 3100 of FIG. 3B, which is typical of what is viewed by a passenger in the front seat. While some elements remain the same (such as the presence of a navigation system 3033 and stick shift 3021), here they are located on the opposite side of the view or scene 3100 compared to the driver scene 3000 of FIG. 3 A. Most prominently, however, the scene 3100 is missing the steering wheel 3010 and gauges 3018, and includes the presence of other indicative items, such as a glovebox 3110.
  • scene analysis 2360 Any convenient known scene analysis (image recognition) algorithm may be used by scene analysis 2360 to compare the images obtained by image capture 2350 against the scene data templates 2300.
  • Such algorithms preferably can be relatively high-speed since the user's access to the device or device features is being controlled.
  • the algorithms are preferably carried out in real time, and, therefore, can be embodied as high priority operating system calls, interrupts, or even embedded in the operating system kernal, depending on the processor type and operating system selected for implementation.
  • the processor 2100 can execute stored instructions 2510 to perform image capture 2350 and upload the scene data to a host 200 for cloud-based scene analysis and receive a scene analysis decision.
  • a cloud-based scene analysis can perform a more computationally intense scene analysis than the scene analysis 2360 that is performed on-board (i.e., locally) the HSC 100.
  • Cloud-based scene analysis can have access to a vast library of vehicle scene that may be impractical to store at the local memory 2102 due to resource limitations.
  • Cloud-based scene analysis in coordination with an appropriate scene analysis (image recognition) algorithm - a design decision that enables sufficiently quick processing and decision making - can also be used to limit the user's access to operational features of the HSC 100.
  • Such cloud-based analysis can be useful to unburden and off-load some of the memory intense and computationally intense processes from the HSC 100.
  • FIG. 4 is a scene 4000 typical of the operator of a motorcycle.
  • elements such as handlebars 4010, gas tanks and gauges 4014, mirrors 4028, and shifter 4021 can be included in the scene data templates 2300.
  • FIG. 5 is a scene 5000 from the perspective of an operator of an antique tractor.
  • the operator may be sitting very close to a very large steering wheel 5010 and, therefore, only a few portions 5012 of the steering wheel 5010 are visible.
  • Other elements may include gauge(s) 5018, levers 5021 and hood section 5033 of the tractor that can be extracted as image features for recognition of scene 5000.
  • FIG. 6 is a flow diagram of a process 6000 that can be executed by the processor 2100 to implement control over the HSC 100 using the speed sensor 2150 and scene analysis 2360.
  • a speed and/or acceleration is determined an compared to a threshold.
  • the accelerometer 2150 or GPS 2200 may indicate rapid acceleration or constant speed above a certain amount, such as 4 miles per hour (MPH).
  • stage 602 is entered. At stage 602 one or more images are captured using the camera 2120. The images captured in stage 602 are then processed by scene analysis 2360 in stage 604. A scene analysis stage 604 may make use of various scene data templates 606, accessed via either the memory 2102 or storage 2104.
  • the scene data templates 606 (or 2300) can be representative of scenes typically viewed by the operators and passengers of motor vehicles, such as those described above with respect to scenes 3000, 3100, 4000, 5000.
  • Stage 608 may make a determination as to whether or not the user of the HSC 100 is travelling in (or on) a vehicle. If this is not the case, then stage 610 can be entered, where all available operating modes are active.
  • stage 612 is entered. At stage 612, a determination is made as to whether or not the user is a passenger in (or on) the vehicle. If the user is determined to be an occupant, then processing can continue to stage 610 where all operating modes are enabled.
  • stage 614 is entered.
  • one or more modes of operational features or functions of the HSC 100 are enable or disabled.
  • stage 620-1 can disable the display.
  • Stage 620-2 can disable the wireless communication interfaces such as 3G or 4G cellular.
  • Stage 620-3 can enable only audio functions, such as the
  • stage 620-4 the display, speaker, and microphones are enabled with only a Bluetooth interface and cellular voice functions are enabled.
  • the Bluetooth (BT) mode 620-4 can permit the driver to place a voice telephone call using an external, in-vehicle, safe, Bluetooth system.
  • driver-detection feature 6000 there may be a way for the user of the HSC 100 to override the driver-detection feature 6000, such as by providing certain specialized commands via the voice recognition functions.
  • the example embodiment described herein are limited to ground vehicles, those having skill in the art should recognize that embodiments of disclosed invention can be applied in other environments and in other contexts to ensure safe usage of the HSC 100.
  • the various "data processors" described herein may each be implemented by a physical or virtual general purpose computer having a central processor, memory, disk or other mass storage, communication interface(s), input/output (I/O) device(s), and other peripherals.
  • the general purpose computer is transformed into the processors and executes the processes described above, for example, by loading software instructions into the processor, and then causing execution of the instructions to carry out the functions described.
  • such a computer may contain a system bus, where a bus is a set of hardware lines used for data transfer among the components of a computer or processing system.
  • the bus or busses are essentially shared conduit(s) that connect different elements of the computer system (e.g. , processor, disk storage, memory, input/output ports, network ports, etc.) that enables the transfer of information between the elements.
  • One or more central processor units are attached to the system bus and provide for the execution of computer instructions.
  • I/O device interfaces for connecting various input and output devices (e.g., keyboard, mouse, displays, printers, speakers, etc.) to the computer.
  • Network interface(s) allow the computer to connect to various other devices attached to a network.
  • Memory provides volatile storage for computer software instructions and data used to implement an embodiment.
  • Disk or other mass storage provides non-volatile storage for computer software instructions and data used to implement, for example, the various procedures described herein.
  • Embodiments may therefore typically be implemented in hardware, firmware, software, or any combination thereof.
  • the procedures, devices, and processes described herein are a computer program product, including a computer readable medium (e.g., a removable storage medium such as one or more DVD-ROM's, CD-ROM's, diskettes, tapes, etc.) that provides at least a portion of the software instructions for the system.
  • a computer readable medium e.g., a removable storage medium such as one or more DVD-ROM's, CD-ROM's, diskettes, tapes, etc.
  • Such a computer program product can be installed by any suitable software installation procedure, as is well known in the art.
  • at least a portion of the software instructions may also be downloaded over a cable, communication and/or wireless connection.
  • Embodiments may also be implemented as instructions stored on a non- transient machine-readable medium, which may be read and executed by one or more procedures.
  • a non-transient machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g. , a computing device).
  • a non-transient machine-readable medium may include read only memory (ROM); random access memory (RAM); storage including magnetic disk storage media; optical storage media; flash memory devices; and others.
  • firmware, software, routines, or instructions may be described herein as performing certain actions and/or functions. However, it should be appreciated that such descriptions contained herein are merely for convenience and that such actions in fact result from computing devices, processors, controllers, or other devices executing the firmware, software, routines, instructions, etc.
  • block and network diagrams may include more or fewer elements, be arranged differently, or be represented differently. But it further should be understood that certain implementations may dictate the block and network diagrams and the number of block and network diagrams illustrating the execution of the embodiments be implemented in a particular way.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Navigation (AREA)
  • Telephone Function (AREA)
  • Traffic Control Systems (AREA)

Abstract

Operating conditions for a headset computer are determined using input from a speed sensor or accelerometer together with the results of scene analysis performed on images captured by a camera embedded in the headset computer. If the headset is travelling above a predetermined speed, and if the scene analysis returns a decision that the wearer is sitting in a driver's seat of the vehicle, then one or more features of the headset computer are disabled or restricted. The headset computer may disable display operation, mobile phone operation, or change audio interface options, or take other actions.

Description

ENABLING AND DISABLING FEATURES OF A HEADSET COMPUTER BASED ON REAL-TIME IMAGE ANALYSIS
RELATED APPLICATION(S)
[0001] This application is a continuation of U.S. Application No. 13/837,048 filed March 15, 2013 which claims the benefit of U.S. Provisional Application No. 61/665,400, filed on June 28, 2012, the entire teachings of which are incorporated herein by reference.
BACKGROUND
[0002] Mobile computing devices, such as notebook personal computers (PC's), Smartphones, and tablet computing devices, are now common tools used for producing, analyzing, communicating, and consuming data in both business and personal life. Consumers continue to embrace a mobile digital lifestyle as the ease of access to digital information increases with high-speed wireless communications technologies becoming ubiquitous. Popular uses of mobile computing devices include displaying large amounts of high-resolution computer graphics information and video content, often wirelessly streamed to the device. While these devices typically include a display screen, the preferred visual experience of a high- resolution, large format display cannot be easily replicated in such mobile devices because the physical size of such devices is limited to promote mobility. Another drawback of the aforementioned device types is that the user interface is hands- dependent, typically requiring a user to enter data or make selections using a keyboard (physical or virtual) or touch-screen display. As a result, consumers are now seeking a hands-free high-quality, portable, color display solution to augment or replace their hands-dependent mobile devices. SUMMARY
[0003] The present disclosure relates to human/computer interfaces and more particularly to a headset computer that determines when a user may be wearing the headset computer while in a potentially unsafe situation, such as when operating a vehicle. If the potentially unsafe condition is detected, one or more operational features of the headset computer are disabled.
[0004] Recently developed micro-displays can provide large-format, high- resolution color pictures and streaming video in a very small form factor. One application for such displays can include integration into a wireless headset computer worn on the head of the user with the display positioned within the field of view of the user, similar in format to eyeglasses, an audio headset, or video eyewear. A "wireless computing headset" device includes one or more small high-resolution micro-displays and optics to magnify the image. The WVGA micro-displays can provide super video graphics array (SVGA) (800 x 600) resolution or extended graphic arrays (XGA) (1024 x 768) or even higher resolutions. A wireless computing headset contains one or more wireless computing and communication interfaces, enabling data and streaming video capability, and provides greater convenience and mobility than hands dependent devices.
[0005] For more information concerning such devices, see co-pending U.S. Application No. 12/348,646 entitled "Mobile Wireless Display Software Platform for Controlling Other Systems and Devices," by Parkinson et al, filed January 5, 2009, PCT International Application No. PCT/US09/38601 entitled "Handheld Wireless Display Devices Having High Resolution Display Suitable For Use as a Mobile Internet Device," by Jacobsen et al , filed March 27, 2009, and U.S.
Application No. 61/638,419 entitled "Improved Headset Computer," by Jacobsen et al , filed April 25, 2012, each of which are incorporated herein by reference in their entirety.
[0006] A headset computer (HSC) may also be referred to as a headset computing device or headmounted device (HMD) herein. A headset computer can be equipped with a camera and other sensors, such as speed or acceleration sensor. An image can be captured with the camera. The captured image can be processed using image processing techniques to perform feature extraction. Feature extraction can be performed locally at the headset computer (e.g., by the HSC processor) or remotely by a networked processor, for example in the cloud. The combination of the detected image features and the current speed and/or acceleration information can be used to determine if the current environment is safe for operating the headset computer. The operation of the headset computer functions or features can be modified based on results of the safety determination. If an unsafe condition is detected, the operations, functions and/or features controlled can include powering down the HSC to an "off state, or operating the HSC in an "audio-only" mode, in which the display is disabled and turned off. If an unsafe condition is not detected, the HSC can operate unrestricted.
[0007] In an example embodiment, operating conditions for a headset computer are determined using input from a speed sensor or accelerometer together with the results of scene analysis (e.g., image processing with feature extraction) performed on images captured by the camera integrated with the headset computer. If the HSC is travelling above a predetermined speed or acceleration threshold, and if the scene analysis returns a decision that the wearer is apparently sitting in the driver's seat of a motor vehicle, then one or more operating features or functions of the headset computer can be disabled or restricted. For example the display can be disabled or the mobile phone operation can be restricted, audio interface options can be changed, or other actions can be controlled.
[0008] The scene analysis may detect the presence of a steering wheel, manufacturers' logos, handlebars, gauges, levers, or other elements indicative of what an operator of a vehicle typically sees while operating the vehicle.
[0009] Further, the scene analysis may account for a typical view from the perspective of a passenger of a vehicle, when determining whether the user of the headset computer is driving.
[0010] Generally, in accordance with principles of the present invention, the HSC can automatically turn off its display or control other features when the user wearing the HSC is attempting to operate or operating a moving vehicle. Thus, driver/user is protected against the temptation to use the HSC while driving and, thereby, causing a potentially dangerous situation. At the same time, a passenger can continue to use a fully functional HSC while travelling in a vehicle.
[0011] Example embodiments using both (i) speed and/or acceleration data and (ii) scene analysis results provide additional fidelity that is useful compared to using either in isolation.
[0012] An example method of controlling operation of a headset computer, according to principles of the present invention, includes determining whether an acceleration or a velocity of the headset computer is greater than a predetermined threshold, capturing an image from a perspective of a user of the headset computer using a camera of the headset computer, comparing the captured image against one or more template images representative of elements of a vehicle as seen by occupants of the vehicle, and disabling one or more features of the headset computer based on the comparison of the captured image and the template images indicating that the user of the headset computer is operating a vehicle.
[0013] For example, the one or more features disabled can include operation of a micro-display or a 3G/4G cellular radio.
[0014] Example methods of controlling operation of a headset computer can further including enabling one or more features of the headset computer based on the comparison of the captured image and the template images indicating that the user of the headset computer is not operating a vehicle.
[0015] Further, the one or more features enabled can include operation of the headset computer in an audio only mode or operation of the headset computer wireless communications in a Bluetooth only mode.
[0016] The one or more of the one or more template images can be stored in a local memory of the headset computer or in a non-local memory accessible to the HSC.
[0017] An example method can further include determining a current global positioning location of the headset computer and associated jurisdiction based of the current location, and updating the one or more template images to reflect a right- hand drive or left-hand vehicle based on the determined jurisdiction.
[0018] The elements compared can include any one of the following: a steering wheel, manufacturer logos, speedometer, tachometer, fuel level gauge, battery gauge, oil pressure gauge, temperature gauge, stick shift, heating/air conditioning vents, a windshield orientation relative to a side window(s), car doors, and navigation systems.
[0019] According to principles of the present invention, a headset computer having a micro-display, audio components, camera, motion sensor, data storage media, and programmable data processor including one or more data processing machines that execute instructions retrieved from the data storage media, the instructions can be for: (i) determining whether an acceleration or a velocity received from the motion sensor is greater than a predetermined threshold, (ii) capturing image data using the camera, (iii) processing the image data to extract one or more image features, (iv) combining the image features and velocity and/or acceleration information to determine if a current environment is safe for operating at least one function of the headset computer, and (v) selectively enabling or disabling the headset computer function depending on the result of determining if the current environment is safe.
[0020] The example embodiments, for a determination that the current environment is not safe, the micro-display can be disabled, an audio-only function can be enabled, a 3G/4G cellular radio function can be disabled, and a Bluetooth wireless communications function can be enabled. For a determination that the current environment is determined to be safe, the HSC functions can be fully enabled.
[0021] Example embodiments can further include accessing one or more image features from a network-based storage media in the determination if the current environment is safe.
[0022] Another example embodiment can further include a Global Positioning System (GPS) receiver to determine a current location and based on the current location determine jurisdiction associated therewith, and further combine a right- hand drive or left-hand drive determination based on the jurisdiction to determine if the current environment is safe or update an image template.
[0023] The extracted one or more image features can represent any one of the following: a steering wheel, manufacturer logos, speedometer, tachometer, fuel level gauge, battery gauge, oil pressure gauge, temperature gauge, stick shift, heating/air conditioning vents, a windshield orientation relative to a side windows, car doors, and navigation systems.
[0024] Still further example embodiment includes a non-transitory computer program product for controlling operation of a headset computer, the computer program product comprising a computer readable medium having computer readable instructions stored thereon, which, when loaded and executed by a processor, cause the processor to determine whether an acceleration or a velocity of the headset computer is greater than a predetermined threshold, capture an image from a perspective of a user of the headset computer, compare the captured image against one or more template images representative of elements of a vehicle as seen by occupants of the vehicle, and disable or enable one or more features of the headset computer based on the comparison of the captured image and the template images indicating that the user of the headset computer is operating a vehicle.
BRIEF DESCRIPTION OF THE DRAWINGS
[0025] The foregoing will be apparent from the following more particular description of example embodiments of the invention, as illustrated in the accompanying drawings in which like reference characters refer to the same parts throughout the different views. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating embodiments of the present invention.
[0026] FIG. 1 A is a perspective view of an example embodiment of a headset computer in which the approaches described herein may be implemented.
[0027] FIG. IB illustrates and example embodiment of a headset computer wirelessly communicating with a host computing device (e.g., Smartphone, PC. etc.) and employing a user interface responsive to voice commands, head motions, and hand movements.
[0028] FIG. 2 is a high-level electronic system block diagram of the components of the headset computer.
[0029] FIGs. 3A and 3B are example scenes including image features taken from inside an automobile from the perspective of the driver and passenger, respectively. [0030] FIG. 4 is an example scene including image features from the perspective of a motorcycle operator.
[0031] FIG. 5 is an example scene including image features from an operator of an antique tractor.
[0032] FIG. 6 is a flow diagram of a process executed by a processor in the headset to control operation based on speed and scene information.
DETAILED DESCRIPTION OF EXAMPLE EMBODIMNENTS
[0033] FIGs. 1 A and IB show an example embodiment of a wireless hands-free computing headset device 100 (also referred to herein as a headset computing device, headset computer (HSC) or headmounted device (HMD)) that incorporates a high-resolution (VGA or better) micro-display element 1010, and other features described below.
[0034] FIG. 1A depicts a HSC 100 and generally includes a frame 1000, strap 1002, housing section 1004, speaker(s) 1006, cantilever or arm 1008, micro display 1010, and camera 1020. Also, located within the housing 1004 are various electronic circuits including, as will be understood shortly, a microcomputer (single or multi-core processor), one or more wired or wireless interfaces, and/or optical interfaces, associated memory and/or storage devices, and various sensors.
[0035] A head worn frame 1000 and strap 1002 are generally configured so that a user can wear the headset computer device 100 on the user's head. A housing 1004 is generally a low profile unit which houses the electronics, such as the microprocessor, memory or other storage device, low power wireless
communications device(s), along with other associated circuitry. Speakers 1006 provide audio output to the user so that the user can hear information, such as the audio portion of a multimedia presentation, or audio prompt, alert, or feedback signaling recognition of a user command.
[0036] Micro-display subassembly 1010 is used to render visual information, such as images and video, to the user. Micro-display 1010 is coupled to the arm 1008. The arm 1008 generally provides physical support such that the micro-display subassembly is able to be positioned within the user's field of view, preferably in front of the eye of the user or within its peripheral vision preferably slightly below or above the eye. Arm 1008 also provides the electrical or optical connections between the micro-display subassembly 1010 and the control circuitry housed within housing unit 1004.
[0037] The electronic circuits located within the housing 1004 can include display drivers for the microdisplay element 1010 and input and/or output devices, such as one or more microphone(s), speaker(s), geo-position sensors, 3 axis to 9 axis degrees of freedom orientation sensing, atmospheric sensors, health condition sensors, GPS, digital compass, pressure sensors, environmental sensors, energy sensors, acceleration, position, altitude, motion, velocity or optical sensors, cameras (visible light, infrared (IR), ultra violet (UV), etc.), additional wireless radios (Bluetooth®, Wi-Fi®, LTE, 3G Cellular, 4G Cellular, NFC, FM, etc.), auxiliary lighting, range finders, or the like, and/or an array of sensors embedded in the headset frame and/or attached via one or more peripheral ports. (Bluetooth is a registered trademark of Bluetooth Sig, Inc., of Kirkland Washington; and Wi-Fi is a registered trademark of Wi-Fi Alliance Corporation of Austin Texas.)
[0038] As illustrated in FIG. IB, example embodiments of the HSC 100 can receive user input through recognizing voice commands, sensing head movements, 110, 11 1 , 112 and hand gestures 1 13, or any combination thereof. Microphone(s) operatively coupled or preferably integrated into the HSC 100 can be used to capture speech commands which are then digitized and processed (2310, Fig. 2) using automatic speech recognition (ASR) techniques. Speech can be a primary input interface to the HSC 100, which is capable of detecting a user's voice, and using speech recognition, derive commands. The HSC 100 then uses the commands derived from the speech recognition to perform various functions.
[0039] Gyroscopes, accelerometers, and other micro-electromechanical system sensors can be integrated into the HSC 100 and used to track the user's head movement to provide user input commands. Cameras or other motion tracking sensors can be used to monitor a user's hand gestures for user input commands. The camera(s), motion sensor(s) and/or positional sensor(s) are used to track the motion and/or position of the user's head, hands and/or body in at least a first axis 111 (horizontal), but preferably also a second (vertical) 1 12, third (depth) 113, fourth (pitch), fifth (roll) and sixth (yaw). A three axis magnetometer (digital compass) can be added to provide the wireless computing headset or peripheral device with a full 9 axis degrees of freedom position accuracy. The voice command automatic speech recognition and head motion tracking features of such a user interface overcomes the hands-dependant formats of other mobile devices.
[0040] The headset computing device 100 can wirelessly communicate with a remote host computing device 200. Such communication can include streaming video signals received from host 200, such that the HSC 100 can be used as a remote auxiliary display. The host 200 may be, for example, a notebook PC, Smartphone, tablet device, or other computing device having sufficient computational complexity to communicate with the HSC 100. The host may be further capable of connecting to other networks 210, such as the Internet. The HSC 100 and host 200 can wirelessly communicate via one or more wireless protocols, such as Bluetooth®, Wi- Fi®, WiMAX or other wireless radio link 150.
[0041] The HSC 100 can be used as a stand alone, fully functional wireless Internet-connected computer system.
[0042] The HSC 100 with microdisplay 1010 can enable the user to select a field of view 300 within a much larger area defined by a virtual display 400. The user can control the position, extent (e.g., X - Y or 3D range), and/or magnification of the field of view 300.
[0043] The HSC may be embodied in various physical forms such as a monocular head mounted computer as shown, but also as a wearable computer, digital eyewear, electronic eyeglasses, and in other forms.
[0044] In one embodiment the HSC may take the form of the HSC described in a co-pending U.S. Patent Application No. 13/018,999, entitled "Wireless Hands-Free Computing Headset With Detachable Accessories Controllable By Motion, Body Gesture And/Or Vocal Commands" by Jacobsen et al, filed February 1, 2011, which is hereby incorporated by reference in its entirety.
[0045] FIG. 2 is a high-level block diagram of the electronic system of the headset computer 100. The electronics system includes a processor 2100, memory 2102, and mass storage 2104, as is typical for any programmable digital computer system. Also included in the electronics system are the microdisplay 2110, one or more microphones 2112, 2114, speakers 2106, 2108, wireless communication module(s) 2105, camera 2120, and accelerometer 2150 or other speed sensors 2200, such as a Global Position System (GPS) receiver that can deliver speed and/or acceleration information.
[0046] In order to determine whether to restrict or inhibit certain features of the HSC 100 due to an unsafe environment, such as the operation of a vehicle by the HSC 100 user, the processor 2100 executes instructions 2510 that are stored in the memory 2102 and accesses data stored in the memory 2102 and/or storage 2104. The processor 2100 may for example execute instructions 2510 embodied as software code. The processor 2100 may also make use of an operating system 2400 and applications 2410 running within the context of the operating system 2400 to provide various functions.
[0047] In an example embodiment, the processor 2100 can execute stored instructions 2510 to perform image capture 2350 and perform scene analysis 2360. The instructions to perform image capture 2360 may include calls for the camera 2120 (1020 in Fig. 1A) to first activate autofocusing, autobalancing and/or other image capturing features, then take a picture. Performing scene analysis 2360 can determine whether or not the image data contains some specific object, feature, element or activity. Scene analysis 2360 can be performed in any variety of ways, including, for example object or feature recognition, identification, or detection, and can include content-based image retrieval. The image capture 2350 and scene analysis 2360 preferably occur in real time, and, therefore, are preferably
implemented as low-level system calls, or even kernal-level functions in the operating system 2400. But in some instances image capture 2350 and scene analysis 2360 may also be implemented as applications 2410 running on top of the operating system 2400.
[0048] The memory 2102 and or storage not only store instructions 2510 for the processor to carry out, but can also store one more scene data templates 2300. Scene data templates 2300 are digital representations of images typically seen by the operator and/or occupants of a motor vehicle.
[0049] More particularly, the processor 2100 is programmed to automatically use the embedded camera 2120 and accelerometer 2150 to determine when a vehicle operator is wearing the headset computer 100. When the HSC 100 determines that such a condition exists, one or more features of the HSC 100 are then disabled. However, even when the accelerometer 2150 (or GPS 2200, etc.) indicates the vehicle is moving above the predetermined speed, if the scene analysis 2360 concludes that the user of the headset computer is not operating the vehicle, but rather is a passenger in the vehicle, the HSC may remain fully functional. The combination of speed or acceleration sensor 2150, 2200, and scene analysis 2360 provides a useful safety feature for drivers while providing a pleasant experience for passengers. Passengers are able to fully use and enjoy the HSC 100 while traveling in a motor vehicle, while the automatic shut off safety feature prevent the driver of the vehicle from using the HSC 100 completely, or at least enabling only certain features known to be safe for the situation. In such a diminished operating mode, the HSC 100 may enable only the audio functions, and/or other functions, such as just the Bluetooth connectivity function. Thus, the driver may still be able to use the Bluetooth audio system built into the vehicle to make calls using the 3G/4G cellular radios in the HSC 100 or stream other audio content.
[0050] FIGs 3 A and 3B illustrate image data representing typical scene data 2300 that can be stored in the HSC 100 and representative of an image captured by camera 2120.
[0051] FIG. 3A is a scene 3000 of the components inside a vehicle taken from the perspective of a driver. The primarily recognizable component element or image feature of the scene 3000 is a steering wheel 3010. But other elements or image features of the scene 3000 can be useful in scene analysis 2360 and can include manufacturer logos 3012 (in the center of the steering wheel 3010), speedometer 3014, tachometer 3016, fuel level 3018 and other gauges, operator controls, such as a stick shift 3021, heating/air conditioning vents 3023, the relative orientation of the windshield 3025 and side windows 3027, the presence of car doors 3029, floors 3031 , other instruments located to the side of the dashboard, such as navigation systems 3033. Image features that specify the relative orientation of doors 3029, windshields 3025, and side windows 3027 for both left-hand and right-hand drive automobiles can be included in image templates and scene data 2300.
[0052] Stored scene data 2300 or template images can include data for both right-hand and left-hand drive vehicles. Further, such stored scene data 2300 can include jurisdictional data. The jurisdictional data can include the geographical locations of jurisdictions and whether it is a left-hand drive or right-hand drive jurisdiction. For example, a HSC 100 with GPS can provide location information, which can then be used to determine the jurisdiction in which the HSC 100 is located. Such jurisdictional information can be used to prioritize scene analysis for left-hand drive or right-hand drive vehicles. For example, if the GPS determines the HSC 100 is located in Canada, then scene analysis for a right-hand drive vehicle can be prioritized.
[0053] The stored scene elements 2300 can also account the possible zoom settings of the camera 2120. For example, on some zoom settings only a portion of the dashboard may be visible (such as only a portion of the wheel 3010 and a few gauges 3018), whereas on other zoom settings, the windshield 3025, side windows 3027, doors 3029 and even portions of the floor 3031 may be visible. Such various possibilities can be accounted for by storing the scene data in particularly efficient ways, for example, by storing multiple versions of a given scene for different zoom levels or by using hierarchical scene element models.
[0054] Stored scene data 2300 may also include representations of vehicle occupant scenes such as scene 3100 of FIG. 3B, which is typical of what is viewed by a passenger in the front seat. While some elements remain the same (such as the presence of a navigation system 3033 and stick shift 3021), here they are located on the opposite side of the view or scene 3100 compared to the driver scene 3000 of FIG. 3 A. Most prominently, however, the scene 3100 is missing the steering wheel 3010 and gauges 3018, and includes the presence of other indicative items, such as a glovebox 3110.
[0055] Any convenient known scene analysis (image recognition) algorithm may be used by scene analysis 2360 to compare the images obtained by image capture 2350 against the scene data templates 2300. Such algorithms preferably can be relatively high-speed since the user's access to the device or device features is being controlled. The algorithms are preferably carried out in real time, and, therefore, can be embodied as high priority operating system calls, interrupts, or even embedded in the operating system kernal, depending on the processor type and operating system selected for implementation. [0056] In an alternative example embodiment, the processor 2100 can execute stored instructions 2510 to perform image capture 2350 and upload the scene data to a host 200 for cloud-based scene analysis and receive a scene analysis decision. By utilizing cloud-based resources, a cloud-based scene analysis can perform a more computationally intense scene analysis than the scene analysis 2360 that is performed on-board (i.e., locally) the HSC 100. Cloud-based scene analysis can have access to a vast library of vehicle scene that may be impractical to store at the local memory 2102 due to resource limitations. Cloud-based scene analysis, in coordination with an appropriate scene analysis (image recognition) algorithm - a design decision that enables sufficiently quick processing and decision making - can also be used to limit the user's access to operational features of the HSC 100. Such cloud-based analysis can be useful to unburden and off-load some of the memory intense and computationally intense processes from the HSC 100.
[0057] FIG. 4 is a scene 4000 typical of the operator of a motorcycle. Here, elements such as handlebars 4010, gas tanks and gauges 4014, mirrors 4028, and shifter 4021 can be included in the scene data templates 2300.
[0058] FIG. 5 is a scene 5000 from the perspective of an operator of an antique tractor. In scene 5000, the operator may be sitting very close to a very large steering wheel 5010 and, therefore, only a few portions 5012 of the steering wheel 5010 are visible. Other elements may include gauge(s) 5018, levers 5021 and hood section 5033 of the tractor that can be extracted as image features for recognition of scene 5000.
[0059] FIG. 6 is a flow diagram of a process 6000 that can be executed by the processor 2100 to implement control over the HSC 100 using the speed sensor 2150 and scene analysis 2360. In a first stage 600, a speed and/or acceleration is determined an compared to a threshold. For example, the accelerometer 2150 or GPS 2200 may indicate rapid acceleration or constant speed above a certain amount, such as 4 miles per hour (MPH).
[0060] If the speed and/or acceleration are low (i.e, less than the threshold), then the processing moves forward to stage 610, where all features, modes and functions of the headset computer 100 may be enabled. [0061] However, if the acceleration or speed is above the predetermined amount ( . e. , greater than the threshold), then stage 602 is entered. At stage 602 one or more images are captured using the camera 2120. The images captured in stage 602 are then processed by scene analysis 2360 in stage 604. A scene analysis stage 604 may make use of various scene data templates 606, accessed via either the memory 2102 or storage 2104. The scene data templates 606 (or 2300) can be representative of scenes typically viewed by the operators and passengers of motor vehicles, such as those described above with respect to scenes 3000, 3100, 4000, 5000.
[0062] Stage 608 may make a determination as to whether or not the user of the HSC 100 is travelling in (or on) a vehicle. If this is not the case, then stage 610 can be entered, where all available operating modes are active.
[0063] If the scene analysis of stage 608 concludes that the operator is inside a vehicle, then stage 612 is entered. At stage 612, a determination is made as to whether or not the user is a passenger in (or on) the vehicle. If the user is determined to be an occupant, then processing can continue to stage 610 where all operating modes are enabled.
[0064] However, if the wearer is determined to be a driver, then stage 614 is entered. At stage 614, one or more modes of operational features or functions of the HSC 100 are enable or disabled. As one example, stage 620-1 can disable the display. Stage 620-2 can disable the wireless communication interfaces such as 3G or 4G cellular. Stage 620-3 can enable only audio functions, such as the
microphones and speakers. In stage 620-4, the display, speaker, and microphones are enabled with only a Bluetooth interface and cellular voice functions are enabled. The Bluetooth (BT) mode 620-4 can permit the driver to place a voice telephone call using an external, in-vehicle, safe, Bluetooth system.
[0065] Other variations are possible. For example, there may be a way for the user of the HSC 100 to override the driver-detection feature 6000, such as by providing certain specialized commands via the voice recognition functions.
[0066] Although the example embodiment described herein are limited to ground vehicles, those having skill in the art should recognize that embodiments of disclosed invention can be applied in other environments and in other contexts to ensure safe usage of the HSC 100. [0067] It should be understood that the example embodiments described above may be implemented in many different ways. In some instances, the various "data processors" described herein may each be implemented by a physical or virtual general purpose computer having a central processor, memory, disk or other mass storage, communication interface(s), input/output (I/O) device(s), and other peripherals. The general purpose computer is transformed into the processors and executes the processes described above, for example, by loading software instructions into the processor, and then causing execution of the instructions to carry out the functions described.
[0068] As is known in the art, such a computer may contain a system bus, where a bus is a set of hardware lines used for data transfer among the components of a computer or processing system. The bus or busses are essentially shared conduit(s) that connect different elements of the computer system (e.g. , processor, disk storage, memory, input/output ports, network ports, etc.) that enables the transfer of information between the elements. One or more central processor units are attached to the system bus and provide for the execution of computer instructions. Also attached to system bus are typically I/O device interfaces for connecting various input and output devices (e.g., keyboard, mouse, displays, printers, speakers, etc.) to the computer. Network interface(s) allow the computer to connect to various other devices attached to a network. Memory provides volatile storage for computer software instructions and data used to implement an embodiment. Disk or other mass storage provides non-volatile storage for computer software instructions and data used to implement, for example, the various procedures described herein.
[0069] Embodiments may therefore typically be implemented in hardware, firmware, software, or any combination thereof.
[0070] In certain embodiments, the procedures, devices, and processes described herein are a computer program product, including a computer readable medium (e.g., a removable storage medium such as one or more DVD-ROM's, CD-ROM's, diskettes, tapes, etc.) that provides at least a portion of the software instructions for the system. Such a computer program product can be installed by any suitable software installation procedure, as is well known in the art. In another embodiment, at least a portion of the software instructions may also be downloaded over a cable, communication and/or wireless connection.
[0071] Embodiments may also be implemented as instructions stored on a non- transient machine-readable medium, which may be read and executed by one or more procedures. A non-transient machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g. , a computing device). For example, a non-transient machine-readable medium may include read only memory (ROM); random access memory (RAM); storage including magnetic disk storage media; optical storage media; flash memory devices; and others.
[0072] Furthermore, firmware, software, routines, or instructions may be described herein as performing certain actions and/or functions. However, it should be appreciated that such descriptions contained herein are merely for convenience and that such actions in fact result from computing devices, processors, controllers, or other devices executing the firmware, software, routines, instructions, etc.
[0073] It also should be understood that the block and network diagrams may include more or fewer elements, be arranged differently, or be represented differently. But it further should be understood that certain implementations may dictate the block and network diagrams and the number of block and network diagrams illustrating the execution of the embodiments be implemented in a particular way.
[0074] Accordingly, further embodiments may also be implemented in a variety of computer architectures, physical, virtual, cloud computers, and/or some combination thereof, and thus the computer systems described herein are intended for purposes of illustration only and not as a limitation of the embodiments.
[0075] Therefore, while this invention has been particularly shown and described with references to example embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the scope of the invention encompassed by the appended claims.

Claims

CLAIMS What is claimed is:
1. A method of controlling operation of a headset computer comprising:
determining whether an acceleration or a velocity of a headset computer is greater than a predetermined threshold;
through a camera of the headset computer, capturing an image from a perspective of a user of the headset computer;
comparing the captured image against one or more template images representative of elements of a vehicle as seen by occupants of the vehicle; and
disabling one or more features of the headset computer based on the comparison of the captured image and the template images indicating that the user of the headset computer is operating a vehicle.
2. The method of controlling operation of a headset computer of Claim 1 , wherein the one or more features disabled includes operation of a micro- display.
3. The method of controlling operation of a headset computer of Claim 1 , wherein the one or more features disabled includes operation of a 3G/4G cellular radio.
4. The method of controlling operation of a headset computer of Claim 1 , further including enabling one or more features of the headset computer based on the comparison of the captured image and the template images indicating that the user of the headset computer is not operating a vehicle.
5. The method of controlling operation of a headset computer of Claim 4, wherein the one or more features enabled includes operation of the headset computer in an audio only mode.
6. The method of controlling operation of a headset computer of Claim 4, wherein the one or more features enabled includes operation of the headset computer wireless communications in a Bluetooth only mode.
7. The method of controlling operation of a headset computer of Claim 1 , wherein one or more of the one or more template images are not stored in a local memory of the headset computer.
8. The method of controlling operation of a headset computer of Claim 1 , further including:
determining a current global positioning location of the headset computer and associated jurisdiction based thereon; and
updating the one or more template images to reflect right-hand drive or left-hand vehicles based on the determined jurisdiction.
9. The method of controlling operation of a headset computer of Claim 1 , wherein the elements compared includes any one of the following: a steering wheel, manufacturer logos, speedometer, tachometer, fuel level gauge, battery gauge, oil pressure gauge, temperature gauge, stick shift, heating/air conditioning vents, a windshield orientation relative to a side windows, car doors, and navigation systems.
10. A headset computer comprising:
a micro-display;
audio components;
a camera;
a motion sensor;
a data storage media; a programmable data processor comprising one or more data processing machines that execute instructions retrieved from the data storage media, the instructions for: determining whether an acceleration or a velocity received from the motion sensor is greater than a predetermined threshold; capturing image data using the camera;
processing the image data to extract one or more image features;
combining the image features and velocity and/or acceleration information to determine if a current environment is safe for operating at least one function of the headset computer; and
selectively enabling or disabling the headset computer function depending on the result of determining if the current environment is safe.
11. The apparatus of Claim 10, wherein the current environment is determined to not be safe, and the micro-display is disabled.
12. The apparatus of Claim 10, wherein the current environment is determined to not be safe, and an audio-only function is enabled.
13. The apparatus of Claim 10, wherein the current environment is determined to be safe, and the headset computer function is fully enabled.
14. The apparatus of Claim 10, wherein the wherein the current environment is determined to not be safe, and a 3G/4G cellular radio function is disabled.
15. The apparatus of Claim 10, wherein the wherein the current environment is determined to not be safe, and a Bluetooth wireless communications function is enabled.
16. The apparatus of Claim 10, further including accessing one or more image features from a network-based storage media in the determination if the current environment is safe.
17. The apparatus of Claim 10, further including a Global Positioning System (GPS) receiver to determine a current location and jurisdiction associated therewith, and further combing a right-hand drive or left-hand drive determination based on the jurisdiction to determine if the current environment is safe.
18. The apparatus of Claim 10, wherein the extracted one or more image features represents any one of the following: a steering wheel, manufacturer logos, speedometer, tachometer, fuel level gauge, battery gauge, oil pressure gauge, temperature gauge, stick shift, heating/air conditioning vents, a windshield orientation relative to a side windows, car doors, and navigation systems.
19. A non-transitory computer program product for controlling operation of a headset computer, the computer program product comprising a computer readable medium having computer readable instructions stored thereon, which, when loaded and executed by a processor, cause the processor to: determine whether an acceleration or a velocity of a headset computer is greater than a predetermined threshold;
capture an image from a perspective of a user of the headset computer;
compare the captured image against one or more template images representative of elements of a vehicle as seen by occupants of the vehicle; and
disable or enable one or more features of the headset computer based on the comparison of the captured image and the template images indicating that the user of the headset computer is operating a vehicle.
PCT/US2013/045152 2012-06-28 2013-06-11 Enabling and disabling features of a headset computer based on real-time image analysis WO2014004075A2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2015520240A JP2015523026A (en) 2012-06-28 2013-06-11 Headset computer that can enable and disable functions based on real-time image analysis
EP13732732.6A EP2867741A2 (en) 2012-06-28 2013-06-11 Enabling and disabling features of a headset computer based on real-time image analysis
CN201380034874.2A CN104428729A (en) 2012-06-28 2013-06-11 Enabling and disabling features of a headset computer based on real-time image analysis

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201261665400P 2012-06-28 2012-06-28
US61/665,400 2012-06-28
US13/837,048 US20140002357A1 (en) 2012-06-28 2013-03-15 Enabling and Disabling Features of a Headset Computer Based on Real-Time Image Analysis
US13/837,048 2013-03-15

Publications (2)

Publication Number Publication Date
WO2014004075A2 true WO2014004075A2 (en) 2014-01-03
WO2014004075A3 WO2014004075A3 (en) 2014-04-17

Family

ID=49777592

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2013/045152 WO2014004075A2 (en) 2012-06-28 2013-06-11 Enabling and disabling features of a headset computer based on real-time image analysis

Country Status (5)

Country Link
US (1) US20140002357A1 (en)
EP (1) EP2867741A2 (en)
JP (2) JP2015523026A (en)
CN (1) CN104428729A (en)
WO (1) WO2014004075A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015130867A3 (en) * 2014-02-28 2015-11-12 Microsoft Technology Licensing, Llc Controlling a computing-based device using gestures

Families Citing this family (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130339859A1 (en) 2012-06-15 2013-12-19 Muzik LLC Interactive networked headphones
US10180715B2 (en) 2012-10-05 2019-01-15 Elwha Llc Correlating user reaction with at least an aspect associated with an augmentation of an augmented view
US9077647B2 (en) 2012-10-05 2015-07-07 Elwha Llc Correlating user reactions with augmentations displayed through augmented views
US10269179B2 (en) 2012-10-05 2019-04-23 Elwha Llc Displaying second augmentations that are based on registered first augmentations
US10713846B2 (en) 2012-10-05 2020-07-14 Elwha Llc Systems and methods for sharing augmentation data
US9111383B2 (en) 2012-10-05 2015-08-18 Elwha Llc Systems and methods for obtaining and using augmentation data and for sharing usage data
US8928695B2 (en) * 2012-10-05 2015-01-06 Elwha Llc Formatting of one or more persistent augmentations in an augmented view in response to multiple input factors
US9141188B2 (en) * 2012-10-05 2015-09-22 Elwha Llc Presenting an augmented view in response to acquisition of data inferring user activity
US10358034B2 (en) * 2016-03-30 2019-07-23 Honda Motor Co., Ltd. System and method for controlling a vehicle display in a moving vehicle
US9639964B2 (en) 2013-03-15 2017-05-02 Elwha Llc Dynamically preserving scene elements in augmented reality systems
US10109075B2 (en) 2013-03-15 2018-10-23 Elwha Llc Temporal element restoration in augmented reality systems
US10025486B2 (en) 2013-03-15 2018-07-17 Elwha Llc Cross-reality select, drag, and drop for augmented reality systems
US20150031349A1 (en) * 2013-07-26 2015-01-29 Kyllburg Technologies, LLC Driver distraction disabling via gesture recognition
JP6039525B2 (en) * 2013-09-27 2016-12-07 株式会社トヨタマップマスター Head mounted display, control method therefor, computer program for controlling head mounted display, and recording medium storing computer program
US10212269B2 (en) 2013-11-06 2019-02-19 Google Technology Holdings LLC Multifactor drive mode determination
US10126823B2 (en) 2014-01-03 2018-11-13 Harman International Industries, Incorporated In-vehicle gesture interactive spatial audio system
US9037125B1 (en) 2014-04-07 2015-05-19 Google Inc. Detecting driving with a wearable computing device
WO2015199704A1 (en) * 2014-06-26 2015-12-30 Johnson Controls Technology Company Wireless communication systems and methods with vehicle display and headgear device pairing
WO2016018044A1 (en) * 2014-07-31 2016-02-04 Samsung Electronics Co., Ltd. Wearable device and method of controlling the same
JP2016057814A (en) * 2014-09-09 2016-04-21 セイコーエプソン株式会社 Head-mounted type display device, control method of head-mounted type display device, information system, and computer program
US10360617B2 (en) 2015-04-24 2019-07-23 Walmart Apollo, Llc Automated shopping apparatus and method in response to consumption
KR20180042354A (en) * 2015-08-21 2018-04-25 아바야 인코포레이티드 Security Policy Manager
US9843853B2 (en) 2015-08-29 2017-12-12 Bragi GmbH Power control for battery powered personal area network device system and method
US10104458B2 (en) 2015-10-20 2018-10-16 Bragi GmbH Enhanced biometric control systems for detection of emergency events system and method
US9944295B2 (en) 2015-11-27 2018-04-17 Bragi GmbH Vehicle with wearable for identifying role of one or more users and adjustment of user settings
KR20170081401A (en) * 2016-01-04 2017-07-12 삼성전자주식회사 Electronic Device and Operating Method Thereof
CA3011552A1 (en) 2016-01-19 2017-07-27 Walmart Apollo, Llc Consumable item ordering system
US10085082B2 (en) 2016-03-11 2018-09-25 Bragi GmbH Earpiece with GPS receiver
US10045116B2 (en) 2016-03-14 2018-08-07 Bragi GmbH Explosive sound pressure level active noise cancellation utilizing completely wireless earpieces system and method
US10052065B2 (en) 2016-03-23 2018-08-21 Bragi GmbH Earpiece life monitor with capability of automatic notification system and method
US10201309B2 (en) 2016-07-06 2019-02-12 Bragi GmbH Detection of physiological data using radar/lidar of wireless earpieces
US10045110B2 (en) 2016-07-06 2018-08-07 Bragi GmbH Selective sound field environment processing system and method
US10062373B2 (en) 2016-11-03 2018-08-28 Bragi GmbH Selective audio isolation from body generated sound system and method
US10063957B2 (en) 2016-11-04 2018-08-28 Bragi GmbH Earpiece with source selection within ambient environment
US10045117B2 (en) 2016-11-04 2018-08-07 Bragi GmbH Earpiece with modified ambient environment over-ride function
US10045112B2 (en) 2016-11-04 2018-08-07 Bragi GmbH Earpiece with added ambient environment
US10058282B2 (en) 2016-11-04 2018-08-28 Bragi GmbH Manual operation assistance with earpiece with 3D sound cues
US10051460B2 (en) 2016-12-16 2018-08-14 Plantronics, Inc. Subscription-enabled audio device and subscription system
US11507216B2 (en) 2016-12-23 2022-11-22 Realwear, Inc. Customizing user interfaces of binary applications
US10936872B2 (en) 2016-12-23 2021-03-02 Realwear, Inc. Hands-free contextually aware object interaction for wearable display
US10620910B2 (en) 2016-12-23 2020-04-14 Realwear, Inc. Hands-free navigation of touch-based operating systems
US10365493B2 (en) 2016-12-23 2019-07-30 Realwear, Incorporated Modular components for a head-mounted display
US10393312B2 (en) 2016-12-23 2019-08-27 Realwear, Inc. Articulating components for a head-mounted display
US10437070B2 (en) 2016-12-23 2019-10-08 Realwear, Inc. Interchangeable optics for a head-mounted display
US11099716B2 (en) 2016-12-23 2021-08-24 Realwear, Inc. Context based content navigation for wearable display
US10344960B2 (en) 2017-09-19 2019-07-09 Bragi GmbH Wireless earpiece controlled medical headlight
US11272367B2 (en) 2017-09-20 2022-03-08 Bragi GmbH Wireless earpieces for hub communications
US10810825B2 (en) * 2018-10-11 2020-10-20 Igt Systems and methods for providing safety and security features for users of immersive video devices
KR20200048145A (en) * 2018-10-29 2020-05-08 현대모비스 주식회사 Apparatus and method for controlling a head lamp
US10764536B2 (en) * 2018-12-27 2020-09-01 Denso International America, Inc. System and method for a dynamic human machine interface for video conferencing in a vehicle

Family Cites Families (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003121160A (en) * 2001-10-12 2003-04-23 Fujitsu Ten Ltd Navigation apparatus
WO2003046732A1 (en) * 2001-11-27 2003-06-05 Matsushita Electric Industrial Co., Ltd. Wearing information notifying unit
DE10330613A1 (en) * 2003-07-07 2005-01-27 Robert Bosch Gmbh Speed-dependent service provision in a motor vehicle
WO2006035231A1 (en) * 2004-09-29 2006-04-06 Rafe Communications Llc Controlling portable digital devices
JP2006186904A (en) * 2004-12-28 2006-07-13 Mitsumi Electric Co Ltd Head set
US7369845B2 (en) * 2005-07-28 2008-05-06 International Business Machines Corporation Managing features available on a portable communication device based on a travel speed detected by the portable communication device
US8855719B2 (en) * 2009-05-08 2014-10-07 Kopin Corporation Wireless hands-free computing headset with detachable accessories controllable by motion, body gesture and/or vocal commands
JP2008292279A (en) * 2007-05-24 2008-12-04 Mobile Computing Technologies:Kk Navigation device for performing database updating by character recognition
CN101359251A (en) * 2007-07-30 2009-02-04 由田新技股份有限公司 Optical remote-control system and method applying to computer projection picture
JP2009043006A (en) * 2007-08-08 2009-02-26 Ntt Docomo Inc Peripheral information providing system, server and peripheral information providing method
US20090149250A1 (en) * 2007-12-07 2009-06-11 Sony Ericsson Mobile Communications Ab Dynamic gaming environment
US7898428B2 (en) * 2008-03-06 2011-03-01 Research In Motion Limited Safety for mobile device users while driving
US8368753B2 (en) * 2008-03-17 2013-02-05 Sony Computer Entertainment America Llc Controller with an integrated depth camera
JP2009266166A (en) * 2008-04-30 2009-11-12 Nec Access Technica Ltd Harmony determination apparatus, harmony determination method and harmony determination program
JP2010081319A (en) * 2008-09-26 2010-04-08 Kyocera Corp Portable electronic device
JP5300443B2 (en) * 2008-12-01 2013-09-25 富士通テン株式会社 Image processing device
KR20110131247A (en) * 2009-02-27 2011-12-06 파운데이션 프로덕션, 엘엘씨 Headset-based telecommunications platform
CN102460349A (en) * 2009-05-08 2012-05-16 寇平公司 Remote control of host application using motion and voice commands
JP2010278595A (en) * 2009-05-27 2010-12-09 Nippon Syst Wear Kk Device and method of setting operation mode of cellular phone, program and computer readable medium storing the program
JP4637960B2 (en) * 2009-08-12 2011-02-23 三菱電機株式会社 Car navigation system
US20110207441A1 (en) * 2010-02-22 2011-08-25 Erik Wood One touch text response (OTTER)
US8655965B2 (en) * 2010-03-05 2014-02-18 Qualcomm Incorporated Automated messaging response in wireless communication systems
JP5287838B2 (en) * 2010-03-16 2013-09-11 株式会社デンソー Display position setting device
US9019068B2 (en) * 2010-04-01 2015-04-28 Apple Inc. Method, apparatus and system for automated change of an operating mode relating to a wireless device
US9888080B2 (en) * 2010-07-16 2018-02-06 Trimble Inc. Detection of mobile phone usage
US20120214463A1 (en) * 2010-11-05 2012-08-23 Smith Michael J Detecting use of a mobile device by a driver of a vehicle, such as an automobile
US8184070B1 (en) * 2011-07-06 2012-05-22 Google Inc. Method and system for selecting a user interface for a wearable computing device
US8811938B2 (en) * 2011-12-16 2014-08-19 Microsoft Corporation Providing a user interface experience based on inferred vehicle state
US8538402B2 (en) * 2012-02-12 2013-09-17 Joel Vidal Phone that prevents texting while driving

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015130867A3 (en) * 2014-02-28 2015-11-12 Microsoft Technology Licensing, Llc Controlling a computing-based device using gestures

Also Published As

Publication number Publication date
WO2014004075A3 (en) 2014-04-17
JP2015523026A (en) 2015-08-06
CN104428729A (en) 2015-03-18
EP2867741A2 (en) 2015-05-06
US20140002357A1 (en) 2014-01-02
JP2018191322A (en) 2018-11-29

Similar Documents

Publication Publication Date Title
US20140002357A1 (en) Enabling and Disabling Features of a Headset Computer Based on Real-Time Image Analysis
KR101711835B1 (en) Vehicle, Vehicle operating method and wearable device operating method
EP2826689B1 (en) Mobile terminal
KR101730315B1 (en) Electronic device and method for image sharing
JP6524422B2 (en) Display control device, display device, display control program, and display control method
US9517776B2 (en) Systems, methods, and apparatus for controlling devices based on a detected gaze
US9645640B2 (en) Device and method for navigating within a menu for controlling a vehicle, and selecting a menu entry from the menu
EP3072710A1 (en) Vehicle, mobile terminal and method for controlling the same
KR20160142167A (en) Display apparatus for vhhicle and vehicle including the same
US20190317328A1 (en) System and method for providing augmented-reality assistance for vehicular navigation
KR101716145B1 (en) Mobile terminal, vehicle and mobile terminal link system
WO2014018363A1 (en) Headset computer with handsfree emergency response
KR102390623B1 (en) Vehicle display device for controlling a vehicle interface using a portable terminal and method for controlling the same
CN112513708B (en) Apparatus and method for use with a vehicle
KR20160148394A (en) Autonomous vehicle
KR101736820B1 (en) Mobile terminal and method for controlling the same
CN115185080A (en) Wearable AR (augmented reality) head-up display system for vehicle
KR101859043B1 (en) Mobile terminal, vehicle and mobile terminal link system
CN112947474A (en) Method and device for adjusting transverse control parameters of automatic driving vehicle
KR20160148395A (en) Autonomous vehicle
WO2022210172A1 (en) Vehicular display system, vehicular display method, and vehicular display program
US9930474B2 (en) Method and system for integrating wearable glasses to vehicle
GB2526515A (en) Image capture system
JP7571866B2 (en) Vehicle display system, vehicle display method, and vehicle display program
KR20180040820A (en) Vehicle interface device, vehicle and mobile terminal link system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13732732

Country of ref document: EP

Kind code of ref document: A2

ENP Entry into the national phase

Ref document number: 2015520240

Country of ref document: JP

Kind code of ref document: A

REEP Request for entry into the european phase

Ref document number: 2013732732

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2013732732

Country of ref document: EP