WO2017147164A1 - Wireless remote input device for head-up display - Google Patents

Wireless remote input device for head-up display Download PDF

Info

Publication number
WO2017147164A1
WO2017147164A1 PCT/US2017/018904 US2017018904W WO2017147164A1 WO 2017147164 A1 WO2017147164 A1 WO 2017147164A1 US 2017018904 W US2017018904 W US 2017018904W WO 2017147164 A1 WO2017147164 A1 WO 2017147164A1
Authority
WO
WIPO (PCT)
Prior art keywords
rotatable wheel
communicating system
wireless
wireless communicating
circuitry
Prior art date
Application number
PCT/US2017/018904
Other languages
French (fr)
Inventor
Jesse MADSEN
Douglas Simpson
Patrick Mulcahy
Brandon Lynne
Sung Ook Yang
Dan Berg
Jon Godston
Original Assignee
Navdy, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Navdy, Inc. filed Critical Navdy, Inc.
Publication of WO2017147164A1 publication Critical patent/WO2017147164A1/en

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0312Detection arrangements using opto-electronic means for tracking the rotation of a spherical or circular member, e.g. optical rotary encoders used in mice or trackballs using a tracking ball or in mouse scroll wheels
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0362Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 1D translations or rotations of an operating part of the device, e.g. scroll wheels, sliders, knobs, rollers or belts
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Definitions

  • a wireless communicating system comprising: (i) a rotatable wheel; (ii) a selectable button fixed relative to the rotatable wheel; and (iii) a flexible strap for affixing the rotatable wheel and the selectable button relative to a steering wheel in the vehicle; (iv) circuitry for translating actuation of the selectable button into a wireless signal representative of the actuation; and (v) circuitry for translating rotation of the rotatable wheel into a wireless signal representative of an extent of the rotation.
  • Figure 2a illustrates a side view of an HUD device constructed according to a preferred embodiment.
  • Figure 3 illustrates a perspective view of a wireless communication system 300 for communication with, and inclusion in a system including, the HUD device of the above Figures.
  • Figure 4 illustrates a perspective exploded view of preferred embodiment components of wireless communication system 300 of Figure 3.
  • a remote wireless input device for use with a head-up display device mounted to an automobile dashboard is provided according to these embodiments.
  • An example of a head-up display suitable for use with an input device according to these embodiments is described in copending U.S. application S.N. 14/806,530 filed July 22, 2015, which published as U.S. Patent Application Publication No. US 2016/0025973, and copending U.S. design application S.N. 29/554,431 filed February 11, 2016, both commonly assigned herewith and incorporated herein by reference.
  • the exterior appearance of an example of this input device is also described in copending U.S. design application S.N. 29/554,435 filed February 11, 2016, commonly assigned herewith and incorporated herein by this reference.
  • the wireless remote input device will provide the driver of a vehicle, and user of a head-up display device, with the ability to forward inputs/commands to the head-up display device. It is also contemplated that inputs to the head-up display device may be provided by the wireless remote input device in combination with hand gestures and audio commands; alternatively, the choice of input mechanism, between this input device and hand gestures and audio commands, may be at the user's option during setup or during operation of the vehicle and display device.
  • the construction and operation of a wireless remote input device according to these preferred embodiment is further illustrated and described in the following pages.
  • HUD device 2 in this context sits on top of car dashboard DSH, typically in view by driver DRV above the speedometer and other gauges or operational displays (not shown) provided within dashboard DSH, and above steering wheel SWH.
  • HUD device 2 provides a see-though image that displays information relevant to driver DRV while driving the vehicle, without blocking the view of the road through windshield WSH.
  • HUD device 2 is constructed so as to be portable, easily placed atop dashboard DSH of a variety of vehicles, and easily removable for use in another vehicle or for security purposes, such as when driver DRV is parking the car in a public parking area.
  • HUD device 2 is constructed to have a compact size so that it can sit on top of dashboard DSH, without significantly interfering with the driver's view.
  • FIG. 2a The cross-sectional view of Figure 2a schematically illustrates the various components of HUD device 2 according to some embodiments.
  • housing 4 encloses control electronics 6, for example as may be mounted on one or more printed circuit boards, and which carry out the data and image processing involved in the operation of HUD device 2 as will be described below.
  • control electronics 6 for example as may be mounted on one or more printed circuit boards, and which carry out the data and image processing involved in the operation of HUD device 2 as will be described below.
  • the architecture and functionality of control electronics 6 will be described in detail below.
  • Housing 4 also encloses projector engine 10 which, for purposes of this description, refers to a projection system, including the optics, light modulation, and light source devices necessary to project an image suitable for use in HUD device 2 according to these embodiments.
  • the optics included in projector engine 10 are contemplated to include some or all of the appropriate lenses, mirrors, light homogenization devices, polarization devices, filters such as dichroic filters that combine light, and such other optical devices known in the art and included in the construction of a modern projector.
  • projector engine 10 is contemplated to also include the appropriate electronics for controlling these elements, as known in the art.
  • projector engine 10 may be affixed along the bottom of housing 4, so as to relate it directly to a same structure as a screen 12, as further discussed below.
  • Projector engine 10 projects images rearwardly (i.e., toward driver DRV) to curved screen 12 within screen enclosure 13 mounted near the rear edge of housing 4 in this embodiment.
  • screen 12 is a reflective surface, for example a high-gain curved reflective surface, positioned relative to projector engine 10 so that the light projected by projector engine 10 forms a "real" (i.e., human viewable) image on screen 12. Preferred embodiments involving various constructions of screen 12 will be described in further detail below.
  • screen 12 reflects the real image it displays) from engine 10) in a forward direction (i.e., toward the windshield) to combiner 14.
  • Figure 2a shows combiner 14 as physically coupled to housing 4 by way of hinge 16C
  • screen 12 is preferably physically coupled directly to housing 4 and even further, directly to the same portion of the housing (e.g., bottom) as are projector engine 10.
  • screen 12 may be physically coupled to housing 4 by way of a hinge (not shown).
  • Hinge 16C enables the angle of combiner 14 to be rotationally adjusted about its axis, so as to receive the image reflected by screen 12. This adjustability ensures good visibility of the image displayed to driver DRV for a variety of dashboard DSH geometries (i.e. , regardless of the flatness of the top surface of dashboard DSH) and with minimal distortion of the image, as will be described in further detail below.
  • the surface of combiner 14 may have a 30% reflective coating, in which case 30% of the reflected light from screen 12 will be reflected toward driver DRV, while roughly 70% of the external light received through windshield WSH will be transmitted through combiner 14 to be visible to driver DRV.
  • the particular construction of combiner 14 according to embodiments will be described in further detail below.
  • the degree of curvature of screen 12 is selected so that the light rays reflected from the surface of screen 12 to combiner 14, and reflected from combiner 14, are focused at the eye pupils of driver DRV and with preferably a minimal amount of varying brightness and/or distortion.
  • spherical surfaces are concave (from an inner perspective) or convex (from an outer perspective) surfaces that approximate a section of the surface of a sphere.
  • substantially spherical refers to a surface that is not perfectly spherical but is sufficiently close to being spherical so as to behave similarly to a perfectly spherical surface within the context of these embodiments.
  • screen 12 is constructed to have a "substantially spherical" surface, meaning that the surface behaves similarly to one that is perfectly spherical for purposes of preferred embodiments, but is not perfectly spherical, specifically by being slightly aspherical so as to help correct for the keystone distortion or barrel distortion, or both, resulting from the tilt and curvature of the inner surface of screen 12.
  • screen 12 is constructed to have a convex cylindrical surface.
  • distortions may be corrected for optically by the design of the projector lens in projector engine 10, or by also making combiner 14 slightly aspherical (while remaining "substantially spherical” as defined above), or by digital processing of the image being projected to pre-distort the image so it will look correct at combiner 14 as viewed by driver DRV, or by a combination of these techniques.
  • the measurement of gain at this point is known as "Peak Gain at Zero Degrees Viewing Axis".
  • Surfaces having a gain of 1.0 include a block of magnesium carbonate (MgCO3) and a matte white screen.
  • MgCO3 magnesium carbonate
  • a screen having a gain above 1.0 will reflect brighter light than that projected; for example, a screen rated at a gain of 1.5 reflects 50% more light in the direction normal to the screen than a screen rated at a gain of 1.0.
  • screens with a gain greater than 1.0 do not reflect light at the same brightness at all viewing angles. Rather, if one moves to the side so as to view the screen at an angle, the brightness of the projected image will drop.
  • FIG. 2c illustrates HUD device 2 from the rear (i.e., from the viewpoint of driver DRV of Figure 1).
  • driver DRV sees the back surface of a screen enclosure 13, within which screen 12 ( Figure 2a) is disposed so as to face projector engine 10 (shown in shadow in Figure 2c).
  • the image presented by the light projected by projector engine 10 forms image IMG 12 (not shown) at screen 12.
  • This image IMG 12 will reflect from screen 12 and appear on combiner 14 as image IMG 14, as shown in Figure 2c.
  • Image IMG 14 thus presents graphics and other visual information generated by control electronics 6 within housing 4 as appropriate for the particular functions being executed, in a manner that is visible to driver DRV.
  • Screen enclosure 13 serves to block light emitted by projector engine 10 from directly reaching driver DRV, as evident in the view of Figure 2c.
  • driver DRV will be able to see the road ahead through combiner 14, with image IMG 14 effectively overlaid onto that view of the road.
  • FIGS 2a through 2c also illustrate various auxiliary components of HUD device 2 that may be implemented in various embodiments.
  • Rear-facing camera 18R is mounted, in this embodiment, on the driver side of screen enclosure 13, and as such is aimed at driver DRV.
  • image data acquired by rear-facing camera 18R are communicated to control electronics 6, which processes those data to identify gestures made by driver DRV and carry out various control functions responsive to those identified gestures.
  • rear-facing camera 18R is sensitive to infrared light, and an infrared illuminant 19 (e.g., an LED emitting infrared light) is mounted on the driver-side surface of screen enclosure 13 and also facing driver DRV.
  • an infrared illuminant 19 e.g., an LED emitting infrared light
  • gesture-detection technologies alternatively may be implemented in place of or in addition to rear-facing camera 18R, examples of which include depth sensors, photometric stereo sensors, and dual camera arrangements.
  • Other aspects also may be included on the front of screen enclosure 13, including an on/off button 20 and a status indicator (e.g., LED) 21.
  • a front-facing camera 18F may be provided in some embodiments, for example mounted to the top edge of combiner 14 and aimed in the direction of windshield WSH.
  • front-facing camera 18F communicates image data pertaining to the location of the vehicle within or among lanes of the roadway, road conditions, or other environmental parameters visible through windshield WSH to control electronics 6, which in turn generates information for display at combiner 14 in response to that information.
  • Figure 2a also shows ambient light sensor 22 mounted on housing 4, which will communicate the level of ambient light to control electronics 6, in some embodiments; more than one such ambient light sensor 22 may be implemented in HUD device 2 if desired. If ambient light sensor 22 is implemented, control electronics 6 can adjust the brightness and other attributes of the light projected by projector engine 10, typically to increase brightness of the displayed images under bright ambient conditions and reduce brightness at nighttime.
  • the system of the embodiment shown in Figures 2a-2c also may include one or more rear cameras RCM, which may be deployed within the automobile, for example on the exterior rear of the vehicle, or internally to the vehicle such as on its ceiling or behind the driver's seat; communication between HUD device 2 and rear camera RCM allows HUD device 2 to display images on combiner 14 showing views from behind the vehicle or of the interior behind driver DRV, as the case may be, without requiring driver DRV to physically turn around or take her eyes off the road.
  • rear cameras RCM which may be deployed within the automobile, for example on the exterior rear of the vehicle, or internally to the vehicle such as on its ceiling or behind the driver's seat
  • communication between HUD device 2 and rear camera RCM allows HUD device 2 to display images on combiner 14 showing views from behind the vehicle or of the interior behind driver DRV, as the case may be, without requiring driver DRV to physically turn around or take her eyes off the road.
  • Wired communications may be effected in various manners, such as via USB port (not shown) or other wired communication with an on- board diagnostic port OB DP of the vehicle in which HUD device 2 is installed; by way of this connection, information regarding the operating parameters or condition of the vehicle, either directly or in combination with navigation information (distance to next filling station) can be displayed to driver DRV at combiner 14. It is contemplated that those skilled in the art having reference to this specification will be readily capable of implementing these functions, and additionally or alternatively other functions beyond those described, as desired, without undue experimentation.
  • HUD device 2 is further operable in response to response to driver commands, whether communicated by the above- referenced remote wireless device, by hand gesture, or by voice. Specifically, after a power-on sequence, such as may be commenced by a user pressing on/off button 20 and/or a button on the wireless device, or via a communication from another source (e.g., vehicle OBD port), device 2 executes appropriate initialization routines by electronics 6, which may include by system CPU, so as to perform power-on self-test sequences, and the like.
  • electronics 6 may include by system CPU, so as to perform power-on self-test sequences, and the like.
  • control electronics 6 places HUD device 2 in a default condition that forwards the corresponding image data to projector engine 10 for display at combiner 14. It is contemplated that this default condition may be to display the current velocity of the vehicle, or the current location on a navigation system map, or even simply a "splash" screen at combiner 14 in the field of view of driver DRV. At this point in its operation, HUD device 2 is ready to receive commands from driver DRV, or to respond to incoming communications.
  • driver DRV can invoke a function by HUD device 2 by operation of the remote wireless device, or by making a pre-determined hand gesture that is detected by rear-facing camera 18R .
  • This "home" gesture may be a "thumbs-up” gesture, a "two-fingers up” gesture, or some other distinctive hand position or motion, preferably made by driver DRV above steering wheel SWH ( Figure 1) so as to be in the field of view of rear-facing camera 18R.
  • a relatively wide range of wireless or audio commands may be available for execution by the system CPU (e.g., "search” for executing an Internet search for a type of business; “tweet” for creating a short text message to be posted on the TWITTER social network, via smartphone SPH; "text” for creating a text message to be sent to a contact via the telecommunications network; “call” for making a telephone call via smartphone SPH; and other such commands including invocation of a navigation function).
  • the system CPU executes the corresponding command and HUD device 2 displays the corresponding content on combiner 14.
  • additional displayes and wireless selections therefrom, or voice commands or hand gestures may be required in the execution of a command. It is contemplated that those skilled in the art having reference to this specification will be readily able to implement such functionality as appropriate for a particular implementation.
  • control electronics 6 then returns to await further instruction or to respond to incoming communications, as the case may be, with the then-current image being displayed at combiner 14.
  • Those then-current image may be the default state, or they may be the result of a different command, for example navigation information regarding the next turn to be made toward the desired destination.
  • control electronics 6 In response to receiving an external communication, for example as communicated by the connected device (smartphone SPH) in response to it receiving a communication, control electronics 6 produces and displays a notification at combiner 14 corresponding to that external communication.
  • notifications displayed by HUD device 2 in response to receiving an external communication such could include: (i) the notification for a "tweet" received over the TWITTER social network, the profile photo of the "tweeter” and their screen name, and images; (ii) the notifications for an incoming phone call; and (iii) notifications for an incoming text message.
  • HUD device 2 may function primarily as a simple display device for an attached computing device, such as smartphone SPH, in which case control electronics 6 would generate the appropriate graphics data to serve as a display for applications running on the attached computing device.
  • HUD device 2 would then leverage features implemented on that attached computing device, which may include connection to the internet, GPS, or other forms of communication, and could be realized by way of less circuitry than in more computationally capable implementations, retaining as little as only that functionality involved in operating the display, for example the functionality for controlling projector engine 10 in response to ambient light sensors to adjust the brightness.
  • HUD device 2 itself could be a complete computing platform on its own, or it may have some intermediate level of functionality in which some of the computing is carried out by control electronics 6 with other operations performed on the attached computing device.
  • FIG. 3 illustrates a perspective view of a wireless communication system 300.
  • wireless communication system 300 is for affixing to a location, preferably a vehicle steering wheel SWH, so that it may be easily reached and operated by a driver of the vehicle.
  • Figure 3 also illustrates the HAND of a user and the readily-accessible features of wireless communication system 300, as may be operated, for example, by the user's thumb TH, while the user's palm remains safely in contact with the steering wheel SWH, as will be explored in the remainder of this document.
  • wireless communication system 300 includes a rotatable wheel 302 and a depressible (or touch sensitive or otherwise selectable) push button 304, each for communicating input signals or commands to HUD device 2, in a manner comparable to user selectability with a computer wherein a user operates either wheels or buttons on a computer mouse.
  • rotatable wheel 302 may allow a user to scroll between and therefore identify one or more choices depicted on combiner 14 of HUD device 2, while button 304 may allow the user to then select a highlighted choice or otherwise select the function of a feature depicted on combiner 14 of HUD device 2 (e.g., choosing to answer or end an incoming call; choosing to have the image on combiner 14 of HUD device 2 change to a different mode; and so forth).
  • button 304 may allow the user to then select a highlighted choice or otherwise select the function of a feature depicted on combiner 14 of HUD device 2 (e.g., choosing to answer or end an incoming call; choosing to have the image on combiner 14 of HUD device 2 change to a different mode; and so forth).
  • wireless communication system 300 includes a strapping apparatus 306 for affixing rotatable wheel 302/button 304 relative to steering wheel SWH and, more preferably, so that rotatable wheel 302/button 304 are located within the inner perimeter of the generally circular shape of steering wheel SWH.
  • the user is still able to steer the vehicle and control steering wheel SWH as is customary in driving, while also having a natural grip position on the wheel so as to permit ready access to rotatable wheel 302 and button 304 (e.g., again, typically expected to be by way of the user's thumb TH).
  • Figure 4 illustrates a perspective exploded view of preferred embodiment components of wireless communication system 300.
  • rotatable wheel 302 and button 304 are generally items relating to strapping apparatus 306.
  • strapping apparatus 306. Each component, and its relationship to other components of system 300 as a whole, is described below.
  • System 300 includes a cap 310, which provides the upper tactile surface for rotatable wheel 302, introduced above.
  • cap 310 is domelike in shape with an aperture 310 A in its center, and where in a preferred embodiment cap 310 has an outer diameter of approximately 33.5mm.
  • aperture 310 A has an outer diameter of approximately 13.7 mm which, in a preferred embodiment is slightly larger than the outer diameter of button 304. In this manner, the depressible surface of the depressible button 304 is aligned in a plane with a top center of rotatable wheel 302 and with only a slight gap between the outer perimeter of button 304 and aperture 310 A .
  • System 310 also includes a button cap 312, a button adhesive 314, and a button actuator 314, that together in general from the above-introduced button 304.
  • Button cap 312 is generally a disk having a diameter slightly smaller than aperture 310 A of cap 310.
  • Button adhesive 314 is also a disk shape and is a double-sided adhesive so as to adhere button cap 302 to button actuator 316.
  • Button actuator 316 includes appropriate apparatus to couple to a rotary/button printed circuit board assembly (PCBA) 318.
  • PCBA rotary/button printed circuit board assembly
  • a set (e.g., three) of screws 320 is used to affix rotary/button PCBA 318 to a PCB carrier 322, where the ends of each screw pass beyond PCB carrier 322 to fit into respective retaining cutouts (e.g., 120 degrees apart around the circular shape) of a battery PCB 324.
  • battery PCB 324 includes appropriate contacts to receive voltage from a battery 326 that is positioned between those contacts and a battery cover 354.
  • system 310 includes a main housing 346, into which each of the above-described components relating to wheel 302/button 304 are aligned along a common axis passing through the center of each component (with the exception of the perimeter located screws 320).
  • wireless communication system 300 includes a flexible strap 330 having a main length portion 332 and an angled portion 334, so named as at is departs, based on pre-formed shaping in strap 330, at an angle ⁇ away from main length portion 332.
  • main length portion 332 has a curved portion 336 at its opposite end.
  • Materials for flexible strap 330 provide a sufficient amount of elasticity so that it may be pulled tight around the curved cylindrical cross- section of a steering wheel and affixed to an anchoring point, where the elasticity will retain it in place; hence, rubber or other flexible and elastic materials are contemplated.
  • Angled portion 334 includes an aperture 338 sized and positioned so that, as detailed later, flexible strap 330 may be wrapped around a steering wheel and whereby aperture 338 fits around a respective protrusion also associated with system 300.
  • main length portion 332 also includes an aperture 340, around which and preferably molded into strap 330 is an aperture reinforcement boundary 342.
  • Aperture 340 (and boundary 342) is sized and positioned so that, as detailed later, when flexible strap 332 is wrapped around a steering wheel, aperture 340 (and boundary 342) also fits around a respective protrusion also associated with system 300.
  • Curved portion 336 includes a concave curvature portion 344, where the curvature is shaped to approximate a contour of a respective portion of the outer curved contour of a steering wheel.
  • Curved portion 336 also attaches to main housing 346. More particularly, preferably a mounting plate 349 is insert molded into curved portion 336 of strap 330 and a set ⁇ e.g., four) of screws 349 s are respectively threaded through corner-located holes in plate 349, through strap 300, into respective holes (not shown) in housing 346.
  • a mounting plate 349 is insert molded into curved portion 336 of strap 330 and a set ⁇ e.g., four) of screws 349 s are respectively threaded through corner-located holes in plate 349, through strap 300, into respective holes (not shown) in housing 346.
  • a pin 352 also retains strap 300 to housing 346, for example to prevent additional gapping near the top of the connection, as pin 352 is passed through a cylindrical sleeve 330 SL that is formed in strap 330 where main length portion 332 meets curved portion 336, and whereby each end tip of pin 352 fits within a respective pin hole 346p H molded into an upper end of housing 346, as may be further retained in position by a pin screw 352 s .
  • a light pipe 358 Also affixed inside the interior of the outer circumferential wall of housing 346 is a light pipe 358, retained in place by way of example via a respective adhesive 360, for providing illuminating visual indication (e.g., by piping light from an LED), through an aperture 362, in housing 346, of one or more operable features of system 300, such as when the system is on or is communicating certain signals, which by way of example can include pairing with HUD device 2 or other subsequent communications therewith.
  • a light pipe 358 also affixed inside the interior of the outer circumferential wall of housing 346, retained in place by way of example via a respective adhesive 360, for providing illuminating visual indication (e.g., by piping light from an LED), through an aperture 362, in housing 346, of one or more operable features of system 300, such as when the system is on or is communicating certain signals, which by way of example can include pairing with HUD device 2 or other subsequent communications therewith.
  • a strap hook 364 is removably inserted into a side of a curved ridge 366 that extends from the bottom of housing 346 and opposite the steering wheel curvature of concave curvature portion 344, where strap hook 354 is sized and positioned to mate with aperture 340 of main length portion 332, as introduced above and as further discussed below.
  • Figure 5 illustrates a top perspective view
  • Figure 6 illustrates a side view, of wireless communication system 300 once the exploded view of components from Figure 4 are assembled, but prior to affixing system 300 in a vehicle. From these views, one skilled in the art can further appreciate some of the structural, functional, and operable aspects described above, and additional observations are now discussed with respect to installing system 300 into a vehicle. Particularly, from the Figures 5 and 6 views, a BEND DIRECTION is indicated which, from the angles shown involves moving angled portion 334 in a clockwise direction.
  • curved ridge 366 is positioned on the inside perimeter of a steering wheel at a position, and with cap 310 at a position that will be readily accessible to the vehicle operator when the vehicle is steered in a straight direction, such as at a position between approximately 1 :00 and 5:00 o'clock, if perceiving the perimeter of the steering wheel in terms of clock position, as is sometimes a common reference in steering (and driving school).
  • angled portion 334 is bent around the outer portion of the steering wheel and with apertures 338 and 340 then pulled toward the underside of housing 346.
  • strap hook 364 includes an additional flange 364 F pointing in a direction opposite of the pulling force that is created by the elastic stretching of strap 330 as described.
  • flange 364 F assists with retaining strap 330 affixed to hook 364.
  • aperture 338 in angled portion 334 is placed around protrusion 356. This completed affixation, of strap 330 to housing 346, is shown in Figure 7.
  • the angle ⁇ between main length portion 332 and angled portion 334 is matched or approximated by the angle on the underside of housing 346, that is, the angle between the flat surface from which protrusion 356 extends and curved ridge 366.
  • the distal length of strap 330, where aperture 338 is located can then seat properly about protraction 356; in this regard, therefore, this provides additional indication to the user that batter cover 354 is properly in place, or if the strap will not properly attach, that they should check to ensure that battery cover 354 is properly secured.
  • FIG. 8 is a flow diagram illustrating a method 800 of the generalized operation of HUD device 2 including its response to driver commands, whether by operation of system 300, hand gesture or voice.
  • Power-on sequence 810 begins with the powering-on of HUD device 2, including execution of the appropriate initialization routines by its system CPU, power-on self-test sequences, and the like.
  • the HUD device system CPU executes the appropriate routines to pair its communications with the various devices in its vicinity, including system 300, smartphone SPH and perhaps certain functions of the vehicle, including the vehicle audio system, the on-board diagnostic port OBDP, rear-mounted vehicle camera RCM, and the like, as available and enabled for this installation.
  • control electronics 6 places HUD device 2 in a default condition in process 814, and forwards the corresponding image data to projector engine 10 for display at combiner 14. It is contemplated that this default condition may be to display the current velocity of the vehicle, or the current location on a navigation system map, or even simply a "splash" screen at combiner 14 in the field of view of driver DRV. At this point in its operation, HUD device 2 is ready to receive commands from driver DRV, or to respond to other incoming communications. It is contemplated that rear-facing camera(s) 18R and other functions associated with control electronics 6 are operable, in this default state, to receive input from driver DRV or over the communications network, as appropriate.
  • Figure 9b illustrates a few such commands by way of example, including “tweet” 952 for creating a short text message to be posted on the TWITTER social network, via smartphone SPH; "text” 954 for creating a text message to be sent to a contact via the telecommunications network; "call” 954 for making a telephone call via smartphone SPH; and other such commands including invocation of a navigation function or an Internet search.
  • the system CPU executes the corresponding command and displays the corresponding content in process 820; for example by pushing one of images 906a through 906d to be displayed at combiner 14, in the example of Figure 9a.
  • additional system 300, voice, or hand gesture commands may be required in the execution of the command of process 820 (e.g., confirming a text or tweet by way of a hand gesture or a "send" voice command or push of button 304). It is contemplated that those skilled in the art having reference to this specification will be readily able to implement such functionality as appropriate for a particular implementation.
  • control electronics 6 Following execution of the command in process 820, control electronics 6 then returns to await further instruction or to respond to incoming communications, as the case may be, with the then-current image being displayed at combiner 14.
  • Those then-current image may be the default state, such as image 902 of Figure 9a, or they may be the result of the command executed in process 820, for example navigation information regarding the next turn to be made toward the desired destination, as shown by image 958 of Figure 9b.
  • control electronics 6 In response to receiving an external communication in process 822, as communicated by a connected device (e.g., wireless communication system 300 or smartphone SPH), control electronics 6 produces and displays a notification at combiner 14 corresponding to that external communication, in process 824.
  • Figure 9b illustrates examples of notifications displayed by HUD device 2 in response to receiving an external communication.
  • Image 952 illustrates the notification for a "tweet” received over the TWITTER social network, including the profile photo of the "tweeter” and their screen name
  • images 954 and 956 illustrate the notifications for an incoming phone call and incoming text message, respectively, each including the contact name or "caller ID" indication for the caller and their phone number.
  • These notifications may include secondary information, such as image 952' that is not immediately displayed with notification image 952 but is available on command, such as via rotation of rotatable wheel 302.
  • commands from wireless communications system 300, hand gestures or voice from driver DRV provide inputs for controlling responses to incoming notifications. These commands are detected using routines executed by the system CPU in response to inputs from a wireless receiver in HUD device 2, rear-facing camera 18R, and an internal (or smartphone) microphone, for example, in process 820.
  • driver DRV indicates the desire to view that secondary notification by rotating wheel 302 counterclockwise, or making a leftward swipe with one finger raised; this wireless communication or hand gesture is detected by the system CPU in process 826, to which control electronics 6 responds in process 828 by displaying image 952'.
  • voice commands issued by driver DRV may be detected in another instance of process 828, and the appropriate action taken by control electronics 6 in process 828.
  • voice commands issued by driver DRV e.g., "retweet", “reply”, etc.
  • the hand gesture of a rightward swipe with one finger raised by driver DRV that is detected in process 826 will cause control electronics to "dismiss" the notification in process 828, returning the display to its previous state (e.g., navigation image 958) or to the default image (e.g., speedometer image 902), awaiting the next gesture, command, or incoming communication.
  • the preferred embodiments provide an improved wireless remote input in combination with a display device.
  • Preferred embodiments may include various features and benefits, such as the following.
  • a wireless remote input device is provided, for example for a head-up display, and that employs an intuitive rotary knob and center button for navigating a graphical user interface, such as that being shown on the head-up display.
  • the remote input device can be efficiently used without requiring its user to look at the device.
  • the device also can be easily mounted to any steering wheel, at a location on the steering wheel that is most convenient for the user.
  • Rotational inputs are provided via an ergonom i cal ly-design ed wheel having an optimized outer diameter and tactile ridges for easy rotation by the user's thumb, allowing the user to simultaneously grip the steering wheel in a safe manner while driving while also operating the remote wireless device.
  • the button centered within the wheel, may have multiple functions, including: (i) selecting an input among several options in the user interface; (ii) providing access to aspects of the user interface by multiple clicking within a time period, such as double or triple clicking (e.g., giving access to Bluetooth pairing; quick access to music functionality; etc).
  • the device is relatively inexpensive to manufacture, and is easily assembled at the factory level, due to the optimized construction of its architecture.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Optics & Photonics (AREA)
  • Instrument Panels (AREA)

Abstract

A wireless communicating system, comprising: (i) a rotatable wheel; (ii) a selectable button fixed relative to the rotatable wheel; and (iii) a flexible strap for affixing the rotatable wheel and the selectable button relative to a steering wheel in the vehicle; (iv) circuitry for translating actuation of the selectable button into a wireless signal representative of the actuation; and (v) circuitry for translating rotation of the rotatable wheel into a wireless signal representative of an extent of the rotation.

Description

WIRELESS REMOTE INPUT DEVICE FOR HEAD-UP DISPLAY
Technical Field
[0001] The preferred embodiments relate to head-up display devices for vehicles such as automobiles and are more particularly directed to a wireless remote input for use, or in combination, with such a display device.
Background Art
[0002] Head-up display devices in vehicles such as automobiles are increasing in desirability and use, yet only a limited number of vehicle manufacturers offer these devices as original equipment. In vehicle aftermarket systems, however, various market entrants are offering or developing units that may be installed into an existing vehicle, often with certain advantages and considerations that arise in the competitive marketplace. Such considerations include, as examples, safety, product longevity, ease of installation, removability, reliability, functionality, and aesthetics.
[0003] The preferred embodiments seek to improve head up display systems, based on the above considerations, as further explored below.
Disclosure of Invention
[0004] In one preferred embodiment, there is a wireless communicating system, comprising: (i) a rotatable wheel; (ii) a selectable button fixed relative to the rotatable wheel; and (iii) a flexible strap for affixing the rotatable wheel and the selectable button relative to a steering wheel in the vehicle; (iv) circuitry for translating actuation of the selectable button into a wireless signal representative of the actuation; and (v) circuitry for translating rotation of the rotatable wheel into a wireless signal representative of an extent of the rotation.
[0005] Other preferred embodiments, aspects and benefits are described and claimed.
Brief Description of Drawings
[0006] The preferred embodiments will be described in detail below by referring to the accompanying drawings:
[0007] Figure 1 illustrates a perspective view of an automotive application of a head- up display (HUD) device according to preferred embodiments.
[0008] Figure 2a illustrates a side view of an HUD device constructed according to a preferred embodiment.
[0009] Figure 2b illustrates a side cross-sectional view of a HUD device according to a preferred embodiment, illustrating light paths in its operation. [0010] Figure 2c illustrates a rear cross-sectional view of an HUD device constructed according to a preferred embodiment.
[0011] Figure 3 illustrates a perspective view of a wireless communication system 300 for communication with, and inclusion in a system including, the HUD device of the above Figures. [0012] Figure 4 illustrates a perspective exploded view of preferred embodiment components of wireless communication system 300 of Figure 3.
[0013] Figure 5 illustrates a top perspective view, and Figure 6 illustrates a side view, of wireless communication system 300 once the exploded view of components from Figure 4 are assembled. [0014] Figure 7 illustrates wireless communication system 300 with its strap in a wraparound position affixed to its housing.
[0015] Figure 8 illustrates a flow diagram illustrating the operation of an HUD device according to an embodiment in response to incoming wireless communications and other possible inputs (e.g., voice, gesture) according to a preferred embodiment. [0016] Figures 9a and 9b illustrate flow diagrams illustrating examples of operation of an HUD device according to the embodiment of Figure 8.
Description of Embodiments
[0018] The one or more preferred embodiments described in this specification are implemented into a wireless input device for an automotive head-up display, as it is contemplated that such implementation is particularly advantageous in that context. However, it is also contemplated that concepts of this invention may be beneficially applied to many other applications. Accordingly, it is to be understood that the following description is provided by way of example only, and is not intended to limit the inventive scope as claimed.
[0019] For purposes of this description, a remote wireless input device for use with a head-up display device mounted to an automobile dashboard is provided according to these embodiments. An example of a head-up display suitable for use with an input device according to these embodiments is described in copending U.S. application S.N. 14/806,530 filed July 22, 2015, which published as U.S. Patent Application Publication No. US 2016/0025973, and copending U.S. design application S.N. 29/554,431 filed February 11, 2016, both commonly assigned herewith and incorporated herein by reference. The exterior appearance of an example of this input device is also described in copending U.S. design application S.N. 29/554,435 filed February 11, 2016, commonly assigned herewith and incorporated herein by this reference.
[0020] It is contemplated that the wireless remote input device according to preferred embodiments will provide the driver of a vehicle, and user of a head-up display device, with the ability to forward inputs/commands to the head-up display device. It is also contemplated that inputs to the head-up display device may be provided by the wireless remote input device in combination with hand gestures and audio commands; alternatively, the choice of input mechanism, between this input device and hand gestures and audio commands, may be at the user's option during setup or during operation of the vehicle and display device. The construction and operation of a wireless remote input device according to these preferred embodiment is further illustrated and described in the following pages.
[0021] As will be apparent from this description and as noted above, the preferred embodiments described herein relate to a see-through display device commonly known as a "head-up display" or "HUD." Figure 1 illustrates the general application of devices according to these embodiments in an automotive context. As shown in Figure 1, HUD device 2 in this context sits on top of car dashboard DSH, typically in view by driver DRV above the speedometer and other gauges or operational displays (not shown) provided within dashboard DSH, and above steering wheel SWH. HUD device 2 provides a see-though image that displays information relevant to driver DRV while driving the vehicle, without blocking the view of the road through windshield WSH.
[0022] As will be evident from the following description, HUD device 2 according to a preferred embodiment is constructed so as to be portable, easily placed atop dashboard DSH of a variety of vehicles, and easily removable for use in another vehicle or for security purposes, such as when driver DRV is parking the car in a public parking area. As such, in these embodiments HUD device 2 is constructed to have a compact size so that it can sit on top of dashboard DSH, without significantly interfering with the driver's view.
[0023] The cross-sectional view of Figure 2a schematically illustrates the various components of HUD device 2 according to some embodiments. As shown in that Figure, housing 4 encloses control electronics 6, for example as may be mounted on one or more printed circuit boards, and which carry out the data and image processing involved in the operation of HUD device 2 as will be described below. The architecture and functionality of control electronics 6 according to some embodiments will be described in detail below.
[0024] Housing 4 also encloses projector engine 10 which, for purposes of this description, refers to a projection system, including the optics, light modulation, and light source devices necessary to project an image suitable for use in HUD device 2 according to these embodiments. The optics included in projector engine 10 are contemplated to include some or all of the appropriate lenses, mirrors, light homogenization devices, polarization devices, filters such as dichroic filters that combine light, and such other optical devices known in the art and included in the construction of a modern projector. Light modulation devices included in projector engine 10 may be any one of a number of types, including those known in the art as digital micromirror array devices (DMD) such as the DLP™ device from Texas Instruments Incorporated, liquid crystal on silicon (LCOS) light modulators, and transmissive LCD displays such as those used in LCD projectors or other type of spatial light modulator; other types of light modulation device suitable for use in some embodiments include a laser beam scanning (LBS) projector, in which a laser light source is modulated electronically or otherwise and the laser beam is scanned by one or more moving mirrors to scan the image, and any other form of image projection. The light source included in projector engine 10 may be one or more LEDs, one or more lasers, or other sources of light. For example, red, green, and blue LEDs or lasers are commonly used with DMD and LCOS modulators, to support what is known as a "full color" display, but of course other colors of light may additionally or alternatively be used. In any of these technologies, projector engine 10 is contemplated to also include the appropriate electronics for controlling these elements, as known in the art. Lastly, while various of the figures depict projector engine 10 elevated relative to electronics 6, in a preferred embodiment projector engine 10 may be affixed along the bottom of housing 4, so as to relate it directly to a same structure as a screen 12, as further discussed below.
[0025] Projector engine 10 projects images rearwardly (i.e., toward driver DRV) to curved screen 12 within screen enclosure 13 mounted near the rear edge of housing 4 in this embodiment. As will be described in detail below, screen 12 is a reflective surface, for example a high-gain curved reflective surface, positioned relative to projector engine 10 so that the light projected by projector engine 10 forms a "real" (i.e., human viewable) image on screen 12. Preferred embodiments involving various constructions of screen 12 will be described in further detail below. [0026] According to preferred embodiments, screen 12 reflects the real image it displays) from engine 10) in a forward direction (i.e., toward the windshield) to combiner 14. Combiner 14 according to preferred embodiments is a semi-transparent curved element that combines light from two directions, namely that transmitted through windshield WSH and that reflected from screen 12, to form a combined "virtual" image that is viewable by driver DRV in the arrangement of Figure 1. Combiner 14 is semi- transparent in the sense that road conditions and other visual information ahead of the vehicle (i.e., light entering through windshield WSH) can be seen by driver DRV through combiner 14, but on which the images projected by projector engine 10 and reflected by screen 12 also will be visible to driver DRV. Further in this regard, Figure 2a shows combiner 14 as physically coupled to housing 4 by way of hinge 16C, and screen 12 is preferably physically coupled directly to housing 4 and even further, directly to the same portion of the housing (e.g., bottom) as are projector engine 10. In an alternative preferred embodiment, screen 12 may be physically coupled to housing 4 by way of a hinge (not shown). Hinge 16C enables the angle of combiner 14 to be rotationally adjusted about its axis, so as to receive the image reflected by screen 12. This adjustability ensures good visibility of the image displayed to driver DRV for a variety of dashboard DSH geometries (i.e. , regardless of the flatness of the top surface of dashboard DSH) and with minimal distortion of the image, as will be described in further detail below.
[0027] Figure 2b illustrates a side elevation view of the basic optical path according to some preferred embodiments. As shown in Figure 2b, projector engine 10 projects light on image path IMG 10 toward screen 12 to form a real (human- viewable) image at screen 12. That image is reflected by screen 12 on image path IMG 12 to combiner 14, and in turn partially reflected by combiner 14 to be visible to driver DRV along a center line-of-sight CLOS. As noted above, combiner 14 is constructed to be semi-transparent to external light such as received through windshield WSH; this semi-transparency also connotes that combiner 14 is semi-reflective to the light reflected by screen 12. These properties may be attained by coatings on the surfaces of combiner 14. For example, the surface of combiner 14 may have a 30% reflective coating, in which case 30% of the reflected light from screen 12 will be reflected toward driver DRV, while roughly 70% of the external light received through windshield WSH will be transmitted through combiner 14 to be visible to driver DRV. The particular construction of combiner 14 according to embodiments will be described in further detail below.
[0028] Because the image projected on screen 12 by projector engine 10 is a "real" image, it is useful for projector engine 10 to be constructed and arranged to project that image so as to be focused on screen 12. In this example, screen 12 is placed in the focal plane of the lens of projector engine 10. In one example in which the lens of projector engine 10 has a focal distance of about 100 mm, screen 12 is placed at distance of about 100 mm from projector engine 10. For projector engine 10 constructed as a DMD or LCOS light modulator type projector, a focus adjustment may be required at manufacture that then remains fixed in place for system use. For those projectors 10 using laser illumination, however, the depth of focus may be sufficient that no additional focusing may be required. It will also be understood by one skilled in the art that because lasers have much narrower bandwidths/linewidths at a given center frequency, the use of lasers can provide better performance with such optical elements at screen 12 such as bandpass filters.
[0029] According to some embodiments, screen 12 is constructed to have a curved surface. In one preferred embodiment, the curved surface is convex (i.e., curved toward) relative to the received light image from projector engine 10, where the convex shape may be spherical, substantially spherical, or biconic (e.g., cylindrically convex). In an alternative preferred embodiment, the curved surface may be concave relative to the received light image from projector engine 10, although such an approach may create bright spots or regions in the depiction of the screen image captured by combiner 14. In any event, the degree of curvature of screen 12 is selected so that the light rays reflected from the surface of screen 12 to combiner 14, and reflected from combiner 14, are focused at the eye pupils of driver DRV and with preferably a minimal amount of varying brightness and/or distortion. In addition, as known in the art, spherical surfaces are concave (from an inner perspective) or convex (from an outer perspective) surfaces that approximate a section of the surface of a sphere. The term "substantially spherical", for purposes of this description, refers to a surface that is not perfectly spherical but is sufficiently close to being spherical so as to behave similarly to a perfectly spherical surface within the context of these embodiments. Referring to the driver's view of Figure 2b, if the projected image from projector engine 10 is as designed for a flat screen with no projection lens offset, that image would appear at screen 12 as slightly "keystoned" (wider at the top than at the bottom) because of the tilt of screen 12, and slightly barrel distorted (smaller at the outsides of the screen than in the center) because of the curvature of screen 12. When reflected to and appearing at combiner 14, the apparent barrel distortion would be further increased by the optical effect of the curvature of combiner 14 since the outside of curved screen 12 is farther away from combiner 14 than is the center of screen 12. In some embodiments, however, screen 12 is constructed to have a "substantially spherical" surface, meaning that the surface behaves similarly to one that is perfectly spherical for purposes of preferred embodiments, but is not perfectly spherical, specifically by being slightly aspherical so as to help correct for the keystone distortion or barrel distortion, or both, resulting from the tilt and curvature of the inner surface of screen 12. In another preferred embodiment, screen 12 is constructed to have a convex cylindrical surface. Alternatively or in addition, distortions may be corrected for optically by the design of the projector lens in projector engine 10, or by also making combiner 14 slightly aspherical (while remaining "substantially spherical" as defined above), or by digital processing of the image being projected to pre-distort the image so it will look correct at combiner 14 as viewed by driver DRV, or by a combination of these techniques.
[0030] According to preferred embodiments, screen 12 is constructed to have a high screen "gain", in the optical sense. As known in the art, screen gain is a measure of the peak brightness of light reflected in a direction normal to the screen surface. As commonly understood in the art of projection screens, screen gain is typically a relative measure, where a gain of 1.0 refers to a screen that reflects light at the same brightness at which it is projected onto the screen with perfect uniformity from all viewing angles, with no light absorbed and all light re-radiated. Gain is typically measured from the vantage point where the screen is at its brightest, which is directly in front of and perpendicular to the tangent of the screen at that point. As such, the measurement of gain at this point is known as "Peak Gain at Zero Degrees Viewing Axis". Surfaces having a gain of 1.0 include a block of magnesium carbonate (MgCO3) and a matte white screen. A screen having a gain above 1.0 will reflect brighter light than that projected; for example, a screen rated at a gain of 1.5 reflects 50% more light in the direction normal to the screen than a screen rated at a gain of 1.0. However, screens with a gain greater than 1.0 do not reflect light at the same brightness at all viewing angles. Rather, if one moves to the side so as to view the screen at an angle, the brightness of the projected image will drop.
[0031] Figure 2c illustrates HUD device 2 from the rear (i.e., from the viewpoint of driver DRV of Figure 1). As evident from Figure 2c, driver DRV sees the back surface of a screen enclosure 13, within which screen 12 (Figure 2a) is disposed so as to face projector engine 10 (shown in shadow in Figure 2c). Because of the construction and arrangement of screen 12 and combiner 14 as will be described in detail below, the image presented by the light projected by projector engine 10 forms image IMG 12 (not shown) at screen 12. This image IMG 12 will reflect from screen 12 and appear on combiner 14 as image IMG 14, as shown in Figure 2c. Image IMG 14 thus presents graphics and other visual information generated by control electronics 6 within housing 4 as appropriate for the particular functions being executed, in a manner that is visible to driver DRV. Screen enclosure 13 serves to block light emitted by projector engine 10 from directly reaching driver DRV, as evident in the view of Figure 2c. And as discussed above and in further detail below, because combiner 14 is semi-transparent according to these embodiments, driver DRV will be able to see the road ahead through combiner 14, with image IMG 14 effectively overlaid onto that view of the road.
[0032] Figures 2a through 2c also illustrate various auxiliary components of HUD device 2 that may be implemented in various embodiments. Rear-facing camera 18R is mounted, in this embodiment, on the driver side of screen enclosure 13, and as such is aimed at driver DRV. As will be described in further detail below, image data acquired by rear-facing camera 18R are communicated to control electronics 6, which processes those data to identify gestures made by driver DRV and carry out various control functions responsive to those identified gestures. To enable this function in nighttime conditions, rear-facing camera 18R is sensitive to infrared light, and an infrared illuminant 19 (e.g., an LED emitting infrared light) is mounted on the driver-side surface of screen enclosure 13 and also facing driver DRV. Other gesture-detection technologies alternatively may be implemented in place of or in addition to rear-facing camera 18R, examples of which include depth sensors, photometric stereo sensors, and dual camera arrangements. Other aspects also may be included on the front of screen enclosure 13, including an on/off button 20 and a status indicator (e.g., LED) 21.
[0033] A front-facing camera 18F may be provided in some embodiments, for example mounted to the top edge of combiner 14 and aimed in the direction of windshield WSH. In these embodiments, front-facing camera 18F communicates image data pertaining to the location of the vehicle within or among lanes of the roadway, road conditions, or other environmental parameters visible through windshield WSH to control electronics 6, which in turn generates information for display at combiner 14 in response to that information. Figure 2a also shows ambient light sensor 22 mounted on housing 4, which will communicate the level of ambient light to control electronics 6, in some embodiments; more than one such ambient light sensor 22 may be implemented in HUD device 2 if desired. If ambient light sensor 22 is implemented, control electronics 6 can adjust the brightness and other attributes of the light projected by projector engine 10, typically to increase brightness of the displayed images under bright ambient conditions and reduce brightness at nighttime.
[0034] HUD device 2 further includes apparatus for communication in combination with various functions in the automotive context, where such communications may be wired, wireless, or both. In one example, HUD device 2 includes wireless communications functionality as part of or in conjunction with its control electronics 6, operable to carry out wireless transmission and receipt according to a conventional technology such as Bluetooth or other near-field communications (NFC) types for local communication with nearby devices (i.e., in the vehicle); longer-range communication capability such as cellular, satellite, FM and other radio communications may additionally or alternatively be implemented. In this example, the communications may be with a smartphone SPH (see Figure 1), which will typically be personal to driver DRV and include the appropriate software for communicating with HUD device 2. By way of this communication with smartphone SPH, HUD device 2 will be capable of displaying online-accessible information regarding traffic, weather conditions, text messages, email, and the like. The wireless functionality of HUD device 2 also is operable to communicate with a remote wireless device in the vehicle as preferred embodiments also contemplate such a device, as may be affixed to steering wheel SWH and operated by the driver DRV for providing scrolling and selection commands, among others, by manipulation of a rotary knob and/or selectable/depressible button the on the wireless device. The system of the embodiment shown in Figures 2a-2c also may include one or more rear cameras RCM, which may be deployed within the automobile, for example on the exterior rear of the vehicle, or internally to the vehicle such as on its ceiling or behind the driver's seat; communication between HUD device 2 and rear camera RCM allows HUD device 2 to display images on combiner 14 showing views from behind the vehicle or of the interior behind driver DRV, as the case may be, without requiring driver DRV to physically turn around or take her eyes off the road. Wired communications may be effected in various manners, such as via USB port (not shown) or other wired communication with an on- board diagnostic port OB DP of the vehicle in which HUD device 2 is installed; by way of this connection, information regarding the operating parameters or condition of the vehicle, either directly or in combination with navigation information (distance to next filling station) can be displayed to driver DRV at combiner 14. It is contemplated that those skilled in the art having reference to this specification will be readily capable of implementing these functions, and additionally or alternatively other functions beyond those described, as desired, without undue experimentation.
[0035] Various of the generalized operations of HUD device 2 should be apparent to one skilled in the art from the preceding, and note that HUD device 2 is further operable in response to response to driver commands, whether communicated by the above- referenced remote wireless device, by hand gesture, or by voice. Specifically, after a power-on sequence, such as may be commenced by a user pressing on/off button 20 and/or a button on the wireless device, or via a communication from another source (e.g., vehicle OBD port), device 2 executes appropriate initialization routines by electronics 6, which may include by system CPU, so as to perform power-on self-test sequences, and the like. The CPU also executes appropriate routines to pair its communications with the various devices in its vicinity, including smartphone SPH and perhaps certain functions of the vehicle, including the vehicle audio system, the on-board diagnostic port OBDP, rear- mounted vehicle camera RCM, and the like, as available and enabled for this installation. Following power-on, control electronics 6 places HUD device 2 in a default condition that forwards the corresponding image data to projector engine 10 for display at combiner 14. It is contemplated that this default condition may be to display the current velocity of the vehicle, or the current location on a navigation system map, or even simply a "splash" screen at combiner 14 in the field of view of driver DRV. At this point in its operation, HUD device 2 is ready to receive commands from driver DRV, or to respond to incoming communications. It is contemplated that rear-facing camera(s) 18R and other functions associated with control electronics 6 are operable, in this default state, to receive input from driver DRV or over the communications network, as appropriate. [0036] According to this embodiment, driver DRV can invoke a function by HUD device 2 by operation of the remote wireless device, or by making a pre-determined hand gesture that is detected by rear-facing camera 18R . This "home" gesture may be a "thumbs-up" gesture, a "two-fingers up" gesture, or some other distinctive hand position or motion, preferably made by driver DRV above steering wheel SWH (Figure 1) so as to be in the field of view of rear-facing camera 18R. Rear-facing camera 18R forwards images to the system CPU, which in response executes image recognition routines to detect the pre-determined "home" gesture for indicating that driver DRV wishes to present a command to HUD device 2. Upon detecting the pre-determined "home" gesture, the system CPU activates a command wait routine, which could anticipate another command from the remote wireless device or could be an audio command listener routine, whereupon control electronics 6 issues the data to projector engine 10 to display a "listening" image at combiner 14. Appropriate speech recognition routines are then executed by the system CPU to detect the content of a voice command received over audio detecting apparatus, where for example HUD device 2 also may include a microphone or may communicate with smartphone SPH in a manner to avail of the microphone included with the latter. According to this embodiment, a relatively wide range of wireless or audio commands may be available for execution by the system CPU (e.g., "search" for executing an Internet search for a type of business; "tweet" for creating a short text message to be posted on the TWITTER social network, via smartphone SPH; "text" for creating a text message to be sent to a contact via the telecommunications network; "call" for making a telephone call via smartphone SPH; and other such commands including invocation of a navigation function). In response to receiving one of these commands, the system CPU executes the corresponding command and HUD device 2 displays the corresponding content on combiner 14. Of course, additional displayes and wireless selections therefrom, or voice commands or hand gestures, may be required in the execution of a command. It is contemplated that those skilled in the art having reference to this specification will be readily able to implement such functionality as appropriate for a particular implementation.
[0037] Following execution of the above, control electronics 6 then returns to await further instruction or to respond to incoming communications, as the case may be, with the then-current image being displayed at combiner 14. Those then-current image may be the default state, or they may be the result of a different command, for example navigation information regarding the next turn to be made toward the desired destination.
[0038] In response to receiving an external communication, for example as communicated by the connected device (smartphone SPH) in response to it receiving a communication, control electronics 6 produces and displays a notification at combiner 14 corresponding to that external communication. Again as examples of notifications displayed by HUD device 2 in response to receiving an external communication, such could include: (i) the notification for a "tweet" received over the TWITTER social network, the profile photo of the "tweeter" and their screen name, and images; (ii) the notifications for an incoming phone call; and (iii) notifications for an incoming text message.
[0039] It is contemplated that those skilled in the art having reference to this specification will be readily able to recognize additional functions as may be provided by HUD device 2 and its control electronics 6, either itself or by way of a connected device such as smartphone SPH, and to realize those functions in a particular implementation, without undue experimentation. Still further, HUD device 2 may function primarily as a simple display device for an attached computing device, such as smartphone SPH, in which case control electronics 6 would generate the appropriate graphics data to serve as a display for applications running on the attached computing device. In this arrangement, HUD device 2 would then leverage features implemented on that attached computing device, which may include connection to the internet, GPS, or other forms of communication, and could be realized by way of less circuitry than in more computationally capable implementations, retaining as little as only that functionality involved in operating the display, for example the functionality for controlling projector engine 10 in response to ambient light sensors to adjust the brightness. In other embodiments, HUD device 2 itself could be a complete computing platform on its own, or it may have some intermediate level of functionality in which some of the computing is carried out by control electronics 6 with other operations performed on the attached computing device.
[0040] Having described numerous aspects of HUD device 2, attention is now turned to preferred embodiments directed toward the above-introduced remote wireless device for mounting inside a vehicle and that communicates with HUD device 2. In this regard, Figure 3 illustrates a perspective view of a wireless communication system 300. In general, wireless communication system 300 is for affixing to a location, preferably a vehicle steering wheel SWH, so that it may be easily reached and operated by a driver of the vehicle. Thus, Figure 3 also illustrates the HAND of a user and the readily-accessible features of wireless communication system 300, as may be operated, for example, by the user's thumb TH, while the user's palm remains safely in contact with the steering wheel SWH, as will be explored in the remainder of this document. By way of introduction, therefore, in a preferred embodiment wireless communication system 300 includes a rotatable wheel 302 and a depressible (or touch sensitive or otherwise selectable) push button 304, each for communicating input signals or commands to HUD device 2, in a manner comparable to user selectability with a computer wherein a user operates either wheels or buttons on a computer mouse. For example, therefore, rotatable wheel 302 may allow a user to scroll between and therefore identify one or more choices depicted on combiner 14 of HUD device 2, while button 304 may allow the user to then select a highlighted choice or otherwise select the function of a feature depicted on combiner 14 of HUD device 2 (e.g., choosing to answer or end an incoming call; choosing to have the image on combiner 14 of HUD device 2 change to a different mode; and so forth). Lastly and also by way of introduction, note that in a preferred embodiment wireless communication system 300 includes a strapping apparatus 306 for affixing rotatable wheel 302/button 304 relative to steering wheel SWH and, more preferably, so that rotatable wheel 302/button 304 are located within the inner perimeter of the generally circular shape of steering wheel SWH. In this position, the user is still able to steer the vehicle and control steering wheel SWH as is customary in driving, while also having a natural grip position on the wheel so as to permit ready access to rotatable wheel 302 and button 304 (e.g., again, typically expected to be by way of the user's thumb TH). Moreover, note that with this orientation, a preferred match is provided in the scrolling direction of information displayed by HUD device 2 and the rolling direction of rotatable wheel 302, that is, when the user rotates the wheel clockwise, the top of the wheel is therefore rotating in general from left to right, and preferably scrolling depicted by HUD device 2 is therefore likewise from left to right; in opposite fashion, when the user rotates the wheel counterclockwise, the top of the wheel is therefore rotating in general from right to left, and preferably scrolling depicted by HUD device 2 is therefore likewise from right to left.
[0041] Figure 4 illustrates a perspective exploded view of preferred embodiment components of wireless communication system 300. Along the left side of Figure 4 are generally items relating to rotatable wheel 302 and button 304, while along the right side of Figure 4 are generally items relating to strapping apparatus 306. Each component, and its relationship to other components of system 300 as a whole, is described below.
[0042] System 300 includes a cap 310, which provides the upper tactile surface for rotatable wheel 302, introduced above. Preferably, cap 310 is domelike in shape with an aperture 310A in its center, and where in a preferred embodiment cap 310 has an outer diameter of approximately 33.5mm. Further, aperture 310A has an outer diameter of approximately 13.7 mm which, in a preferred embodiment is slightly larger than the outer diameter of button 304. In this manner, the depressible surface of the depressible button 304 is aligned in a plane with a top center of rotatable wheel 302 and with only a slight gap between the outer perimeter of button 304 and aperture 310A. Cap 310 is preferably constructed of plastic { and has a tactile or grip enhancing surface treatment around its periphery or sidewall, where such treatment also may extend along the top of cap 310. In a preferred embodiment, therefore, the treatment is formed by radial ridges that are molded into cap 310, whereby a user may rotate cap 310 by applying force to one or more ridges at a time.
[0043] System 310 also includes a button cap 312, a button adhesive 314, and a button actuator 314, that together in general from the above-introduced button 304. Button cap 312 is generally a disk having a diameter slightly smaller than aperture 310A of cap 310. Button adhesive 314 is also a disk shape and is a double-sided adhesive so as to adhere button cap 302 to button actuator 316. Button actuator 316 includes appropriate apparatus to couple to a rotary/button printed circuit board assembly (PCBA) 318. In a preferred embodiment, rotary/button PCBA 318 includes sufficient mechanisms and circuitry so as to detect both rotation of cap 310 and translate that to a signal for communicating between system 300 and HUD device 2, and likewise to detect actuation (e.g., depression, touch) of button 304 and to translate that to a signal for communicating between system 300 and HUD device 2, where circuitry for performing such mechanical- movement to signal translation are readily ascertainable by one skilled in the art. A set (e.g., three) of screws 320 is used to affix rotary/button PCBA 318 to a PCB carrier 322, where the ends of each screw pass beyond PCB carrier 322 to fit into respective retaining cutouts (e.g., 120 degrees apart around the circular shape) of a battery PCB 324. While not visible in the view of Figure 4, battery PCB 324 includes appropriate contacts to receive voltage from a battery 326 that is positioned between those contacts and a battery cover 354. Moreover, system 310 includes a main housing 346, into which each of the above-described components relating to wheel 302/button 304 are aligned along a common axis passing through the center of each component (with the exception of the perimeter located screws 320).
[0044] Looking to the right side of Figure 4, wireless communication system 300 includes a flexible strap 330 having a main length portion 332 and an angled portion 334, so named as at is departs, based on pre-formed shaping in strap 330, at an angle Θ away from main length portion 332. At its opposite end, main length portion 332 has a curved portion 336 at its opposite end. Materials for flexible strap 330 provide a sufficient amount of elasticity so that it may be pulled tight around the curved cylindrical cross- section of a steering wheel and affixed to an anchoring point, where the elasticity will retain it in place; hence, rubber or other flexible and elastic materials are contemplated. Angled portion 334 includes an aperture 338 sized and positioned so that, as detailed later, flexible strap 330 may be wrapped around a steering wheel and whereby aperture 338 fits around a respective protrusion also associated with system 300. Similarly, main length portion 332 also includes an aperture 340, around which and preferably molded into strap 330 is an aperture reinforcement boundary 342. Aperture 340 (and boundary 342) is sized and positioned so that, as detailed later, when flexible strap 332 is wrapped around a steering wheel, aperture 340 (and boundary 342) also fits around a respective protrusion also associated with system 300. Curved portion 336 includes a concave curvature portion 344, where the curvature is shaped to approximate a contour of a respective portion of the outer curved contour of a steering wheel.
[0045] Curved portion 336 also attaches to main housing 346. More particularly, preferably a mounting plate 349 is insert molded into curved portion 336 of strap 330 and a set {e.g., four) of screws 349s are respectively threaded through corner-located holes in plate 349, through strap 300, into respective holes (not shown) in housing 346. In addition, a pin 352 also retains strap 300 to housing 346, for example to prevent additional gapping near the top of the connection, as pin 352 is passed through a cylindrical sleeve 330SL that is formed in strap 330 where main length portion 332 meets curved portion 336, and whereby each end tip of pin 352 fits within a respective pin hole 346pH molded into an upper end of housing 346, as may be further retained in position by a pin screw 352s.
[0046] Various additional components are affixed to housing 346. First, recalling that screws 320 attach PCBA 318 to PCB carrier 322 and the screw tips then align with and pass through respective retaining cutouts in battery PCB 324, the tips of those screws preferably terminate in respective receptacles along the bottom of housing 346, thereby retaining the assemblage of those components in place within the circumferential outer wall of housing 346. Along the bottom of housing 346 is removably affixed (e.g., by rotating ¼ turn) a battery cover 354, and extending away from battery cover 354 is a protrusion 356 which is sized and positioned to mate with aperture 338 of angled portion 334, as introduced above and as further discussed below. Also affixed inside the interior of the outer circumferential wall of housing 346 is a light pipe 358, retained in place by way of example via a respective adhesive 360, for providing illuminating visual indication (e.g., by piping light from an LED), through an aperture 362, in housing 346, of one or more operable features of system 300, such as when the system is on or is communicating certain signals, which by way of example can include pairing with HUD device 2 or other subsequent communications therewith. Lastly, a strap hook 364 is removably inserted into a side of a curved ridge 366 that extends from the bottom of housing 346 and opposite the steering wheel curvature of concave curvature portion 344, where strap hook 354 is sized and positioned to mate with aperture 340 of main length portion 332, as introduced above and as further discussed below.
[0047] Figure 5 illustrates a top perspective view, and Figure 6 illustrates a side view, of wireless communication system 300 once the exploded view of components from Figure 4 are assembled, but prior to affixing system 300 in a vehicle. From these views, one skilled in the art can further appreciate some of the structural, functional, and operable aspects described above, and additional observations are now discussed with respect to installing system 300 into a vehicle. Particularly, from the Figures 5 and 6 views, a BEND DIRECTION is indicated which, from the angles shown involves moving angled portion 334 in a clockwise direction. In this regard, curved ridge 366 is positioned on the inside perimeter of a steering wheel at a position, and with cap 310 at a position that will be readily accessible to the vehicle operator when the vehicle is steered in a straight direction, such as at a position between approximately 1 :00 and 5:00 o'clock, if perceiving the perimeter of the steering wheel in terms of clock position, as is sometimes a common reference in steering (and driving school). Hence, with curved ridge 366 abutting an inside perimeter of a steering wheel, angled portion 334 is bent around the outer portion of the steering wheel and with apertures 338 and 340 then pulled toward the underside of housing 346. Given the preferred embodiment elasticity of flexible strap 330, upon wrapping angled portion 334 in the BEND DIRECTION, flexible strip 330 is stretched and aperture 340 of main portion 332 is placed around strap hook 364. Further in this regard, in a preferred embodiment strap hook 364 includes an additional flange 364F pointing in a direction opposite of the pulling force that is created by the elastic stretching of strap 330 as described. Hence, flange 364F assists with retaining strap 330 affixed to hook 364. Additionally, with angled portion 334 so situated, aperture 338 in angled portion 334 is placed around protrusion 356. This completed affixation, of strap 330 to housing 346, is shown in Figure 7. From Figures 5 through 7, therefore, one skilled in the art will appreciate that the angle Θ between main length portion 332 and angled portion 334 is matched or approximated by the angle on the underside of housing 346, that is, the angle between the flat surface from which protrusion 356 extends and curved ridge 366. Moreover, so long as battery cover 354 is properly positioned, then the distal length of strap 330, where aperture 338 is located, can then seat properly about protraction 356; in this regard, therefore, this provides additional indication to the user that batter cover 354 is properly in place, or if the strap will not properly attach, that they should check to ensure that battery cover 354 is properly secured.
[0048] Figure 8 is a flow diagram illustrating a method 800 of the generalized operation of HUD device 2 including its response to driver commands, whether by operation of system 300, hand gesture or voice. Power-on sequence 810 begins with the powering-on of HUD device 2, including execution of the appropriate initialization routines by its system CPU, power-on self-test sequences, and the like. In process 812, the HUD device system CPU executes the appropriate routines to pair its communications with the various devices in its vicinity, including system 300, smartphone SPH and perhaps certain functions of the vehicle, including the vehicle audio system, the on-board diagnostic port OBDP, rear-mounted vehicle camera RCM, and the like, as available and enabled for this installation. Following power-on sequence 810 and pairing process 812, control electronics 6 places HUD device 2 in a default condition in process 814, and forwards the corresponding image data to projector engine 10 for display at combiner 14. It is contemplated that this default condition may be to display the current velocity of the vehicle, or the current location on a navigation system map, or even simply a "splash" screen at combiner 14 in the field of view of driver DRV. At this point in its operation, HUD device 2 is ready to receive commands from driver DRV, or to respond to other incoming communications. It is contemplated that rear-facing camera(s) 18R and other functions associated with control electronics 6 are operable, in this default state, to receive input from driver DRV or over the communications network, as appropriate.
[0049] According to this embodiment, driver DRV can invoke a function by HUD device 2 by making a pre-determined hand gesture that is detected by rear-facing camera 18R in process 816. This "home" gesture may be a "thumbs-up" gesture, a "two-fingers up" gesture, or some other distinctive hand position or motion, preferably made by driver DRV above steering wheel SWH (Figure 1) so as to be in the field of view of rear-facing camera 18R, which forwards images to the device 2 system CPU; in process 816, the system CPU in turn executes image recognition routines to detect the pre-determined "home" gesture for indicating that driver DRV wishes to present a command to HUD device 2. An example of this operation is shown in Figure 9a, in which the default state displayed in process 814 is image 902, which includes such information as the current velocity of the vehicle, the direction of travel, and a number of pending reminders are displayed as shown. Upon detecting the pre-determined "home" gesture in process 816, in this example a "thumbs-up" gesture, process 818 is then executed by the system CPU to activate a command await routine, in which case the system may await receiving a command from one or more sources. In one example, therefore, the anticipated command may be from system 300, in which case step 818 represents awaiting of a wireless command from either rotatable wheel 302 or push button 304; in process 818, therefore, control electronics 6 issues the data to projector engine 10 to display an image corresponding with the wait state - for example, if the anticipated command may be from wireless communication system 300, then combiner 14 may display one or more onscreen choices, whereby the command by the user, via wireless communication system 300, may be to scroll across the choices via wheel 302 and/or to select a displayed choice via button 304. Alternatively, if the anticipated command is a spoken voice command from the driver DRV, then combiner 14 can depict a "listening" image, for example as shown by image 904 in Figure 9a. In this case, the appropriate speech recognition routines are then executed by the system CPU to detect the content of a voice command received over a connected microphone, smartphone SPH, or the appropriate audio input facility.
[0050] According to this embodiment, a relatively wide range of input commands may be available for execution by the system in process 820. Figure 9b illustrates a few such commands by way of example, including "tweet" 952 for creating a short text message to be posted on the TWITTER social network, via smartphone SPH; "text" 954 for creating a text message to be sent to a contact via the telecommunications network; "call" 954 for making a telephone call via smartphone SPH; and other such commands including invocation of a navigation function or an Internet search. In response to receiving one of these commands, the system CPU executes the corresponding command and displays the corresponding content in process 820; for example by pushing one of images 906a through 906d to be displayed at combiner 14, in the example of Figure 9a. Of course, additional system 300, voice, or hand gesture commands may be required in the execution of the command of process 820 (e.g., confirming a text or tweet by way of a hand gesture or a "send" voice command or push of button 304). It is contemplated that those skilled in the art having reference to this specification will be readily able to implement such functionality as appropriate for a particular implementation. [0051] Following execution of the command in process 820, control electronics 6 then returns to await further instruction or to respond to incoming communications, as the case may be, with the then-current image being displayed at combiner 14. Those then-current image may be the default state, such as image 902 of Figure 9a, or they may be the result of the command executed in process 820, for example navigation information regarding the next turn to be made toward the desired destination, as shown by image 958 of Figure 9b.
[0052] In response to receiving an external communication in process 822, as communicated by a connected device (e.g., wireless communication system 300 or smartphone SPH), control electronics 6 produces and displays a notification at combiner 14 corresponding to that external communication, in process 824. Figure 9b illustrates examples of notifications displayed by HUD device 2 in response to receiving an external communication. Image 952 illustrates the notification for a "tweet" received over the TWITTER social network, including the profile photo of the "tweeter" and their screen name, and images 954 and 956 illustrate the notifications for an incoming phone call and incoming text message, respectively, each including the contact name or "caller ID" indication for the caller and their phone number. These notifications may include secondary information, such as image 952' that is not immediately displayed with notification image 952 but is available on command, such as via rotation of rotatable wheel 302.
[0053] According to this embodiment, commands from wireless communications system 300, hand gestures or voice from driver DRV provide inputs for controlling responses to incoming notifications. These commands are detected using routines executed by the system CPU in response to inputs from a wireless receiver in HUD device 2, rear-facing camera 18R, and an internal (or smartphone) microphone, for example, in process 820. For example, if secondary notifications such as image 952' are available, driver DRV indicates the desire to view that secondary notification by rotating wheel 302 counterclockwise, or making a leftward swipe with one finger raised; this wireless communication or hand gesture is detected by the system CPU in process 826, to which control electronics 6 responds in process 828 by displaying image 952'. At this point (as indicated by way of example with a "listening" icon included in image 952'), voice commands issued by driver DRV (e.g., "retweet", "reply", etc.) may be detected in another instance of process 828, and the appropriate action taken by control electronics 6 in process 828. Alternatively, in the example shown in Figure 9b, the hand gesture of a rightward swipe with one finger raised by driver DRV that is detected in process 826 will cause control electronics to "dismiss" the notification in process 828, returning the display to its previous state (e.g., navigation image 958) or to the default image (e.g., speedometer image 902), awaiting the next gesture, command, or incoming communication.
[0054] Given the preceding, the preferred embodiments provide an improved wireless remote input in combination with a display device. Preferred embodiments may include various features and benefits, such as the following. A wireless remote input device is provided, for example for a head-up display, and that employs an intuitive rotary knob and center button for navigating a graphical user interface, such as that being shown on the head-up display. The remote input device can be efficiently used without requiring its user to look at the device. The device also can be easily mounted to any steering wheel, at a location on the steering wheel that is most convenient for the user. Rotational inputs are provided via an ergonom i cal ly-design ed wheel having an optimized outer diameter and tactile ridges for easy rotation by the user's thumb, allowing the user to simultaneously grip the steering wheel in a safe manner while driving while also operating the remote wireless device. The button, centered within the wheel, may have multiple functions, including: (i) selecting an input among several options in the user interface; (ii) providing access to aspects of the user interface by multiple clicking within a time period, such as double or triple clicking (e.g., giving access to Bluetooth pairing; quick access to music functionality; etc). The device is relatively inexpensive to manufacture, and is easily assembled at the factory level, due to the optimized construction of its architecture. The device connects wirelessly to another device (e.g., head-up display, using, for example, Bluetooth). The device has an elastic and hence stretchable Strap that can adapt to various thicknesses of steering wheels, and also has a "gripping" surface finish to resist slippage of the mounted device when it is being manipulated by the end user. The device is also readily removable, as its strap can be easily attached to the steering wheel (by engagement with the hook), while reversal of the process by removing the strap from the hook allows the entire system to be easily removed, and such removal is achieved without leaving any residue, as may be the case with other mounting schemes (e.g., adhesive). Still other features and benefits will be ascertainable by one skilled in the art. The inventive scope, therefore, is demonstrated by the teachings herein and is further guided by the following claims.

Claims

Claims: What is claimed is:
1. A wireless communicating system, comprising:
a rotatable wheel;
a selectable button fixed relative to the rotatable wheel;
a flexible strap for affixing the rotatable wheel and the selectable button relative to a steering wheel in the vehicle;
circuitry for translating actuation of the selectable button into a wireless signal representative of the actuation; and
circuitry for translating rotation of the rotatable wheel into a wireless signal representative of an extent of the rotation.
2. The wireless communicating system of claim 1 wherein a selectable surface of the selectable button is centrally positioned relative to a perimeter of the rotatable wheel.
3. The wireless communicating system of claim 3 wherein the selectable surface of the selectable button is aligned in a plane with a top center of the rotatable wheel.
4. The wireless communicating system of claim 3 wherein the flexible strap is for affixing the rotatable wheel and the selectable button inside an outer perimeter of the steering wheel in the vehicle.
5. The wireless communicating system of claim 1:
wherein the rotatable wheel further comprises an outer wall; and
wherein the outer wall comprises a plurality of radial ridges.
6. The wireless communicating system of claim 5:
wherein the rotatable wheel further comprises a top surface; and
wherein the top surface comprises a plurality of radial ridges.
7. The wireless communicating system of claim 1 and further comprising: a housing affixed relative to the rotatable wheel; and
wherein the for translating rotation of the rotatable wheel is located in the housing.
8. The wireless communicating system of claim 7 and further comprising a hook member coupled to an exterior of the housing, the hook member for coupling to an aperture in the flexible strap.
9. The wireless communicating system of claim 8 wherein the hook member comprises a first hook member and the aperture comprises a first aperture, and further comprising a second hook member coupled to an exterior of the housing, the second hook member for coupling to a second aperture in the flexible strap.
10. The wireless communicating system of claim 9:
wherein the flexible strap comprises a main body portion and an angled portion extending at an angle relative to the main body portion;
wherein the first aperture is formed in the angled portion; and
wherein the second aperture is formed in the main body portion.
11. The wireless communicating system of claim 7 and further comprising: a housing affixed relative to the rotatable wheel; and
wherein the circuitry for translating actuation of the selectable button is located in the housing.
12. The wireless communicating system of claim 1 and further comprising: a housing affixed relative to the rotatable wheel; and
circuitry within the housing for translating depression of the selectable button into a wireless signal representative of the depression.
13. The wireless communicating system of claim 1 and further comprising a head up display unit for receiving wireless communications from the circuitry for translating depression and the circuitry for translating rotation.
14. The wireless communicating system of claim 13 wherein the head up display comprises:
a panel for displaying images; and
circuitry for changing the displayed image in response to a communication received by the circuitry for translating depression or the circuitry for translating rotation.
15. The wireless communicating system of claim 1:
wherein the rotatable wheel further comprises a top surface; and
wherein the top surface comprises a plurality of radial ridges.
PCT/US2017/018904 2016-02-22 2017-02-22 Wireless remote input device for head-up display WO2017147164A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662298302P 2016-02-22 2016-02-22
US62/298,302 2016-02-22

Publications (1)

Publication Number Publication Date
WO2017147164A1 true WO2017147164A1 (en) 2017-08-31

Family

ID=59685615

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/018904 WO2017147164A1 (en) 2016-02-22 2017-02-22 Wireless remote input device for head-up display

Country Status (1)

Country Link
WO (1) WO2017147164A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1146409A2 (en) * 2000-04-15 2001-10-17 Robert Bosch Gmbh Control element
US20030109290A1 (en) * 2001-12-07 2003-06-12 Moffi Diane C. Hands free vehicle mounted mobile telephone with audio
US20040262462A1 (en) * 2003-06-27 2004-12-30 Polak Donald J. One-piece molded clamp
US20050197745A1 (en) * 2004-03-04 2005-09-08 Davis Alan C. Vehicle information system with remote communicator
CN101930661A (en) * 2009-06-26 2010-12-29 宏达国际电子股份有限公司 Remote control device and remote control method of electronic device
US20130220779A1 (en) * 2012-02-28 2013-08-29 Grayhill, Inc. Rotary pushbutton and touchpad device and system and method for detecting rotary movement, axial displacement and touchpad gestures
CN104627085A (en) * 2015-02-16 2015-05-20 北京精灵网络科技有限公司 Method of fixing vehicular intelligent hardware device
WO2015095849A1 (en) * 2013-12-20 2015-06-25 Amaru Michael Method and apparatus for in-vehicular communications

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1146409A2 (en) * 2000-04-15 2001-10-17 Robert Bosch Gmbh Control element
US20030109290A1 (en) * 2001-12-07 2003-06-12 Moffi Diane C. Hands free vehicle mounted mobile telephone with audio
US20040262462A1 (en) * 2003-06-27 2004-12-30 Polak Donald J. One-piece molded clamp
US20050197745A1 (en) * 2004-03-04 2005-09-08 Davis Alan C. Vehicle information system with remote communicator
CN101930661A (en) * 2009-06-26 2010-12-29 宏达国际电子股份有限公司 Remote control device and remote control method of electronic device
US20130220779A1 (en) * 2012-02-28 2013-08-29 Grayhill, Inc. Rotary pushbutton and touchpad device and system and method for detecting rotary movement, axial displacement and touchpad gestures
WO2015095849A1 (en) * 2013-12-20 2015-06-25 Amaru Michael Method and apparatus for in-vehicular communications
CN104627085A (en) * 2015-02-16 2015-05-20 北京精灵网络科技有限公司 Method of fixing vehicular intelligent hardware device

Similar Documents

Publication Publication Date Title
US20160025973A1 (en) Compact Heads-Up Display System
US20060103590A1 (en) Augmented display system and methods
US9268135B2 (en) Head-up projection system
US7734414B2 (en) Device, system and method for displaying a cell phone control signal in front of a driver
US20200057546A1 (en) User interface and methods for inputting and outputting information in a vehicle
US10247944B2 (en) Method and apparatus for in-vehicular communications
US20150054760A1 (en) Vehicle use portable heads-up display
JP2001506772A (en) Information display system for at least one person
JP2003512244A (en) Display device for vehicle information
WO2016123248A1 (en) An image projection medium and display projection system using same
JP4622794B2 (en) Screen moving display device
JPWO2017018122A1 (en) Projection display apparatus and projection control method
EP3130510B1 (en) Automobile room mirror adapter
FR3047933A1 (en) DEVICE FOR INTERACTIVE PRESENTATION OF INFORMATION IN A MOTOR VEHICLE HABITACLE
US8749363B2 (en) Universal accessory for viewing a smartphone display in an automobile
CN107111217B (en) Projection display unit
WO2017147173A1 (en) High-precision focus mechanism for a pico projector module
WO2017147164A1 (en) Wireless remote input device for head-up display
WO2017147155A1 (en) Low-profile suction apparatus with integrated gimbal adjustment for mounting to compound curvature surfaces
JP2007186016A (en) Head-up display device for vehicle
WO2018230526A1 (en) Input system and input method
JP3219639U (en) Embedded head-up display device
US10698209B2 (en) Embedded head-up display device
CN209063897U (en) Down enters formula head-up display device
CN218918341U (en) Image display device

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17757132

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17757132

Country of ref document: EP

Kind code of ref document: A1