WO2014209354A1 - Image stabilization in a pico projector - Google Patents

Image stabilization in a pico projector Download PDF

Info

Publication number
WO2014209354A1
WO2014209354A1 PCT/US2013/048537 US2013048537W WO2014209354A1 WO 2014209354 A1 WO2014209354 A1 WO 2014209354A1 US 2013048537 W US2013048537 W US 2013048537W WO 2014209354 A1 WO2014209354 A1 WO 2014209354A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
displayed image
instability
size
projection unit
Prior art date
Application number
PCT/US2013/048537
Other languages
French (fr)
Inventor
Mark Alan Schultz
James Edwin Hailey
Mark Francis Rumreich
Original Assignee
Thomson Licensing
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thomson Licensing filed Critical Thomson Licensing
Priority to PCT/US2013/048537 priority Critical patent/WO2014209354A1/en
Publication of WO2014209354A1 publication Critical patent/WO2014209354A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/3173Constructional details thereof wherein the projection device is specially adapted for enhanced portability
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/144Movement detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor

Definitions

  • the present arrangement provides a system and method for compensating for movement and vibrations of a portable projection device to stabilize the display of an image.
  • projection devices were (and are) designed as non-mobile devices that are positioned in a room and project a series of audio-visual images on a screen that is viewable by individuals within the room and in the line of sight of the projected image.
  • these projection devices are precisely configured to minimize errors in the audio-visual images being displayed. Examples of these systems include but are not limited to movie theaters, professional meeting rooms, lecture halls and the like.
  • pico projectors A pico projector may be included in any handheld device that can selectively project at least one of an image or series of images on a surface. Moreover, it is important for the pico projector to be able to generate a clear image of sufficient quality on any type of surface. This may include, for example, a conventional display screen or a wall in a room. It is therefore necessary for the pico projector to compensate for any surface impurities when generating and projecting a display image.
  • a further drawback associated with pico projection relates to the nature of the device itself. Because the pico projector is naturally handheld and/or portable, the pico projector suffers from increased visual display errors as compared to a traditional projection device. The increased visual errors (e.g. noise, distortion, etc) in images projected by pico projectors result from the often sub-optimal positioning of the pico projector with respect to the surface on which the images are being displayed as well as the orientation of individuals viewing the image to the surface on which the image is displayed.
  • the increased visual errors e.g. noise, distortion, etc
  • the projected image from a pico projector is subject to movement and vibration from many sources, e.g. a shaky hand, heartbeat, hand twitching from nervousness, fan vibrations, people kicking the table, or even speaker feedback. These movements and vibrations interfere with the ability to provide a stable displayed image with the pico projector.
  • an apparatus for stabilizing an image displayed by a projection unit senses motion of the projection unit and an instability detector detects a level of instability of the displayed image using the motion sensed by the motion sensor.
  • a size calculator determines a size for the displayed image within the image area based on the detected level of instability and an image stabilizer maps the displayed image within the display area to reduce the effects caused by movement of the projection unit.
  • a method of stabilizing an image displayed by a projection unit includes sensing motion of the displayed image and detecting a level of instability of the projection unit. A size for the displayed image within the image area is determined based on the detected level of instability. A position of the displayed image within an image area is calculated and the displayed image within the display area is mapped to reduce the effects caused by movement of the projection unit.
  • an apparatus for stabilizing an image displayed by a projection unit includes a means, such as a motion sensor, for sensing motion of the projection unit and means, such as an instability detector, for detecting a level of instability of the displayed image and calculating a size of the displayed image within an image area.
  • the apparatus also includes a means, such as a size calculator, for determining a position for the displayed image within the image area based on the detected level of instability and size of the displayed image and means, such as an image stabilizer, for mapping the displayed image within the display area to reduce the effects caused by movement of the projection unit.
  • a means such as a size calculator, for determining a position for the displayed image within the image area based on the detected level of instability and size of the displayed image and means, such as an image stabilizer, for mapping the displayed image within the display area to reduce the effects caused by movement of the projection unit.
  • This invention attempts to compensate for the majority of projector vibrations by adjusting the picture to stabilize the image on the display surface even though the projector is not steady.
  • FIG. 1 is a block diagram of the portable projection device according to aspects of the present invention.
  • FIGS. 2A - 2D are exemplary light engines for use in the portable projection device according to aspects of the present invention.
  • FIG. 3 is a block diagram of the stabilization device according to aspects of the present invention.
  • FIG. 4 is a block diagram of the instability detector within the stabilization device according to aspects of the present invention.
  • FIG. 5 is a block diagram of the image stabilizer within the stabilization device according to aspects of the present invention.
  • FIG. 6A-6D are diagrams of the active display area of the imager with respect to instability according to aspects of the present invention.
  • FIG. 7 is a flow diagram of the method for stabilizing a displayable image according to aspects of the present invention.
  • these elements are implemented in a combination of hardware and software on one or more appropriately programmed general-purpose devices, which may include a processor, memory and input/output interfaces.
  • processor or “controller” should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, digital signal processor ("DSP") hardware, read only memory (“ROM”) for storing software, random access memory (“RAM”), and nonvolatile storage.
  • DSP digital signal processor
  • ROM read only memory
  • RAM random access memory
  • a component is intended to refer to hardware, or a combination of hardware and software in execution.
  • a component can be, but is not limited to being, a process running on a processor, a processor, an object, an executable, and/or a microchip and the like.
  • an application running on a processor and the processor can be a component.
  • One or more components can reside within a process and a component can be localized on one system and/or distributed between two or more systems. Functions of the various components shown in the figures can be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software.
  • any element expressed as a means for performing a specified function is intended to encompass any way of performing that function including, for example, a) a combination of circuit elements that performs that function or b) software in any form, including, therefore, firmware, microcode or the like, combined with appropriate circuitry for executing that software to perform the function.
  • the disclosure as defined by such claims resides in the fact that the functionalities provided by the various recited means are combined and brought together in the manner which the claims call for. It is thus regarded that any means that can provide those functionalities are equivalent to those shown herein.
  • the present invention is directed towards a multifunction portable electronic device (hereinafter, the "device") that includes audiovisual image projection capabilities (e.g. a pico projector) and method of operating the same.
  • An exemplary block diagram of the device 10 is provided in Figure 1.
  • the device 10 includes a controller 12.
  • the controller 12 is a component that executes various operational algorithms that control the various functions of the device 10. In one embodiment, the controller 12 executes algorithms that enable audio and video processing of a source input signal.
  • the controller 12 may also include a memory in which various machine executable instructions controlling various device functionality may be stored and accessed as needed in response to various control signals generated by one of (a) a user; and (b) other components of the device 10 as will be discussed below.
  • the memory of the controller 12 may also store data associated with any input signal received by the controller 12.
  • the memory of controller 12 may also store user- specific information that is associated with a user of the device 10.
  • user specific information may include user preferences for configuring the device for a particular type of operation.
  • the user specific information may include global preference information that configures aspects of device operation that are common between the various functions as well as function specific preference information that configures the device to operate in a particular manner when executing a particular function.
  • the controller 12 is described as including a memory, one skilled in the art should understand that the memory (or other storage medium) within the device may be a separately embodied component that is read/write accessible by the controller 12 as needed.
  • the device 10 also includes a power converter 14 and battery 16 connected to the power converter 14.
  • the power converter 14 is selectively connectable to an input power source (either AC or DC) for receiving power therefrom. Power received by the power converter 14 is provided to the battery 16 and selectively charges the battery 16 as needed. It should be understood that the operation of charging is meant to include an initial charging of the battery 16 as well as recharging the battery 16 as the power level is being depleted. Power is also simultaneously provided by the power converter 14 to the controller 12 for powering operation thereof.
  • the controller 12 may selectively detect when input power is being provided to the power converter 14 causing the device 10 to operate in a first power mode when a connection to an input power source is detected and a second mode when no connection to an input power source is detected.
  • the controller 12 may execute a battery monitoring algorithm that enables the controller 12 to selectively detect a power level in the battery 16 and control the power converter 14 to direct power thereto. The controller 12 can also control charging of the battery 16 when the detected power level in the battery 16 is below a predetermined threshold. In another embodiment of the first power mode, the controller 12 may automatically direct power from the power converter 14 to be provided to the battery 16 in response to connection of the power converter with the input power source. In the second mode of operation, the controller 12 is powered by the battery 16 until such time that the battery power is depleted below a predetermined operational threshold representing a minimum amount of power needed to operate the device.
  • the controller 12 may receive an input audiovisual signal from one of a plurality of device inputs collectively referred to using reference numeral 15.
  • the controller 12 can control selective projection of the audiovisual input signal using projection unit/microdisplay 30.
  • the input audiovisual signal may include one of (a) a still image; (b) a series of images; (c) a video signal; and (d) audio signal.
  • the input audiovisual signal may also include an audio component that is intended to be audibly reproduced by speaker 29 in conjunction with the projection, by the projection unit 30, of the one still image or series of images as will be discussed below.
  • the plurality of inputs may include any combination of but is not limited to (a) a card reader 18; (b) a USB port 20; (c) a digital video input port (HDMI) 22; (d) a VGA/Component video input port 24; and (e) a composite/S-Video input port 26.
  • the depiction of the plurality of input ports 15 is for purposes of example only and the device 10 may include any combination of the described input ports or other known input ports.
  • the card reader selectively receives a storage card that may include data representative of the input audiovisual signal that is accessed by the controller 12 and provided to the projection unit 30 and/or speaker 29 for output thereof.
  • the card reader 18 may be a MicroSD card reader. This is described for purposes of example only and any card reading device able to read any standardized storage card may be included in device 10.
  • the USB port 20 enables the device 10 to be selectively connected to one of (a) a portable storage device (e.g. flash drive); or (b) a secondary device, that stores data representative of the audiovisual input signal.
  • Any of the digital video input 22, VGA/component input 24 and/or composite video input 26 may enable connection with a secondary device that includes the source audiovisual input signal and are coupled to the controller 12 via an input selector 28.
  • the input selector 28 selectively couples a respective one of the digital video input 22, VGA/component input 24 and/or composite video input 26 with the controller 12 such that the controller 12 may provide the audiovisual input signal to the projection unit 30 and speaker 29 for output thereof.
  • the device 10 further includes a plurality of user controls, collectively referred to using reference numeral 31, enabling the user to selectively control various device functions.
  • An input/output (IO) interface 32 may include at least one user selectable button associated with at least one device function such that selection thereof initiates a control signal received by the controller 12 that is used to control the particular device function.
  • the IO interface 32 may be a touch screen and the at least one button may be a user selectable image element displayed on the touch screen enabling selection thereof by a user.
  • the number and types of user selectable image elements may be generated by the controller 12 depending on the particular operational mode of the device. For example, during projection mode, the user selectable image elements may enable activation of image projection functionality and, if the device 10 is operating in a communication mode, the user selectable image elements displayed on the I/O interface 32 may relate to
  • the IO interface 32 may include at least one dedicated button on a housing of the device 10 that may be manually activated by a user.
  • Another user control 31 included with the device 10 includes a keyboard 34.
  • the keyboard 34 enables a user to enter alphanumeric text-based input commands for controlling the operation of the device.
  • the keyboard is positioned on the housing of the device. In another embodiment, there is no dedicated keyboard and the keyboard may be generated by the controller 12 and provided for display by the IO interface 32.
  • a further user control 31 that may be provided is a remote infrared (IR) sensor 36.
  • Remote IR sensor 36 selectively receives an IR input signal that is generated by a remote control.
  • the IR input signal received by the remote IR sensor 36 is communicated to the controller 12 which interprets the received IR input signal and initiates operation of a particular function of the device corresponding to user input.
  • Any of the user controls 32, 34 and/or 36 may be used to generate control signals for selecting an input audiovisual signal from a respective input source of the plurality of input sources 15.
  • the control signals input via the user are received by the controller 12 which processes the user input signal and selects the source of the input audiovisual signal.
  • Input received from any of the user controls 31 may also condition the controller 12 to selectively output the audiovisual signal using projection unit 30 and speaker 29.
  • the projection unit 30 includes a panel driver 38, a light engine 39 and a projection lens 48.
  • the panel driver 38 receives the audiovisual input signal from the controller 12 and controls the light engine to emit light representative of the audiovisual input signal that may be projected via a projection lens 48 coupled thereto.
  • the light engine 39 may include a light source and light processing circuitry that is selectively controlled by the panel driver 38 to generate light and project an image representing the audiovisual signal onto a surface. Exemplary types of light engines 39 will be discussed in greater detail with respect to Figures 2 A - 2D. However, persons skilled in the art will understand that any light engine used in any type of projection device (portable or otherwise) may be incorporated in the projection unit 30 of the device 10.
  • the light generated by the light engine 39 is provided to the projection lens 48 which projects the full color image onto a display surface (e.g. screen, wall, etc).
  • the projection lens 48 may be focused in response to user input received by the controller 12 as needed. Additionally, the operation and position of the various components of the projection unit 30 may be controlled via a control signal that is generated by either the user or another component of device 10.
  • the projection unit 30 of the device may also include an infrared light emitting diode (IR LED) 50 that is coupled to the panel driver 38.
  • the controller 12 may generate an IR audiovisual input signal based on the audiovisual input signal received from one of the plurality of inputs 31 or user controls.
  • the IR audiovisual signal may be provided to the panel driver 38 which conditions the IR LED 50 to project an IR version of the audiovisual input signal.
  • the IR signal is imperceptible to the human eye but may be used by other components as an input control signal in the manner discussed below.
  • the device 10 may also include a camera module 52.
  • the camera module 52 may include a lens 54 coupled to an image sensor 56. Image data received via the lens 54 and sensed by image sensor 56 may be processed by image processor 58.
  • the camera module 52 may operate as a convention digital camera able to capture one of still images and video images.
  • the camera module 52 may also operate as a sensor that senses at least one type of image being displayed and uses the sensed image as a control signal for controlling at least one function of the device 10 as will be discussed below.
  • the lens 54 of the camera module 52 shown in conjunction with the projection lens 48 of the projection unit, is described for purposes of example only and the device may include a single lens that is shared between the projection unit 30 and camera module 52.
  • a motion sensor 60 is also provided.
  • the motion sensor 60 is coupled to the controller 12 and selectively senses data representing movement of the device 10.
  • the motion sensor 60 may sense the position of the device and generate an input control signal used by the controller 12 for controlling device operation.
  • the motion sensor 60 may include any type of motion sensor including but not limited to a gyroscope and/or an accelerometer.
  • the device 10 may include at least three accelerometers positioned on the X, Y and Z axis such that accelerometers may sense the position of the device 10 with respect to gravity.
  • the motion sensor 60 may refer to a plurality of different sensors that are able to sense various types of data which may be provided to the controller 12 for analysis and processing thereof.
  • the device 10 also includes a communications processor 62 that enables bidirectional communication between the device 10 and a remote device.
  • the communication processor 62 is described generally and is intended to include all electronic circuitry and algorithms that enable bidirectional communication between devices.
  • the communication processor 62 enables the device to operate as a cellular phone.
  • the communication processor 62 includes all components and instructions for connecting the device 10 to the internet.
  • the communication processor 62 includes all components associated with a smartphone to enable a plurality of different types of bidirectional communication (e.g. telephone, email, messaging, internet, etc) between the device and a communications network.
  • Figures 2 A - 2D are block diagrams representing different types of light engines 39 that may be employed within the projection unit 30 described in Figure 1. It should be understood that the portable projection device 10 as discussed herein may utilize any of the different light engines 39a - 39d described in Figures 2A - 2D. It should also be appreciated that the description of the light engines 39a - 39d is not limited to those described herein and any type of light engine able to generate and process light into a full color image for display on a surface may be used by the device 10.
  • Figure 2A represents a three-color LED light engine 39a.
  • the light engine 39a is controlled via the panel driver 38 (Fig. 1).
  • the panel driver 38 receives the audiovisual input signal from the controller 12 and controls the operation of light emitting diodes (LED) 40a, 40b, and 40c.
  • the LEDs 40a - c represent three color LEDs including a blue LED 40a, a green LED 40b and a red LED 40c.
  • the audiovisual input signal provided to the panel driver 38 has been separated into its component colors by the controller 12 and the panel driver 38 selectively controls the LEDs 40a-c to emit the necessary light to generate the desired audiovisual image for output. Light generated by the LEDs 40a-c is focused into a full color image by a focusing element 42.
  • the focusing element 42 may be an x-cube. In another embodiment, the focusing element 42 may be a dichroic mirror. These focusing elements are described for purposes of example only and any focusing element 42 able to combine light from a plurality of LEDs into a single full color image may be used in the projection unit 30.
  • the focused image is projected on a liquid crystal on silicon (LCOS) chip 44 for receiving light emitted from each of the LEDs 40a - c and optically combines the received light via a polarizing beam splitter 46.
  • the combined light is provided to the projection lens 48 which projects the combined full color image onto a display surface (e.g. screen, wall, etc).
  • the projection lens 48 may be focused in response to user input received by the controller 12 as needed. Additionally, the operation and position of the various components of the projection unit 30 may be controlled via a control signal that is generated by either the user or another component of device 10.
  • Figure 2B depicts a white-light LED light engine 39b that may be used in the projection unit of the device 10.
  • Light engine 39b may include a while light LED 41.
  • the panel driver 38 (in Fig. 1) receives the audiovisual input signal from the controller 12 and controls the operation of the white light LED 41.
  • the LED 41 is controlled to emit a pattern of light to generate the desired audiovisual image for output.
  • Light generated by the LED 41 is provided to a LCOS chip 44b.
  • the LCOS chip 44b has a predetermined pattern of primary color dots thereon.
  • the panel driver 38 controls the LCOS chip 44b to have certain of the dots illuminated by the light emitted by LED 41 to provide colored light to the polarizing beam splitter 46b which optically combines the colored light reflected off of the LCOS chip 44b.
  • the combined light is provided to the projection lens 48 which projects the combined full color image onto a display surface (e.g. screen, wall, etc).
  • FIG. 2C depicts a digital light processing (DLP) engine 39c.
  • the DLP engine 39c includes three colored light sources 40a, 40b, and 40c. In one
  • the light sources 40a - c represent three color LEDs including a blue LED 40a, a green LED 40b and a red LED 40c. While these are described as LED light sources, this is done for purposes of example only and the light sources may be any type of light sources including, but not limited to lasers as are known to be implemented in a DLP light engine.
  • the light sources 40a-c are not on simultaneously. Rather, the panel driver 38 controls the individual light sources in sequence and the emitted light is provided to the focusing element for producing the full color image.
  • a color wheel may be positioned between a light source and the focusing element 42. The panel driver 38 selectively controls the color wheel to rotate to one of the three primary colors based on the data in the audiovisual input signal to illuminate a respective light color at a given time.
  • the audiovisual input signal provided to the panel driver 38 has been separated into its component colors by the controller 12 and the panel driver 38 selectively controls the LEDs 40a-c to emit the necessary light to generate the desired audiovisual image for output.
  • Light generated by the LEDs 40a-c are projected and focused into a full color image by a focusing element 42.
  • the focusing element 42 may include a mirror unit 45 formed from at least one mirror which reflects the emitted light through prisms 47.
  • the focused image is provided to the projection lens 48 which projects the combined full color image onto a display surface (e.g. screen, wall, etc).
  • FIG. 2D depicts a laser-based light engine 39d.
  • the laser light engine 39d includes light sources 43a - c that each emit a respective color light based on an audiovisual input signal.
  • the light sources 43a - c are lasers that emit light in three distinct wavelengths.
  • light source 43a may be a laser that emits light at a wavelength associated with the color red whereas light source 43b may emit light at a wavelength associated with the color green and light source 43c may emit light at a wavelength associated with the color blue.
  • the panel driver 38 controls the light sources 43a-c to emit respective colored light based on the audiovisual input signal received from the controller 12.
  • the emitted light (either, concurrently or sequentially - depending on the panel driver being used) is provided to a focusing element 42.
  • the focusing element 42 includes a set of combiner optics 49 that receives and combines the emitted laser light and provides the light to the mirror unit 45 including a plurality of individual mirrors.
  • the mirror unit 45 is controlled by the panel driver 38 to rotate the plurality of mirrors based on the audiovisual input signal and reflects light to the projection lens 48 for projection onto a display surface (e.g. screen, wall, etc).
  • the projected image from a pico projector is subject to movement and vibration from many sources such as but not limited to a shaky hand, a heartbeat, hand twitching from nervousness, fan vibrations, people kicking the table, or even speaker feedback.
  • This present invention attempts to compensate for the majority of projector vibrations by adjusting the picture to stabilize the image on the screen even though the projector is not steady.
  • the stability of the device and thus the image is determined and the size of the image is adjusted according to the level of instability detected.
  • the size of the image displayed with respect to the display area is in correlation with the amount of instability detected.
  • FIG. 3 A block diagram of the stabilization device 100 used to compensate for movement and vibrations affecting the display of an image is shown in Figure 3.
  • the stabilization device 100 includes the motion sensor 60 (as shown in Figure 1), an instability detector 1 10, a sub-image size calculator 120 and an image stabilizer 130.
  • the instability detector 1 10, sub-image size calculator 120 and image stabilizer 130 may be integrally formed within the controller 12. Alternatively, these elements may be included within the device 10 separate from the controller 12.
  • the motion sensor 60 measures displacement or motion of the device 10.
  • the motion may be caused by any number of external forces such as movement of the surface on which the device 10 is positioned or movement by a user holding the device.
  • the motion sensor may generate an angular velocity vector with pitch and yaw components to represent the motion.
  • the angular velocity is provided to the instability detector 1 10.
  • the instability detector 1 10 detects a level of instability for the image being displayed.
  • the manner of detecting the instability level will be described further with respect to Figure 4.
  • the determined level of instability is provided to the sub-image size calculator 120 which calculates the size at which the image is to be displayed within the display area for the image based on the detected level of instability. The larger the level of instability, the smaller the size of the displayed image.
  • a look-up- table may be used to provide the desired size versus instability characteristic.
  • the determined size is provided by the sub-image size calculator 120 to the image stabilizer 130 along with an angular velocity signal from the motion sensor 60 to map the image within the display area.
  • pitch and yaw velocity components may be used to calculate x and y position offsets, subject to the amount of spatial headroom allotted for instability compensation by the size control signal.
  • FIG. 4 shows a block diagram of the instability detector 1 10.
  • the instability detector 110 includes a leaky integrator 1 12, a peak-to-peak detector 1 14 and a low pass filter 1 16. Since position is the integral of velocity, the leaky integrator 112 receives the angular velocity signal from the motion sensor 60 and calculates an angular deviation of the image. The "leakiness" of the integrator gradually returns the integrator output (and instability detector output) to zero when the projector is stable.
  • the peak-to-peak detector 114 calculates a peak-to-peak position instability of the image to determine the average energy of the instability. An RMS or other type of detector could be used in place of the peak-to-peak detector 1 14.
  • the low pass filter 1 16 selectively compensates for the determined peak-to-peak instability by gradually modifying the size of the image being displayed to prevent an abrupt re-sizing of the image when correcting for image instability.
  • the peak-to-peak position instability is used to decide the size of the sub-image which is calculated by the sub-image size calculator 120.
  • the output of the instability detector 110 is provided to the sub-image size calculator 120 for calculating the adaptive sub-image size of the display image.
  • a block diagram showing the elements forming the image stabilizer 130 is shown in Figure 5.
  • the image stabilizer 130 is connected to the sub-image size calculator 120 and motion sensor 60.
  • the image stabilizer 130 includes a sub-image position calculator 132, a video resizer 134 and a display mapper 136.
  • the sub-image position calculator 132 may include a leaky integrator. The leaky integrator may be used to assist with re-positioning of the image back to a center point.
  • the sub-image position calculator 132 receives the angular velocity signal from the motion sensor 60 and a size signal from the sub-image size calculator 120. The angular velocity signal and size signal may be used by the sub-image position calculator 132 to move the image to the appropriate location of the display area.
  • the video resizer 134 receives the size signal from the sub-image size calculator 120 and the video signal from the input selector 28 (shown in Figure 1).
  • the sub-image position calculator 132 and video resizer 134 calculate the sub-image position and scale the video.
  • the display mapper 136 is connected to the sub-image position calculator 132 and video resizer 134.
  • the display mapper 136 maps the image to be displayed within the display area and stabilizes the image using the adaptive sub-image size.
  • Figures 6A-6D show the active display area and positioning of the active display area within the total display area based on varying levels of instability.
  • Figure 6A shows the total display area of the imager 200.
  • the instability detector 110 detects a low level of instability
  • the active display area of the image 210 substantially fills the entire display area 200.
  • Figure 6B illustrates the active display area of the image 220 within the total display area 200 when a medium level of instability is detected.
  • the size of the active display area of the image is decreased from that when a low level of instability is detected forming a border or frame between a periphery of the active display area of the imager 220 and the total display area 200. As the level of instability increases, the size of the active display area decreases accordingly. By reducing the size of the active display area according to the level of instability detected, the amount of spatial headroom allotted for instability compensation is increased. When a high level of instability is detected, the size of the active display area of the imager 230 is decreased further with respect to the total display area of the imager 200 as shown in Figure 6C. When a high level of instability is detected, the image displayed can be reduced by up to substantially fifty percent.
  • Figure 6D further illustrates the size and position of the active display area with respect to the total display area when both a high level of instability and a horizontal bump is detected.
  • the size of the active display area is minimized due to the high level of instability detected and the position of the active display area is also moved within the total display area to account for the horizontal bump which was detected.
  • the position of the active display area will move within the total display area in a direction opposite the detected movement. In the instance shown, the horizontal bump moved the device and total display area to the right.
  • the active display area moved to the left of the total display area. If the detected movement of the device was in an upwards vertical direction, the active display area would move toward the bottom of the total display area.
  • the return-to-zero property of the leaky integrators within sub-image position calculator 132 will act to center the active display area within the total display area.
  • a flow chart describing the method 700 of the present invention is shown in Figure 7.
  • the motion sensor 60 measures the motion of the device in step 710. Based on the measured motion, an angular position of the active image display area within the total display area is calculated in step 720.
  • a peak-to-peak position instability indicating a range of vibration of the active display area is calculated.
  • the position instability signal is then low pass filtered to slow the change in size of the active display area based on the detected range of vibration and level of instability in step 740.
  • An adaptive sub-image size for the active display area is then calculated in step 750 and the sub-image position is calculated and the video is scaled in the image stabilizer in step 760.
  • the active display area and displayed image is stabilized and mapped within the total display area using the adaptive sub- image size and position signals in step 770.
  • the implementations described herein may be implemented in, for example, a method or process, an apparatus, or a combination of hardware and software. Even if only discussed in the context of a single form of implementation (for example, discussed only as a method), the implementation of features discussed may also be implemented in other forms (for example, a hardware apparatus, hardware and software apparatus, or a computer-readable media).
  • An apparatus may be implemented in, for example, appropriate hardware, software, and firmware.
  • the methods may be implemented in, for example, an apparatus such as, for example, a processor, which refers to any processing device, including, for example, a computer, a microprocessor, an integrated circuit, or a programmable logic device. Processing devices also include communication devices, such as, for example, computers, cell phones, tablets, portable/personal digital assistants ("PDAs”), and other devices that facilitate communication of information between end-users.
  • PDAs portable/personal digital assistants
  • the methods may be implemented by instructions being performed by a processor, and such instructions may be stored on a processor or computer-readable media such as, for example, an integrated circuit, a software carrier or other storage device such as, for example, a hard disk, a compact diskette, a random access memory ("RAM"), a read-only memory (“ROM”) or any other magnetic, optical, or solid state media.
  • the instructions may form an application program tangibly embodied on a computer-readable medium such as any of the media listed above.
  • a processor may include, as part of the processor unit, a computer-readable media having, for example, instructions for carrying out a process.
  • the instructions corresponding to the method of the present invention, when executed, can transform a general purpose computer into a specific machine that performs the methods of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

An apparatus and method for stabilizing an image displayed by a projection unit (30) is provided. A motion sensor (60) senses motion of the projection unit (30) and an instability detector (110) detects a level of instability of the displayed image based on the motion sensed by the motion sensor. A size calculator (120) determines a size for the displayed image within the image area based on the detected level of instability and an image stabilizer (130) maps the displayed image within the display area to reduce the effects caused by movement of the projection unit (30).

Description

IMAGE STABILIZATION ΓΝ A PICO PROJECTOR
FIELD
[001] The present arrangement provides a system and method for compensating for movement and vibrations of a portable projection device to stabilize the display of an image.
BACKGROUND
[002] Conventionally, projection devices were (and are) designed as non-mobile devices that are positioned in a room and project a series of audio-visual images on a screen that is viewable by individuals within the room and in the line of sight of the projected image. To ensure projection quality and an optimal viewing experience for the individuals, these projection devices are precisely configured to minimize errors in the audio-visual images being displayed. Examples of these systems include but are not limited to movie theaters, professional meeting rooms, lecture halls and the like.
[003] However, the rapid miniaturization of electronic devices has also extended to projection devices. Currently, there exists a portable electronic projection device that may be easily transported and able to turn virtually any room into a projection room. These portable electronic projection devices are termed pico projectors. A pico projector may be included in any handheld device that can selectively project at least one of an image or series of images on a surface. Moreover, it is important for the pico projector to be able to generate a clear image of sufficient quality on any type of surface. This may include, for example, a conventional display screen or a wall in a room. It is therefore necessary for the pico projector to compensate for any surface impurities when generating and projecting a display image.
[004] Moreover, a further drawback associated with pico projection relates to the nature of the device itself. Because the pico projector is naturally handheld and/or portable, the pico projector suffers from increased visual display errors as compared to a traditional projection device. The increased visual errors (e.g. noise, distortion, etc) in images projected by pico projectors result from the often sub-optimal positioning of the pico projector with respect to the surface on which the images are being displayed as well as the orientation of individuals viewing the image to the surface on which the image is displayed. [005] Additionally, as pico projectors are increasingly being embodied in multifunction devices, activities associated with functions other than the projection of images may interrupt, distort and/or otherwise affect the image being projected by the pico projector and/or the experience of the individuals viewing the projected images. An example of these drawbacks is present in a multi-function portable electronic device that, in addition to being a pico projector, is also a portable communication device (e.g. smartphone). Various call and message functionality associated with the portable communication device may interfere with the functionality of the pico projector embodied in the multifunction portable electronic device.
[006] Furthermore, due to the portable nature of a pico projector, the projected image from a pico projector is subject to movement and vibration from many sources, e.g. a shaky hand, heartbeat, hand twitching from nervousness, fan vibrations, people kicking the table, or even speaker feedback. These movements and vibrations interfere with the ability to provide a stable displayed image with the pico projector.
[007] It would therefore be desirable to correct any of the above identified drawbacks associated with pico projectors. A system and method according the present invention addresses these deficiencies.
SUMMARY
[008] In one embodiment, an apparatus for stabilizing an image displayed by a projection unit is provided. A motion sensor senses motion of the projection unit and an instability detector detects a level of instability of the displayed image using the motion sensed by the motion sensor. A size calculator determines a size for the displayed image within the image area based on the detected level of instability and an image stabilizer maps the displayed image within the display area to reduce the effects caused by movement of the projection unit.
[009] In another embodiment, a method of stabilizing an image displayed by a projection unit is provided. The method includes sensing motion of the displayed image and detecting a level of instability of the projection unit. A size for the displayed image within the image area is determined based on the detected level of instability. A position of the displayed image within an image area is calculated and the displayed image within the display area is mapped to reduce the effects caused by movement of the projection unit. [0010] In another embodiment, an apparatus for stabilizing an image displayed by a projection unit is provided. The apparatus includes a means, such as a motion sensor, for sensing motion of the projection unit and means, such as an instability detector, for detecting a level of instability of the displayed image and calculating a size of the displayed image within an image area. The apparatus also includes a means, such as a size calculator, for determining a position for the displayed image within the image area based on the detected level of instability and size of the displayed image and means, such as an image stabilizer, for mapping the displayed image within the display area to reduce the effects caused by movement of the projection unit.
[0011 ] This invention attempts to compensate for the majority of projector vibrations by adjusting the picture to stabilize the image on the display surface even though the projector is not steady.
[0012] The above presents a simplified summary of the subject matter in order to provide a basic understanding of some aspects of subject matter embodiments. This summary is not an extensive overview of the subject matter. It is not intended to identify key/critical elements of the embodiments or to delineate the scope of the subject matter. Its sole purpose is to present some concepts of the subject matter in a simplified form as a prelude to the more detailed description that is presented later.
[0013] To the accomplishment of the foregoing and related ends, certain illustrative aspects of embodiments are described herein in connection with the following description and the annexed drawings. These aspects are indicative, however, of but a few of the various ways in which the principles of the subject matter can be employed, and the subject matter is intended to include all such aspects and their equivalents. Other advantages and novel features of the subject matter can become apparent from the following detailed description when considered in conjunction with the drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
[0014] FIG. 1 is a block diagram of the portable projection device according to aspects of the present invention;
[0015] FIGS. 2A - 2D are exemplary light engines for use in the portable projection device according to aspects of the present invention;
[0016] FIG. 3 is a block diagram of the stabilization device according to aspects of the present invention; [0017] FIG. 4 is a block diagram of the instability detector within the stabilization device according to aspects of the present invention;
[0018] FIG. 5 is a block diagram of the image stabilizer within the stabilization device according to aspects of the present invention;
[0019] FIG. 6A-6D are diagrams of the active display area of the imager with respect to instability according to aspects of the present invention; and
[0020] FIG. 7 is a flow diagram of the method for stabilizing a displayable image according to aspects of the present invention. DETAILED DESCRIPTION
[0021] It should be understood that the elements shown in the FIGS, may be implemented in various forms of hardware, software or combinations thereof.
Preferably, these elements are implemented in a combination of hardware and software on one or more appropriately programmed general-purpose devices, which may include a processor, memory and input/output interfaces.
[0022] The present description illustrates the principles of the present disclosure. It will thus be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described or shown herein, embody the principles of the disclosure and are included within its spirit and scope.
[0023] All examples and conditional language recited herein are intended for pedagogical purposes to aid the reader in understanding the principles of the disclosure and the concepts contributed by the inventor to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions.
[0024] Moreover, all statements herein reciting principles, aspects, and embodiments of the disclosure, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future, i.e., any elements developed that perform the same function, regardless of structure.
[0025] Thus, for example, it will be appreciated by those skilled in the art that the block diagrams presented herein represent conceptual views of illustrative circuitry embodying the principles of the disclosure. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudocode, and the like represent various processes which may be substantially represented in computer readable media and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
[0026] The functions of the various elements shown in the figures may be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software. When provided by a processor, the functions may be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which may be shared. Moreover, explicit use of the term "processor" or "controller" should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, digital signal processor ("DSP") hardware, read only memory ("ROM") for storing software, random access memory ("RAM"), and nonvolatile storage.
[0027] If used herein, the term "component" is intended to refer to hardware, or a combination of hardware and software in execution. For example, a component can be, but is not limited to being, a process running on a processor, a processor, an object, an executable, and/or a microchip and the like. By way of illustration, both an application running on a processor and the processor can be a component. One or more components can reside within a process and a component can be localized on one system and/or distributed between two or more systems. Functions of the various components shown in the figures can be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software.
[0028] Other hardware, conventional and/or custom, may also be included. Similarly, any switches shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the implementer as more specifically understood from the context.
[0029] In the claims hereof, any element expressed as a means for performing a specified function is intended to encompass any way of performing that function including, for example, a) a combination of circuit elements that performs that function or b) software in any form, including, therefore, firmware, microcode or the like, combined with appropriate circuitry for executing that software to perform the function. The disclosure as defined by such claims resides in the fact that the functionalities provided by the various recited means are combined and brought together in the manner which the claims call for. It is thus regarded that any means that can provide those functionalities are equivalent to those shown herein. The subject matter is now described with reference to the drawings, wherein like reference numerals are used to refer to like elements throughout. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the subject matter. It can be evident, however, that subject matter embodiments can be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to facilitate describing the embodiments.
[0030] The present invention is directed towards a multifunction portable electronic device (hereinafter, the "device") that includes audiovisual image projection capabilities (e.g. a pico projector) and method of operating the same. An exemplary block diagram of the device 10 is provided in Figure 1. The device 10 includes a controller 12. The controller 12 is a component that executes various operational algorithms that control the various functions of the device 10. In one embodiment, the controller 12 executes algorithms that enable audio and video processing of a source input signal. The controller 12 may also include a memory in which various machine executable instructions controlling various device functionality may be stored and accessed as needed in response to various control signals generated by one of (a) a user; and (b) other components of the device 10 as will be discussed below. The memory of the controller 12 may also store data associated with any input signal received by the controller 12. The memory of controller 12 may also store user- specific information that is associated with a user of the device 10. In one embodiment, user specific information may include user preferences for configuring the device for a particular type of operation. The user specific information may include global preference information that configures aspects of device operation that are common between the various functions as well as function specific preference information that configures the device to operate in a particular manner when executing a particular function. While the controller 12 is described as including a memory, one skilled in the art should understand that the memory (or other storage medium) within the device may be a separately embodied component that is read/write accessible by the controller 12 as needed. [0031 ] The device 10 also includes a power converter 14 and battery 16 connected to the power converter 14. The power converter 14 is selectively connectable to an input power source (either AC or DC) for receiving power therefrom. Power received by the power converter 14 is provided to the battery 16 and selectively charges the battery 16 as needed. It should be understood that the operation of charging is meant to include an initial charging of the battery 16 as well as recharging the battery 16 as the power level is being depleted. Power is also simultaneously provided by the power converter 14 to the controller 12 for powering operation thereof. The controller 12 may selectively detect when input power is being provided to the power converter 14 causing the device 10 to operate in a first power mode when a connection to an input power source is detected and a second mode when no connection to an input power source is detected. In one embodiment of the first power mode, the controller 12 may execute a battery monitoring algorithm that enables the controller 12 to selectively detect a power level in the battery 16 and control the power converter 14 to direct power thereto. The controller 12 can also control charging of the battery 16 when the detected power level in the battery 16 is below a predetermined threshold. In another embodiment of the first power mode, the controller 12 may automatically direct power from the power converter 14 to be provided to the battery 16 in response to connection of the power converter with the input power source. In the second mode of operation, the controller 12 is powered by the battery 16 until such time that the battery power is depleted below a predetermined operational threshold representing a minimum amount of power needed to operate the device.
[0032] The controller 12 may receive an input audiovisual signal from one of a plurality of device inputs collectively referred to using reference numeral 15. The controller 12 can control selective projection of the audiovisual input signal using projection unit/microdisplay 30. The input audiovisual signal may include one of (a) a still image; (b) a series of images; (c) a video signal; and (d) audio signal. The input audiovisual signal may also include an audio component that is intended to be audibly reproduced by speaker 29 in conjunction with the projection, by the projection unit 30, of the one still image or series of images as will be discussed below.
[0033] The plurality of inputs may include any combination of but is not limited to (a) a card reader 18; (b) a USB port 20; (c) a digital video input port (HDMI) 22; (d) a VGA/Component video input port 24; and (e) a composite/S-Video input port 26. The depiction of the plurality of input ports 15 is for purposes of example only and the device 10 may include any combination of the described input ports or other known input ports.
[0034] The card reader selectively receives a storage card that may include data representative of the input audiovisual signal that is accessed by the controller 12 and provided to the projection unit 30 and/or speaker 29 for output thereof. In one embodiment, the card reader 18 may be a MicroSD card reader. This is described for purposes of example only and any card reading device able to read any standardized storage card may be included in device 10. The USB port 20 enables the device 10 to be selectively connected to one of (a) a portable storage device (e.g. flash drive); or (b) a secondary device, that stores data representative of the audiovisual input signal. Any of the digital video input 22, VGA/component input 24 and/or composite video input 26 may enable connection with a secondary device that includes the source audiovisual input signal and are coupled to the controller 12 via an input selector 28. The input selector 28 selectively couples a respective one of the digital video input 22, VGA/component input 24 and/or composite video input 26 with the controller 12 such that the controller 12 may provide the audiovisual input signal to the projection unit 30 and speaker 29 for output thereof.
[0035] The device 10 further includes a plurality of user controls, collectively referred to using reference numeral 31, enabling the user to selectively control various device functions. An input/output (IO) interface 32 may include at least one user selectable button associated with at least one device function such that selection thereof initiates a control signal received by the controller 12 that is used to control the particular device function. In one embodiment, the IO interface 32 may be a touch screen and the at least one button may be a user selectable image element displayed on the touch screen enabling selection thereof by a user. In this embodiment, the number and types of user selectable image elements may be generated by the controller 12 depending on the particular operational mode of the device. For example, during projection mode, the user selectable image elements may enable activation of image projection functionality and, if the device 10 is operating in a communication mode, the user selectable image elements displayed on the I/O interface 32 may relate to
communication functionality. In another embodiment, the IO interface 32 may include at least one dedicated button on a housing of the device 10 that may be manually activated by a user. [0036] Another user control 31 included with the device 10 includes a keyboard 34. The keyboard 34 enables a user to enter alphanumeric text-based input commands for controlling the operation of the device. In one embodiment, the keyboard is positioned on the housing of the device. In another embodiment, there is no dedicated keyboard and the keyboard may be generated by the controller 12 and provided for display by the IO interface 32.
[0037] A further user control 31 that may be provided is a remote infrared (IR) sensor 36. Remote IR sensor 36 selectively receives an IR input signal that is generated by a remote control. The IR input signal received by the remote IR sensor 36 is communicated to the controller 12 which interprets the received IR input signal and initiates operation of a particular function of the device corresponding to user input.
[0038] Any of the user controls 32, 34 and/or 36 may be used to generate control signals for selecting an input audiovisual signal from a respective input source of the plurality of input sources 15. The control signals input via the user are received by the controller 12 which processes the user input signal and selects the source of the input audiovisual signal. Input received from any of the user controls 31 may also condition the controller 12 to selectively output the audiovisual signal using projection unit 30 and speaker 29.
[0039] Operation of the projection unit 30 will now be discussed. The projection unit 30 includes a panel driver 38, a light engine 39 and a projection lens 48. The panel driver 38 receives the audiovisual input signal from the controller 12 and controls the light engine to emit light representative of the audiovisual input signal that may be projected via a projection lens 48 coupled thereto. The light engine 39 may include a light source and light processing circuitry that is selectively controlled by the panel driver 38 to generate light and project an image representing the audiovisual signal onto a surface. Exemplary types of light engines 39 will be discussed in greater detail with respect to Figures 2 A - 2D. However, persons skilled in the art will understand that any light engine used in any type of projection device (portable or otherwise) may be incorporated in the projection unit 30 of the device 10. In operation, the light generated by the light engine 39 is provided to the projection lens 48 which projects the full color image onto a display surface (e.g. screen, wall, etc). The projection lens 48 may be focused in response to user input received by the controller 12 as needed. Additionally, the operation and position of the various components of the projection unit 30 may be controlled via a control signal that is generated by either the user or another component of device 10.
[0040] The projection unit 30 of the device may also include an infrared light emitting diode (IR LED) 50 that is coupled to the panel driver 38. In certain exemplary operations, the controller 12 may generate an IR audiovisual input signal based on the audiovisual input signal received from one of the plurality of inputs 31 or user controls. The IR audiovisual signal may be provided to the panel driver 38 which conditions the IR LED 50 to project an IR version of the audiovisual input signal. The IR signal is imperceptible to the human eye but may be used by other components as an input control signal in the manner discussed below.
[0041] The device 10 may also include a camera module 52. The camera module 52 may include a lens 54 coupled to an image sensor 56. Image data received via the lens 54 and sensed by image sensor 56 may be processed by image processor 58. The camera module 52 may operate as a convention digital camera able to capture one of still images and video images. The camera module 52 may also operate as a sensor that senses at least one type of image being displayed and uses the sensed image as a control signal for controlling at least one function of the device 10 as will be discussed below. The lens 54 of the camera module 52, shown in conjunction with the projection lens 48 of the projection unit, is described for purposes of example only and the device may include a single lens that is shared between the projection unit 30 and camera module 52.
[0042] A motion sensor 60 is also provided. The motion sensor 60 is coupled to the controller 12 and selectively senses data representing movement of the device 10. The motion sensor 60 may sense the position of the device and generate an input control signal used by the controller 12 for controlling device operation. The motion sensor 60 may include any type of motion sensor including but not limited to a gyroscope and/or an accelerometer. For example, in an embodiment, where the motion sensor 60 includes an accelerometer, the device 10 may include at least three accelerometers positioned on the X, Y and Z axis such that accelerometers may sense the position of the device 10 with respect to gravity. The motion sensor 60 may refer to a plurality of different sensors that are able to sense various types of data which may be provided to the controller 12 for analysis and processing thereof.
[0043] The device 10 also includes a communications processor 62 that enables bidirectional communication between the device 10 and a remote device. The communication processor 62 is described generally and is intended to include all electronic circuitry and algorithms that enable bidirectional communication between devices. In one embodiment, the communication processor 62 enables the device to operate as a cellular phone. In another embodiment, the communication processor 62 includes all components and instructions for connecting the device 10 to the internet. In a further embodiment, the communication processor 62 includes all components associated with a smartphone to enable a plurality of different types of bidirectional communication (e.g. telephone, email, messaging, internet, etc) between the device and a communications network.
[0044] Figures 2 A - 2D are block diagrams representing different types of light engines 39 that may be employed within the projection unit 30 described in Figure 1. It should be understood that the portable projection device 10 as discussed herein may utilize any of the different light engines 39a - 39d described in Figures 2A - 2D. It should also be appreciated that the description of the light engines 39a - 39d is not limited to those described herein and any type of light engine able to generate and process light into a full color image for display on a surface may be used by the device 10.
[0045] Figure 2A represents a three-color LED light engine 39a. The light engine 39a is controlled via the panel driver 38 (Fig. 1). The panel driver 38 receives the audiovisual input signal from the controller 12 and controls the operation of light emitting diodes (LED) 40a, 40b, and 40c. The LEDs 40a - c represent three color LEDs including a blue LED 40a, a green LED 40b and a red LED 40c. The audiovisual input signal provided to the panel driver 38 has been separated into its component colors by the controller 12 and the panel driver 38 selectively controls the LEDs 40a-c to emit the necessary light to generate the desired audiovisual image for output. Light generated by the LEDs 40a-c is focused into a full color image by a focusing element 42. In one embodiment, the focusing element 42 may be an x-cube. In another embodiment, the focusing element 42 may be a dichroic mirror. These focusing elements are described for purposes of example only and any focusing element 42 able to combine light from a plurality of LEDs into a single full color image may be used in the projection unit 30.
[0046] The focused image is projected on a liquid crystal on silicon (LCOS) chip 44 for receiving light emitted from each of the LEDs 40a - c and optically combines the received light via a polarizing beam splitter 46. The combined light is provided to the projection lens 48 which projects the combined full color image onto a display surface (e.g. screen, wall, etc). The projection lens 48 may be focused in response to user input received by the controller 12 as needed. Additionally, the operation and position of the various components of the projection unit 30 may be controlled via a control signal that is generated by either the user or another component of device 10.
[0047] Figure 2B depicts a white-light LED light engine 39b that may be used in the projection unit of the device 10. Light engine 39b may include a while light LED 41. The panel driver 38 (in Fig. 1) receives the audiovisual input signal from the controller 12 and controls the operation of the white light LED 41. The LED 41 is controlled to emit a pattern of light to generate the desired audiovisual image for output. Light generated by the LED 41 is provided to a LCOS chip 44b. The LCOS chip 44b has a predetermined pattern of primary color dots thereon. The panel driver 38 controls the LCOS chip 44b to have certain of the dots illuminated by the light emitted by LED 41 to provide colored light to the polarizing beam splitter 46b which optically combines the colored light reflected off of the LCOS chip 44b. The combined light is provided to the projection lens 48 which projects the combined full color image onto a display surface (e.g. screen, wall, etc).
[0048] Figure 2C depicts a digital light processing (DLP) engine 39c. The DLP engine 39c includes three colored light sources 40a, 40b, and 40c. In one
embodiment, the light sources 40a - c represent three color LEDs including a blue LED 40a, a green LED 40b and a red LED 40c. While these are described as LED light sources, this is done for purposes of example only and the light sources may be any type of light sources including, but not limited to lasers as are known to be implemented in a DLP light engine. In operation, the light sources 40a-c are not on simultaneously. Rather, the panel driver 38 controls the individual light sources in sequence and the emitted light is provided to the focusing element for producing the full color image. In another embodiment of a DLP engine, a color wheel may be positioned between a light source and the focusing element 42. The panel driver 38 selectively controls the color wheel to rotate to one of the three primary colors based on the data in the audiovisual input signal to illuminate a respective light color at a given time.
[0049] The audiovisual input signal provided to the panel driver 38 has been separated into its component colors by the controller 12 and the panel driver 38 selectively controls the LEDs 40a-c to emit the necessary light to generate the desired audiovisual image for output. Light generated by the LEDs 40a-c are projected and focused into a full color image by a focusing element 42. The focusing element 42 may include a mirror unit 45 formed from at least one mirror which reflects the emitted light through prisms 47. The focused image is provided to the projection lens 48 which projects the combined full color image onto a display surface (e.g. screen, wall, etc).
[0050] Figure 2D depicts a laser-based light engine 39d. The laser light engine 39d includes light sources 43a - c that each emit a respective color light based on an audiovisual input signal. The light sources 43a - c are lasers that emit light in three distinct wavelengths. For example, light source 43a may be a laser that emits light at a wavelength associated with the color red whereas light source 43b may emit light at a wavelength associated with the color green and light source 43c may emit light at a wavelength associated with the color blue. The panel driver 38 controls the light sources 43a-c to emit respective colored light based on the audiovisual input signal received from the controller 12. The emitted light (either, concurrently or sequentially - depending on the panel driver being used) is provided to a focusing element 42. The focusing element 42 includes a set of combiner optics 49 that receives and combines the emitted laser light and provides the light to the mirror unit 45 including a plurality of individual mirrors. The mirror unit 45 is controlled by the panel driver 38 to rotate the plurality of mirrors based on the audiovisual input signal and reflects light to the projection lens 48 for projection onto a display surface (e.g. screen, wall, etc).
[0051 ] Due to the portable nature of a pico projector, the projected image from a pico projector is subject to movement and vibration from many sources such as but not limited to a shaky hand, a heartbeat, hand twitching from nervousness, fan vibrations, people kicking the table, or even speaker feedback. This present invention attempts to compensate for the majority of projector vibrations by adjusting the picture to stabilize the image on the screen even though the projector is not steady. The stability of the device and thus the image is determined and the size of the image is adjusted according to the level of instability detected. The size of the image displayed with respect to the display area is in correlation with the amount of instability detected. By adjusting the size and the position of the displayed image within a display area, it is possible to create an image that appears more stable during periods of movement and vibration. [0052] A block diagram of the stabilization device 100 used to compensate for movement and vibrations affecting the display of an image is shown in Figure 3. The stabilization device 100 includes the motion sensor 60 (as shown in Figure 1), an instability detector 1 10, a sub-image size calculator 120 and an image stabilizer 130. The instability detector 1 10, sub-image size calculator 120 and image stabilizer 130 may be integrally formed within the controller 12. Alternatively, these elements may be included within the device 10 separate from the controller 12.
[0053] The motion sensor 60 measures displacement or motion of the device 10. The motion may be caused by any number of external forces such as movement of the surface on which the device 10 is positioned or movement by a user holding the device. The motion sensor may generate an angular velocity vector with pitch and yaw components to represent the motion. The angular velocity is provided to the instability detector 1 10.
[0054] The instability detector 1 10 detects a level of instability for the image being displayed. The manner of detecting the instability level will be described further with respect to Figure 4. The determined level of instability is provided to the sub-image size calculator 120 which calculates the size at which the image is to be displayed within the display area for the image based on the detected level of instability. The larger the level of instability, the smaller the size of the displayed image. A look-up- table may be used to provide the desired size versus instability characteristic. The determined size is provided by the sub-image size calculator 120 to the image stabilizer 130 along with an angular velocity signal from the motion sensor 60 to map the image within the display area. Within the instability detector 130, pitch and yaw velocity components may be used to calculate x and y position offsets, subject to the amount of spatial headroom allotted for instability compensation by the size control signal.
[0055] Figure 4 shows a block diagram of the instability detector 1 10. The instability detector 110 includes a leaky integrator 1 12, a peak-to-peak detector 1 14 and a low pass filter 1 16. Since position is the integral of velocity, the leaky integrator 112 receives the angular velocity signal from the motion sensor 60 and calculates an angular deviation of the image. The "leakiness" of the integrator gradually returns the integrator output (and instability detector output) to zero when the projector is stable. The peak-to-peak detector 114 calculates a peak-to-peak position instability of the image to determine the average energy of the instability. An RMS or other type of detector could be used in place of the peak-to-peak detector 1 14. The low pass filter 1 16 selectively compensates for the determined peak-to-peak instability by gradually modifying the size of the image being displayed to prevent an abrupt re-sizing of the image when correcting for image instability. The peak-to-peak position instability is used to decide the size of the sub-image which is calculated by the sub-image size calculator 120. The output of the instability detector 110 is provided to the sub-image size calculator 120 for calculating the adaptive sub-image size of the display image.
[0056] A block diagram showing the elements forming the image stabilizer 130 is shown in Figure 5. The image stabilizer 130 is connected to the sub-image size calculator 120 and motion sensor 60. The image stabilizer 130 includes a sub-image position calculator 132, a video resizer 134 and a display mapper 136. The sub-image position calculator 132 may include a leaky integrator. The leaky integrator may be used to assist with re-positioning of the image back to a center point. The sub-image position calculator 132 receives the angular velocity signal from the motion sensor 60 and a size signal from the sub-image size calculator 120. The angular velocity signal and size signal may be used by the sub-image position calculator 132 to move the image to the appropriate location of the display area. The video resizer 134 receives the size signal from the sub-image size calculator 120 and the video signal from the input selector 28 (shown in Figure 1). The sub-image position calculator 132 and video resizer 134 calculate the sub-image position and scale the video. The display mapper 136 is connected to the sub-image position calculator 132 and video resizer 134. The display mapper 136 maps the image to be displayed within the display area and stabilizes the image using the adaptive sub-image size.
[0057] Figures 6A-6D show the active display area and positioning of the active display area within the total display area based on varying levels of instability. Figure 6A shows the total display area of the imager 200. When the instability detector 110 detects a low level of instability, the active display area of the image 210 substantially fills the entire display area 200. As the instability detected is low, there is little movement and vibration of the device and thus also of the displayed image. It is thus possible to fill the total display area with the displayed image without a noticing vibrations or jittering of the image. Figure 6B illustrates the active display area of the image 220 within the total display area 200 when a medium level of instability is detected. When a medium level of instability is detected the size of the active display area of the image is decreased from that when a low level of instability is detected forming a border or frame between a periphery of the active display area of the imager 220 and the total display area 200. As the level of instability increases, the size of the active display area decreases accordingly. By reducing the size of the active display area according to the level of instability detected, the amount of spatial headroom allotted for instability compensation is increased. When a high level of instability is detected, the size of the active display area of the imager 230 is decreased further with respect to the total display area of the imager 200 as shown in Figure 6C. When a high level of instability is detected, the image displayed can be reduced by up to substantially fifty percent. Figure 6D further illustrates the size and position of the active display area with respect to the total display area when both a high level of instability and a horizontal bump is detected. The size of the active display area is minimized due to the high level of instability detected and the position of the active display area is also moved within the total display area to account for the horizontal bump which was detected. The position of the active display area will move within the total display area in a direction opposite the detected movement. In the instance shown, the horizontal bump moved the device and total display area to the right. Thus, in order to stabilize the displayed image, the active display area moved to the left of the total display area. If the detected movement of the device was in an upwards vertical direction, the active display area would move toward the bottom of the total display area. Once the active display area is moved closer to the border of the total display area to adjust for the detected bump, the return-to-zero property of the leaky integrators within sub-image position calculator 132 will act to center the active display area within the total display area.
[0058] A flow chart describing the method 700 of the present invention is shown in Figure 7. As shown in the Figure, the motion sensor 60 measures the motion of the device in step 710. Based on the measured motion, an angular position of the active image display area within the total display area is calculated in step 720. In step 730, a peak-to-peak position instability indicating a range of vibration of the active display area is calculated. The position instability signal is then low pass filtered to slow the change in size of the active display area based on the detected range of vibration and level of instability in step 740. An adaptive sub-image size for the active display area is then calculated in step 750 and the sub-image position is calculated and the video is scaled in the image stabilizer in step 760. The active display area and displayed image is stabilized and mapped within the total display area using the adaptive sub- image size and position signals in step 770.
[0059] The implementations described herein may be implemented in, for example, a method or process, an apparatus, or a combination of hardware and software. Even if only discussed in the context of a single form of implementation (for example, discussed only as a method), the implementation of features discussed may also be implemented in other forms (for example, a hardware apparatus, hardware and software apparatus, or a computer-readable media). An apparatus may be implemented in, for example, appropriate hardware, software, and firmware. The methods may be implemented in, for example, an apparatus such as, for example, a processor, which refers to any processing device, including, for example, a computer, a microprocessor, an integrated circuit, or a programmable logic device. Processing devices also include communication devices, such as, for example, computers, cell phones, tablets, portable/personal digital assistants ("PDAs"), and other devices that facilitate communication of information between end-users.
[0060] Additionally, the methods may be implemented by instructions being performed by a processor, and such instructions may be stored on a processor or computer-readable media such as, for example, an integrated circuit, a software carrier or other storage device such as, for example, a hard disk, a compact diskette, a random access memory ("RAM"), a read-only memory ("ROM") or any other magnetic, optical, or solid state media. The instructions may form an application program tangibly embodied on a computer-readable medium such as any of the media listed above. As should be clear, a processor may include, as part of the processor unit, a computer-readable media having, for example, instructions for carrying out a process. The instructions, corresponding to the method of the present invention, when executed, can transform a general purpose computer into a specific machine that performs the methods of the present invention.
[0061 ] What has been described above includes examples of the embodiments. It is, of course, not possible to describe every conceivable combination of components or methodologies for purposes of describing the embodiments, but one of ordinary skill in the art can recognize that many further combinations and permutations of the embodiments are possible. Accordingly, the subject matter is intended to embrace all such alterations, modifications and variations that fall within the spirit and scope of the appended claims. Furthermore, to the extent that the term "includes" is used in either the detailed description or the claims, such term is intended to be inclusive in a manner similar to the term "comprising" as "comprising" is interpreted when employed as a transitional word in a claim.

Claims

1. An apparatus (100) for stabilizing an image displayed by a projection unit (30), comprising
a motion sensor (60) that senses motion of the projection unit;
an instability detector (1 10) that detects a level of instability of the displayed image using the motion sensed by the motion sensor;
a size calculator (120) that determines a size for the displayed image within the image area based on the detected level of instability; and
an image stabilizer (130) that maps the displayed image within the display area to reduce the effects caused by movement of the projection unit (30).
2. The apparatus as claimed in claim 1, wherein said image stabilizer (130) sets the size of the displayed image based on the level of instability detected, such that the image size decreases as the instability increases over a substantial portion of the instability characteristic.
3. The apparatus as claimed in claim 1, wherein said projection unit (30) is within a pico projector.
4. The apparatus as claimed in claim 1, wherein said instability detector (110) calculates an angular position of the displayed image and a peak-to-peak position instability level.
5. The apparatus as claimed in claim 1, wherein said image stabilizer (130) includes a display mapper (136) that moves the displayed image within the display area in a direction opposite to a detected movement of the projection unit.
6. The apparatus as claimed in claim 5, wherein said image stabilizer (130) resets the position of the display area to locate the displayed image in a central area thereof after moving the displayed image.
7. The apparatus as claimed in claim 2, wherein said image stabilizer (130) includes a video resizer (134) that resets a size of the displayed image to up to 50% of a stable size based on the measured level of instability and the size calculated by said size calculator.
8. The apparatus as claimed in claim 1, wherein said instability detector (110) includes a leaky integrator (112) that restores a default location of the displayed image within the display area.
9. A method of stabilizing an image displayed by a projection unit (30), comprising:
sensing motion of the displayed image;
detecting a level of instability of the projection unit (30); determining a size for the displayed image within the image area based on the detected level of instability;
calculating a position of the displayed image within an image area; mapping the displayed image within the display area to reduce the effects caused by movement of the projection unit (30).
10. The method as claimed in claim 9, wherein the activity of setting, sets the size of the displayed image inversely proportional to the level of instability detected.
1 1. The method as claimed in claim 9, further comprising calculating an angular position of the displayed image and a peak-to-peak position instability level.
12. The method as claimed in claim 9, further comprising moving the displayed image within the display area upon detection of movement of the projection unit in a direction opposite to a detected movement of the projection unit.
13. The method as claimed in claim 12, further comprising resetting a position of the display area to locate the displayed image in a central area thereof after moving the displayed image.
14. The method as claimed in claim 10, further comprising resetting a size of the displayed image to up to 50% of a stable size based on the measured level of instability and the size calculated by said size calculator.
15. The method as claimed in claim 9, further comprising restoring a default location of the displayed image within the display area.
16. An apparatus (100) for stabilizing an image displayed by a projection unit (30), comprising:
means for sensing motion (60) of the projection unit; means for detecting (1 10) a level of instability of the displayed image and calculating a size of the displayed image within an image area;
means for determining (120) a position for the displayed image within the image area based on the detected level of instability and size of the displayed image; and
means for mapping the displayed image within the display area to reduce the effects caused by movement of the projection unit.
17. The apparatus as claimed in claim 16, wherein said means for setting (130) sets the size of the displayed image inversely proportional to the level of instability detected.
18. The apparatus as claimed in claim 16, wherein said projection unit is within a pico projector.
19. The apparatus as claimed in claim 16, wherein said means for detecting (110) calculates an angular position of the displayed image and a peak-to-peak position instability level.
20. The apparatus as claimed in claim 16, wherein said means for setting (130) includes a display mapper (136) that moves the displayed image within the display area in a direction opposite to a detected movement of the projection unit.
21. The apparatus as claimed in claim 20, wherein said means for setting (130) resets the position of the display area to locate the displayed image in a central area thereof after moving the displayed image.
22. The apparatus as claimed in claim 17, wherein said means for setting
(130) includes a video resizer (134) that resets a size of the displayed image to up to 50% of a stable size based on the measured level of instability and the size calculated by said size calculator.
23. The apparatus as claimed in claim 16, wherein said means for detecting (110) includes a leaky integrator (112) that restores a default location of the displayed image within the display area.
PCT/US2013/048537 2013-06-28 2013-06-28 Image stabilization in a pico projector WO2014209354A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2013/048537 WO2014209354A1 (en) 2013-06-28 2013-06-28 Image stabilization in a pico projector

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2013/048537 WO2014209354A1 (en) 2013-06-28 2013-06-28 Image stabilization in a pico projector

Publications (1)

Publication Number Publication Date
WO2014209354A1 true WO2014209354A1 (en) 2014-12-31

Family

ID=48783375

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2013/048537 WO2014209354A1 (en) 2013-06-28 2013-06-28 Image stabilization in a pico projector

Country Status (1)

Country Link
WO (1) WO2014209354A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105427347B (en) * 2015-12-05 2018-11-16 中国航空工业集团公司洛阳电光设备研究所 A kind of image sequence global motion estimating method and device
EP3985966A1 (en) * 2020-10-15 2022-04-20 Koninklijke Philips N.V. A system comprising a wearable device, and a method of spatially stabilizing display information upon operation of such device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030038928A1 (en) * 2001-08-27 2003-02-27 Alden Ray M. Remote image projector for hand held and wearable applications
JP2009081616A (en) * 2007-09-26 2009-04-16 Funai Electric Co Ltd Projection type video display device and mobile electronic device
JP2009186646A (en) * 2008-02-05 2009-08-20 Nikon Corp Projector, and camera

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030038928A1 (en) * 2001-08-27 2003-02-27 Alden Ray M. Remote image projector for hand held and wearable applications
JP2009081616A (en) * 2007-09-26 2009-04-16 Funai Electric Co Ltd Projection type video display device and mobile electronic device
JP2009186646A (en) * 2008-02-05 2009-08-20 Nikon Corp Projector, and camera

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105427347B (en) * 2015-12-05 2018-11-16 中国航空工业集团公司洛阳电光设备研究所 A kind of image sequence global motion estimating method and device
EP3985966A1 (en) * 2020-10-15 2022-04-20 Koninklijke Philips N.V. A system comprising a wearable device, and a method of spatially stabilizing display information upon operation of such device
WO2022078898A1 (en) * 2020-10-15 2022-04-21 Koninklijke Philips N.V. A system comprising a wearable or hand-held device, and a method of spatially stabilizing display information upon operation of such device

Similar Documents

Publication Publication Date Title
US10218951B2 (en) MEMS scan controlled keystone and distortion correction
US9638989B2 (en) Determining motion of projection device
CN107667526B (en) Electronic device and method
WO2014171134A1 (en) Projection-type video display apparatus
US9967530B2 (en) Image stabilization and skew correction for projection devices
KR20230150925A (en) An electronic device for stabilizing image and operating method thereof
KR102166590B1 (en) Multiple laser drive system
US9158184B2 (en) Projector device, distortion correction method, and recording medium storing distortion correction program
JP2009229563A (en) Projector and reflecting device
US20120105572A1 (en) Automatically adjusting a video-capture device
WO2012157546A1 (en) Projector
CN104658462A (en) Porjector and method of controlling projector
WO2014209354A1 (en) Image stabilization in a pico projector
JP2013225040A (en) Projector
JP2012047850A (en) Projection type display device
JP6776619B2 (en) Projection device, projection control method and program
JP5816458B2 (en) projector
EP3014869B1 (en) Highlighting an object displayed by a pico projector
WO2014209355A1 (en) Apparatus and method of communicating between portable projection devices
US11457186B2 (en) Method of controlling projector and projector
US20240040094A1 (en) Electronic apparatus for projecting image and controlling method thereof
JP2012217070A (en) Projector and projection method
WO2014209351A1 (en) Method and apparatus of disabling a notification module in a portable projection device
WO2014209349A1 (en) Graphical user interface for a portable projection device
JP2021184042A (en) Control method, projection system, and computer program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13736727

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13736727

Country of ref document: EP

Kind code of ref document: A1