GB2519311A - A device capable of projecting an image frame - Google Patents

A device capable of projecting an image frame Download PDF

Info

Publication number
GB2519311A
GB2519311A GB1318282.9A GB201318282A GB2519311A GB 2519311 A GB2519311 A GB 2519311A GB 201318282 A GB201318282 A GB 201318282A GB 2519311 A GB2519311 A GB 2519311A
Authority
GB
United Kingdom
Prior art keywords
projected
image frames
image
frame
frame rate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1318282.9A
Other versions
GB201318282D0 (en
Inventor
Matthew Charles Porth
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority to GB1318282.9A priority Critical patent/GB2519311A/en
Publication of GB201318282D0 publication Critical patent/GB201318282D0/en
Publication of GB2519311A publication Critical patent/GB2519311A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0127Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level by changing the field or frame frequency of the incoming video signal, e.g. frame rate converter
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3182Colour adjustment, e.g. white balance, shading or gamut
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0127Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level by changing the field or frame frequency of the incoming video signal, e.g. frame rate converter
    • H04N7/0132Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level by changing the field or frame frequency of the incoming video signal, e.g. frame rate converter the field or frame frequency of the incoming video signal being multiplied by a positive integer, e.g. for flicker reduction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/3173Constructional details thereof wherein the projection device is specially adapted for enhanced portability

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Geometry (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Projection Apparatus (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

A device capable of projecting an image frame(such as a camera, mobile telephone or a projector) comprises a graphics processing unit arranged to process image frames to be projected wherein the frame rate at which image frames are processed is greater than the frame rate at which image frames are provided from an image source. The frame rate increase may be an integer multiple greater than the rate at which images are received from the source. The device may include sensors to compensate/correct for motion/displacement of the projector during projection of image frames. Notably, a perspective distortion may be applied to image frames to correct for keystone distortion. Brightness adjustment may also be applied to at least a portion of the projected image frames, dependent upon the perspective distortion performed. The perspective distortion may be performed under user-control via a graphics user interface (GUI) presented upon a device display screen. The images may also be displayed upon the display screen. A corresponding method of processing image frames to projected is also described. The increase in frame rate for the processing of image frames allows for other processing (i.e. correction/compensation) to be performed within each frame period.

Description

A Device Capable Of Projecting An Image Frame The present invention relates generally to a device capable of projecting an image frame, and also to a method of processing image frames to be projected.
Traditionally, images to be projected onto a screen or other structure were projected by passing light through a sheet of material on which the image was provided. This often involved the use of cumbersome projection units which passed light through a surface which was equal to or greater than the size of the actual image to be projected. A typical example of such projection was the overhead projector. In more recent years, electronic projection has become far more prevalent, with the image to be projected being in some way electronically controlled. For instance, light may be passed through a relatively small liquid crystal display, the liquid crystal display being in some way manipulated to form an image to be projected.
This has allowed the projectors themselves to become smaller and lighter, and therefore far more portable. This is to the extent that projectors are now found in readily portable formats, and might even take the form of hand-held devices. For instance, dedicated hand-held projectors are now available, or mobile telephones or digital cameras may be provided with some form of image projection functionality.
Although technological developments have allowed images to be projected from smaller devices, this is not always as advantageous as might at first be expected. Take for example the traditional overhead projector, which was heavy and bulky. Although the size of the overhead projector might have been a problem in some instances, in other instances its size might be an advantage. An advantage might be the physical stability of the projector during use. The size and bulk of the projector gave the projector stability, which in turn gave the projection of an image an associated stability. This stability is lacking from more recent, smaller devices. Take for example a very small portable projector placed on a desk. Due to the small and associated lightweight nature of the portable projector, it is extremely easy to knock or in some other way perturb the projector, which has a resultant effect on the image that is projected, and likely a magnified affect. This is, of course, a problem that is even more prevalent when the image is projected from a hand-held device. Very small movements of the user of the device will cause an at least associated, and if not magnified, movement of the image that is projected. This might be to the extent that even breathing or the pulse of the user might cause a significant shift of the projected image.
It is an aim of example embodiments of the present invention to at least partially obviate or mitigate one or more disadvantages of the prior art, whether identified herein or elsewhere, or to at least provide an alternative to existing apparatus and/or methods.
According to the present invention there is provided an apparatus and method as set forth in the appended claims. Other features of the invention will be apparent from the dependent claims, and the description which follows.
According to a first aspect of the invention, there is provided a device capable of projecting an image frame, the device comprising: a graphics processing unit arranged to process image frames to be projected; a frame rate at which image frames to be projected are processed is greater than a frame rate at which image frames are provided from an image frame source.
The frame rate at which image frames to be projected are processed may be an integer multiple greater than a frame rate at which image frames are provided from the image source.
The device may comprise a controller arranged to control a property of the image frames to be projected via input to and/or control of the operation of the graphics processing unit.
The device may comprise one or more movement sensors for sensing movement of the device. The controller may be arranged to use an output from the one or more movement sensors to at least partially stabilise image frames to be projected relative to movement of the device.
The graphics processing unit may be (possibly via, or with the control of the controller) arranged to perform perspective distortion of the image frames to be projected to at least partially stabilise those image frames relative to movement of the device.
The perspective distortion and/or stabilisation may be relative to a default perspective configuration of the image frames to be projected.
The controller may be arranged to facilitate user control of a perspective distortion of the image frames to be projected. This control may be used, in one example, to allow the user to set the default perspective configuration. In another example, the default perspective configuration maybe set by the device, for example at the time when image frame projection begins.
The device may be capable of projecting image frames having a first perimeter, with undistorted image frames to be projected having a second perimeter that fits within the first perimeter.
When a distortion of the image frame to be projected is or would be such that the second perimeter would extend beyond the first perimeter, image frames to be projected may be processed (and thus projected from the device) in an undistorted manner.
The controller may be arranged to adjust a brightness of at least a portion of an image frame to be projected, in proportion to a perspective distortion performed on that image frame.
The graphics processing unit might also be arranged to process image frames to be displayed on a display screen of the device.
The device might be arranged to provide a graphical user interface using a screen of the device, for receiving user input and control of the device.
The device may comprise the image source. The device may be connectable to an external image source, in a wired or wireless manner. The image source may comprise software, hardware, or a combination.
The device may be one or more of (which includes a combination of): portable, hand-held, a digital camera, a mobile telephone, a projector.
According to a second aspect of the present invention there is provided a method of processing image frames to be projected, the method comprising: processing image frames to be projected using a graphics processing unit; wherein a frame rate at which image frames to be projected are processed is greater than a frame rate at which image frames are provided from an image frame source.
It should be clear to the skilled person that one or more features described in relation to one aspect of the invention may, unless clearly mutually exclusive, be combined with and/or replace one or more features of another aspect of the present invention.
For a better understanding of the invention, and to show how embodiments of the same may be carried into effect, reference will now be made, by way of example, to the accompanying diagrammatic Figures in which: Figure 1 schematically depicts a view of a typical image projection scenario; Figure 2 schematically depicts the outline of an image projected in accordance with the scenario of Figure 1; Figure 3 schematically depicts a view of another typical image projection scenario; Figure 4 schematically depicts the outline of an image projected in accordance with the scenario of Figure 3; Figure 5 schematically depicts a flow chart showing the methodology implemented by/control layout of a device capable of projecting an image frame according to an example embodiment; Figure 6 schematically depicts first and second perimeters associated with image projection, in accordance with an example embodiment; Figures 7 and 8 schematically depict distortion of the second perimeter relative to the first perimeter of the image frame of Figure 6; and Figure 9 schematically depicts the situation where the second perimeter exceeds the first perimeter of the image frame of Figure 6.
Figure 1 schematically depicts a device capable of projecting an image frame 2. The device 2 is shown as projecting an image frame 4 onto a screen 6 or similar. The device 2 is typically a portable device, such as a hand-held device which might be a hand-held projector, or a mobile telephone or digital camera capable of projecting an image. However, the device 2 could represent a larger device that is not necessarily as portable as the devices previously described. Example embodiments of the present invention are still applicable to such less portable devices, although the invention finds particular application to more portable devices.
Figure 2 schematically depicts the image frame as projected 8 onto the screen. As is typical of image projection, the image frame has an outermost perimeter that is rectangular in shape.
However, even if the image frame to be projected has a rectangular outermost perimeter, the ultimately projected image frame may not have this rectangular form due to the orientation or change in orientation of the projection device itself.
Figure 3 shows how the device 2 has changed in orientation. The change is such that the device 2 is no longer oriented perpendicularly/normally with respect to the screen 6, but is instead oriented at an angle slightly offset from the normal/perpendicular. Such a change in orientation might be intentional, for example due to orientational constraints imposed on the orientation of the device 2, or accidental or unintentional, for example due to the device 2 being knocked or otherwise perturbed or moved.
Figure 4 shows that the image frame 8 projected onto the screen is no longer rectangular in shape, but is distorted. In more detail, the image 8 has undergone perspective distortion due to the change in perspective of the device relative to the screen onto which the image is projected. Clearly, perspective distortion is undesirable, since it is typically desired to project an image onto a screen or other structure without such distortion.
One solution to the problem of distortion of the image is to simply re-orient the device that projected the image. If the device is large or bulky, this might involve a simple positional and/or orientational shift of the device that projected the image. However, this is far less straightforward to achieve if the device is small, for example portable and in particular hand-held. When the device is more portable, it may not be possible to simply implement a single step positional/orientational correction, but instead such correction might be required on an almost continual basis to correct for almost continual movement of the device.
According to an example embodiment of the present invention, the problems described in relation to Figures 1-4 may be at least partially obviated or mitigated. An example embodiment provides a device that is capable of projecting an image frame. The device comprises a graphics processing unit that is arranged to process image frames to be projected. Processing of the image frames using a graphics processing unit (commonly referred to as a OPU) is far more efficient and effective than, for instance, processing the images using only software. A frame rate at which image frames to be projected are processed is greater than a frame rate at which image frames are provided from an image frame source. This might be referred to as (temporal) over-sampling, or up-sampling. Such an increased processing frame rate allows the image frames to be processed without affecting the viewing experience of a user of the device. For example, movement of the device can be taken account of and compensated in an electronic manner, and for far more quickly than the movement itself, therefore resulting in a smooth correction that is in practise invisible to the user.
In isolation and/or combination, the device features allow for more efficient and effective image processing of the image frames to be projected, which allows for efficient and effective image stabilisation when the device projecting the image is moved or re-oriented or similar.
Figure 5 schematically depicts a device, and/or control methodology for a device, capable of projecting an image frame in accordance with an example embodiment. An image frame source 10 is provided. The image frame source 10 may be part of the device, or separate to the device and connectable to the device in a wired or wireless manner. The image frame source may comprise hardware and/or software, and for example might be an application, or a memory. The image frame source provides image frames that are to be projected 12. Collectively, the image frame source 10 and the generation of image frames to be projected are operated or otherwise controlled at a first frame rate 14. The first frame rate might typically be 60Hz. Image frames to be projected 14 are input to and processed by a graphics processing unit 16.
A graphics processing unit, also occasionally called a visual processing unit (VPU), might be defined as a specialized electronic circuit for the creation and/or processing of images. The circuit might be further defined as being designed to rapidly manipulate and alter memory to accelerate the creation and/or processing of images in a frame buffer intended for output to a display. In an alternative and/or additional definition, a graphics processing unit might be described as being used primarily for 3-D applications. A graphics processing unit is a single-chip processor that creates lighting effects and/or transforms objects every time a 3D scene is redrawn. These are mathematically-intensive tasks, which, otherwise, would put quite a strain on a central processing unit of the device. Lifting this burden from the CPU frees up cycles that can be used for other jobs. So, a device can, and typically will, comprise a graphics processing unit in addition to a central processing unit. Despite the above advantages, it would appear that no-one has yet contemplated such a combination, or more simply, use of a graphics processing unit, in a projection device.
The graphics processing unit 16 operates at a second frame rate 18 which is higher than the first frame rate 14. Typically and advantageously, the second frame rate 18 will be an integer multiple greater than the first frame rate 14 at which image frames to be projected 12 are provided from the image source 10. The integer multiple allows for more straightforward processing and syncing and the like of the frames at the different frame rates throughout the device as a whole. The integer multiple might be 2x, 3x, 4x, 5x, or higher (e.g. lOx). Thus, the second frame rate 18 might be, for example, 120Hz, 180Hz, 240Hz, 300hz or higher (e.g. 600hz).
The graphics processing unit processes the images to be projected 12 to form processed image frames 20. These are then input to the projection element or elements 22 for projection. The elements 22 might comprise one or more lens or display elements. The projection element 22 also operates at the second, higher frame rate 18 in accordance with, for instance, a VSYNC 24.
As discussed above, such an increased processing frame rate allows the image frames to be processed without affecting the viewing experience of a user of the device. For example, and as described in more detail below, movement of the device can be taken account of and compensated in an electronic manner, and for far more quickly than the movement itself, therefore resulting in a smooth correction that is in practise invisible to the user. Although similar image stabilisation has of course been employed when receiving images, for example in digital camera lenses, it would appear that no-one has yet contemplated such stabilisation when outputting or projecting images, despite the clear advantages of such.
The VSYNC 24 may be viewed as a clock or similar, or comprise, or be in connection with and/or driven by a clock. The VSYNC 24 is subjected to division 26 by, in this embodiment, an integer multiple to provide a driving signal in the form of a clock or similar to the image source 10 which is synchronised with the projection of image frames. Thus, the entire system may operate from and/or be synchronised by a single clock or VSYNC 24, which may be divided down to ensure that the projection is undertaken at a higher frame rate than image frame provision. In another example, the reverse might be true, where the clock or otherwise synchronisation of the image frame source is multiplied up to ensure that the images to be projected are processed and projected at the higher frame rate.
A controller 28 is also provided. The controller 28 is arranged to control a property of the image frames to be projected via input to and/or control of the operation of the graphics processing unit 16. The controller 28 might be connected to the graphics processing unit as shown in Figure 5, or in other embodiments could be a part of the graphics processing unit, or the graphics processing unit may be part of the controller.
The device comprises one or more movement sensors 30. The movement sensors 30 are provided to sense movement of the device, which includes translational movement and/or orientational movement. For instance, the one or more sensors may comprise one or more accelerometers used to calculate or otherwise determine motion and/or orientation of the device (which includes changes thereto) relative to a default position/orientation of the device (corresponding to a default perspective configuration of the image frames, described in more detail below). Alternatively, and/or additionally, the one or more sensors may comprise one or more gyroscopes for refining the motion and/or orientation determination or calculation provided by what might be coarser calculation or determination undertaken by the accelerometers.
Output from the one or more movement sensors is provided to the controller 28. In general terms, the controller 28 is arranged to use an output from the one or more movement sensors 30 to at least partially stabilise image frames to be projected relative to movement of the device. This means that the device itself does not necessarily need to be re-oriented or re-positioned to correct for the perspective distortion already described in relation to Figures 3 and 4. In more detail, the graphics processing unit 16 is, with the input of the controller 28, arranged to perform perspective distortion of the image frames to be projected 12 to at least partially stabilise those image frames relative to the movement of the device. This ensures that image frames to be projected 20 are ultimately distortion free, or at least subjected to less distortion than might otherwise be the case.
The perspective distortion and/or image stabilisation is undertaken relative to a default perspective configuration of the image frames to be projected 12. This default might be pre-determined, and built into the device, or in some way established or determined by the device when image frame projection begins. However, this would of course make little sense if the device moves away from what is, or is expected to be, or is determined as, the default perspective configuration. Therefore, a far more versatile option is to facilitate user control 32 (or user override and setting of) of a default perspective distortion of the image frame which is to be projected 12 via a user interface and the controller 28. The interface may typically be a graphical user interface. The graphical user interface may be provided on a screen of the device, for example a touch screen or similar. Perspective distortion could be visualised and controlled on-screen by the changing of the viewing perspective of a 3-D object such as a cube or cuboid.
Setting a default perspective distortion of the image frames to be projected 12 provides a baseline relative to which distortion and/or stabilisation takes place. For instance, it might well be that the device is oriented with respect to a structure onto which images are to be projected which results in perspective distortion of the projected images. The user can, in a live feedback manner, use the user interface to control a perspective distortion of the image frames to be projected so that a default perspective distortion is applied. Any subsequent stabilisation/correction can be achieved relative to this default setting, if and when the device moves.
In use, the device may be set up for projecting images onto a structure, for example a wall or screen. The orientation of the device may be such that images projected onto the structure would, without correction, be projected with perspective distortion. Thus, user input 32 may be provided to the controller 28 and the graphics processing unit 16 to set and apply a default perspective correction to the image frames 12 to account for the orientation of the device. At, for example, a subsequent time, the device may be perturbed. The perturbation is detected by the one or more movement sensors 30, the output of which sensors 30 is provided to the controller 28. The controller 28 controls the graphics processing unit 16 to compensate for the movement relative to the pre-set and thus default perspective distortion set by the user, to ensure that, even though the device has been perturbed, or is continually perturbed, there is little or no change to the projected images 20. Because the graphics processing unit 16 is performing the distortion correction, this is a far more efficient and effective correction than if the transformations or the like were applied using software. Also, because the corrections are applied at a higher frame rate 18 than the frame rate 14 at which images are provided, and at which perturbation occurs, the correction is largely invisible to the user.
Any appropriate transformation can be applied to reflect the detected motion. For example, rotation of the device may be detected. Depending on where this rotation takes place, for example about what axis of the device, then the image frame may be processed by rotation or by tilting or by skewing or the like. In general, it will be appreciated that so long as the motion can be detected, an image transformation should be possible to correct for that motion.
It will be appreciated that movement of the device may be such that it becomes difficult or even impossible to correct for distortion in the projected image. Boundaries in the form of perimeters may therefore be set to ensure that image correction is implemented, or at least attempted or considered up to a certain stage, beyond which stage image correction is no longer applied. After that stage, image correction may be attempted relative to a newly determined orientation or position of the device Figures 6-9 show how transformations may be undertaken in practice, using the perimeters describe above. Figure 6 shows an image frame that the device is capable of projecting. The device is capable of projecting image frames having a first perimeter 40.
Undistorted (or unprocessed or default) image frames to be projected have a second perimeter 42 that fits within (e.g. sits within extremities of) the first perimeter 14. This results in what might be described as a "dead zone" 44. The dead zone can be used to implement perspective correction.
Figure 7 shows an example distortion applied to the image frame to be projected, and it can be seen that the perimeter 42 of the image frame is now substantially trapezoidal in shape. The perimeter 42 of the image frame still sits within the perimeter 40 that the device is capable of projecting. This means that, even with perspective correction, the entire image frame that it is intended to project can in fact be projected and viewed by the user. In other words, the perspective correction is undertaken such that the image frame to be projected is expanded or otherwise distorted into the dead zone 44, but not beyond the perimeter 40 that the device is capable of projecting.
Figure 7 also depicts an arrow 50. The arrow indicates a brightness adjustment undertaken by the controller and/or graphics processing unit, which is in proportion to the perspective distortion performed on the image frame. The brightness correction may be employed when it is determined that the device has moved such that parts of the projected image are now further from/closer to the device than intended/than previously was the case, which might result in a reduction in brightness, and/or an increase in brightness of parts of the image. The brightness correction is used taking into account this brightness differential.
Figure 8 shows another typical transformation that might take place.
Figure 9 shows that when a distortion of the image frame to be projected is, or would be, such that the perimeter of the image frame 42 exceeds the perimeter 40 that the device is capable of projecting, the image frame is not in fact projected. This is to avoid clipping" where not all of the image frame is actually projected and therefore visible to the user. In this scenario, the image frames might instead be projected in an undistorted manner, and subsequent corrections undertaken from this new default starting point.
Alternatively/additionally, at this point the user could provide user input as a result of the change in orientation being too great for the device to effectively correct for -i.e. the user could provide a new default perspective distortion.
The result of perspective correction is that movement of the device as shown in Figure 3 will not result in the distorted projected image of Figure 4, but the undistorted image of Figure 2.
As discussed above, the invention might find particular use in devices that are capable of projecting image frames, and which devices are portable in nature. This is because it is in such devices that small movements are likely to have a large effect on the resultant projected image. However, the invention can also be applied to larger, and more physically stable devices, where image stabilisation may still be of some use.
In the above embodiments, a graphics processing unit has been described. This graphics processing unit may be used solely for perspective correction and stabilisation and the like of images that are to be projected. However, in at least some devices, it is likely that the device will already have a graphics processing unit that might be used, for example, to process image frames to be displayed on a display screen of the device. In such a scenario, it will be appreciated that the graphic processing unit functionality already inherent in the device might be additionally and/or alternatively used for image projection. That is, the unit has multi-purpose functionality. In fact, the present invention might be applied retrospectively to a device capable of projecting that already has a graphics processing unit but which is not yet employed for processing of image frames to be projected.
In the above embodiments, the use of different image frames has been described. In some instances, these image frames will comprise different images, for example to construct, over time, a moving image visible to the user, or simply to present different static images. In other embodiments, the different image frames may comprise the same image, and the projection of each subsequent image might simply constitute or amount to a refresh or re-projection of the image.
Although a few preferred embodiments have been shown and described, it will be appreciated by those skilled in the art that various changes and modifications might be made without departing from the scope of the invention, as defined in the appended claims.
Attention is directed to all papers and documents which are filed concurrently with or previous to this specification in connection with this application and which are open to public inspection with this specification, and the contents of all such papers and documents are incorporated herein by reference.
All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and/or all of the steps of any method or process so disclosed, may be combined in any combination, except combinations where at least some of such features and/or steps are mutually exclusive.
Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise. Thus, unless expressly stated otherwise, each feature disclosed is one example only of a generic series of equivalent or similar features.
The invention is not restricted to the details of the foregoing embodiment(s). The invention extends to any novel one, or any novel combination, of the features disclosed in this specification (including any accompanying claims, abstract and drawings), or to any novel one, or any novel combination, of the steps of any method or process so disclosed.

Claims (16)

  1. Claims 1. A device capable of projecting an image frame, the device comprising: a graphics processing unit arranged to process image frames to be projected; a frame rate at which image frames to be projected are processed is greater than a frame rate at which image frames are provided from an image frame source.
  2. 2. The device of claim 1, wherein the frame rate at which image frames to be projected are processed is an integer multiple greater than a frame rate at which image frames are provided from the image source.
  3. 3. The device of claim 1, comprising a controller arranged to control a property of the image frames to be projected via input to and/or control of the operation of the graphics processing unit.
  4. 4. The device of claim 3, wherein the device comprises one or more sensors for sensing movement of the device, and the controller is arranged to use an output from the one or more sensors to at least partially stabilise image frames to be projected relative to movement of the deviGe.
  5. 5. The device of claim 4. wherein the graphics processing unit is arranged to perform perspective distortion of the image frames to be projected to at least partially stabilise those image frames relative to movement of the device.
  6. 6. The device of claim 3 or claim 4, wherein the perspective distortion and/or stabilisation is relative to a default perspective configuration of the image frames to be projected.
  7. 7. The device of any of claims 3 to 5, wherein the controller is arranged to facilitate user control of a perspective distortion of the image frames to be projected, in order to set the default perspective configuration.
  8. 8. The device of any preceding claims, wherein the device is capable of projecting image frames having a first perimeter, and undistorted image frames to be projected have a second perimeter that fits within the first perimeter.
  9. 9. The device of claim 8, wherein, when a distortion of the image frame to be projected is or would be such that the second perimeter would extend beyond the first perimeter, image frames to be projected are processed in an undistorted manner.
  10. 10. The device of any claim dependent on claim 3, wherein the controller is arranged to adjust a brightness of at least a portion of an image frame to be projected, in proportion to a perspective distortion performed on that image frame.
  11. 11. The device of any preceding claim, wherein the graphics processing unit is also arranged to process image frames to be displayed on a display screen of the device.
  12. 12. The device of any preceding claim, wherein the device is arranged to provide a graphical user interface using a screen of the device, for receiving user input and control of the device.
  13. 13. The device of any preceding claim, wherein the device comprises the image source.
  14. 14. The device of any preceding claim, wherein the device is one or more of: portable, hand-held, a digital camera, a mobile telephone, a projector.
  15. 15. A method of processing image frames to be projected, the method comprising: processing image frames to be projected using a graphics processing unit; wherein a frame rate at which image frames to be projected are processed is greater than a frame rate at which image frames are provided from an image frame source.
  16. 16. A device substantially as shown in the accompanying Figures, a device substantially as described herein with reference to the accompanying Figures, and/or a device substantially as described herein.
GB1318282.9A 2013-10-16 2013-10-16 A device capable of projecting an image frame Withdrawn GB2519311A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1318282.9A GB2519311A (en) 2013-10-16 2013-10-16 A device capable of projecting an image frame

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1318282.9A GB2519311A (en) 2013-10-16 2013-10-16 A device capable of projecting an image frame

Publications (2)

Publication Number Publication Date
GB201318282D0 GB201318282D0 (en) 2013-11-27
GB2519311A true GB2519311A (en) 2015-04-22

Family

ID=49680112

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1318282.9A Withdrawn GB2519311A (en) 2013-10-16 2013-10-16 A device capable of projecting an image frame

Country Status (1)

Country Link
GB (1) GB2519311A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030058368A1 (en) * 2001-09-24 2003-03-27 Mark Champion Image warping using pixel pages
US20040136686A1 (en) * 2002-11-07 2004-07-15 Seiko Epson Corporation Conversion of frame rate according to image data
EP1447975A2 (en) * 2003-02-17 2004-08-18 Seiko Epson Corporation Projected image correction method and projector
US20090027549A1 (en) * 2004-05-17 2009-01-29 Weisgerber Robert C Method for processing motion pictures at high frame rates with improved temporal and spatial resolution, resulting in improved audience perception of dimensionality in 2-D and 3-D presentation
US20090059098A1 (en) * 2007-08-31 2009-03-05 Sony Corporation Image display apparatus and image processing apparatus
US20090080789A1 (en) * 2007-08-31 2009-03-26 Sony Corporation Projection display and projection display control program
US20090174810A1 (en) * 2003-11-01 2009-07-09 Taro Endo Video display system
US20100231593A1 (en) * 2006-01-27 2010-09-16 Samuel Zhou Methods and systems for digitally re-mastering of 2d and 3d motion pictures for exhibition with enhanced visual quality
JP2012168897A (en) * 2011-02-16 2012-09-06 Canon Inc Image processing device, control method therefor, and program

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030058368A1 (en) * 2001-09-24 2003-03-27 Mark Champion Image warping using pixel pages
US20040136686A1 (en) * 2002-11-07 2004-07-15 Seiko Epson Corporation Conversion of frame rate according to image data
EP1447975A2 (en) * 2003-02-17 2004-08-18 Seiko Epson Corporation Projected image correction method and projector
US20090174810A1 (en) * 2003-11-01 2009-07-09 Taro Endo Video display system
US20090027549A1 (en) * 2004-05-17 2009-01-29 Weisgerber Robert C Method for processing motion pictures at high frame rates with improved temporal and spatial resolution, resulting in improved audience perception of dimensionality in 2-D and 3-D presentation
US20100231593A1 (en) * 2006-01-27 2010-09-16 Samuel Zhou Methods and systems for digitally re-mastering of 2d and 3d motion pictures for exhibition with enhanced visual quality
US20090059098A1 (en) * 2007-08-31 2009-03-05 Sony Corporation Image display apparatus and image processing apparatus
US20090080789A1 (en) * 2007-08-31 2009-03-26 Sony Corporation Projection display and projection display control program
JP2012168897A (en) * 2011-02-16 2012-09-06 Canon Inc Image processing device, control method therefor, and program

Also Published As

Publication number Publication date
GB201318282D0 (en) 2013-11-27

Similar Documents

Publication Publication Date Title
JP6145123B2 (en) System and method for injection of mapping function
JP5849560B2 (en) Display device, projector, and display method
US10893246B2 (en) Projection system and automatic setting method thereof
EP3141993A3 (en) Mobile terminal and method for controlling the same
JP6429545B2 (en) Control device and control method
JP2017199982A5 (en)
US10630949B2 (en) Projection system and automatic setting method thereof
WO2005121867A8 (en) Polarized stereoscopic display device and method
JP2012028963A5 (en)
US10075644B2 (en) Information processing apparatus and information processing method
JP6540099B2 (en) IMAGE PROCESSING DEVICE, DISPLAY DEVICE, AND IMAGE PROCESSING METHOD
JP6201358B2 (en) Image processing apparatus, projector, and image processing method
JP2018159847A5 (en)
WO2012120586A1 (en) Projection type image display device and light quantity adjustment method
JP2008061160A5 (en)
GB2519311A (en) A device capable of projecting an image frame
JP2014219572A (en) Display device and control method of the same
JP5828400B2 (en) Video display device and video display method
JP2007132984A (en) Display apparatus, program, information memory medium and on-screen display image display method
JP4851860B2 (en) Projector and angle of view control method
JP2015197781A5 (en)
JP2015198321A5 (en)
US20240179288A1 (en) 3d projection method and 3d projection device
US11908353B2 (en) Information processing apparatus and information processing system
US20200145628A1 (en) Projection apparatus and correcting method of display image

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)