WO2011040864A1 - Method relating to digital images - Google Patents
Method relating to digital images Download PDFInfo
- Publication number
- WO2011040864A1 WO2011040864A1 PCT/SE2010/051019 SE2010051019W WO2011040864A1 WO 2011040864 A1 WO2011040864 A1 WO 2011040864A1 SE 2010051019 W SE2010051019 W SE 2010051019W WO 2011040864 A1 WO2011040864 A1 WO 2011040864A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- digital image
- representation
- signal
- representations
- group
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04104—Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04805—Virtual magnifying lens, i.e. window or frame movable on top of displayed information to enlarge it for better reading or selection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- the present invention relates to a method for selecting an image and a method for facilitating generation or capturing of a desired image.
- a method for selecting an image comprises forming a group of digital image
- the advantage of forming a group of images is that the chance of having at least one good image is increased.
- the selection method the selection of the best image is facilitated and thereby it becomes easier for a user to discard less desirable images.
- the zoom-in facilitates viewing of details in the high resolution image on a smaller preview screen.
- the sliding direction of the second sliding motion is along a trajectory that is substantially circular. This is an advantage in that the sliding motion determines the switching between images and if the number of images in the group of image representations are large, the touch sensitive display may not be big enough. However, by making the sliding motion circular there is no longer any limitations as it becomes possible to slide the pointing device many turns.
- the touch sensitive display is a multi-touch sensitive display, wherein said generating of a position signal further includes a detection of a second pointing device on the multi-touch sensitive display, said position signal indicating a touch position which is based on each position of the two pointing devices respectively, wherein said zoom-in signal is generated in response to a detection of the two pointing devices sliding on the multi-touch display away from each other, and wherein said shift signal is generated in response to a second sliding motion of the two pointing devices at substantially constant distance between the pointing devices.
- the touch position is a position between the two detected pointing devices.
- representation of the first image includes displaying an enlarged subarea of the first digital image representation in response to the zoom-in signal, the position of the subarea within the first digital image representation being based on the selected position and the enlargement of the subarea being based on the zoom-in signal.
- the size of the subarea to be enlarged is based on the distance between the two pointing devices at the generation of the position signal.
- said forming of a group of digital image representations includes capturing a plurality of different digital image representations of essentially the same view at different points in time.
- said forming of a group of digital image representations includes capturing a plurality of different digital image representations of essentially the same view at different exposure settings.
- said forming of a group of digital image representations includes capturing a plurality of different digital image representations of essentially the same view having different focus distances.
- said forming of a group of digital image representations includes capturing a plurality of different digital image representations of essentially the same view being exposed for different transforms.
- said forming of a group of digital image representations includes generating a plurality of digital image representations from one single original image by manipulating the original image differently for each digital, the manipulation includes applying a transform or a parameter to the original image.
- a pointing device is a fingertip.
- Fig 1 a is a schematic view of a display side of an image presentation device according to one embodiment of the invention.
- Fig 1 b is a schematic view of a lens side of an image presentation device incorporating a camera according to one embodiment of the invention
- Fig 1 c is a schematic block diagram of the image presentation device in Figs 1 a-b,
- Fig 2 shows a group of image representations according to one embodiment of the invention
- Fig 3a-d schematically shows acts of a method according to one embodiment of the invention. Detailed Description of a Preferred Embodiment
- the image presentation device 10 for implementing the invention is showed.
- the image presentation device 10 includes housing 12 and a touch sensitive display 14, see Fig 1 a. It may also include a lens 16, see Fig 1 b, for focusing light to be captured as an image on an image sensor, not showed.
- the image presentation device 10 may, thus, be included in a camera or have a camera incorporated.
- the image presentation device 10 includes processor 18, a volatile memory 20, a non volatile memory 22, a display driver 24, a touch screen driver 26, touch screen circuitry 28, and camera circuitry 30 including an image sensor.
- the processor 18, the volatile memory 20 and the non volatile memory 22 may be arranged and connected in a way known to the skilled person for operation of the image presentation device and execution of applications stored in the non-volatile memory 22.
- the design and implementation of the touch screen circuitry 28 depends on the type of touch sensitive display that is to be used.
- the implementation of the touch screen driver 26 depends on the type of touch sensitive display and the operating system of the image presentation device 10.
- touch sensitive display or touch screen is used for a display that is arranged to detect the presence, location, and/or movement of a "touch" within the display area.
- the touch screen may be designed to detect presence, location, and/or movement on the display by a finger, a hand, a stylus, a pen, etc.
- the touch screen be a resistive touch screen, a touch screen based on surface acoustic wave technology, a capacitive touch screen, a touch screen using surface capacitance, a touch screen based on projected capacitive touch technology, a system based on infrared LEDs and photo sensors, a system based on a strain gauge configuration, a touch screen based on dispersive signal technology, a touch screen based on acoustic pulse recognition technology, etc.
- a method for selecting images is part of a greater scheme of achieving a desired image having specific
- the embodiment relates to an image selecting method operating on a group of digital image representations in order to achieve this result.
- the images forming the group of image representations may be images retrieved from a storage device, e.g. a hard drive, the non volatile memory 22, an image server accessed via a network, etc.
- the images may alternatively be acquired by means of a camera arranged in the device 10 or by means of transforming one original image retrieved from a storage device or one original image acquired by said camera.
- the image sequence may also be calculated from one or more source images, and the image itself may be virtual representation based on one or more mathematical schemes applied on one or more original images.
- One example of how to generate the group of image representations are to bracket, i.e. to take photographs at more than one exposure in order to ensure that the desired exposure is obtained in at least one exposure.
- Other examples are to take a plurality of photographs at different points in time, different depth of field, at different focus distances, or by varying any other setting of the camera.
- the camera used in these examples may well be a camera implemented in the same device or system as the image presentation device 10.
- the group of image representation may be generated from applying different transforms to the images.
- the number of images in a group of image representations may be as few as two and as many as hundreds, it much depends on the application in which the method is planned to be used.
- the group of images may be separate images/photos or a sequence of frames in a video.
- Fig 2 a group of image representations 50 including three image representations is showed.
- this particular sequence of images represents photographs 52, 54, 56, also referred to as first, second and third photographs, taken at different points in time.
- photographs 52, 54, 56 also referred to as first, second and third photographs, taken at different points in time.
- the selection method may be started in many ways, e.g.
- FIG. 3 shows one embodiment of the invention by depicting subsequent acts of the selection method of this embodiment.
- the image presentation device 10 displays a representation of the second photograph 54, see Fig 3a, from the group of image representations 50.
- the pointing device 70 is positioned to touch the touch sensitive display 14 at position 72, see Fig 3b, and a position signal is generated, including the touch position, and is sent from the touch sensitive display 14 to the processor 18, i.e. from the touch screen circuitry 28 and the touch screen driver 26 to the processor 18.
- the touch sensitive display 14 detects a sliding motion 74, performed by means of the pointing device 70, along the display 14 away from the touch position 72. This detection results in the generation of a zoom signal that is sent to the processor 18.
- the zoom signal includes an indication of the distance of the sliding movement 74, referred to as zoom-value. Based on the zoom signal and the zoom value the displayed image
- the pointing device is moved in another direction 76, in this specific embodiment essentially perpendicular to previous movement 74.
- This second movement 76 may advantageously be following a circular trajectory, as indicated by 76 in Fig 3d, in order to facilitating shifting through a large group of image representation without running out of touch sensitive display surface to slide on.
- the second sliding motion 76 is detected by the touch sensitive display 14 and a shift signal is generated and sent to the processor 18.
- the image displayed 54 is shifted to another image representation 56 in the group of image
- the number of images shifted may be proportional to the length of the second sliding motion 76.
- the shift would continue to present the image representation of the first photograph if the second sliding motion 76 is continued.
- the enlargement is not applied to the entire image representation as depicted in Figs 3c and 3d, but rather to a predetermined area surrounding the touch position indicated by the position signal.
- the area may be defined by a radius and may be substantially circular.
- two pointing devices are used, e.g. a finger and the thumb of a users hand.
- a position signal is generated when the two pointing devices are detected on the touch sensitive display, the touch position being indicated as a position between the detection points of the two pointing devices.
- a zoom signal is generated and in response to the zoom signal an enlarged image representation of the image representation presently displayed is presented on the display.
- the degree of enlargement is based on the distance the two pointing devices have been sliding away from each other.
- not the entire image representation is zoomed but only a subarea.
- the size of this sub area may correspond to an area defined by the initial positions of the pointing devices, i.e. when the touch position is indicated.
- a shift signal is generated and in response to the shift signal the image representation displayed is shifted to another image representation from the group of image representations.
- they are rotated substantially around a position in-between the two pointing devices and at a substantially constant distance from each other, e.g. following a substantially circular trajectory.
- the length of the sliding motion determines which image representation from the group of image representations to display. The enlargement applied to the initial image is displayed in the shifted images as well.
- portions of the initially displayed image representation is shifted.
- the portion to be shifted may for instance be indicated manually by tracing the contours of the area and then the shifting results in that the corresponding area of another image from the group of image representations is displayed.
- the contours of the area can also be automatically computed by tracing where the two images', aligned to substantially same positions within the particular area, pixels are substantially similar around the traced contour.
- the zoom-in step may be skipped and a rotational/circular motion using one or two pointing devices may be used to switch images.
- the group of image representations probably includes the images of a folder in a file system or a database or of a particular category in a database.
- the image presentation device 10 is a mobile telephone equipped with a camera.
- FIG. 4 Another embodiment is shown in Fig 4.
- the image presentation device 10 displays a representation of the second photograph 54, see Fig 4a, from the group of image representations 50. Then the pointing device 70 is positioned to touch the touch sensitive display 14 at a first position 82, see Fig 4b, and a position signal is generated, including the touch position, and is sent from the touch sensitive display 14 to the processor 18, i.e. from the touch screen circuitry 28 and the touch screen driver 26 to the processor 18.
- the displayed image representation 54 is enlarged to a degree that is based on a predetermined zoom value, see Fig 4b.
- the enlarging of the image representation is centered at the touch position. According to this embodiment the enlargement is not applied to the entire image representation but rather to a predetermined area 81
- the predetermined area 81 is defined by a radius and is substantially circular.
- the touch position including the position signal is updated.
- the updated position signal is sent from the touch sensitive display 14 to the processor 18, i.e. from the touch screen circuitry 28 and the touch screen driver 26 to the processor 18.
- the predetermined area 81 of the image representation that is enlarged in this case image representation of the second photograph 54, is also updated so that the enlargement is applied to the predetermined area 81 surrounding the touch position indicated by the position signal.
- the the predetermined area 81 of the image representation that is enlarged in this case image representation of the second photograph 54, may be live updated whilst the pointing device 70 is moved from the first position 82 to the second position 84.
- the predetermined area 81 of the image representation that is enlarged is live updated.
- the size of the predetermined area 81 may be reduced or enlarged upon detection of two pointing devices sliding towards each other or away from each other, respectively. This may e.g. be done by pointing the two pointing devices on to opposite sections on the border of the predetermined area 81 , respectively, and then sliding the two pointing devices towards each other or away from each other.
- the image representation shown in the predetermined area 81 may be shifted.
- the pointing device 70 is moved in a direction 86 along the border of the enlarged area of the image representation.
- a substantially circular trajectory as indicated by a sliding motion 86 in Fig 4d, this in order to facilitating shifting through a large group of image
- the sliding motion 86 is detected by the touch sensitive display 14 and a shift signal is generated and sent to the processor 18.
- the image displayed in the enlarged predetermined area 81 is shifted to another image representation in the group of image representations 50, in this case to the image representation of the third photograph 56.
- the same enlargement is applied to the newly shifted image representation as for the previous.
- the number of images shifted may be proportional to the length of the sliding motion 86.
- the shift would continue to present the image representation of the first photograph 52 if the sliding motion 86 is continued.
- the not selected digital image representations of the group of digital image representations may be discarded.
- the embodiment of figure 4 may be altered in various ways.
- the shifting of the image representations may be performed by moving the pointing device 70 in a substantially linear manner.
- This embodiment may be developed even further by showing small miniatures of the image representations at e.g. the bottom of the touch sensitive display 14. The shifting of the image
- representations may then be performed by moving the small miniatures by means of the pointing device 70 in a substantially linear manner along a path that is substantially parallel with the lower border of the touch sensitive display 14.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
The present invention is related to a method for selecting an image comprises forming a group of digital image representations, displaying a first digital image representation of the group of digital image representations on a touch sensitive display, generating a position signal in response to a detection of a pointing device on the touch sensitive display, said position signal indicating a touch position, identifying a selected position within the displayed first digital image representation based on the position signal, generating a zoom-in signal in response to a detection of the pointing device sliding away from the touch position on the touch display, said zoom signal indicating a sliding distance from the touch position, displaying an enlarged representation of the first digital image representation in response to the zoom-in signal, generating a shift signal in response to a detection of a second sliding motion of the fingertip on the touch sensitive display, and displaying a second digital image representation and an enlarged representation of the second digital image representation in response to the shift signal, the enlargement of the subarea being based on the zoom signal generated during displaying of the first digital image representation.
Description
METHOD RELATING TO DIGITAL IMAGES
Technical Field of Invention
The present invention relates to a method for selecting an image and a method for facilitating generation or capturing of a desired image. Background to the Invention
A great number of the graphical or photographical images of today are generated digitally. Generally this results in that more images are created and many times a greater amount of undesirable images are created. One of the problems of today is that even undesired images are stored and thereby occupy storage capacity. One simple solution to this problem is to delete all undesired images. However, the likelihood of undesired images still occupying storage capacity becomes greater as the time pass from the time an image was created. Summary of the Invention
It is an object of the invention to improve operations on images and to improve the experience for the user of continuous operations on images.
This object is achieved by means of a method for selecting an image according to claim 1 . Further embodiments of the invention are disclosed in the dependent claims.
In particular, according to a first aspect of the invention, a method for selecting an image comprises forming a group of digital image
representations, displaying a first digital image representation of the group of digital image representations on a touch sensitive display, generating a position signal in response to a detection of a pointing device on the touch sensitive display, said position signal indicating a touch position, identifying a selected position within the displayed first digital image representation based on the position signal, generating a zoom-in signal in response to a detection of the pointing device sliding away from the touch position on the touch display, said zoom signal indicating a sliding distance from the touch position, displaying an enlarged representation of the first digital image representation in response to the zoom-in signal, generating a shift signal in response to a detection of a second sliding motion of the pointing device on the touch sensitive display, and displaying a second digital image representation and an enlarged representation of the second digital image representation in
response to the shift signal, the enlargement of the second digital image representation being based on the zoom signal generated during displaying of the first digital image representation.
The advantage of forming a group of images is that the chance of having at least one good image is increased. By implementing the selection method the selection of the best image is facilitated and thereby it becomes easier for a user to discard less desirable images. Moreover, the zoom-in facilitates viewing of details in the high resolution image on a smaller preview screen.
According to one embodiment the sliding direction of the second sliding motion is along a trajectory that is substantially circular. This is an advantage in that the sliding motion determines the switching between images and if the number of images in the group of image representations are large, the touch sensitive display may not be big enough. However, by making the sliding motion circular there is no longer any limitations as it becomes possible to slide the pointing device many turns.
According to yet another embodiment the touch sensitive display is a multi-touch sensitive display, wherein said generating of a position signal further includes a detection of a second pointing device on the multi-touch sensitive display, said position signal indicating a touch position which is based on each position of the two pointing devices respectively, wherein said zoom-in signal is generated in response to a detection of the two pointing devices sliding on the multi-touch display away from each other, and wherein said shift signal is generated in response to a second sliding motion of the two pointing devices at substantially constant distance between the pointing devices.
In another embodiment the touch position is a position between the two detected pointing devices.
In yet another embodiment the act of displaying an enlarged
representation of the first image includes displaying an enlarged subarea of the first digital image representation in response to the zoom-in signal, the position of the subarea within the first digital image representation being based on the selected position and the enlargement of the subarea being based on the zoom-in signal.
In one embodiment the size of the subarea to be enlarged is based on the distance between the two pointing devices at the generation of the position signal.
In another embodiment said forming of a group of digital image representations includes capturing a plurality of different digital image representations of essentially the same view at different points in time.
According to another embodiment said forming of a group of digital image representations includes capturing a plurality of different digital image representations of essentially the same view at different exposure settings.
According to yet another embodiment said forming of a group of digital image representations includes capturing a plurality of different digital image representations of essentially the same view having different focus distances.
According to a further embodiment said forming of a group of digital image representations includes capturing a plurality of different digital image representations of essentially the same view being exposed for different transforms.
According to yet another embodiment said forming of a group of digital image representations includes generating a plurality of digital image representations from one single original image by manipulating the original image differently for each digital, the manipulation includes applying a transform or a parameter to the original image.
According to another embodiment a pointing device is a fingertip.
Brief Description of the Drawings
The invention will now be described in further detail by way of example under reference to the accompanying drawings, on which:
Fig 1 a is a schematic view of a display side of an image presentation device according to one embodiment of the invention,
Fig 1 b is a schematic view of a lens side of an image presentation device incorporating a camera according to one embodiment of the invention,
Fig 1 c is a schematic block diagram of the image presentation device in Figs 1 a-b,
Fig 2 shows a group of image representations according to one embodiment of the invention,
Fig 3a-d schematically shows acts of a method according to one embodiment of the invention.
Detailed Description of a Preferred Embodiment
The present invention will now be described more fully hereinafter with reference to the accompanying drawings, in which certain embodiments are shown. Like numbers refer to like elements throughout.
In Fig 1 an image presentation device 10 for implementing the invention is showed. According to one embodiment the image presentation device 10 includes housing 12 and a touch sensitive display 14, see Fig 1 a. It may also include a lens 16, see Fig 1 b, for focusing light to be captured as an image on an image sensor, not showed. The image presentation device 10 may, thus, be included in a camera or have a camera incorporated.
Moreover, now referring to Fig 1 c, the image presentation device 10 includes processor 18, a volatile memory 20, a non volatile memory 22, a display driver 24, a touch screen driver 26, touch screen circuitry 28, and camera circuitry 30 including an image sensor.
The processor 18, the volatile memory 20 and the non volatile memory 22 may be arranged and connected in a way known to the skilled person for operation of the image presentation device and execution of applications stored in the non-volatile memory 22.
The design and implementation of the touch screen circuitry 28 depends on the type of touch sensitive display that is to be used. The implementation of the touch screen driver 26 depends on the type of touch sensitive display and the operating system of the image presentation device 10.
In the present application the term touch sensitive display or touch screen is used for a display that is arranged to detect the presence, location, and/or movement of a "touch" within the display area. The touch screen may be designed to detect presence, location, and/or movement on the display by a finger, a hand, a stylus, a pen, etc.
Depending on the usage of the image presentation device one of a plurality of types of touch screens may be selected. For example may the touch screen be a resistive touch screen, a touch screen based on surface acoustic wave technology, a capacitive touch screen, a touch screen using surface capacitance, a touch screen based on projected capacitive touch technology, a system based on infrared LEDs and photo sensors, a system based on a strain gauge configuration, a touch screen based on dispersive signal technology, a touch screen based on acoustic pulse recognition technology, etc.
According to one embodiment a method for selecting images is part of a greater scheme of achieving a desired image having specific
characteristics. The embodiment relates to an image selecting method operating on a group of digital image representations in order to achieve this result. The images forming the group of image representations may be images retrieved from a storage device, e.g. a hard drive, the non volatile memory 22, an image server accessed via a network, etc. The images may alternatively be acquired by means of a camera arranged in the device 10 or by means of transforming one original image retrieved from a storage device or one original image acquired by said camera. The image sequence may also be calculated from one or more source images, and the image itself may be virtual representation based on one or more mathematical schemes applied on one or more original images.
One example of how to generate the group of image representations are to bracket, i.e. to take photographs at more than one exposure in order to ensure that the desired exposure is obtained in at least one exposure. Other examples are to take a plurality of photographs at different points in time, different depth of field, at different focus distances, or by varying any other setting of the camera. The camera used in these examples may well be a camera implemented in the same device or system as the image presentation device 10. Moreover, the group of image representation may be generated from applying different transforms to the images.
The number of images in a group of image representations may be as few as two and as many as hundreds, it much depends on the application in which the method is planned to be used. The group of images may be separate images/photos or a sequence of frames in a video. In Fig 2 a group of image representations 50 including three image representations is showed. In order to facilitate the depiction of the different images, this particular sequence of images represents photographs 52, 54, 56, also referred to as first, second and third photographs, taken at different points in time. Hence these images could very well have been taken by the camera of the device shortly before the user desides to use the selection method. The selection method may be started in many ways, e.g. by pressing a button on the image presentation device 10, by touching the touch sensitive display 14 by means of a pointing device either within a displayed image representation from the group of image representations or within an area on the display indicating a button for activiating this functionality.
Fig 3 shows one embodiment of the invention by depicting subsequent acts of the selection method of this embodiment. The image presentation device 10 displays a representation of the second photograph 54, see Fig 3a, from the group of image representations 50. Then the pointing device 70 is positioned to touch the touch sensitive display 14 at position 72, see Fig 3b, and a position signal is generated, including the touch position, and is sent from the touch sensitive display 14 to the processor 18, i.e. from the touch screen circuitry 28 and the touch screen driver 26 to the processor 18.
Then the touch sensitive display 14 detects a sliding motion 74, performed by means of the pointing device 70, along the display 14 away from the touch position 72. This detection results in the generation of a zoom signal that is sent to the processor 18. The zoom signal includes an indication of the distance of the sliding movement 74, referred to as zoom-value. Based on the zoom signal and the zoom value the displayed image
representation 54 is enlarged to a degree that is based on the zoom value, see Fig 3c. In the embodiment of Fig 3c the enlarging of the image
representation is centered at the touch position 72.
Now referring to Fig 3d, the pointing device is moved in another direction 76, in this specific embodiment essentially perpendicular to previous movement 74. This second movement 76 may advantageously be following a circular trajectory, as indicated by 76 in Fig 3d, in order to facilitating shifting through a large group of image representation without running out of touch sensitive display surface to slide on. The second sliding motion 76 is detected by the touch sensitive display 14 and a shift signal is generated and sent to the processor 18. In response to the shift signal the image displayed 54 is shifted to another image representation 56 in the group of image
representations 50. In Fig 3d the third photograph 56 is shifted in for display. In this embodiment the same enlargement is applied to the newly shifted image representation as for the previous.
The number of images shifted may be proportional to the length of the second sliding motion 76. Hence, in the example above, in which the group of image representations 50 only includes three photographs 52, 54, 56, the shift would continue to present the image representation of the first photograph if the second sliding motion 76 is continued.
According to another embodiment the enlargement is not applied to the entire image representation as depicted in Figs 3c and 3d, but rather to a predetermined area surrounding the touch position indicated by the position
signal. The area may be defined by a radius and may be substantially circular.
According to yet another embodiment two pointing devices are used, e.g. a finger and the thumb of a users hand. In this embodiment a position signal is generated when the two pointing devices are detected on the touch sensitive display, the touch position being indicated as a position between the detection points of the two pointing devices.
Then, upon detection of the two pointing devices sliding away from each other, a zoom signal is generated and in response to the zoom signal an enlarged image representation of the image representation presently displayed is presented on the display. The degree of enlargement is based on the distance the two pointing devices have been sliding away from each other. Moreover, according to one embodiment, not the entire image representation is zoomed but only a subarea. The size of this sub area may correspond to an area defined by the initial positions of the pointing devices, i.e. when the touch position is indicated.
Then, in response to detection of a second sliding motion by the two pointing devices, wherein the two pointing devices are sliding at substantially constant distance from each other, a shift signal is generated and in response to the shift signal the image representation displayed is shifted to another image representation from the group of image representations. In one embodiment they are rotated substantially around a position in-between the two pointing devices and at a substantially constant distance from each other, e.g. following a substantially circular trajectory. The length of the sliding motion determines which image representation from the group of image representations to display. The enlargement applied to the initial image is displayed in the shifted images as well.
According to one specific embodiment only portions of the initially displayed image representation is shifted. The portion to be shifted may for instance be indicated manually by tracing the contours of the area and then the shifting results in that the corresponding area of another image from the group of image representations is displayed. The contours of the area can also be automatically computed by tracing where the two images', aligned to substantially same positions within the particular area, pixels are substantially similar around the traced contour. By means of this embodiment combined with a group of image representations being a bracketed image sequence it is possible to generate HDR images, High Dynamic Range images.
According to one embodiment the method may advantageously be used for browsing images. In such an application the zoom-in step may be skipped and a rotational/circular motion using one or two pointing devices may be used to switch images. In this embodiment the group of image representations probably includes the images of a folder in a file system or a database or of a particular category in a database.
According to one particular embodiment the image presentation device 10 is a mobile telephone equipped with a camera.
According to another aspect of the present invention yet another embodiment is shown in Fig 4. The image presentation device 10 displays a representation of the second photograph 54, see Fig 4a, from the group of image representations 50. Then the pointing device 70 is positioned to touch the touch sensitive display 14 at a first position 82, see Fig 4b, and a position signal is generated, including the touch position, and is sent from the touch sensitive display 14 to the processor 18, i.e. from the touch screen circuitry 28 and the touch screen driver 26 to the processor 18.
Then the displayed image representation 54 is enlarged to a degree that is based on a predetermined zoom value, see Fig 4b. In the embodiment of Fig 4b the enlarging of the image representation is centered at the touch position. According to this embodiment the enlargement is not applied to the entire image representation but rather to a predetermined area 81
surrounding the touch position indicated by the position signal. According to the in fig 4b shown embodiment the predetermined area 81 is defined by a radius and is substantially circular.
By moving the pointing device 70 on the touch sensitive display 14 from the first position 82 to a second position 84, see Fig 4c, the touch position including the position signal is updated. The updated position signal is sent from the touch sensitive display 14 to the processor 18, i.e. from the touch screen circuitry 28 and the touch screen driver 26 to the processor 18. In response to the updated position signal the predetermined area 81 of the image representation that is enlarged, in this case image representation of the second photograph 54, is also updated so that the enlargement is applied to the predetermined area 81 surrounding the touch position indicated by the position signal. The the predetermined area 81 of the image representation that is enlarged, in this case image representation of the second photograph 54, may be live updated whilst the pointing device 70 is moved from the first position 82 to the second position 84. Thus, as long as the pointing device 70
is moved along the trajectory between first position 82 to the second position 84 the predetermined area 81 of the image representation that is enlarged is live updated.
The size of the predetermined area 81 may be reduced or enlarged upon detection of two pointing devices sliding towards each other or away from each other, respectively. This may e.g. be done by pointing the two pointing devices on to opposite sections on the border of the predetermined area 81 , respectively, and then sliding the two pointing devices towards each other or away from each other.
As a next step the image representation shown in the predetermined area 81 may be shifted. Now referring to Fig 4d, the pointing device 70 is moved in a direction 86 along the border of the enlarged area of the image representation. Thus, according to this specific embodiment, along a substantially circular trajectory, as indicated by a sliding motion 86 in Fig 4d, this in order to facilitating shifting through a large group of image
representation without running out of touch sensitive display surface to slide on. The sliding motion 86 is detected by the touch sensitive display 14 and a shift signal is generated and sent to the processor 18. In response to the shift signal the image displayed in the enlarged predetermined area 81 is shifted to another image representation in the group of image representations 50, in this case to the image representation of the third photograph 56. In this
embodiment the same enlargement is applied to the newly shifted image representation as for the previous. The number of images shifted may be proportional to the length of the sliding motion 86. Hence, in the example above, in which the group of image representations 50 only includes three photographs 52, 54, 56, the shift would continue to present the image representation of the first photograph 52 if the sliding motion 86 is continued.
Furthermore according to the above embodiment when a specific image representation has been selected among the group of image
representations by performing the sliding motion 86 the not selected digital image representations of the group of digital image representations may be discarded.
It is recognized that the embodiment of figure 4 may be altered in various ways. For example, according to one embodiment the shifting of the image representations may be performed by moving the pointing device 70 in a substantially linear manner. This embodiment may be developed even further by showing small miniatures of the image representations at e.g. the
bottom of the touch sensitive display 14. The shifting of the image
representations may then be performed by moving the small miniatures by means of the pointing device 70 in a substantially linear manner along a path that is substantially parallel with the lower border of the touch sensitive display 14.
Claims
1 . Method for selecting a digital image representation comprising:
(a) forming a group of digital image representations,
(b) displaying a first digital image representation of the group of digital image representations on a touch sensitive display,
(c) generating a position signal in response to a detection of a pointing device on the touch sensitive display, said position signal indicating a touch position,
(d) identifying a selected position within the displayed first digital image representation based on the position signal,
(e) generating a zoom-in signal in response to a detection of the pointing device sliding away from the touch position on the touch display,
(f) displaying an enlarged representation of the first digital image representation as well as the first digital image representation in response to the zoom-in signal,
(g) selecting a desired digital image representation among the group of digital image representations by repeating the acts (h)-(i) until a desired digital image representation among the group of digital image representations is selected,
(h) generating a shift signal, for shifting among the digital image representations of the group of image representations, in response to a detection of a sliding motion of the pointing device on the touch sensitive display, wherein the sliding direction of the sliding motion is along a trajectory that is substantially circular, and
(i) displaying an enlarged representation of another digital image representation in response to the shift signal, wherein the enlargement of the second digital image representation being based on the zoom signal generated during displaying of the first digital image representation, and wherein the number of digital image representations shifted is proportional to the length of the sliding motion.
2. Method according to any one of claims 1 , wherein only a portion of the displayed first digital image representation is shifted.
3. Method according to claim 1 or 2, wherein the enlarged
representations of the digital image representations are displayed at a predetermined area surrounding the touch position indicated by the position signal.
4. Method according to claim 3, wherein the predetermined area is substantially circular.
5. Method according to any one of claims 3-4, wherein the portion being shifted is indicated by tracing the contours of the predetermined area.
6. Method according to any one of claims 1 -5, further comprising discarding the not selected digital image representations of the group of digital image representations.
7. Method according to any one of claims 1 -6, comprising displaying the other digital image representation as well as the enlarged representation of the other digital image representation in response to the shift signal.
8. Method according to any one of claims 1 -6, comprising displaying the first image representation as well as the enlarged representation of the other digital image representation in response to the shift signal.
9. Method according to any one of claims 1 -8, wherein the touch sensitive display is a multi-touch sensitive display, wherein said generating of a position signal further includes a detection of a second pointing device on the multi-touch sensitive display, said position signal indicating a touch position which is based on each position of the two pointing devices respectively, wherein said zoom-in signal is generated in response to a detection of the two pointing devices sliding on the multi-touch display away from each other, and wherein said shift signal is generated in response to a sliding motion of the two pointing devices at substantially constant distance between the pointing devices.
10. Method according to claim 9, wherein the touch position being a position between the two detected pointing devices.
1 1 . Method according to any one of claims 1 -10, wherein the act of displaying an enlarged representation of the first image includes displaying an enlarged subarea of the first digital image representation in response to the zoom-in signal, the position of the subarea within the first digital image representation being based on the selected position and the enlargement of the subarea being based on the zoom-in signal.
12. Method according to any one of claims 9-1 1 , wherein the size of the subarea to be enlarged is based on the distance between the two pointing devices at the generation of the position signal.
13. Method according to any one of claims 1 -12, wherein said forming of a group of digital image representations includes capturing a plurality of different digital image representations of essentially the same view at different points in time.
14. Method according to any one of claims 1 -12, wherein said forming of a group of digital image representations includes capturing a plurality of different digital image representations of essentially the same view at different exposure settings.
15. Method according to any one of claims 1 -12, wherein said forming of a group of digital image representations includes capturing a plurality of different digital image representations of essentially the same view having different focus distances.
16. Method according to any one of claims 1 -12, wherein said forming of a group of digital image representations includes capturing a plurality of different digital image representations of essentially the same view being exposed for different transforms.
17. Method according to any one of claims 1 -12, wherein said forming of a group of digital image representations includes generating a plurality of digital image representations from one single original image by manipulating the original image differently for each digital image representation, the manipulation includes applying a transform or a parameter to the original image.
18. Method according to any one of claims 1 -17, wherein the pointing device is a fingertip.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP10820903.2A EP2483767B1 (en) | 2009-10-01 | 2010-09-22 | Method relating to digital images |
US13/499,711 US9792012B2 (en) | 2009-10-01 | 2010-09-22 | Method relating to digital images |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SE0901263 | 2009-10-01 | ||
SE0901263-4 | 2009-10-01 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011040864A1 true WO2011040864A1 (en) | 2011-04-07 |
Family
ID=43826514
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/SE2010/051019 WO2011040864A1 (en) | 2009-10-01 | 2010-09-22 | Method relating to digital images |
Country Status (3)
Country | Link |
---|---|
US (1) | US9792012B2 (en) |
EP (1) | EP2483767B1 (en) |
WO (1) | WO2011040864A1 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2541553A3 (en) * | 2011-06-27 | 2013-01-23 | Yamaha Corporation | Parameter controlling apparatus |
WO2013012370A1 (en) | 2011-07-15 | 2013-01-24 | Scalado Ab | Method of providing an adjusted digital image representation of a view, and an apparatus |
EP2919456A1 (en) * | 2014-03-11 | 2015-09-16 | Canon Kabushiki Kaisha | Display control apparatus and display control method |
US9196069B2 (en) | 2010-02-15 | 2015-11-24 | Mobile Imaging In Sweden Ab | Digital image manipulation |
US9344642B2 (en) | 2011-05-31 | 2016-05-17 | Mobile Imaging In Sweden Ab | Method and apparatus for capturing a first image using a first configuration of a camera and capturing a second image using a second configuration of a camera |
US9792012B2 (en) | 2009-10-01 | 2017-10-17 | Mobile Imaging In Sweden Ab | Method relating to digital images |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016111652A (en) * | 2014-12-10 | 2016-06-20 | オリンパス株式会社 | Imaging apparatus, imaging method and program |
CN113542625B (en) * | 2021-05-28 | 2024-08-20 | 北京迈格威科技有限公司 | Image processing method, device, equipment and storage medium |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060038908A1 (en) * | 2004-08-18 | 2006-02-23 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, program, and storage medium |
WO2007006075A1 (en) * | 2005-07-14 | 2007-01-18 | Canon Information Systems Research Australia Pty Ltd | Image browser |
US20080062141A1 (en) | 2006-09-11 | 2008-03-13 | Imran Chandhri | Media Player with Imaged Based Browsing |
WO2008038883A1 (en) * | 2006-09-29 | 2008-04-03 | Lg Electronics Inc. | Method of generating key code in coordinate recognition device and video device controller using the same |
EP1942401A1 (en) * | 2007-01-05 | 2008-07-09 | Apple Inc. | Multimedia communication device with touch screen responsive to gestures for controlling, manipulating and editing of media files |
US20090019399A1 (en) * | 2007-07-10 | 2009-01-15 | Brother Kogyo Kabushiki Kaisha | Image displaying device, and method and computer readable medium for the same |
US20090141046A1 (en) * | 2007-12-03 | 2009-06-04 | Apple Inc. | Multi-dimensional scroll wheel |
Family Cites Families (120)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5138460A (en) | 1987-08-20 | 1992-08-11 | Canon Kabushiki Kaisha | Apparatus for forming composite images |
US5657402A (en) * | 1991-11-01 | 1997-08-12 | Massachusetts Institute Of Technology | Method of creating a high resolution still image using a plurality of images and apparatus for practice of the method |
EP0592136B1 (en) | 1992-10-09 | 1999-12-08 | Sony Corporation | Producing and recording images |
JP3036439B2 (en) | 1995-10-18 | 2000-04-24 | 富士ゼロックス株式会社 | Image processing apparatus and image attribute adjustment method |
US6985172B1 (en) | 1995-12-01 | 2006-01-10 | Southwest Research Institute | Model-based incident detection system with motion classification |
US6075905A (en) | 1996-07-17 | 2000-06-13 | Sarnoff Corporation | Method and apparatus for mosaic image construction |
JPH10134163A (en) | 1996-11-05 | 1998-05-22 | Pfu Ltd | Device with scanner |
US6621524B1 (en) | 1997-01-10 | 2003-09-16 | Casio Computer Co., Ltd. | Image pickup apparatus and method for processing images obtained by means of same |
US6249616B1 (en) | 1997-05-30 | 2001-06-19 | Enroute, Inc | Combining digital images based on three-dimensional relationships between source image data sets |
US6542645B1 (en) | 1997-07-15 | 2003-04-01 | Silverbrook Research Pty Ltd | Adaptive tracking of dots in optical storage system using ink dots |
JP3931393B2 (en) | 1997-09-04 | 2007-06-13 | ソニー株式会社 | Camera-integrated video recorder and photographing method |
US6466701B1 (en) | 1997-09-10 | 2002-10-15 | Ricoh Company, Ltd. | System and method for displaying an image indicating a positional relation between partially overlapping images |
US6552744B2 (en) | 1997-09-26 | 2003-04-22 | Roxio, Inc. | Virtual reality camera |
EP1717677B1 (en) | 1998-01-26 | 2015-06-17 | Apple Inc. | Method and apparatus for integrating manual input |
JP3695119B2 (en) | 1998-03-05 | 2005-09-14 | 株式会社日立製作所 | Image synthesizing apparatus and recording medium storing program for realizing image synthesizing method |
US6304284B1 (en) | 1998-03-31 | 2001-10-16 | Intel Corporation | Method of and apparatus for creating panoramic or surround images using a motion sensor equipped camera |
JP3485543B2 (en) | 1998-06-22 | 2004-01-13 | 富士写真フイルム株式会社 | Imaging device and method |
US6317141B1 (en) | 1998-12-31 | 2001-11-13 | Flashpoint Technology, Inc. | Method and apparatus for editing heterogeneous media objects in a digital imaging device |
US6927874B1 (en) | 1999-04-02 | 2005-08-09 | Canon Kabushiki Kaisha | Image processing method, apparatus and storage medium therefor |
US6778211B1 (en) * | 1999-04-08 | 2004-08-17 | Ipix Corp. | Method and apparatus for providing virtual processing effects for wide-angle video images |
US6415051B1 (en) | 1999-06-24 | 2002-07-02 | Geometrix, Inc. | Generating 3-D models using a manually operated structured light source |
US7064783B2 (en) | 1999-12-31 | 2006-06-20 | Stmicroelectronics, Inc. | Still picture format for subsequent picture stitching for forming a panoramic image |
WO2001059709A1 (en) | 2000-02-11 | 2001-08-16 | Make May Toon, Corp. | Internet-based method and apparatus for generating caricatures |
JP4126640B2 (en) | 2000-03-08 | 2008-07-30 | 富士フイルム株式会社 | Electronic camera |
JP4208113B2 (en) | 2000-04-19 | 2009-01-14 | 富士フイルム株式会社 | Album creating method and apparatus, and recording medium |
US6930703B1 (en) | 2000-04-29 | 2005-08-16 | Hewlett-Packard Development Company, L.P. | Method and apparatus for automatically capturing a plurality of images during a pan |
US20020025796A1 (en) | 2000-08-30 | 2002-02-28 | Taylor William Stuart | System and method conducting cellular POS transactions |
JP5361103B2 (en) | 2000-10-24 | 2013-12-04 | 株式会社東芝 | Image processing device |
US6959120B1 (en) | 2000-10-27 | 2005-10-25 | Microsoft Corporation | Rebinning methods and arrangements for use in compressing image-based rendering (IBR) data |
US7099510B2 (en) | 2000-11-29 | 2006-08-29 | Hewlett-Packard Development Company, L.P. | Method and system for object detection in digital images |
US6975352B2 (en) | 2000-12-18 | 2005-12-13 | Xerox Corporation | Apparatus and method for capturing a composite digital image with regions of varied focus and magnification |
SE518050C2 (en) | 2000-12-22 | 2002-08-20 | Afsenius Sven Aake | Camera that combines sharply focused parts from various exposures to a final image |
GB2372165A (en) * | 2001-02-10 | 2002-08-14 | Hewlett Packard Co | A method of selectively storing images |
US7162080B2 (en) | 2001-02-23 | 2007-01-09 | Zoran Corporation | Graphic image re-encoding and distribution system and method |
WO2002076090A1 (en) | 2001-03-16 | 2002-09-26 | Vision Robotics Corporation | System and method to increase effective dynamic range of image sensors |
DE10122919A1 (en) | 2001-05-11 | 2002-09-05 | Infineon Technologies Ag | Mobile phone chip circuit, converting oscillator frequency to GHz carrier frequency, mixes divided- and direct local oscillator frequencies |
US6930718B2 (en) | 2001-07-17 | 2005-08-16 | Eastman Kodak Company | Revised recapture camera and method |
US7298412B2 (en) | 2001-09-18 | 2007-11-20 | Ricoh Company, Limited | Image pickup device, automatic focusing method, automatic exposure method, electronic flash control method and computer program |
US6724386B2 (en) | 2001-10-23 | 2004-04-20 | Sony Corporation | System and process for geometry replacement |
US7583293B2 (en) | 2001-12-06 | 2009-09-01 | Aptina Imaging Corporation | Apparatus and method for generating multi-image scenes with a camera |
US7573509B2 (en) | 2002-01-30 | 2009-08-11 | Ricoh Company, Ltd. | Digital still camera, reproduction device, and image processor |
US20030189647A1 (en) * | 2002-04-05 | 2003-10-09 | Kang Beng Hong Alex | Method of taking pictures |
US20030190090A1 (en) | 2002-04-09 | 2003-10-09 | Beeman Edward S. | System and method for digital-image enhancement |
WO2003088650A1 (en) | 2002-04-17 | 2003-10-23 | Seiko Epson Corporation | Digital camera |
WO2003105466A1 (en) | 2002-06-07 | 2003-12-18 | Koninklijke Philips Electronics N.V. | Method of imaging an object and mobile imaging device |
AU2003246268A1 (en) | 2002-08-09 | 2004-02-25 | Sharp Kabushiki Kaisha | Image combination device, image combination method, image combination program, and recording medium containing the image combination program |
US20040174434A1 (en) | 2002-12-18 | 2004-09-09 | Walker Jay S. | Systems and methods for suggesting meta-information to a camera user |
CN1771740A (en) | 2003-01-24 | 2006-05-10 | 米科伊公司 | Steroscopic panoramic image capture device |
US6856705B2 (en) | 2003-02-25 | 2005-02-15 | Microsoft Corporation | Image blending by guided interpolation |
US20040189849A1 (en) | 2003-03-31 | 2004-09-30 | Hofer Gregory V. | Panoramic sequence guide |
JP4120677B2 (en) | 2003-04-17 | 2008-07-16 | セイコーエプソン株式会社 | Generation of still images from multiple frame images |
US20040223649A1 (en) | 2003-05-07 | 2004-11-11 | Eastman Kodak Company | Composite imaging method and system |
US20040239767A1 (en) | 2003-05-29 | 2004-12-02 | Stavely Donald J. | Systems and methods for providing tactile feedback |
US20050062845A1 (en) | 2003-09-12 | 2005-03-24 | Mills Lawrence R. | Video user interface system and method |
JP2005117296A (en) | 2003-10-07 | 2005-04-28 | Matsushita Electric Ind Co Ltd | Apparatus and method for assisting search |
US7317479B2 (en) | 2003-11-08 | 2008-01-08 | Hewlett-Packard Development Company, L.P. | Automated zoom control |
EP1685537B1 (en) | 2003-11-18 | 2015-04-01 | Mobile Imaging in Sweden AB | Method for processing a digital image and image representation format |
US7656429B2 (en) | 2004-02-04 | 2010-02-02 | Hewlett-Packard Development Company, L.P. | Digital camera and method for in creating still panoramas and composite photographs |
JP5055686B2 (en) | 2004-05-13 | 2012-10-24 | ソニー株式会社 | Imaging system, imaging apparatus, and imaging method |
US7743348B2 (en) * | 2004-06-30 | 2010-06-22 | Microsoft Corporation | Using physical objects to adjust attributes of an interactive display application |
JP4579980B2 (en) | 2004-07-02 | 2010-11-10 | ソニー エリクソン モバイル コミュニケーションズ, エービー | Taking a series of images |
EP1613060A1 (en) | 2004-07-02 | 2006-01-04 | Sony Ericsson Mobile Communications AB | Capturing a sequence of images |
JP2006040050A (en) * | 2004-07-28 | 2006-02-09 | Olympus Corp | Reproduction device, camera and display switching method for reproduction device |
JP4293089B2 (en) | 2004-08-05 | 2009-07-08 | ソニー株式会社 | Imaging apparatus, imaging control method, and program |
JP4477968B2 (en) | 2004-08-30 | 2010-06-09 | Hoya株式会社 | Digital camera |
US7375745B2 (en) | 2004-09-03 | 2008-05-20 | Seiko Epson Corporation | Method for digital image stitching and apparatus for performing the same |
TWI246031B (en) | 2004-09-17 | 2005-12-21 | Ulead Systems Inc | System and method for synthesizing multi-exposed image |
US7646400B2 (en) | 2005-02-11 | 2010-01-12 | Creative Technology Ltd | Method and apparatus for forming a panoramic image |
US7595823B2 (en) | 2005-02-17 | 2009-09-29 | Hewlett-Packard Development Company, L.P. | Providing optimized digital images |
EP1875402A2 (en) | 2005-04-15 | 2008-01-09 | Clifford R. David | Interactive image activation and distribution system and associated methods |
US9621749B2 (en) | 2005-06-02 | 2017-04-11 | Invention Science Fund I, Llc | Capturing selected image objects |
US7659923B1 (en) | 2005-06-24 | 2010-02-09 | David Alan Johnson | Elimination of blink-related closed eyes in portrait photography |
US7424218B2 (en) | 2005-07-28 | 2008-09-09 | Microsoft Corporation | Real-time preview for panoramic images |
US20070024721A1 (en) | 2005-07-29 | 2007-02-01 | Rogers Sean S | Compensating for improperly exposed areas in digital images |
GB2428927A (en) | 2005-08-05 | 2007-02-07 | Hewlett Packard Development Co | Accurate positioning of a time lapse camera |
JP4288612B2 (en) | 2005-09-14 | 2009-07-01 | ソニー株式会社 | Image processing apparatus and method, and program |
US7483061B2 (en) | 2005-09-26 | 2009-01-27 | Eastman Kodak Company | Image and audio capture with mode selection |
US20070081081A1 (en) | 2005-10-07 | 2007-04-12 | Cheng Brett A | Automated multi-frame image capture for panorama stitching using motion sensor |
CN1750593A (en) | 2005-10-13 | 2006-03-22 | 上海交通大学 | Digital camera with image split function |
US9270976B2 (en) | 2005-11-02 | 2016-02-23 | Exelis Inc. | Multi-user stereoscopic 3-D panoramic vision system and method |
US20090295830A1 (en) * | 2005-12-07 | 2009-12-03 | 3Dlabs Inc., Ltd. | User interface for inspection of photographs |
US7639897B2 (en) | 2006-01-24 | 2009-12-29 | Hewlett-Packard Development Company, L.P. | Method and apparatus for composing a panoramic photograph |
JP4790446B2 (en) | 2006-03-01 | 2011-10-12 | 三菱電機株式会社 | Moving picture decoding apparatus and moving picture encoding apparatus |
US7715831B2 (en) | 2006-03-24 | 2010-05-11 | Sony Ericssson Mobile Communications, Ab | Methods, systems, and devices for detecting and indicating loss of proximity between mobile devices |
US7787664B2 (en) | 2006-03-29 | 2010-08-31 | Eastman Kodak Company | Recomposing photographs from multiple frames |
EP2003877B1 (en) * | 2006-03-31 | 2015-05-06 | Nikon Corporation | Image processing method |
WO2008064349A1 (en) | 2006-11-22 | 2008-05-29 | Nik Software, Inc. | Method for dynamic range editing |
US7839422B2 (en) | 2006-12-13 | 2010-11-23 | Adobe Systems Incorporated | Gradient-domain compositing |
US7809212B2 (en) | 2006-12-20 | 2010-10-05 | Hantro Products Oy | Digital mosaic image construction |
JP4853320B2 (en) | 2007-02-15 | 2012-01-11 | ソニー株式会社 | Image processing apparatus and image processing method |
US7729602B2 (en) | 2007-03-09 | 2010-06-01 | Eastman Kodak Company | Camera using multiple lenses and image sensors operable in a default imaging mode |
US7859588B2 (en) | 2007-03-09 | 2010-12-28 | Eastman Kodak Company | Method and apparatus for operating a dual lens camera to augment an image |
US8717412B2 (en) | 2007-07-18 | 2014-05-06 | Samsung Electronics Co., Ltd. | Panoramic image production |
JP4930302B2 (en) | 2007-09-14 | 2012-05-16 | ソニー株式会社 | Imaging apparatus, control method thereof, and program |
JP2009124206A (en) | 2007-11-12 | 2009-06-04 | Mega Chips Corp | Multimedia composing data generation device |
US8494306B2 (en) | 2007-12-13 | 2013-07-23 | Samsung Electronics Co., Ltd. | Method and an apparatus for creating a combined image |
US8750578B2 (en) | 2008-01-29 | 2014-06-10 | DigitalOptics Corporation Europe Limited | Detecting facial expressions in digital images |
JP4492724B2 (en) | 2008-03-25 | 2010-06-30 | ソニー株式会社 | Image processing apparatus, image processing method, and program |
US20090244301A1 (en) * | 2008-04-01 | 2009-10-01 | Border John N | Controlling multiple-image capture |
US8891955B2 (en) | 2008-04-04 | 2014-11-18 | Whitham Holdings, Llc | Digital camera with high dynamic range mode of operation |
US8249332B2 (en) | 2008-05-22 | 2012-08-21 | Matrix Electronic Measuring Properties Llc | Stereoscopic measurement system and method |
US20090303338A1 (en) | 2008-06-06 | 2009-12-10 | Texas Instruments Incorporated | Detailed display of portion of interest of areas represented by image frames of a video signal |
US8497920B2 (en) | 2008-06-11 | 2013-07-30 | Nokia Corporation | Method, apparatus, and computer program product for presenting burst images |
JP4513903B2 (en) | 2008-06-25 | 2010-07-28 | ソニー株式会社 | Image processing apparatus and image processing method |
US8768070B2 (en) | 2008-06-27 | 2014-07-01 | Nokia Corporation | Method, apparatus and computer program product for providing image modification |
US8463020B1 (en) * | 2008-07-08 | 2013-06-11 | Imove, Inc. | Centralized immersive image rendering for thin client |
JP2010020581A (en) | 2008-07-11 | 2010-01-28 | Shibaura Institute Of Technology | Image synthesizing system eliminating unnecessary objects |
US8654085B2 (en) * | 2008-08-20 | 2014-02-18 | Sony Corporation | Multidimensional navigation for touch sensitive display |
US8072504B2 (en) | 2008-08-27 | 2011-12-06 | Micron Technology, Inc. | Method and system for aiding user alignment for capturing partially overlapping digital images |
US8176438B2 (en) * | 2008-09-26 | 2012-05-08 | Microsoft Corporation | Multi-modal interaction for a screen magnifier |
US20100091119A1 (en) | 2008-10-10 | 2010-04-15 | Lee Kang-Eui | Method and apparatus for creating high dynamic range image |
KR20100070043A (en) | 2008-12-17 | 2010-06-25 | 삼성전자주식회사 | Method for displaying scene recognition of digital image signal processing apparatus, medium for recording the method and digital image signal processing apparatus applying the method |
SE0802657A1 (en) | 2008-12-23 | 2010-06-24 | Scalado Ab | Extraction of digital information |
US9792012B2 (en) | 2009-10-01 | 2017-10-17 | Mobile Imaging In Sweden Ab | Method relating to digital images |
EP2323102A1 (en) | 2009-10-23 | 2011-05-18 | ST-Ericsson (France) SAS | Image capturing aid |
US10080006B2 (en) | 2009-12-11 | 2018-09-18 | Fotonation Limited | Stereoscopic (3D) panorama creation on handheld device |
SE534551C2 (en) | 2010-02-15 | 2011-10-04 | Scalado Ab | Digital image manipulation including identification of a target area in a target image and seamless replacement of image information from a source image |
KR20120046802A (en) | 2010-10-27 | 2012-05-11 | 삼성전자주식회사 | Apparatus and method of creating 3 dimension panorama image by using a camera |
SE1150505A1 (en) | 2011-05-31 | 2012-12-01 | Mobile Imaging In Sweden Ab | Method and apparatus for taking pictures |
EP2718896A4 (en) | 2011-07-15 | 2015-07-01 | Mobile Imaging In Sweden Ab | Method of providing an adjusted digital image representation of a view, and an apparatus |
-
2010
- 2010-09-22 US US13/499,711 patent/US9792012B2/en active Active
- 2010-09-22 WO PCT/SE2010/051019 patent/WO2011040864A1/en active Application Filing
- 2010-09-22 EP EP10820903.2A patent/EP2483767B1/en not_active Not-in-force
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060038908A1 (en) * | 2004-08-18 | 2006-02-23 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, program, and storage medium |
WO2007006075A1 (en) * | 2005-07-14 | 2007-01-18 | Canon Information Systems Research Australia Pty Ltd | Image browser |
US20080062141A1 (en) | 2006-09-11 | 2008-03-13 | Imran Chandhri | Media Player with Imaged Based Browsing |
WO2008038883A1 (en) * | 2006-09-29 | 2008-04-03 | Lg Electronics Inc. | Method of generating key code in coordinate recognition device and video device controller using the same |
EP1942401A1 (en) * | 2007-01-05 | 2008-07-09 | Apple Inc. | Multimedia communication device with touch screen responsive to gestures for controlling, manipulating and editing of media files |
US20090019399A1 (en) * | 2007-07-10 | 2009-01-15 | Brother Kogyo Kabushiki Kaisha | Image displaying device, and method and computer readable medium for the same |
US20090141046A1 (en) * | 2007-12-03 | 2009-06-04 | Apple Inc. | Multi-dimensional scroll wheel |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9792012B2 (en) | 2009-10-01 | 2017-10-17 | Mobile Imaging In Sweden Ab | Method relating to digital images |
US9196069B2 (en) | 2010-02-15 | 2015-11-24 | Mobile Imaging In Sweden Ab | Digital image manipulation |
US9396569B2 (en) | 2010-02-15 | 2016-07-19 | Mobile Imaging In Sweden Ab | Digital image manipulation |
US9344642B2 (en) | 2011-05-31 | 2016-05-17 | Mobile Imaging In Sweden Ab | Method and apparatus for capturing a first image using a first configuration of a camera and capturing a second image using a second configuration of a camera |
EP2541553A3 (en) * | 2011-06-27 | 2013-01-23 | Yamaha Corporation | Parameter controlling apparatus |
WO2013012370A1 (en) | 2011-07-15 | 2013-01-24 | Scalado Ab | Method of providing an adjusted digital image representation of a view, and an apparatus |
EP2718896A4 (en) * | 2011-07-15 | 2015-07-01 | Mobile Imaging In Sweden Ab | Method of providing an adjusted digital image representation of a view, and an apparatus |
US9432583B2 (en) | 2011-07-15 | 2016-08-30 | Mobile Imaging In Sweden Ab | Method of providing an adjusted digital image representation of a view, and an apparatus |
EP2919456A1 (en) * | 2014-03-11 | 2015-09-16 | Canon Kabushiki Kaisha | Display control apparatus and display control method |
CN104915110A (en) * | 2014-03-11 | 2015-09-16 | 佳能株式会社 | Display control apparatus and display control method |
US9438789B2 (en) | 2014-03-11 | 2016-09-06 | Canon Kabushiki Kaisha | Display control apparatus and display control method |
CN104915110B (en) * | 2014-03-11 | 2019-04-05 | 佳能株式会社 | Display control unit and display control method |
Also Published As
Publication number | Publication date |
---|---|
EP2483767A4 (en) | 2016-01-20 |
EP2483767B1 (en) | 2019-04-03 |
EP2483767A1 (en) | 2012-08-08 |
US9792012B2 (en) | 2017-10-17 |
US20120262490A1 (en) | 2012-10-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9792012B2 (en) | Method relating to digital images | |
US11481096B2 (en) | Gesture mapping for image filter input parameters | |
US11550420B2 (en) | Quick review of captured image data | |
KR101893275B1 (en) | Method and apparatus for zooming in on selected area of preview interface | |
US20230094025A1 (en) | Image processing method and mobile terminal | |
US9438789B2 (en) | Display control apparatus and display control method | |
CN106993131B (en) | Information processing method and electronic equipment | |
JP2009500884A (en) | Method and device for managing digital media files | |
JP2013070303A (en) | Photographing device for enabling photographing by pressing force to screen, photographing method and program | |
CN107087102B (en) | Focusing information processing method and electronic equipment | |
JP2011050038A (en) | Image reproducing apparatus and image sensing apparatus | |
KR101739318B1 (en) | Display control apparatus, imaging system, display control method, and recording medium | |
JP2009140368A (en) | Input device, display device, input method, display method, and program | |
US20070097089A1 (en) | Imaging device control using touch pad | |
JP2001298649A (en) | Digital image forming device having touch screen | |
JP2011060111A (en) | Display device | |
CN105607825B (en) | Method and apparatus for image processing | |
US11442613B2 (en) | Electronic apparatus, control method of electronic apparatus, and non-transitory computer readable medium | |
JP5976166B2 (en) | Shooting device, shooting method and program capable of shooting by pressing on screen | |
JP2016192230A (en) | User interface device in which display is variable according to whether divice is held by right or left hand, display control method, and program | |
WO2021005415A1 (en) | Method for operating an electronic device in order to browse through photos | |
JP6362110B2 (en) | Display control device, control method therefor, program, and recording medium | |
RU2792413C1 (en) | Image processing method and mobile terminal | |
JP2017152841A (en) | Imaging apparatus, operation control method, and program | |
KR101477540B1 (en) | Photographing apparatus with automatic panning shot function and photographing method for panning shot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10820903 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010820903 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13499711 Country of ref document: US |