EP1220530A2 - Affichage d'une image - Google Patents

Affichage d'une image Download PDF

Info

Publication number
EP1220530A2
EP1220530A2 EP01310861A EP01310861A EP1220530A2 EP 1220530 A2 EP1220530 A2 EP 1220530A2 EP 01310861 A EP01310861 A EP 01310861A EP 01310861 A EP01310861 A EP 01310861A EP 1220530 A2 EP1220530 A2 EP 1220530A2
Authority
EP
European Patent Office
Prior art keywords
image
visual effect
user equipment
version
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
EP01310861A
Other languages
German (de)
English (en)
Other versions
EP1220530A3 (fr
Inventor
Jyrki Hoisko
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nokia Oyj
Original Assignee
Nokia Oyj
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Oyj filed Critical Nokia Oyj
Publication of EP1220530A2 publication Critical patent/EP1220530A2/fr
Publication of EP1220530A3 publication Critical patent/EP1220530A3/fr
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N1/32101Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N1/32128Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title attached to the image data, e.g. file header, transmitted message header, information on the same page or in the same computer file as the image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3274Storage or retrieval of prestored additional information

Definitions

  • the present invention relates to displaying an image, and in particular, but not exclusively, to displaying an image based on image data by means of a user equipment.
  • An image may comprise, for example, a photograph of a target.
  • the image may present a target such as a scenery, an object, or one or more living subjects such as one or more persons or animals.
  • An image may also comprise a picture of a graphical or textual presentation and so on.
  • image data can also be stored as digitised data in an appropriate data storage means such as in a hard disc of a computer or in a CD (compact disc).
  • the image can then presented to a viewer by displaying the image on a display of a user equipment based on the image data.
  • a user equipment typically comprises display means for displaying the image. Examples of possible user equipment for displaying images based on image data comprise various computers (both fixed and portable computers) and other user devices or equipment adapted to be able to display images based on image data.
  • the image data may be generated by the same user equipment by which it is displayed or by another equipment.
  • the image data may be stored in memory means of the user equipment.
  • the image data may also be stored in a remote data storage. It is also possible to transport the image data from the remote storage to the user equipment by means of appropriate communication media, such as over a data communication network or by means of a data carrier. If the user equipment is not directly an element of the data network, a data communication media is required between the data network and the user equipment.
  • the transmission of data between the data network and the user equipment may be accomplished over a wireless interface.
  • Arrangements utilising mobile user equipment and wireless interfaces for the transportation and presentation of images may be referred to by term 'mobile visual communication'.
  • the mobile visual communication can be understood as a technology that enables a mobile user to send and receive images e.g. by using some proprietary method or via a multimedia messaging system (MMS) gateway.
  • MMS multimedia messaging system
  • a still image itself does not tell anything else than what was captured in the image when it was taken. Thus the image may not always be sufficiently informative to describe all desired information associated with the image.
  • the image itself may not describe enough about e.g. the context of the target.
  • a prior art solution to present additional information has been addition of informative text within or aside the image area.
  • Textual information may also have been contained in the name of the image or the textual information may have been transported as a separate text/description file.
  • text information embedded in the picture may use a substantial area of the display screen or otherwise disturb the image. This is an disadvantage especially with the display screen of mobile user equipment, since the display screen thereof has typically a substantially small size.
  • reading and understanding the text messages requires attention and thus the solutions based on textual messages typically require the user to perceive and process the text in his brain.
  • the recipient of the image may wish to view the image or e.g. an MMS message immediately after he/she becomes aware that the reception of image data has commenced.
  • downloading of all image data that associates with the image may take a while, especially if the transportation occurs over one or more wireless interfaces.
  • Embodiments of the present invention aim to address one or several of the above problems.
  • a method for displaying an image comprising:
  • a method in a mobile station for displaying an image on a display means thereof comprising: receiving image data associated with the image and additional information from a data communication system, said image data and additional information being transmitted over a wireless interface between the mobile station and the data communication system; generating a visual effect to be presented in association with a version of the image, said visual effect being generated based on said additional information; displaying said visual effect by the display means; and displaying the image by the display means.
  • a user equipment for displaying an image comprising:
  • a communication system comprising: a data communication media for transporting data between at least two user equipment; a first user equipment for generating image data associated with an image, said first user equipment being adapted to associate additional information with the image data; a second user equipment comprising a receiver means for receiving the image data, a processor means for processing said received image data, and a display means for displaying the image based on the received image data, said second user equipment being also adapted to display an altered version of the image, wherein the altered version comprises a visual effect generated based on said additional information associated with the image.
  • the embodiments of the invention may provide a possibility to present additional information to the viewer of the image in a visualised form.
  • the visualised information may be presented prior to displaying the original i.e. complete image to the viewer.
  • the additional information about the imaging target may provide the viewer with a better understanding of the image and/or conditions affecting the target of the image, and thus enhance the visual communication.
  • Some of the embodiments enable provision of additional information even before the complete image has been downloaded into the user equipment. Thus the viewer may not need to wait to receive all image data before he/she can already start experiencing information associated with the image.
  • Figure 1 shows user equipment and a communication network that may be used for the embodiments of the present invention. More particularly, Figure 1 shows a first user equipment 1 and a second user equipment 2 and a communication network 5. The users of the devices 1 and 2 are not shown in Figure 1 for clarity reasons. The user equipment 1 and 2 are adapted for communication with each other over the data network 5. The communication between the respective user equipment 1,2 and the communication network 5 occurs via a respective wireless interface 3,4.
  • the communication network 5 may be any appropriate network capable of transporting image data.
  • the first user equipment 1 is provided with imaging means 10, such as a digital camera means.
  • the imaging means 10 are adapted to enable the user of the device 1 to point a lens of the imaging means 10 towards a target 6 of an image and for generating digitised image data of the object 6.
  • the user equipment 1 may be provided with a display 11 for viewing the image e.g. during the "shooting" thereof and/or before transmission of the image data to the second user equipment 2 via the data network 5. Control instructions may be input to the device 1 by means of the control buttons 12.
  • the second user equipment 2 is provided with a display 21 enabling the user thereof to view an image that has been generated based on image data.
  • the image data may be received from the network 5 via the wireless interface 4.
  • Image data may be stored in an image data storage 24 of the device 2.
  • the operation of the display and/or presentation of any images on the display is controlled by processor means 23.
  • a display software may be adapted to be run in the processor 23 of the user device 2 so that the processor is capable of expressing the additional information on the display 21.
  • Control instructions may be input to the device 2 by means of the control buttons 22.
  • multimedia messaging system MMS
  • MMS multimedia messaging system
  • the mobile imaging devices 1 and 2 may also use other data transportation media such as GSM data connection for the exchange of binary information, such as images without departing the scope of the present invention.
  • Figure 1 shows also an external database 7. This may be used in some embodiments for storing image data such that the image data may be retrieved from the database upon request from the user device 1 or 2.
  • visual effects such as different colouring or image quality schemes or graphical elements may be presented to the user with a modified version of the image while the "real" i.e. original version of the image is presented without these visual effects.
  • the effect of the embodiments is based on the fact that half of the primate human cortex is devoted to handling of visual information. Thus the humans are substantially good in analysing any visual information shown to them. Visualised information can typically be understood more effortlessly than other types of information. The visual information can also be understood substantially quickly, such as within half a second or so.
  • visualisation of additional information may help the viewer to more easily to understand the image and possible background information associated with the image.
  • the following will disclose in more detail different possibilities for dynamic visualisation of additional information to the viewers. Some possible uses of such dynamic visualisation are also discussed.
  • the additional information to be presented to the viewer of the image may be defined at the time of creation of the image. A possibility for this is to use the imaging apparatus to insert instructions regarding the presentation of the image to the image data, as will be explained in more detail later.
  • the additional information is then visualised in the displaying stage at the receiving terminal, i.e. while the recipient is watching the image.
  • the visualisation process comprises modifying the image between two or more different versions thereof.
  • the image may first appear as coloured with a certain colour and intensity, and then quickly turn back to normal.
  • the additional information regarding the context of the image that may be visualised by such a colour change may associate e.g. to temperature or the age of an image.
  • visualisation examples include, without limiting to these, visualisation of a spatial geographic difference between the two user equipment 1, 2 or a location of the transmitting station 1. This may occur e.g. by zooming the image from different areas or edges of the display area 21.
  • Visualisation schemes may also be used to separate different geographic locations or regions that associate with the image or the source (e.g. caller groups or individual callers) of the image.
  • the visualisation may also be used for expressing additional information that associates with emotional feelings or priority information (e.g. low, medium, high). Auditory information about the context of the picture may also be visualised. As a further example, it is also possible to visualise any movements of the user of the sending device 1 or the target 6. Examples of the implementations of these possibilities are described in more detail below with reference to Figures 2 to 5.
  • the user of the transmitting or source device 1 may determine what additional information should be presented to the user of the receiving device 2.
  • the additional information may be inserted manually by the sender of the image. According to a possibility the additional information is added automatically e.g. by sensors in the camera means or similar means capturing the image.
  • the additional information can be embedded by the user device 2 within the name of the image file.
  • the name of the image data file could be: scenery_Temp_Hot.pg; or Friend_laughing_MoodHappy.jpg.
  • the processor 23 of the device 2 is adapted to recognise any image file names containing this type of indications. After recognition the image can be presented accordingly.
  • the additional information is visualised e.g. by altering the colours of the image. This can be done e.g. by simply altering the colour index table of the device 2.
  • the colour values are interpreted e.g. by means of an appropriate hardware and/or software and the colours of the image will be changed accordingly.
  • the altering may be controlled by the processor 23.
  • This approach is advantageous in that no modification to the bitmap image itself is necessarily needed. In an indexed colour scheme, only the index colour needs be modified. That is, if the image is a 256 colour image, only 256 values must be modified. However, this approach requires that colour indexing is used for the images to be displayed.
  • colour indexing schemes examples include PNG (Portable Network Graphics) and GIF (CompuGraphics Image Format) or similar formats.
  • Colour displays that have no true colours e.g. 16/24bit displays
  • Some of the true colour systems use indexing to quickly and easily accomplish colour balancing or calibration of the image.
  • the system may be adapted to modify the whole bitmap.
  • two different images are generated, one comprising the visual effect and the other the actual image.
  • the original image may need to be copied and modified several times to an image buffer of the device 2.
  • this approach may require more processing capacity from the processor 23 than the ones based on indexing.
  • the modification on a portion of the image may be applied only. For example, only the middle area of the image window can be altered to form the visual effect. This may be accomplished by defining an area of the imaging area which is to be altered in the image data. The creator of the image may use this feature e.g. to highlight the importance of a certain area or object of the image.
  • the additional information may also be included with the image data as a separate text file.
  • a file may be based e.g. on a ASCII, HTML (Hyper Text Markup Language), XML (Extended Markup Language), RDF (Resource Description Framework), a JPEG or JPEG 2000 techniques or any other file that includes a free form field for additional information.
  • HTML/XML/RDF type formats the content of the additional text file may look like the following example for a description file of the received image:
  • This information may alternatively be embedded within the image structure or within the image binary content, thus providing more space for the required descriptions.
  • the implementation of systems that use this approach depend on the format that is used for the image.
  • the flowchart of Figure 6 illustrates a possible operation of the invention.
  • the visual effects are presented to the viewer and thus the additional information is visually presented to the viewer prior the actual image is displayed for the viewer.
  • a different version of the image is presented first for a certain period of time after the picture has first appeared on a display device prior displaying the complete or final version of the image.
  • the required time period depends on the application.
  • the time period is preferably defined such that it is long enough for the viewers to see the visual effect and to understand the additional information based on the effect but also such that the users do not experience the visual effect as disturbing.
  • this embodiment may also provide another advantage that relates to the capability of the receiving device to instantly display an image. That is, a first version of the image may be displayed before all data that associates with the image has been received. The first version does not necessarily need to be as sharp and/or "correct" as the final version. Therefore all image data may not be required before it is possible to display the first version of the image to the viewer.
  • This may be an advantage especially when the image is received over a wireless interface, as the downloading of substantially large image files over the wireless interface may take a while.
  • the embodiment provides a possibility to display additional information to the viewer during the downloading of the image data.
  • a user 1 shoots by means of his user equipment 1 an image of a snowy scenery.
  • the image is digitised by means of the processor 13 of the user equipment 1 for the transmission over the data network 5.
  • the user may then transmit the image data to a user device 2 e.g. by means of the MMS.
  • the image data is received at the mobile user device 2.
  • the processor means 23 recognises that the image data transmission contains text, audio and/or context information that associates with the image.
  • the mobile user device 2 displays the image dynamically. More particularly, at a first step the image is shown with blue and white colours only. However, after a certain period of time, such as 0.5 to 1 seconds or so, the image is presented in normal colours. The user of the device 2 may visually identify the differences between the initial colours and the actual colours. The change in colours will assist the user to understand that it is cold at the imaging target. Vice versa, e.g. for a photograph taken in hot conditions, the first image may be e.g. coloured with orange or red after which the image is changed back to normal.
  • Figure 2A shows the first version of an image, said first version being coloured (the colouring being illustrated by the grid of lines) to indicate a hot temperature.
  • Figure 2B illustrates the normal version of the image that is shown after the Figure 2A version has been displayed for a certain period of time.
  • a user browses images taken by his imaging device and/or images otherwise provided to the user device.
  • the images may be stored in a remote picture archive such as in the database 7 of Figure 1.
  • the remote image storage 7 may be offered by an appropriate service provider.
  • the database 7 may also comprise e.g. a home PC of a user (not shown). The PC is accessible by the mobile station of the user.
  • the images may be stored for long periods of time. During image browsing and prior presentation thereof to the user, the images may be dynamically modified depending on the age of the image.
  • the visualisation may associate with the age of the image relative to the moment when the image is viewed.
  • the age effect may also indicate the relative difference in age between the image and another image (e.g. a previously viewed image).
  • the visual effect may also indicate the relative age of an image when compared to a set of images.
  • the visualisation of the age of the image may be based on the known ageing phenomenon of the conventional paper pictures which tend to change in colour when getting older. Typically the old photographs tend to get more yellowish and/or greyish and/or brownish than what they originally were. This known effect can be utilised with the digital images as well. If the age of the digital image is substantially old, the image may be digitally modified such that it first appears e.g. as a greyish and yellowish image, transmuting then back to a normal colour image. Fresh images may also be indicated e.g. by green colour and so on.
  • Image data files typically comprise data associated with creation date and time of the image, e.g. by means of the so called time-stamp.
  • the age of the image may be determined based on this data by the user equipment 2 and the visual effect may be set accordingly.
  • the third example utilises location information regarding the location of the user equipment 1 used for transmitting an image.
  • the skilled person is aware of various possibilities for the provision of location information associated with the geographical location of a mobile user equipment, and thus they will not be explained here in detail. It is sufficient to note that substantially accurate geographical location information can be obtained, for example, based on the satellite based GPS (Global Positioning System). More accurate location information can be obtained through a differential GPS.
  • a possibility is to use a location service (LCS) provided by a cellular system for the provision of the location information. In the latter the cells or similar geographically limited access entities and associated controllers of the communication system can be utilised to produce location information associated with a mobile station.
  • LCS location service
  • the cellular communication system may also be provided with location measurement units that provide more accurate data concerning the location of the user equipment within the service area of the cellular system.
  • the location data may be processed in a specific location service (LCS) node implemented either within the cellular system or connected thereto. Alternatively, or in addition to the LCS node, the location data may be processed in the user equipment provided with appropriate processing capacity. All these techniques are known, and will thus not be discussed herein in more detail.
  • LCS location service
  • the user equipment 2 may also be made aware of its position.
  • the distance and direction to the user equipment 1 transmitting image data is visualised in the display phase.
  • the more far away the image comes from the smaller the initial picture frame may be.
  • the direction in which the sender is visualised to the user of the receiving terminal For example, if the image is taken or transmitted geographically north from the user, the image starts zooming from the top of the display. If the image is taken or transmitted east from the user, the image starts zooming from the right of the display, and so on.
  • the receiving mobile device 2 may be made aware of its position and direction (e.g. by means of a digital compass). If the location data concerning the transmitting device 1 is also provided to the receiving device 2, it is possible to control the display such that the image moves relatively on the display 21 based on the relative direction between the transmitting device 1 and the receiving device 2. For example, if an image arrived from the north and the user is facing the east, the image could appear from the left of the display 21, instead of the top of the display.
  • the capability of receive visualised information regarding the relative positioning may be useful e.g. in the future professional cyber tools that may be used in professions such as guards, policemen and maintenance personnel, who collaborate in a team in a scene and exchange images of views they are facing.
  • the users may even be provided with head-mounted displays and cameras, wearable computers and wireless connectivity.
  • a map of the area where the transmitting user equipment is located or what is illustrated by the image is shown by the display. Based on the location information the image received by the receiving user equipment "pops up" from a certain location in the map. For example, an image taken in London pops up or zooms out from the London area on a map showing England. In a city, the image may pop up from a specific street or building on a local map and so on. This is illustrated by Figure 5.
  • the visual effect is provided by displaying a distorted version of the image.
  • the image may be inclined, stretched, granular or come out from a screw or otherwise be purposively distorted from the final image.
  • the user may associate some colouring schemes to different locations. For example, if the user equipment 2 receives images about the region of the workplace of the user (sent e.g. by his colleagues), these images may appear first as red. If the images show the user's home region, the images may first appear e.g. as green. This colour information would indicate immediately additional information about the geographical source of the image. If the user knows that the image is taken in a certain environment (e.g. home/work), this may accelerate and ease the perception and identification of the image.
  • a certain environment e.g. home/work
  • Images received from persons belonging to different caller groups may be visualised in a similar manner as well. For example, images sent by family members will first appear as green, images sent by work-mates appear as red, images sent by friends appear as blue and so on. It shall be appreciated that other visualisation effects, such as movements of the images and so on, may also be used to identify the caller groups.
  • emotional feelings of the sender are transmitted to the receiver of the image by means of the visualised additional information.
  • the visualised additional information For example, vacation photographs of subjects such as scenery, people, statues may have strong emotional impact on the person taking the picture. However, these photographs may not create the same emotional feeling in the recipient as what the sender felt. It may be important for the sender to be able to mediate his emotional state (e.g. excitement) since it may have been the reason for taking the picture in the first place.
  • the emotional or affective information may be attached with the image and visualised in the display phase by using different colours (as described above) or other visual effects. The other effect may be such as shaking of the image, zooming of the image, despeckling of the image, building up the image out of blocks and so on.
  • the user equipment 1 of the sender may be provided with appropriate physiological sensors for sensing emotional information about the sender so that the information may be derived automatically.
  • the emotional state may be found out using emotion recognition methods, such as recognition of mood from voice or gestures or other physiological signals.
  • the emotional status may also be given manually by the sender by means of appropriate input means provided in the sending user equipment.
  • information regarding auditory events is shown visually to the viewer of the image.
  • a short auditory scene sample associated with the target of the image may be recorded and transmitted along with the image.
  • the user In order for the recipient user to get the knowledge of this auditory information during the display phase, the user must typically listen to the recordings while watching the image. However, in many instances, such as in a bus or train, the user may not wish to do so as it might be disrupting and/or a non-deliberate eavesdropping may occur.
  • the auditory information about the image's context may be visualised in the beginning of watching the image by appropriate visual effects. These effect may comprise the effect explained above, that is a colour change, zooming and so on. For example, if loud noises were present at the scene, a red initial colour and/or shaky movement may be used to visualise this. If the target scene was silent, a grey and still image may be displayed to the recipient user.
  • a further embodiment relates to a possibility of visualising movements of an object in a stationary still image.
  • the still images explain little about the movements in the picture, although a motion blur in the image might tell something.
  • Video would naturally reveal this but the recipient user equipment may support only still-images, or more expensive video transmission is not desired for the transmission of the image data.
  • the movement of the sender may be analysed using a set of accelerometer sensors provided in the imaging device.
  • the movement in the scene may be analysed by appropriate image analysis methods.
  • the movement information is attached with the image data sent to the recipient.
  • the data is visualised to the recipient user 2.
  • the image may move from the left of the display to the middle, or shake for a short period of time prior to displaying the proper image. This would indicate that the sending user 1 was moving while he/she was taking the image.
  • a camera-shoot is usually stationary, the camera may be e.g. head-mounted and the user may be in move. An example of such a user is a worker operating with wearable computers.
  • the change between different version of an image may occur at once.
  • the change may occur by smoothly “sliding" from the first version of the image towards the real image.
  • the sliding may occur by means of small steps between e.g. differently coloured version of an image.
  • the absolute degree of coloration may not always be an especially effective way of describing any accurate information, such as an accurate value of temperature.
  • the human beings recognise easily difference between colours and/or changes in colours. The different colours are thus believed to be a good visual effect when describing approximate or relative values, which is often sufficient.
  • For more accurate information it is possible to add e.g. textual notification in addition to the visual effect.
  • User equipment such as mobile stations with image processing capability and data communication systems that enable mobile users to transport substantially large data files to each other are beginning to be available for the public.
  • the users of such devices may find that the display of visual effects on images is of great benefit for them. For example, mobile subscribers may wish to sent images to each others just for fun.
  • the visual effects may add the entertainment value of the images, even in instances where the effects may have no informative value.
  • the additional information may also be used for personalization of the images.
  • Examples of the possibilities for the data communication network 5 of Figure 1 include, without limiting to these, standards and/or specifications such as GSM (Global System for Mobile communications) or various GSM based systems (such as GPRS: General Packet Radio Service), AMPS (American Mobile Phone System), DAMPS (Digital AMPS), WCDMA (Wideband Code Division Multiple Access), IS-95 or 3 rd generation UMTS (Universal Mobile Telecommunications System), IMT 2000 (International Mobile Telecommunications 2000) and so on.
  • GSM Global System for Mobile communications
  • GSM based systems such as GPRS: General Packet Radio Service
  • AMPS American Mobile Phone System
  • DAMPS Digital AMPS
  • WCDMA Wideband Code Division Multiple Access
  • IS-95 or 3 rd generation UMTS Universal Mobile Telecommunications System
  • IMT 2000 International Mobile Telecommunications 2000
  • the user equipment may be switched between modes in which the visual effects are displayed and in which the effects are not displayed. Those user equipment which are not capable of displaying the visual effect may simply ignore the additional information and display the image in a conventional manner. It is also possible to associate an indication to the image data which either allows or prevents display of any visual effects.
  • the indication may be e.g. a permission flag.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Information Transfer Between Computers (AREA)
  • Telephonic Communication Services (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Facsimiles In General (AREA)
  • User Interface Of Digital Computer (AREA)
  • Mobile Radio Communication Systems (AREA)
EP01310861A 2000-12-28 2001-12-24 Affichage d'une image Ceased EP1220530A3 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB0031752A GB2370709A (en) 2000-12-28 2000-12-28 Displaying an image and associated visual effect
GB0031752 2000-12-28

Publications (2)

Publication Number Publication Date
EP1220530A2 true EP1220530A2 (fr) 2002-07-03
EP1220530A3 EP1220530A3 (fr) 2004-07-14

Family

ID=9905964

Family Applications (1)

Application Number Title Priority Date Filing Date
EP01310861A Ceased EP1220530A3 (fr) 2000-12-28 2001-12-24 Affichage d'une image

Country Status (3)

Country Link
US (1) US7755566B2 (fr)
EP (1) EP1220530A3 (fr)
GB (1) GB2370709A (fr)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2843515A1 (fr) * 2002-08-08 2004-02-13 Eastman Kodak Co Procede de mise en page de messages multimedias
EP1422639A2 (fr) 2002-11-25 2004-05-26 Eastman Kodak Company Procédé de traitement d'images et système
EP1445938A1 (fr) * 2002-11-25 2004-08-11 Eastman Kodak Company Méthode et système d'imagerie de surveillance de la santé et de la sécurite personelle
WO2004110055A1 (fr) * 2003-06-11 2004-12-16 Nokia Corporation Procede et systeme de traitement d'image, dispositif et enregistrement d'image
US7046924B2 (en) 2002-11-25 2006-05-16 Eastman Kodak Company Method and computer program product for determining an area of importance in an image using eye monitoring information
US7206022B2 (en) 2002-11-25 2007-04-17 Eastman Kodak Company Camera system with eye monitoring
WO2012018517A1 (fr) * 2010-07-26 2012-02-09 Eastman Kodak Company Sélection automatique de mode de photographie d'un appareil photo numérique
EP2742416A1 (fr) * 2011-08-08 2014-06-18 Samsung Electronics Co., Ltd. Journalisation de vie et partage de mémoire
CN105830066A (zh) * 2013-12-19 2016-08-03 微软技术许可有限责任公司 用情绪状态信息来标记图像

Families Citing this family (54)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004120069A (ja) * 2002-09-24 2004-04-15 Canon Inc 画像処理装置、画像処理方法および該方法を実現するためのプログラム
US20040093432A1 (en) * 2002-11-07 2004-05-13 Eastman Kodak Company Method and system for conducting image processing from a mobile client device
US20050275566A1 (en) * 2004-06-14 2005-12-15 Nokia Corporation System and method for transferring content
US9704502B2 (en) * 2004-07-30 2017-07-11 Invention Science Fund I, Llc Cue-aware privacy filter for participants in persistent communications
US9779750B2 (en) * 2004-07-30 2017-10-03 Invention Science Fund I, Llc Cue-aware privacy filter for participants in persistent communications
US20150163342A1 (en) * 2004-07-30 2015-06-11 Searete Llc Context-aware filter for participants in persistent communication
US20060041848A1 (en) * 2004-08-23 2006-02-23 Luigi Lira Overlaid display of messages in the user interface of instant messaging and other digital communication services
NO323521B1 (no) 2004-09-10 2007-06-04 Telenor Asa Lokasjonsbildekommentering pa kameratelefoner
GB0502844D0 (en) 2005-02-11 2005-03-16 Univ Edinburgh Storing digital content for access using a captured image
JP4049173B2 (ja) * 2005-07-14 2008-02-20 コニカミノルタビジネステクノロジーズ株式会社 データ通信システム、画像処理装置、および画像処理装置におけるデータの管理方法
US8849821B2 (en) * 2005-11-04 2014-09-30 Nokia Corporation Scalable visual search system simplifying access to network and device functionality
US7392041B2 (en) * 2006-01-06 2008-06-24 Microsoft Corporation Mobile access to information using images
US20080071749A1 (en) * 2006-09-17 2008-03-20 Nokia Corporation Method, Apparatus and Computer Program Product for a Tag-Based Visual Search User Interface
US8775452B2 (en) 2006-09-17 2014-07-08 Nokia Corporation Method, apparatus and computer program product for providing standard real world to virtual world links
US20080071770A1 (en) * 2006-09-18 2008-03-20 Nokia Corporation Method, Apparatus and Computer Program Product for Viewing a Virtual Database Using Portable Devices
JP5480462B2 (ja) * 2007-02-27 2014-04-23 富士ゼロックス株式会社 文書処理プログラム、文書処理装置及び文書処理システム
US20080263449A1 (en) * 2007-04-20 2008-10-23 Microsoft Corporation Automated maintenance of pooled media content
US20090021531A1 (en) * 2007-07-20 2009-01-22 Vlad Vasilescu Window or door showing remote scenery in real-life motion
US20090150433A1 (en) * 2007-12-07 2009-06-11 Nokia Corporation Method, Apparatus and Computer Program Product for Using Media Content as Awareness Cues
US20090161963A1 (en) * 2007-12-20 2009-06-25 Nokia Corporation Method. apparatus and computer program product for utilizing real-world affordances of objects in audio-visual media data to determine interactions with the annotations to the objects
JP4952567B2 (ja) * 2007-12-21 2012-06-13 富士ゼロックス株式会社 画像検索システム、画像検索装置、プログラム
US8098894B2 (en) * 2008-06-20 2012-01-17 Yahoo! Inc. Mobile imaging device as navigator
US9218682B2 (en) * 2008-10-15 2015-12-22 Nokia Technologies Oy Method and apparatus for generating an image
US8935292B2 (en) * 2008-10-15 2015-01-13 Nokia Corporation Method and apparatus for providing a media object
US8121618B2 (en) 2009-10-28 2012-02-21 Digimarc Corporation Intuitive computing methods and systems
US9759917B2 (en) 2010-02-28 2017-09-12 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered AR eyepiece interface to external devices
US9097890B2 (en) 2010-02-28 2015-08-04 Microsoft Technology Licensing, Llc Grating in a light transmissive illumination system for see-through near-eye display glasses
US9366862B2 (en) 2010-02-28 2016-06-14 Microsoft Technology Licensing, Llc System and method for delivering content to a group of see-through near eye display eyepieces
US10180572B2 (en) 2010-02-28 2019-01-15 Microsoft Technology Licensing, Llc AR glasses with event and user action control of external applications
US9129295B2 (en) 2010-02-28 2015-09-08 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a fast response photochromic film system for quick transition from dark to clear
US9091851B2 (en) 2010-02-28 2015-07-28 Microsoft Technology Licensing, Llc Light control in head mounted displays
US9285589B2 (en) 2010-02-28 2016-03-15 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered control of AR eyepiece applications
US20120194549A1 (en) * 2010-02-28 2012-08-02 Osterhout Group, Inc. Ar glasses specific user interface based on a connected external device type
US20120249797A1 (en) 2010-02-28 2012-10-04 Osterhout Group, Inc. Head-worn adaptive display
US9128281B2 (en) 2010-09-14 2015-09-08 Microsoft Technology Licensing, Llc Eyepiece with uniformly illuminated reflective display
US9223134B2 (en) 2010-02-28 2015-12-29 Microsoft Technology Licensing, Llc Optical imperfections in a light transmissive illumination system for see-through near-eye display glasses
WO2011106797A1 (fr) 2010-02-28 2011-09-01 Osterhout Group, Inc. Déclenchement de projection par un repère externe dans des lunettes intégrales
US9134534B2 (en) 2010-02-28 2015-09-15 Microsoft Technology Licensing, Llc See-through near-eye display glasses including a modular image source
US20150309316A1 (en) 2011-04-06 2015-10-29 Microsoft Technology Licensing, Llc Ar glasses with predictive control of external device based on event input
US9229227B2 (en) 2010-02-28 2016-01-05 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a light transmissive wedge shaped illumination system
US9341843B2 (en) 2010-02-28 2016-05-17 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a small scale image source
US9182596B2 (en) 2010-02-28 2015-11-10 Microsoft Technology Licensing, Llc See-through near-eye display glasses with the optical assembly including absorptive polarizers or anti-reflective coatings to reduce stray light
US9097891B2 (en) 2010-02-28 2015-08-04 Microsoft Technology Licensing, Llc See-through near-eye display glasses including an auto-brightness control for the display brightness based on the brightness in the environment
US9132691B2 (en) * 2010-09-13 2015-09-15 Travel Tags, Inc. Mass customization of articles having dimensional and/or animated images
CN102479024A (zh) * 2010-11-24 2012-05-30 国基电子(上海)有限公司 手持装置及其用户界面构建方法
US8552881B2 (en) * 2011-02-09 2013-10-08 Harris Corporation Electronic device with a situational awareness function
JP2014035592A (ja) * 2012-08-07 2014-02-24 Fujitsu Ltd 画像キャプチャ装置、画像キャプチャ方法およびプログラム
KR20140035713A (ko) * 2012-09-14 2014-03-24 한국전자통신연구원 실감 미디어 저작 방법 및 장치, 이를 이용하는 휴대형 단말 장치
JP6313014B2 (ja) * 2013-08-27 2018-04-18 東芝ライフスタイル株式会社 カメラシステム、冷蔵庫
US9311639B2 (en) 2014-02-11 2016-04-12 Digimarc Corporation Methods, apparatus and arrangements for device to device communication
US20170324993A1 (en) * 2014-11-17 2017-11-09 Nec Corporation Video processing system, transmission device, and video processing method
US10332282B2 (en) 2015-06-24 2019-06-25 Incubate Messenger, Inc. System and method for fragmented reveal of a multimedia content
KR102551358B1 (ko) * 2018-09-21 2023-07-04 삼성전자주식회사 냉장고 내 객체의 상태와 관련된 정보를 제공하는 방법 및 시스템
CN114079730B (zh) * 2020-08-19 2023-09-12 华为技术有限公司 一种拍摄方法和拍摄系统

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1217859A1 (fr) * 2000-12-20 2002-06-26 Nokia Corporation Procédé et appareil pour commander des appels contextuelles par signalisation DTMF, et échange d'informations contextuelles pendant une connexion de communication mobile

Family Cites Families (77)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AUPQ131399A0 (en) * 1999-06-30 1999-07-22 Silverbrook Research Pty Ltd A method and apparatus (NPAGE02)
US4209852A (en) * 1974-11-11 1980-06-24 Hyatt Gilbert P Signal processing and memory arrangement
US4554591A (en) * 1983-09-19 1985-11-19 Polaroid Corporation Reproducible image and information bearing record
US4586038A (en) * 1983-12-12 1986-04-29 General Electric Company True-perspective texture/shading processor
JP2521183Y2 (ja) * 1987-09-29 1996-12-25 ソニー株式会社 ディジタル信号処理回路
EP0488723B1 (fr) * 1990-11-30 1997-02-26 Canon Kabushiki Kaisha Appareil de détection de vecteur de mouvement
EP0559708A1 (fr) * 1990-11-30 1993-09-15 Cambridge Animation Systems Limited Synthese et traitement d'images
US5821523A (en) * 1992-03-12 1998-10-13 Bunte; Alan G. Combined code reader and digital camera using a common photodetector
JPH05304641A (ja) * 1992-04-24 1993-11-16 Victor Co Of Japan Ltd テレビジョン受像機
DE4405020C1 (de) * 1994-02-17 1995-01-26 Inst Rundfunktechnik Gmbh Verfahren zum Empfangen von in einem Fernsehsignal übertragenen Daten
US7843497B2 (en) * 1994-05-31 2010-11-30 Conley Gregory J Array-camera motion picture device, and methods to produce new visual and aural effects
US6321158B1 (en) * 1994-06-24 2001-11-20 Delorme Publishing Company Integrated routing/mapping information
JPH0888815A (ja) * 1994-09-19 1996-04-02 Olympus Optical Co Ltd 映像表示システム
US5689269A (en) * 1995-01-25 1997-11-18 American Technology Corporation GPS relative position detection system
US6590996B1 (en) * 2000-02-14 2003-07-08 Digimarc Corporation Color adaptive watermarking
DE69624732T2 (de) * 1995-08-21 2003-03-13 Matsushita Electric Ind Co Ltd Vorrichtung und Verfahren zur Wiedergabe von optischen Platten, welche dynamisches Schalten von reproduzierten Daten ermöglichen
GB9518530D0 (en) * 1995-09-11 1995-11-08 Informatix Inc Image processing
JP3454396B2 (ja) * 1995-10-11 2003-10-06 株式会社日立製作所 動画像の変化点検出制御方法とそれに基づく再生停止制御方法およびそれらを用いた動画像の編集システム
JP3679487B2 (ja) * 1996-01-30 2005-08-03 キヤノン株式会社 通信装置及び通信方法
EP0847549B1 (fr) * 1996-03-06 1999-09-22 GMD-Forschungszentrum Informationstechnik GmbH Systeme a robot mobile autonome pour navigation par capteurs et cartes dans des reseaux de conduites
ATE194046T1 (de) * 1996-04-19 2000-07-15 Spotzoom As Verfahren und system zur manipulation von objekten in einem fernsehbild
JP3950515B2 (ja) * 1996-06-11 2007-08-01 キヤノン株式会社 再生装置及び編集方法
US5903309A (en) * 1996-09-19 1999-05-11 Flashpoint Technology, Inc. Method and system for displaying images and associated multimedia types in the interface of a digital camera
US20010033732A1 (en) * 1996-09-20 2001-10-25 Masahiro Juen Image reproduction apparatus for reproducing multiple image files
US6343987B2 (en) * 1996-11-07 2002-02-05 Kabushiki Kaisha Sega Enterprises Image processing device, image processing method and recording medium
USRE38401E1 (en) * 1997-01-16 2004-01-27 Obvious Technology, Inc. Interactive video icon with designated viewing position
US6262694B1 (en) * 1997-03-11 2001-07-17 Fujitsu Limited Image display system
US6786420B1 (en) * 1997-07-15 2004-09-07 Silverbrook Research Pty. Ltd. Data distribution mechanism in the form of ink dots on cards
US20030091329A1 (en) * 1997-04-12 2003-05-15 Tetsuro Nakata Editing system and editing method
US6091956A (en) * 1997-06-12 2000-07-18 Hollenberg; Dennis D. Situation information system
JP3304290B2 (ja) * 1997-06-26 2002-07-22 シャープ株式会社 ペン入力装置及びペン入力方法及びペン入力制御プログラムを記録したコンピュータ読み取り可能な記録媒体
US6573907B1 (en) * 1997-07-03 2003-06-03 Obvious Technology Network distribution and management of interactive video and multi-media containers
JP3786242B2 (ja) * 1997-07-14 2006-06-14 富士写真フイルム株式会社 画像処理方法および装置、画像再生方法および装置並びにその方法に使用する画像確認装置
US7050143B1 (en) * 1998-07-10 2006-05-23 Silverbrook Research Pty Ltd Camera system with computer language interpreter
AUPO918697A0 (en) * 1997-09-15 1997-10-09 Canon Information Systems Research Australia Pty Ltd Enhanced information gathering apparatus and method
US6125175A (en) * 1997-09-18 2000-09-26 At&T Corporation Method and apparatus for inserting background sound in a telephone call
JPH11203837A (ja) * 1998-01-16 1999-07-30 Sony Corp 編集システムおよび編集方法
US6426778B1 (en) * 1998-04-03 2002-07-30 Avid Technology, Inc. System and method for providing interactive components in motion video
FR2778523B1 (fr) * 1998-05-07 2000-06-09 Alsthom Cge Alcatel Procede d'affichage de sequences d'images par un equipement mobile de radiocommunication cooperant avec un module d'identification d'abonne
US6323858B1 (en) * 1998-05-13 2001-11-27 Imove Inc. System for digitally capturing and recording panoramic movies
US6167469A (en) * 1998-05-18 2000-12-26 Agilent Technologies, Inc. Digital camera having display device for displaying graphical representation of user input and method for transporting the selected digital images thereof
JP3697064B2 (ja) * 1998-05-29 2005-09-21 キヤノン株式会社 画像編集装置、記録装置、画像編集方法及びコンピュータ読み取り可能な記録媒体
US6633685B1 (en) * 1998-08-05 2003-10-14 Canon Kabushiki Kaisha Method, apparatus, and storage media for image processing
US6435969B1 (en) * 1998-11-03 2002-08-20 Nintendo Co., Ltd. Portable game machine having image capture, manipulation and incorporation
US6504990B1 (en) * 1998-11-12 2003-01-07 Max Abecassis Randomly and continuously playing fragments of a video segment
US6747642B1 (en) * 1999-01-29 2004-06-08 Nintendo Co., Ltd. Method and apparatus for providing non-photorealistic cartoon outlining within a 3D videographics system
CA2266208C (fr) * 1999-03-19 2008-07-08 Wenking Corp. Systeme d'echange de donnees sur la circulation routiere a distance et de vehicules routiers intelligents
US6519571B1 (en) * 1999-05-27 2003-02-11 Accenture Llp Dynamic customer profile management
EP1056014A1 (fr) * 1999-05-28 2000-11-29 Hewlett-Packard Company Système pour fournir une interface utilisateur à mérite de confiance
US7831244B2 (en) * 1999-06-30 2010-11-09 Silverbrook Research Pty Ltd Retrieving an image via a coded surface
JP4431216B2 (ja) * 1999-07-09 2010-03-10 富士フイルム株式会社 データ通信システム
US20060274164A1 (en) * 1999-08-17 2006-12-07 Nikon Corporation Information processing apparatus, information processing system, image input apparatus, image input system and information exchange method
GB9919805D0 (en) * 1999-08-21 1999-10-27 Univ Manchester Video cording
AUPQ464099A0 (en) * 1999-12-14 2000-01-13 Canon Kabushiki Kaisha Emotive editing system
US6366907B1 (en) * 1999-12-15 2002-04-02 Napster, Inc. Real-time search engine
US6976229B1 (en) * 1999-12-16 2005-12-13 Ricoh Co., Ltd. Method and apparatus for storytelling with digital photographs
EP1670233A1 (fr) * 1999-12-28 2006-06-14 Sony Corporation Système de transactions commerciales d'images photographiques utilisant un lecteur portable de musique
CN100508560C (zh) * 2000-01-26 2009-07-01 索尼公司 信息处理设备和处理方法
JP4792686B2 (ja) * 2000-02-07 2011-10-12 ソニー株式会社 画像処理装置及び画像処理方法並びに記録媒体
CA2851301C (fr) * 2000-03-30 2015-11-03 Sony Corporation Systeme de traitement de dons
US20010055035A1 (en) * 2000-04-07 2001-12-27 Naoto Kinjo Image processing method and system using computer graphics
US6618051B1 (en) * 2000-04-26 2003-09-09 Sony Corporation System and method for creating and displaying representations of holographic images
JP4560897B2 (ja) * 2000-06-02 2010-10-13 ソニー株式会社 通信装置、通信方法及び媒体
US6781724B1 (en) * 2000-06-13 2004-08-24 Eastman Kodak Company Image processing and manipulation system
US6977630B1 (en) * 2000-07-18 2005-12-20 University Of Minnesota Mobility assist device
US20030025649A1 (en) * 2000-09-08 2003-02-06 Wynne Willson Peter David Image projection apparatus
JP3784289B2 (ja) * 2000-09-12 2006-06-07 松下電器産業株式会社 メディア編集方法及びその装置
US20020078172A1 (en) * 2000-09-14 2002-06-20 Tadashi Yoshikai Image server, image communication system, and control methods thereof
JP4729785B2 (ja) * 2000-11-10 2011-07-20 ソニー株式会社 番組付加データ処理装置、サーバ装置、番組情報表示方法及び記録媒体
US6993553B2 (en) * 2000-12-19 2006-01-31 Sony Corporation Data providing system, data providing apparatus and method, data acquisition system and method, and program storage medium
JP2002189801A (ja) * 2000-12-21 2002-07-05 Sony Corp サービス提供システム、管理サーバ、サービスプロバイダ、端末装置、記録媒体発行装置、サービス提供方法、記録媒体
US7003061B2 (en) * 2000-12-21 2006-02-21 Adobe Systems Incorporated Image extraction from complex scenes in digital video
FI20002828A (fi) * 2000-12-22 2002-06-23 Nokia Corp Menetelmä ja järjestelmä tunnetilan ilmaisemiseksi puhelimella tapahtuvassa viestinnässä
US6871231B2 (en) * 2001-01-03 2005-03-22 Ipac Acquisition Subsidiary I, Llc Role-based access to image metadata
GB2377147A (en) * 2001-06-27 2002-12-31 Nokia Corp A virtual reality user interface
JP3719659B2 (ja) * 2001-12-26 2005-11-24 株式会社日立製作所 情報受信システム及び情報受信端末
US7920827B2 (en) * 2002-06-26 2011-04-05 Nokia Corporation Apparatus and method for facilitating physical browsing on wireless devices using radio frequency identification

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1217859A1 (fr) * 2000-12-20 2002-06-26 Nokia Corporation Procédé et appareil pour commander des appels contextuelles par signalisation DTMF, et échange d'informations contextuelles pendant une connexion de communication mobile

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2843515A1 (fr) * 2002-08-08 2004-02-13 Eastman Kodak Co Procede de mise en page de messages multimedias
WO2004019603A1 (fr) * 2002-08-08 2004-03-04 Eastman Kodak Company Procede de mise en page pour messages multimedia
KR100978969B1 (ko) * 2002-08-08 2010-08-30 이스트맨 코닥 캄파니 복합 멀티미디어 메시지의 자동 배치 및 자동 디스플레이를 가능하게 하는 방법
CN100380910C (zh) * 2002-08-08 2008-04-09 伊斯曼柯达公司 多媒体消息的编排方法
US7046924B2 (en) 2002-11-25 2006-05-16 Eastman Kodak Company Method and computer program product for determining an area of importance in an image using eye monitoring information
CN100430934C (zh) * 2002-11-25 2008-11-05 伊斯曼柯达公司 成像方法与系统
EP1422639A2 (fr) 2002-11-25 2004-05-26 Eastman Kodak Company Procédé de traitement d'images et système
US7206022B2 (en) 2002-11-25 2007-04-17 Eastman Kodak Company Camera system with eye monitoring
US7233684B2 (en) 2002-11-25 2007-06-19 Eastman Kodak Company Imaging method and system using affective information
EP1422639A3 (fr) * 2002-11-25 2006-05-03 Eastman Kodak Company Procédé de traitement d'images et système
CN100352255C (zh) * 2002-11-25 2007-11-28 伊斯曼柯达公司 用于健康监视和个人安全的成像方法和系统
US7319780B2 (en) 2002-11-25 2008-01-15 Eastman Kodak Company Imaging method and system for health monitoring and personal security
EP1445938A1 (fr) * 2002-11-25 2004-08-11 Eastman Kodak Company Méthode et système d'imagerie de surveillance de la santé et de la sécurite personelle
US7418116B2 (en) 2002-11-25 2008-08-26 Eastman Kodak Company Imaging method and system
KR100746602B1 (ko) 2003-06-11 2007-08-06 노키아 코포레이션 이미지 처리를 위한 방법 및 시스템, 장치 및 이미지레코드
WO2004110055A1 (fr) * 2003-06-11 2004-12-16 Nokia Corporation Procede et systeme de traitement d'image, dispositif et enregistrement d'image
US8072509B2 (en) 2003-06-11 2011-12-06 Nokia Corporation Method and a system for image processing, a device, and an image record
WO2012018517A1 (fr) * 2010-07-26 2012-02-09 Eastman Kodak Company Sélection automatique de mode de photographie d'un appareil photo numérique
US8970720B2 (en) 2010-07-26 2015-03-03 Apple Inc. Automatic digital camera photography mode selection
US9270882B2 (en) 2010-07-26 2016-02-23 Apple Inc. System and method for contextual digital photography mode selection
US9686469B2 (en) 2010-07-26 2017-06-20 Apple Inc. Automatic digital camera photography mode selection
EP2742416A1 (fr) * 2011-08-08 2014-06-18 Samsung Electronics Co., Ltd. Journalisation de vie et partage de mémoire
EP2742416A4 (fr) * 2011-08-08 2015-01-07 Samsung Electronics Co Ltd Journalisation de vie et partage de mémoire
CN105830066A (zh) * 2013-12-19 2016-08-03 微软技术许可有限责任公司 用情绪状态信息来标记图像

Also Published As

Publication number Publication date
EP1220530A3 (fr) 2004-07-14
US20020113757A1 (en) 2002-08-22
GB2370709A (en) 2002-07-03
US7755566B2 (en) 2010-07-13
GB0031752D0 (en) 2001-02-07

Similar Documents

Publication Publication Date Title
US7755566B2 (en) Displaying an image
US10439678B2 (en) Method and system for transfering data between plurality of devices
US7844229B2 (en) Mobile virtual and augmented reality system
KR100649040B1 (ko) 화상 데이터 관리·열람 방법
US20090111434A1 (en) Mobile virtual and augmented reality system
US20090054084A1 (en) Mobile virtual and augmented reality system
US9912969B1 (en) Live streaming video sharing system and related methods
US20030065665A1 (en) Device, method and recording medium for information distribution
JP2014195297A (ja) 地理情報に関連付けられたコンテンツを取得及び共有するためのシステム及び方法
JP2007334696A (ja) データ共有システム、通信端末及びサーバ
JP2010520540A (ja) ウェブに基づくソーシャルネットワークのメンバーを自動的に見つけるためのシステムおよび方法
KR20120017172A (ko) 휴대용 단말기에서 지오 태깅 시 전력 제어 장치 및 방법
WO2018117303A1 (fr) Procédé de fourniture de plateforme d'édition de ppl d'image intermédiaire entre des clips vidéo
JP2003186903A (ja) 情報配信装置、方法及びプログラム
JP2008005297A (ja) 画像撮影再生システム
JP2005150941A (ja) 情報提供システム及び方法
KR101950355B1 (ko) 개인 방송 어플리케이션이 설치되어 있는 스마트단말
JP2014106618A (ja) サーバ装置、端末装置、arコンテンツ提供方法、およびプログラム
US8085981B2 (en) Object perception method and a related perceiving device
CN113038161A (zh) 一种基于二维码的视频展示方法及系统
KR20150064613A (ko) 영상 표시 기기 및 그의 동작 방법
KR101421059B1 (ko) 이미지 파일을 이용한 스트리밍 영상의 제공방법
JP2008228120A (ja) 動画ファイル・サーバおよび端末装置ならびにそれらの動作制御方法およびそれらの制御プログラム
JP2005228249A (ja) 電子メール配信用のサーバ
KR100652701B1 (ko) 이동 통신 단말기의 웹 사이트로의 일기 등록 장치 및 방법

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE TR

AX Request for extension of the european patent

Free format text: AL;LT;LV;MK;RO;SI

PUAL Search report despatched

Free format text: ORIGINAL CODE: 0009013

AK Designated contracting states

Kind code of ref document: A3

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE TR

AX Request for extension of the european patent

Extension state: AL LT LV MK RO SI

AKX Designation fees paid

Designated state(s): DE FR IT

17P Request for examination filed

Effective date: 20020115

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED

18R Application refused

Effective date: 20120217