EP2833249B1 - Information processing device, information processing method and program - Google Patents

Information processing device, information processing method and program Download PDF

Info

Publication number
EP2833249B1
EP2833249B1 EP13767258.0A EP13767258A EP2833249B1 EP 2833249 B1 EP2833249 B1 EP 2833249B1 EP 13767258 A EP13767258 A EP 13767258A EP 2833249 B1 EP2833249 B1 EP 2833249B1
Authority
EP
European Patent Office
Prior art keywords
image
image information
information
captured image
displayed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP13767258.0A
Other languages
German (de)
English (en)
French (fr)
Other versions
EP2833249A4 (en
EP2833249A1 (en
Inventor
Shunichi Kasahara
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Publication of EP2833249A1 publication Critical patent/EP2833249A1/en
Publication of EP2833249A4 publication Critical patent/EP2833249A4/en
Application granted granted Critical
Publication of EP2833249B1 publication Critical patent/EP2833249B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/96Management of image or video recognition tasks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4126The peripheral being portable, e.g. PDAs or mobile phones
    • H04N21/41265The peripheral being portable, e.g. PDAs or mobile phones having a remote control device for bidirectional communication between the remote control device and client device
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/26Output arrangements for video game devices having at least one additional display device, e.g. on the game controller or outside a game booth
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/40Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network
    • A63F2300/403Connection between platform and handheld device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/02Handling of images in compressed format, e.g. JPEG, MPEG
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00347Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with another still picture apparatus, e.g. hybrid still picture apparatus

Definitions

  • the present disclosure relates to a server device, an information processing method, and a program.
  • Patent Literature 1 describes a technique including superimposing an image of a virtual object imitating a real object, such as furniture, onto an image of a real space, and then presenting the resultant image to thereby facilitate a trial of the arrangement of the furniture or the like.
  • Patent Literature 2 discloses a method for moving an item displayed on a first display to a remote second display by using a touch projector.
  • a user may select one item displayed on a first display by touching an image of the item in a video displayed by the touch projector, while the video shows the first display, and move the item to a second display by moving the touch projector in a way that the second display appears in the video.
  • an image of another device contained in an image in which an image of a real space is captured can be operated.
  • a load of performing processing of recognizing an image of another device contained in an image with a terminal device, for example is high. It is still hard to say that a technique of reducing the load is sufficiently proposed.
  • the present disclosure proposes novel and improved server device, information processing method, and program which allow a reduction in a load of the processing of recognizing an image of another device displayed in an image.
  • the invention provides an information processing device in accordance with claim 1.
  • the invention provides an information processing method in accordance with claim 12.
  • the invention provides a program in accordance with claim 13. Further aspects of the invention are set forth in the dependent claims, the drawings and the following description of embodiments.
  • a server device including a captured image information acquisition portion which acquires captured image information corresponding to a captured image, a displayed image information acquisition portion which acquires displayed image information corresponding to a first image displayed on a display screen, and an object recognition portion which detects the position and the posture of the first image in the captured image using the displayed image information and the captured image information.
  • an information processing method including acquiring captured image information corresponding to a captured image, acquiring displayed image information corresponding to a first image displayed on a display screen, and detecting the position and the posture of the first image in the captured image using the displayed image information and the captured image information.
  • the position and the posture of the first image displayed on the display screen in the captured image are detected using the information corresponding to the first image. More specifically, the object recognition of the captured image containing the first image can be performed after acquiring the information corresponding to the first image beforehand. Therefore, the processing load of the object processing can be reduced.
  • the load of the processing of recognizing an image of another device displayed in an image can be reduced.
  • FIG. 1 is a view for explaining the outline of this example
  • this embodiment relates to a terminal device 100 and a display device 200.
  • the terminal device 100 acquires a captured image of a real space containing the display device 200, and then displays an image 151 on a display portion 150 based on the captured image.
  • the terminal device 100 has a function of recognizing an object contained in the captured image and can utilize the object recognition result of the captured image when displaying the image 151 as described later.
  • the display device 200 has a display screen 250, and an image 251 is displayed on the display screen 250. Since the display device 200 is contained in the captured image acquired by the terminal device 100, the image 251 displayed on the display screen 250 is also contained in the captured image. The terminal device 100 recognizes the image 251 from the captured image, and then displays a virtual image 153 corresponding to the image 251 in the image 151.
  • the virtual image 153 may be one in which the image 251 contained in the captured image is drawn as it is or one which is re-drawn utilizing the object recognition result of the captured image.
  • the terminal device 100 is mainly a device which is held and operated by a user, such as a cellular phone (smartphone), a tablet personal computer (PC), or a portable game machine or media player, for example.
  • the display device 200 is mainly a stationary device, such as a television set, a desktop or notebook PC, or a PC monitor, for example.
  • the embodiment of the present disclosure is not limited to these examples.
  • both the terminal device 100 and the display device 200 may be smartphones.
  • the terminal device 100 has a function of recognizing the object contained in the captured image.
  • the terminal device 100 recognizes the image 251 contained in the captured image. Therefore, for example, the terminal device 100 can acquire an operation to the virtual image 153 in the image 151, and then can change the virtual image 153 in the same manner as in the case where the same operation is performed to the real image 251.
  • the terminal device 100 can assign the same functions as those of GUI (Graphical User Interface) components (a button, a link, a scroll bar, and the like) of the image 251 to the GUI components contained in the virtual image 153 by converting the coordinates on the display portion 150 to the coordinates on the display screen 250.
  • GUI Graphic User Interface
  • the virtual image 153 changes in the same manner as in the case where the same operation is performed to the GUI components of the image 251.
  • the processing of detecting the position and the posture of the image 251 contained in the captured image is processing with a relatively high load. Since the image 251 changes according to the operation state of the display device 200, the reproduction state of contents, and the like, for example, it is not easy to recognize the image 251 by static data prepared beforehand. Therefore, it is sometimes required to search for the image 251 using enormous amount of data on the network, for example.
  • the load generated by performing such processing is not desirable from the viewpoint of the performance of the terminal device 100, for example.
  • the embodiment of the present disclosure proposes a technique of reducing the load of the processing of recognizing the image 251 from the captured image of the terminal device 100.
  • FIG. 2 is a view for explaining the device configuration of this embodiment.
  • FIG. 3 is a schematic block diagram showing the functional configuration of a system according to this embodiment. For simplicity, in FIG. 3 , a plurality of display devices 200a to 200c illustrated in FIG. 2 are represented by a single display device 200.
  • the device configuration of this example contains the terminal device 100, display devices 200a to 200c, and a server device 300 (one example of an information processing device).
  • the number of the display devices 200 may not be 3 as in the example illustrated in the figure and may be 1, 2, or 4 or more. As described above with reference to FIG. 1 , even when the number of the display devices 200 is 1, the load of the processing of recognizing the image 251 contained in the captured image is relatively high. As in the example illustrated in the figure, when the plurality of display devices 200 are present, the load of the processing of recognizing the image 251 may be higher.
  • the server device 300 may not always be realized by a single device.
  • the function of the server device may be realized by the cooperation of resources of a plurality of devices through a network.
  • the terminal device 100 transmits captured image information 155 corresponding to the captured image to the server device 300.
  • the captured image information 155 may be image data of the captured image itself but is not limited thereto.
  • the captured image information 155 may be one which is adapted to be used for the object recognition processing in the server device 300 described later and, for example, may be compressed image data, a data array of feature points for use in the object recognition, or the like.
  • the display devices 200a to 200c transmit displayed image information 253a to 253c corresponding to the image 251 displayed on the display screens 250 of the display devices 200a to 200c, respectively, to the server device 300.
  • the displayed image information 253a to 253c may be the image data itself of the image 251 but are not limited thereto.
  • the displayed image information 253a to 253c may be those which are adapted to be used for the object recognition processing in the server device 300 described later and, for example, may be compressed image data, a data array of feature points for use in the object recognition, or the like.
  • the displayed image information 253a to 253c may not always be the same kind of information as the captured image information 155.
  • the captured image information 155 may be the image data itself of the captured image and the displayed image information 253a to 253c may be a data array of feature points.
  • the display devices 200a to 200c may not always transmit the same kind of displayed image information 253a to 253c.
  • the displayed image information 253a may be the image data itself of the image 251a and the displayed image information 253b and 253c may be data arrays of feature points.
  • the server device 300 may convert the displayed image information 253a to the same data array of feature points as those of the displayed image information 253b and 253c for use.
  • the server device 300 performs the object recognition processing using the captured image information 155 acquired from the terminal device 100 and the displayed image information 253a to 253c acquired from the display devices 200a to 200c. It is judged by this processing whether the image 251 displayed on any one of the display devices 200a to 200c is contained in the captured image of the terminal device 100. When the image 251 is contained in the captured image, the position and the posture are also detected.
  • the server device 300 transmits a recognition result information 157 including information showing the position and the posture of the detected image 251 to the terminal device 100.
  • the terminal device 100 displays the virtual image 153 in the image 151 using the recognition result information 157.
  • the configuration of the terminal device 100 is further described with reference to FIG. 3 .
  • the terminal device 100 contains an image pickup portion 110, a display control portion 120, and the display portion 150.
  • the image pickup portion 110 is realized by an image pickup device built in or connected to the terminal device 100, for example, and acquires a captured image.
  • the image pickup portion 110 may output the captured image as a dynamic image or may output the same as a still image.
  • the image 251 to be displayed on the display screen 250 of the display device 200 is contained in the captured image.
  • the image pickup portion 110 provides the generated captured image to the display control portion 120 and also transmits the captured image information 155 corresponding to the captured image to the server device 300.
  • the terminal device 100 may further contain a processing circuit for use in generating the captured image information 155 from the image data of the captured image, a communication device for transmitting the captured image information 155, and the like.
  • the display control portion 120 is realized by, for example, the operation of a central processing unit (CPU), a random access memory (RAM), and a read only memory (ROM) of the terminal device 100 according to a program stored in a storage device or a removable storage medium.
  • the display control portion 120 displays the image 151 on the display portion 150 based on the image data of the captured image provided from the image pickup portion 110.
  • the virtual image 153 corresponding to the image 251 recognized from the captured image is contained in the image 151.
  • the display control portion 120 receives the recognition result information 157 from the server device 300 through a communication device (not illustrated).
  • the information on the position and the posture of the image 251 may be contained in the recognition result information 157.
  • information on the contents of the image 251 may be contained in the recognition result information 157.
  • the display control portion 120 may display the virtual image 153 utilizing the recognition result information 157.
  • the display portion 150 is realized by a liquid crystal display (LCD), an organic electroluminescence display, or the like which is possessed by the terminal device 100 as an output device or which is connected to the terminal device 100 as an external connection device, for example.
  • the display portion 150 is not always limited to a flat display and may be a head mount display (HMD), for example.
  • the display portion 150 displays the image 151 according to the control of the display control portion 120.
  • the display device 200 contains the display control portion 210 and the display screen 250.
  • the display control portion 210 is realized by the operation of a CPU, a RAM, and a ROM of the display device 200 according to a program, for example.
  • the display control portion 210 displays an image stored in a storage of the display device 200 or an image received by the display device 200 using a communication device (not illustrated) as the image 251 on the display screen 250.
  • the display control portion 210 transmits the displayed image information 253 corresponding to the image 251 to the server device 300.
  • the display device 200 may further contain a processing circuit for use in generating the displayed image information 253 from the image data of the displayed image, a communication device for transmitting the displayed image information 253, and the like.
  • the display screen 250 is realized by a display, such as an LCD or an organic EL display, which is possessed by the terminal device 200 as an output device, for example.
  • the display screen 250 displays the image 251 according to the control of the display control portion 210.
  • the server device 300 contains a captured image information acquisition portion 310, a displayed image information acquisition portion 320, and an object recognition portion 330. These portions are all realized by the operation of a CPU, a RAM, and a ROM of the server device 300 according to a program, for example.
  • the captured image information acquisition portion 310 acquires the captured image information 155 transmitted from the terminal device 100 through a communication device (not illustrated). As described above, the captured image information 155 corresponds to the captured image acquired by the terminal device 100.
  • the captured image information 155 may be image data themselves of the captured image, compressed image data, a data array of feature points for use in the object recognition, or the like, for example.
  • the captured image information acquisition portion 310 provides the acquired captured image information 155 to the object recognition portion 330.
  • the displayed image information acquisition portion 320 acquires the displayed image information 253 transmitted from the display device 200 through a communication device (not illustrated). As described above, the displayed image information 253 corresponds to the image 251 displayed by the display device 200.
  • the displayed image information 253 may be image data themselves of the captured image, compressed image data, a data array of feature points for use in the object recognition, or the like, for example.
  • the displayed image information acquisition portion 320 provides the acquired displayed image information 253 to the object recognition portion 330.
  • the object recognition portion 330 recognizes an object contained in the captured image using the captured image information 155 provided from the captured image information acquisition portion 310. For example, the object recognition portion 330 compares a set of the feature points extracted from the captured image with the shape of the object defined by model data. The object recognition portion 330 may compare image data, such as a symbol mark or a text label defined by model data, with the captured image. Furthermore, the object recognition portion 330 may compare the amount of the features of the known object image defined by model data with the amount of the features extracted from the captured image.
  • the model data includes data defining the shape of each object, image data, such as a predetermined symbol mark or a text label attached to each object, data of the feature amount set extracted from the known image about each object, or the like.
  • the model data are acquired from a model DB stored in a storage device, for example. Or, the model data may be acquired from a network through a communication device (not illustrated).
  • the object recognition portion 330 recognizes the image 251 contained in the captured image.
  • the recognition of the image 251 may be performed by searching the contents of the image 251, for example.
  • the object recognition portion 330 acquires a uniform resource locator (URL) of the Web page and a uniform resource identifier (URI) showing the operation state thereof.
  • URL uniform resource locator
  • URI uniform resource identifier
  • the object recognition portion 330 acquires information on the contents of the image 251 described above using the displayed image information 253 provided from the displayed image information acquisition portion 320.
  • the object recognition portion 330 can easily acquire the information on the contents of the image 251 by the use of the displayed image information 253 which is the information provided from the display device 200 itself which displays the image 251. Therefore, the object recognition portion 330 can recognize the image 251 with a lower processing load and with higher accuracy.
  • the object recognition portion 330 transmits the object recognition result described above to the terminal device 100 as the recognition result information 157 through a communication device (not illustrated).
  • the recognition result information 157 includes information showing objects appearing (contained) in the captured image and the positions and the postures of the objects appearing in the captured image, for example. Therefore, when the image 251 appears in the captured image, the recognition result information 157 includes information showing the position and the posture of the image 251.
  • the recognition result information 157 may also further contain information on the contents of the image 251.
  • the object recognition processing performed by the object recognition portion 330 includes the following two kinds of processing, for example.
  • the first processing is processing of judging which object appears (contained) in the captured image.
  • the second processing is processing of detecting the appearance manner of the object contained in the captured image: the position and the posture (inclination, rotation, and the like) thereof in the captured image.
  • the object recognition processing performed by the object recognition portion 330 includes the following two kinds of processing, for example.
  • the first processing is a search processing.
  • the search processing is performed when the information on an object appearing in the captured image has not been acquired yet, for example. In this case, any object has a possibility of appearing in the captured image.
  • the object appears in the captured image there is a possibility that the object appears at any position in the captured image.
  • the second processing is a tracking processing.
  • the tracking processing is performed when the information on the object appearing in a frame prior to the frame of the captured image has already been acquired, for example.
  • the object appearing in the former frame has a high possibility of appearing in the present frame. Therefore, these objects can be subjected to the recognition processing with priority.
  • a possibility that the object appears also in the present frame at a position near a position where the objects have appeared in the former frame is high. Therefore, with respect to these objects, a region where the recognition processing is to be performed can be narrowed down to some extent. Therefore, the tracking processing can be performed with a lower processing load as compared with the load of the search processing.
  • FIG. 4 is a data flow diagram showing processing in this embodiment.
  • two devices of the plurality of display devices 200a to 200c illustrated in FIG. 2 are shown as the display devices 200a and 200b.
  • the terminal device 100 transmits the captured image information 155 to the server device 300 in Step S101.
  • the display devices 200a and 200b transmit the displayed image information 253a and 253b, respectively, to the server device 300.
  • Step S103 the object recognition portion 330 searches the images 251 (hereinafter referred to as images A and B) displayed on the display devices 200a and 200b, respectively, using the captured image information 155 and the displayed image information 253a and 253b in the server device 300.
  • the image A is found in Step S103.
  • the server device 300 transmits the recognition result information 157 showing the position and the posture of the image A in the captured image to the terminal device 100.
  • the terminal device 100 displays the virtual image 153 corresponding to the image A in the image 151 using the received recognition result information 157.
  • Step S105 the terminal device 100 transmits the captured image information 155 to the server device 300.
  • the captured image is a dynamic image in this embodiment.
  • the captured image information 155 to be transmitted in Step S105 corresponds to a frame after a frame corresponding to the captured image information 155 transmitted in Step S101.
  • the captured image information 155 may not always be transmitted to all the frames of the captured image. Therefore, the captured image information 155 to be transmitted in Step S105 may correspond to a frame after several frames following the frame corresponding to the captured image information 155 transmitted in Step S101, for example.
  • the display device 200b transmits the displayed image information 253b to the server device 300.
  • the display device 200a may not transmit the displayed image information 253a. This is because the image A is already found in Step S103 above, and, in the following steps, tracking of the image A contained in the captured image can be performed using the detection result obtained in the process.
  • Step S107 the object recognition portion 330 performs tracking of the already found image A and also searches for the image B using the captured image information 155 and the displayed image information 253b in the server device 300.
  • a display screen B is not found also in Step S107. Therefore, the server device 300 transmits the recognition result information 157 showing the position and the posture of the image A detected by the racking to the terminal device 100.
  • the terminal device 100 may further update the virtual image 153 corresponding to the image A using the received recognition result information 157.
  • Step S109 the terminal device 100 transmits the captured image information 155 to the server device 300.
  • the captured image information 155 to be transmitted in Step 109 corresponds to a frame further distant from the frame in Step S105.
  • Step S109 the display devices 200a and 200b transmit the displayed image information 253a and 253b, respectively, to the server device 300.
  • the image 251 (image A) to be displayed on the display screen 250 changes between Step S105 and Step S109 in the display device 200a.
  • the display device 200a also transmits the displayed image information 253a to the server device 300. This is because when the image A changes, a possibility that the tracking of the image A in the object recognition portion 330 fails is high.
  • Step S111 the object recognition portion 330 searches for the images A and B using the captured image information 155 and the displayed image information 253a and 253b in the server device 300.
  • the search of the image A may be performed after the tracking of the image A by the object recognition portion 330 has actually failed.
  • both the images A and B are found in Step 111.
  • the server device 300 transmits the recognition result information 157 showing the positions and the postures of the images A and B in the captured image to the terminal device 100.
  • the terminal device 100 displays two virtual images 153 corresponding to the image A and the image B in the image 151 using the received recognition result information 157.
  • FIG. 5 is a view for explaining the timing of the image information acquisition in this embodiment.
  • the captured image information 155 for six frames (which always do not constitute successive frames of a dynamic image) and the displayed image information 253-1 to 253-3 for use in recognition of the display screen 250 with the captured image information 155 are shown on the time axis in a corresponding manner.
  • the captured image information 155 is periodically acquired at an almost fixed interval.
  • the displayed image information 253-1 to 253-3 are acquired at timing different from that of the captured image information 155.
  • the displayed image information 253-1 to 253-3 may be acquired when the image 251 changes as shown in the example of FIG. 4 , for example.
  • the displayed image information 253 is desirably acquired at an interval longer than that of the captured image information 155.
  • the object recognition is usually performed for each captured image information 155. Therefore, even when the displayed image information 253 is acquired at a frequency higher than that of the captured image information 155, there is a possibility that the displayed image information 253 may become consequently useless.
  • the displayed image information 253-1 is used for the first one frame
  • the displayed image information 253-2 is used for the following three frames
  • the displayed image information 253-3 is used for the following 2 frames.
  • the object recognition portion 330 may continuously use the displayed image information 253 acquired before.
  • the captured image information acquisition portion 310 and the displayed image information acquisition portion 320 each acquire information in chronological order
  • the displayed image information 253 and the captured image information 155 to be matched with each other are determined based on synchronization information contained in at least one of the captured image information 155 and the displayed image information 253, for example.
  • the synchronization information includes information on the time at which each information is generated, for example.
  • the object recognition portion 330 may synchronize the displayed image information 253 and the captured image information 155 using the synchronization information as it is.
  • the object recognition portion 330 may select the displayed image information 253 to be used with the captured image information 155 according to a delay between the captured image information 155 and the displayed image information 253.
  • the delay may arise due to time lag between each device, a delay due to communication between each device, a processing delay in each device, or and the like, for example.
  • the object recognition portion 330 may detect the delay by comparing the timing at which the change of the image 251 is indicated by the displayed image information 253 with the timing, at which the contents of the image 251 change, recognized from the captured image.
  • the object recognition portion 330 can use a suitable displayed image information 253 to the captured image information 155 by applying an offset to either one of the synchronization information of the captured image information 155 or the synchronization information of the displayed image information 253, for example.
  • information on an image which may be contained in the captured image of the terminal device is provided from the display device itself which displays the image.
  • the recognition processing of the displayed image contained in the captured image is performed by the server device, and then the result is transmitted to the terminal device.
  • the result of recognizing the displayed image of another device contained in the captured image can be utilized while suppressing the processing load in the terminal device to the minimum, for example.
  • FIG. 6A and FIG. 6B are views for explaining the device configuration of this embodiment.
  • FIG. 7 is a schematic block diagram showing the functional configuration of a system according to this embodiment. For simplicity, a plurality of display devices 400a to 400c illustrated in FIG. 6A and FIG. 6B are represented by a single display device 400 in FIG. 7 .
  • this example relates to a terminal device 100 and display devices 400a to 400c (one example of an information processing device).
  • the number of the display devices 400 may not be 3 as in the example illustrated in the figure and may be 1, 2, or 4 or more.
  • the terminal device 100 transmits captured image information 155 corresponding to a captured image to each of the display devices 400a to 400c.
  • the contents of the captured image information 155 are the same as those in the case of the first embodiment.
  • the display devices 400a to 400c internally acquire displayed image information 253a to 253c corresponding to images 251a to 251c displayed on a display screen 250, respectively. More specifically, the display device 400a internally acquires the displayed image information 253a, the display device 400b internally acquires the displayed image information 253b, and the display device 400c internally acquires the displayed image information 253c.
  • the contents of the displayed image information 253 are the same as those in the case of the first embodiment.
  • the display devices 400a to 400c perform object recognition processing using the captured image information 155 acquired from the terminal device 100 and the displayed image information 253a to 253c internally acquired by the display devices 400a to 400c, respectively. It is judged by this processing whether any one of the images 251a to 251c is contained in the captured image of the terminal device 100. When any one of the images 251a to 251c is contained in the captured image, the position and the posture thereof are also detected.
  • the display device 400a detects the image 251a contained in the captured image, and then transmits a recognition result information 157a including information showing the position and the posture of the image 251a to the terminal device 100.
  • the display device 400c detects the image 251c contained in the captured image, and then transmits a recognition result information 157c including information showing the position and the posture of the image 251c to the terminal device 100.
  • the terminal device 100 displays two virtual images 153 corresponding to the image 251a and the image 251c into the image 151 using the recognition result information 157a and 157c.
  • FIG. 6B illustrates a state where the display device 400a has lost (not recognized) the image 251a contained in the captured image in the state of FIG. 6A .
  • the display device 400a searches for and detects the image 251a contained in the captured image, and then successively performs tracking of the image 251a.
  • the display device 400a notifies the failure to the terminal device 100.
  • the display device 400a changes the processing to the image 251a to search from tracking.
  • the terminal device 100 may terminate the display of a virtual image 153 corresponding to the image 251a in an image 151 in response to the notification.
  • the display device 400c successively succeeds in tracking of the image 251c contained in the captured image in the state of FIG. 6B . Therefore, the display device 400c successively transmits the recognition result information 157c updated according to the tracking result to the terminal device 100.
  • the terminal device 100 may update the display of a virtual image 153 corresponding to an image 521c in the image 151 using the received recognition result information 157c.
  • the terminal device 100 contains an image pickup portion 110, a display control portion 120, and a display portion 150.
  • the display device 400 contains a display control portion 210, a display screen 250, a captured image information acquisition portion 310, a displayed image information acquisition portion 320, and an object recognition portion 430.
  • each portion is the same component as that described in the first embodiment described with reference to FIG. 3 . More specifically, it can be said that the functional configuration of this embodiment is a configuration in which the function realized by the server device 300 in the first embodiment is alternatively realized by the display device 400.
  • the object recognition portion 430 contained in the display device 400 in this embodiment is different from the object recognition portion 330 of the server device 300 in the first embodiment in that the object recognition processing is performed to the image 251 which is mainly displayed by the display device 400 itself (the image 251a in the display device 400a, the image 251b in the display device 400b, and the image 251c in the display device 400c).
  • FIG. 8 is a data flow diagram showing the processing in this embodiment.
  • the terminal device 100 first transmits the captured image information 155 to each of the display devices 400a to 400c in Step S201.
  • Step S203 the object recognition portion 430 searches for the image 251a (hereinafter referred to as an image A) using the captured image information 155 and the internally acquired displayed image information 253a in the display device 400a.
  • the image A is found in Step S203.
  • the display device 400a transmits the recognition result information 157a showing the position and the posture of the image A in the captured image to the terminal device 100.
  • the terminal device 100 displays the virtual image 153 corresponding to the image A in the image 151 using the received recognition result information 157a.
  • Step S203 the object recognition portion 430 searches for the image 251b (hereinafter referred to as an image B) using the captured image information 155 and the internally acquired displayed image information 253b in the display device 400b.
  • the image B is not found in the example illustrated in the figure.
  • the object recognition portion 430 searches for the image 251c (hereinafter referred to as an image C) using the captured image information 155 and the internally acquired displayed image information 253c in the display device 400c.
  • the image C is found in Step S203 in the example illustrated in the figure.
  • the display device 400c transmits the recognition result information 157c showing the position and the posture of the image C to the terminal device 100.
  • the terminal device 100 displays the virtual image 153 corresponding to the image C in the image 151 using the received recognition result information 157c.
  • Step S205 the terminal device 100 transmits the captured image information 155 to each of the display devices 400a to 400c. Since the captured image is a dynamic image in this embodiment, the captured image information 155 to be transmitted in Step S205 corresponds to a frame after the frame corresponding to the captured image information 155 transmitted in Step S201.
  • Step S207 the object recognition portion 430 performs tracking of the already found image A in the display device 400a.
  • the tracking of the image A is successfully performed.
  • the display device 400a transmits the recognition result information 157a showing the position and the posture of the image A updated according to the tracking result to the terminal device 100.
  • the terminal device 100 updates the display of the virtual image 153 contained in the image 151 using the received recognition result information 157a.
  • Step S207 the display device 400b searches for the image B in the same manner as in Step S203 but the image B is not found.
  • the display device 400c performs tracking of the image C in the same manner as in the display device 400a, and then transmits the recognition result information 157c to the terminal device 100.
  • the terminal device 100 updates the display of the virtual image 153 contained in the image 151 using the received recognition result information 157c.
  • Step S209 the terminal device 100 transmits the captured image information 155 to each of the display devices 400a to 400c.
  • the captured image information 155 to be transmitted in Step S209 corresponds to a frame further distant from the frame in Step S205.
  • Step S211 the object recognition portion 430 successively performs the tracking of the image A in the display device 400a.
  • the tracking of the image A fails in Step S211, i.e., the display device 400a has lost the image A.
  • the display device 400a transmits a notification that the image A has been lost to the terminal device 100.
  • the terminal device 100 receiving the notification terminates the display of the virtual image 153 corresponding to the image A in the image 151.
  • the display device 400a searches for the image A again using the captured image information 155 received in Step S209 and the internally acquired displayed image information 253a.
  • Step S211 the display device 400b searches for the image B in the same manner as in Step S203 above but the image B is not found.
  • the display device 400c performs tracking of the image C in the same manner as in Step S207 above, and then transmits the recognition result information 157c to the terminal device 100.
  • the terminal device 100 updates the display of the virtual image 153 contained in the image 151 using the received recognition result information 157c.
  • the search processing and the tracking processing of the image contained in the captured image are performed by the display device itself which displays the image, and then the result is transmitted to the terminal device.
  • a calculation resource of the display can be effectively utilized, whereby the processing load in the terminal device can be suppressed.
  • a communication resource can be saved due to the fact that the displayed image information may not be transmitted between devices, for example.
  • FIG. 9A to FIG. 9C are views for explaining the device configuration of this embodiment.
  • FIG. 10 is a schematic block diagram showing the functional configuration of a system according to this embodiment.
  • a plurality of display devices 600a to 600c illustrated in FIG. 9A to FIG. 9C are represented by a single display 600.
  • this embodiment relates to a terminal device 500 and the display devices 600a to 600c (one example of an information processing device).
  • the number of the display devices 600 may not be 3 as in the example illustrated in the figure and may be 1, 2, or 4 or more.
  • the terminal device 500 transmits captured image information 155 corresponding to a captured image to each of the display devices 600a to 600c.
  • the contents of the captured image information 155 are the same as those in the case of the first embodiment.
  • the display devices 600a to 600c internally acquire displayed image information 253a to 253c corresponding to images 251a to 251c, respectively, displayed on a display screen 250. More specifically, the display device 600a internally acquires the displayed image information 253a, the display device 600b internally acquires the displayed image information 253b, and the display device 600c internally acquires the displayed image information 253c.
  • the contents of the displayed image information 253 are the same as those in the case of the first embodiment.
  • the display devices 600a to 600c perform object recognition processing using the captured image information 155 acquired from the terminal device 500 and the displayed image information 253a to 253c internally acquired by the display devices 600a to 600c, respectively. It is judged by this processing whether any one of the images 251a to 251c is contained in the captured image of the terminal device 100. When any one of the images 251a to 251c is contained in the captured image, the position and the posture are also detected.
  • the display device 600a finds the image 251a contained in the captured image. Then, the display device 600a transmits a tracking information 653a which can be utilized for tracking of the image 251a to the terminal device 500.
  • the tracking information 653a may include information on the position and the posture of the image 251a in the captured image and the contents of the image 251a, for example.
  • the terminal device 100 performs the tracking of the image 251a using the tracking information 653a, and then displays a virtual image 153 corresponding to the image 251a in the image 151.
  • FIG. 9B shows a state after the image 251a is detected in FIG. 9A .
  • the display device 600a delegates detection (tracking) of the position and the posture of the image 251a to the terminal device. More specifically, the terminal device 500 does not transmit the captured image information 155 to the display device 600a. The display device 600a also does not transmit the tracking information 653a to the terminal device. The terminal device 500 performs the tracking of the image 251a contained in the captured image successively using the tracking information 653a received before, and then updates the display of the virtual image 153 in the image 151 using the tracking result.
  • the terminal device 500 successively transmits the captured image information 155 to the display devices 600b and 600c.
  • the display devices 600b and 600c search for the images 251b and 251c, respectively, similarly as in the state of FIG. 9A .
  • FIG. 9C illustrates a state where the image 251a displayed on the display device 400a has changed or the tracking of the image 251a by the terminal device 500 has failed in the state of FIG. 9B .
  • the display device 400a detects (searches) the position and the posture of the image 251a again, and then transmits a new tracking information 653a to the terminal device 500.
  • the terminal device 500 performs tracking of the image 251a using the newly received tracking information 653a, and then updates the display of the virtual image 153 in the image 151.
  • the terminal device 500 contains an image pickup portion 110, a display control portion 120, a display portion 150, and an object recognition portion 530.
  • the display device 600 contains a display control portion 210, a display screen 250, a captured image information acquisition portion 310, a displayed image information acquisition portion 320, and an object recognition portion 430.
  • each portion is the same component as that described in the second embodiment with reference to FIG. 7 .
  • the object recognition portion is contained in both the terminal device 500 and the display device 600.
  • the object recognition portion 430 of the display device 600 performs mainly a search processing of the image 251 displayed by the display device 600 itself. As described above with reference to FIG. 9A to FIG. 9C , the object recognition portion 430 of the display device 600 performs the processing of searching for the image 251 until the image 251 is found in the captured image. When the image 251 is found, the object recognition portion 430 transmits s tracking information 653 containing information on the position and the posture of the image 251 and the contents of the image 251, for example, to the object recognition portion 530 of the terminal device 500. Thereafter, the object recognition portion 430 may stop the recognition processing of the image 251 until the tracking of the image 251 by the terminal device 500 fails or the contents of the image 251 are changed.
  • the object recognition portion 530 of the terminal device 500 performs processing of performing tracking of the image 251 when the image 251 is found in the captured image.
  • the information on the position and the posture of the image 251 found before and the contents of the image 251 to be used for the tracking processing may be acquired from the tracking information 653 transmitted by the object recognition portion 430 of the display device 600. Therefore, even when the search processing with relatively high processing load is not performed, the object recognition portion 530 can start tracking processing with relatively low processing load.
  • the recognition processing of an object other than image 251 may be performed by either one of the object recognition portion 430 or the object recognition portion 530 described above.
  • an object recognition portion which performs the recognition processing of an object other than the image 251 may be contained in the terminal device 100 also in the first and second embodiments.
  • FIG. 11 is a data flow diagram showing the processing in this embodiment.
  • the terminal device 500 first transmits the captured image information 155 to each of the display devices 600a to 600c in Step S301.
  • Step S303 the object recognition portion 430 searches for the image 251a (hereinafter referred to as an image A) using the captured image information 155 and the internally acquired displayed image information 253a in the display device 600a.
  • the image A is found in Step S303.
  • the display device 600a transmits the tracking information 653a which can be utilized for the tracking of the image 251a to the terminal device 500.
  • the object recognition portion 530 of the terminal device 100 performs tracking of the image A using the received tracking information 653a, and then displays a virtual image 153 corresponding to the image A in the image 151.
  • the object recognition portion 430 searches for the image 251b (hereinafter referred to as an image B) and the image 251c (hereinafter referred to as an image C), but the images are not found.
  • Step S305 the terminal device 500 transmits the captured image information 155 to each of the display devices 600b and 600c.
  • the captured image is a dynamic image
  • the captured image information 155 to be transmitted in Step S305 corresponds to a frame after the frame corresponding to the captured image information 155 transmitted in Step S301.
  • the terminal device 500 since the terminal device 500 recognizes the image A at this point, the terminal device 500 does not transmit the captured image information 155 to the display device 600a.
  • Step S307 the object recognition portion 530 performs tracking of the image A in the terminal device 500.
  • the object recognition portion 430 searches for the image B and the image C but the images are not found.
  • Step S309 the terminal device 500 transmits the captured image information 155 to each of the display devices 600b and 600c in the same manner as in Step S305.
  • Step S311 the display device 600a detects a change of the image 251a. Then, the display device 600a transmits a tracking information 653a corresponding to the image 251a after the change to the terminal device 500. In the terminal device 500, the object recognition portion 530 performs tracking of the image 251a using the newly received tracking information 653a, and then updates the display of the virtual image 153 in the image 151 using the tracking result.
  • the tracking after the displayed image is found by the search is performed by the terminal device which acquires the captured image. More specifically, the tracking processing with a relatively low processing load is performed by the terminal device while dispersing the processing load of the search with the highest processing load from the terminal device. Thus, an unnecessary consumption of a calculation resource of the display device is prevented, for example. Moreover, a communication resource can be saved by eliminating the necessity of transmitting the captured image information to all the display devices, for example.
  • the terminal device may function as the information processing device, for example. Also in this case, the processing load of the object recognition in the terminal device is reduced due to the fact that the information corresponding to the image displayed by the display device is provided.
  • FIG. 12 is a block diagram for explaining the hardware configuration of the information processing device.
  • the information processing device 900 contains a central processing unit (CPU) 901, a read only memory (ROM) 903, and a random access memory (RAM) 905. Furthermore, the information processing device 900 may also contain a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, a storage device 919, a drive 921, a connection port 923, and a communication device 925. The information processing device 900 may have a processing circuit, such as digital signal processor (DSP), in place of or together with the CPU 901.
  • DSP digital signal processor
  • the CPU 901 functions as an arithmetic processing device and a control device and controls all or some operations in the information processing device 900 according to various programs recorded in the ROM 903, the RAM 905, the storage device 919, or a removable recording medium 927.
  • the ROM 903 stores a program, an operation parameter, and the like to be used by the CPU 901.
  • the RAM 905 primarily stores a program to be used in the execution of the CPU 901, parameters which change as appropriate in the execution thereof, and the like.
  • the CPU 901, the ROM 903, and the RAM 905 are connected to each other by the host bus 907 constituted by an internal bus, such as a CPU bus. Furthermore, the host bus 907 is connected to the external bus 911, such as a peripheral component interconnect/interface (PCI) bus, through the bridge 909.
  • PCI peripheral component interconnect/interface
  • An input device 915 is a device operated by a user, such as a mouse, a keyboard, a touch panel, a button, a switch, a lever, and the like, for example.
  • the input device 915 may be a remote control device utilizing infrared rays or other electric waves, for example, or may be an external connection device 929, such as a cellular phone corresponding to an operation of the information processing device 900.
  • the input device 915 contains an input control circuit which generates an input signal based on information input by a user, and then outputs the input signal to the CPU 901. The user operates the input device 915 to thereby input various kinds of data or direct a processing operation to the information processing device 900.
  • the output device 917 is constituted by a device capable of visually or audibly notifying the acquired information to a user.
  • the output device 917 may be, for example, a display device, such as a liquid crystal display (LCD), a plasma display panel (PDP), and an organic electro-luminescence (EL) display, a sound output device, such as a speaker and a headphone, a printer device, and the like.
  • the output device 917 outputs the result obtained by the processing of the information processing device 900 as a picture, such as a text or an image, or outputs the same as a sound, such as a voice or a sound.
  • the storage device 919 is a data storing device constituted as one example of a storage portion of the information processing device 900.
  • the storage device 919 is constituted by a magnetic storage device, such as a hard disk drive (HDD), a semiconductor storage device, an optical storage device, or a magneto-optical storage device, or the like, for example.
  • the storage device 919 stores a program and various kinds of data to be performed by the CPU901, various kinds of data acquired from the outside, and the like.
  • the drive 921 is a reader/writer for the removable recording media 927, such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and is built in or externally attached to the information processing device 900.
  • the drive 921 reads information recorded on the attached removable recording medium 927, and then outputs the read information to the RAM 905.
  • the drive 921 writes a record in the attached removable recording medium 927.
  • the connection port 923 is a port for directly connecting a device to the information processing device 900.
  • the connection port 923 may be a universal serial bus (USB) port, an IEEE1394 port, an small computer system interface (SCSI) port, or the like, for example.
  • the connection port 923 may also be an RS-232C port, an optical audio terminal, a high-definition multimedia interface (HDMI) port, or the like.
  • USB universal serial bus
  • SCSI small computer system interface
  • HDMI high-definition multimedia interface
  • the communication device 925 is a communication interface constituted by a communication device for the connection with the communication network 931 and the like, for example.
  • the communication device 925 may be a communication card for a wired or wireless local area network (LAN), Bluetooth (Registered Trademark), a wireless USB (WUSB), or the like.
  • the communication device 925 may also be a router for optical communication, a router for an asymmetric digital subscriber line (ADSL), or a modem for various communications.
  • the communication device 925 transmits and receives a signal and the like between the Internet or other communication devices using predetermined protocols, such as TCP/IP, for example.
  • the communication network 931 connected to the communication device 925 is a wired or wirelessly-connected network and may be the Internet, a home LAN, infrared data communication, radio wave data communication, or satellite data communication, for example.
  • the image pickup device 933 is a device which captures an image of a real space using various kinds of members, such as an image pickup element, such as a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS), and a lens for controlling image formation of a target image to the image pickup element to generate a captured image, for example.
  • the image pickup device 933 may be one which captures a still image or may be one which captures a dynamic image.
  • the sensor 935 includes various kinds of sensors, such as an accelerometer, a gyroscope sensor, a geomagnetism sensor, an optical sensor, and a sound sensor, for example.
  • the sensor 935 acquires information on the state of the information processing device 900, such as the posture of a case of the information processing device 900 itself, and information on the circumferential environment of the information processing device 900, such as brightness, noise, and the like around the information processing device 900, for example.
  • the sensor 935 may also contain a global positioning system (GPS) sensor which receives a GPS signal to measure the latitude, the longitude, and the altitude of a device.
  • GPS global positioning system
  • the above description describes one example of the hardware configuration of the information processing device 900.
  • Each of the above-described components may be configured employing general-purpose members or may be configured by a hardware tailored to the function of each component. Such configurations may be changed as appropriate according to the technological level when the embodiments are implemented.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)
  • Digital Computer Display Output (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Telephone Function (AREA)
  • Information Transfer Between Computers (AREA)
EP13767258.0A 2012-03-26 2013-02-08 Information processing device, information processing method and program Active EP2833249B1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012069713 2012-03-26
PCT/JP2013/053009 WO2013145883A1 (ja) 2012-03-26 2013-02-08 情報処理装置、情報処理方法およびプログラム

Publications (3)

Publication Number Publication Date
EP2833249A1 EP2833249A1 (en) 2015-02-04
EP2833249A4 EP2833249A4 (en) 2016-04-06
EP2833249B1 true EP2833249B1 (en) 2019-06-19

Family

ID=49259168

Family Applications (1)

Application Number Title Priority Date Filing Date
EP13767258.0A Active EP2833249B1 (en) 2012-03-26 2013-02-08 Information processing device, information processing method and program

Country Status (5)

Country Link
US (1) US9984468B2 (ja)
EP (1) EP2833249B1 (ja)
JP (2) JP5892236B2 (ja)
CN (1) CN104205012A (ja)
WO (1) WO2013145883A1 (ja)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102001218B1 (ko) * 2012-11-02 2019-07-17 삼성전자주식회사 객체와 관련된 정보 제공 방법 및 이를 위한 디바이스
US9716842B1 (en) * 2013-06-19 2017-07-25 Amazon Technologies, Inc. Augmented reality presentation
US10152495B2 (en) 2013-08-19 2018-12-11 Qualcomm Incorporated Visual search in real world using optical see-through head mounted display with augmented reality and user interaction tracking
CN104898835B (zh) * 2015-05-19 2019-09-24 联想(北京)有限公司 一种信息处理方法及电子设备
GB2555841A (en) * 2016-11-11 2018-05-16 Sony Corp An apparatus, computer program and method
CN107592520B (zh) * 2017-09-29 2020-07-10 京东方科技集团股份有限公司 Ar设备的成像装置及成像方法
CN109582122B (zh) * 2017-09-29 2022-05-03 阿里巴巴集团控股有限公司 增强现实信息提供方法、装置及电子设备
US20200042793A1 (en) * 2018-07-31 2020-02-06 Ario Technologies, Inc. Creating, managing and accessing spatially located information utilizing augmented reality and web technologies
CN111679731A (zh) * 2019-03-11 2020-09-18 三星电子株式会社 显示装置及其控制方法
US11514594B2 (en) 2019-10-30 2022-11-29 Vergence Automation, Inc. Composite imaging systems using a focal plane array with in-pixel analog storage elements
CN111583329B (zh) * 2020-04-09 2023-08-04 深圳奇迹智慧网络有限公司 增强现实眼镜显示方法、装置、电子设备和存储介质
CN113094016B (zh) * 2021-06-09 2021-09-07 上海影创信息科技有限公司 用于信息增益与显示的系统、方法及介质

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001136504A (ja) * 1999-11-08 2001-05-18 Sony Corp 情報入出力システム及び情報入出力方法
JP4032776B2 (ja) 2002-03-04 2008-01-16 ソニー株式会社 複合現実感表示装置及び方法、記憶媒体、並びにコンピュータ・プログラム
JP2005122516A (ja) * 2003-10-17 2005-05-12 Aruze Corp 発注システム及び携帯端末
KR100708178B1 (ko) * 2005-09-01 2007-04-16 삼성전자주식회사 영상 처리 방법, 장치 및 영상 정보를 기록한 정보저장매체
JP2007193403A (ja) * 2006-01-17 2007-08-02 Hitachi Ltd ポインティングデバイス、ポインタ指示位置制御方法およびディスプレイシステム
US8347213B2 (en) * 2007-03-02 2013-01-01 Animoto, Inc. Automatically generating audiovisual works
JP2008310446A (ja) * 2007-06-12 2008-12-25 Panasonic Corp 画像検索システム
US8677399B2 (en) * 2008-04-15 2014-03-18 Disney Enterprises, Inc. Preprocessing video to insert visual elements and applications thereof
JP5149744B2 (ja) * 2008-09-10 2013-02-20 株式会社デンソーアイティーラボラトリ 画像検索装置、画像検索システム、画像検索方法およびプログラム
US8731301B1 (en) * 2008-09-25 2014-05-20 Sprint Communications Company L.P. Display adaptation based on captured image feedback
JP5317798B2 (ja) * 2009-03-05 2013-10-16 ヤフー株式会社 現実世界から目的物を検索する携帯情報検索機器
CN101996371A (zh) * 2009-08-17 2011-03-30 华为技术有限公司 广告投放方法、装置及系统
US8913009B2 (en) * 2010-02-03 2014-12-16 Nintendo Co., Ltd. Spatially-correlated multi-display human-machine interface
US8339364B2 (en) * 2010-02-03 2012-12-25 Nintendo Co., Ltd. Spatially-correlated multi-display human-machine interface
JP5464661B2 (ja) * 2010-03-12 2014-04-09 Kddi株式会社 情報端末装置
EP2593920A4 (en) * 2010-07-12 2016-05-04 Google Inc SYSTEM AND METHOD FOR DETERMINING BUILDING NUMBERS
US8781152B2 (en) * 2010-08-05 2014-07-15 Brian Momeyer Identifying visual media content captured by camera-enabled mobile device
EP2619728B1 (en) * 2010-09-20 2019-07-17 Qualcomm Incorporated An adaptable framework for cloud assisted augmented reality
US8913171B2 (en) * 2010-11-17 2014-12-16 Verizon Patent And Licensing Inc. Methods and systems for dynamically presenting enhanced content during a presentation of a media content instance
US8984562B2 (en) * 2011-01-13 2015-03-17 Verizon Patent And Licensing Inc. Method and apparatus for interacting with a set-top box using widgets
US10204167B2 (en) * 2012-03-14 2019-02-12 Oath Inc. Two-dimension indexed carousels for in situ media browsing on mobile devices
US9438647B2 (en) * 2013-11-14 2016-09-06 At&T Intellectual Property I, L.P. Method and apparatus for distributing content

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None *

Also Published As

Publication number Publication date
EP2833249A4 (en) 2016-04-06
CN104205012A (zh) 2014-12-10
JP6135783B2 (ja) 2017-05-31
JPWO2013145883A1 (ja) 2015-12-10
JP5892236B2 (ja) 2016-03-23
US9984468B2 (en) 2018-05-29
US20150070390A1 (en) 2015-03-12
JP2016129050A (ja) 2016-07-14
WO2013145883A1 (ja) 2013-10-03
EP2833249A1 (en) 2015-02-04

Similar Documents

Publication Publication Date Title
EP2833249B1 (en) Information processing device, information processing method and program
EP2832107B1 (en) Information processing apparatus, information processing method, and program
US10796157B2 (en) Hierarchical object detection and selection
US10768881B2 (en) Multi-screen interaction method and system in augmented reality scene
JP2022553174A (ja) ビデオ検索方法、装置、端末、及び記憶媒体
US20150020014A1 (en) Information processing apparatus, information processing method, and program
US20170068417A1 (en) Information processing apparatus, program, information processing method, and information processing system
CN105549878A (zh) 电子书翻页控制方法及设备
EP3234746B1 (en) Electronic device and method for controlling a display
CN109618218B (zh) 一种视频处理方法及移动终端
US9424651B2 (en) Method of tracking marker and electronic device thereof
CN108769822B (zh) 一种视频显示方法及终端设备
JPWO2014034256A1 (ja) 表示制御装置、表示制御システムおよび表示制御方法
KR102192159B1 (ko) 디스플레이 방법 및 그 방법을 처리하는 전자 장치
CN109388326A (zh) 一种电子设备、双屏电子设备的控制方法及装置
US20150145749A1 (en) Image processing apparatus and image processing method
JP7124281B2 (ja) プログラム、情報処理装置、画像処理システム
WO2021104254A1 (zh) 信息处理方法及电子设备
US20220276822A1 (en) Information processing apparatus and information processing method
US10855639B2 (en) Information processing apparatus and information processing method for selection of a target user
CN115793904A (zh) 锁屏显示方法、移动终端及存储介质
US20210342201A1 (en) Information processing apparatus, information processing method, information providing apparatus, information providing method, presentation controlling apparatus, presentation controlling method, and information processing system
KR20220147314A (ko) 비디오 스트림 감지 방법 및 이를 지원하는 전자 장치
US20130342450A1 (en) Electronic devices
US20150213038A1 (en) Method for managing data and electronic device thereof

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20141014

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAX Request for extension of the european patent (deleted)
RA4 Supplementary search report drawn up and despatched (corrected)

Effective date: 20160307

RIC1 Information provided on ipc code assigned before grant

Ipc: G06F 3/0346 20130101ALI20160301BHEP

Ipc: G06K 9/00 20060101ALI20160301BHEP

Ipc: G06F 3/01 20060101ALI20160301BHEP

Ipc: G06F 3/048 20060101AFI20160301BHEP

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20180208

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20190104

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1146330

Country of ref document: AT

Kind code of ref document: T

Effective date: 20190715

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602013056823

Country of ref document: DE

REG Reference to a national code

Ref country code: NL

Ref legal event code: FP

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190619

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190619

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190619

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190619

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190919

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190619

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190920

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190619

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190919

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190619

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1146330

Country of ref document: AT

Kind code of ref document: T

Effective date: 20190619

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190619

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190619

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190619

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191021

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190619

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190619

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190619

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190619

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190619

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191019

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190619

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190619

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190619

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200224

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602013056823

Country of ref document: DE

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG2D Information on lapse in contracting state deleted

Ref country code: IS

26N No opposition filed

Effective date: 20200603

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190619

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20200229

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200208

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190619

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200229

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200229

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200229

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200208

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200229

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20220125

Year of fee payment: 10

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190619

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190619

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: NL

Payment date: 20220121

Year of fee payment: 10

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190619

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230527

REG Reference to a national code

Ref country code: NL

Ref legal event code: MM

Effective date: 20230301

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20230208

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20230301

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20230208

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20230208

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20240123

Year of fee payment: 12