WO2015072194A1 - Display control device, display control method and program - Google Patents
Display control device, display control method and program Download PDFInfo
- Publication number
- WO2015072194A1 WO2015072194A1 PCT/JP2014/071386 JP2014071386W WO2015072194A1 WO 2015072194 A1 WO2015072194 A1 WO 2015072194A1 JP 2014071386 W JP2014071386 W JP 2014071386W WO 2015072194 A1 WO2015072194 A1 WO 2015072194A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- display
- display control
- orientation
- record
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/24—Aligning, centring, orientation detection or correction of the image
- G06V10/242—Aligning, centring, orientation detection or correction of the image by image rotation, e.g. by 90 degrees
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
Definitions
- the present disclosure relates to a display control device, a display control method, and a program.
- AR augmented reality
- an object included in a captured image is recognized, and information related to the recognized object is displayed.
- Such information is also called annotation, and is visualized as various forms of virtual objects such as text, icons, or animations.
- Patent Document 1 An example of such an AR technique is described in Patent Document 1, for example.
- Patent Document 1 proposes a technique for appropriately displaying a virtual object related to a real object in accordance with the position or orientation of the object, a technique for display utilizing the real object itself is proposed. Not. As described in Patent Document 1, since it is possible to detect the position and orientation of an object, it should be possible to display more useful information by utilizing the actual object itself. It is.
- a new and improved display control apparatus capable of displaying more useful information by utilizing the actual object itself based on the display position or orientation of the actual object included in the image.
- a display control method and program are proposed.
- a record reference unit that refers to a record that associates an image, an object included in the image, a display position or a posture of the object in the image, and a second object including the object based on the record.
- a display control device that includes a display control unit that transitions display of one image to display of a second image that includes the object and is different from the first image, while maintaining the display position or orientation of the object.
- a display control method including transition of display of one image to display of a second image including the object and different from the first image while maintaining the display position or orientation of the object.
- a function that refers to a record that associates an image, an object included in the image, and a display position or orientation of the object in the image, and a function that includes the object based on the record A program for causing a computer to realize a function of transitioning display of one image to display of a second image that includes the object and is different from the first image while maintaining the display position or orientation of the object Is provided.
- FIG. 10 is a diagram illustrating a first display example according to an embodiment of the present disclosure.
- FIG. 10 is a diagram illustrating a second display example according to an embodiment of the present disclosure.
- FIG. 14 is a diagram illustrating a third display example according to an embodiment of the present disclosure.
- FIG. 14 is a diagram illustrating a fourth display example according to an embodiment of the present disclosure.
- FIG. 16 is a diagram illustrating a fifth display example according to an embodiment of the present disclosure.
- 5 is a diagram for describing an example of processing for displaying an annotation according to an embodiment of the present disclosure.
- FIG. 18 is a diagram illustrating a sixth display example according to an embodiment of the present disclosure.
- FIG. 3 is a block diagram illustrating a hardware configuration example of an information processing apparatus according to an embodiment of the present disclosure.
- FIG. 1 is a diagram illustrating an outline of a device configuration according to an embodiment of the present disclosure.
- the smartphone 100 images a real space 200 including an object 201.
- a through image 300t generated by imaging is displayed on the display unit 118 of the smartphone 100.
- the through image 300t includes an object 301 corresponding to the actual object 201.
- FIG. 2 is a block diagram illustrating a schematic configuration of a smartphone according to an embodiment of the present disclosure.
- the smartphone 100 includes an imaging unit 102, an object recognition unit 104, an object DB 106, a record generation unit 108, a record reference unit 110, an image-object DB 112, a display control unit 114, and an image.
- DB 116 and display unit 118 are included.
- the smartphone 100 is an example of a display control device according to an embodiment of the present disclosure, and may be realized by, for example, a hardware configuration of an information processing device described later. Hereinafter, each component will be further described.
- the smartphone 100 includes at least a processor such as a CPU (Central Processing), a memory or storage, a camera (imaging device), and a display (output device).
- a processor such as a CPU (Central Processing), a memory or storage, a camera (imaging device), and a display (output device).
- the object recognition unit 104, the record generation unit 108, the record reference unit 110, and the display control unit 114 can be realized by the processor operating according to a program stored in a memory or storage.
- the object DB 106, the image-object DB 112, and the image DB 116 can be realized by memory or storage.
- the imaging unit 102 can be realized by a camera.
- the display unit 118 can be realized by a display.
- the imaging unit 102 captures an actual space and generates an image.
- the imaging unit 102 may generate a still image or a moving image.
- the image data generated by the imaging unit 102 is provided to the object recognition unit 104 and stored in the image DB 116 as necessary.
- the image data may be provided to the display control unit 114 and displayed on the display unit 118 as a through image or a preview image.
- the object recognition unit 104 performs object recognition on the image generated by the imaging unit 102.
- the object recognition unit 104 may refer to the object DB 106 for object recognition.
- model data relating to the shape or appearance of an object to be recognized is stored in advance.
- the model data includes data defining the shape of each object, image data such as a predetermined symbol mark or text label attached to each object, or feature set data extracted from a known image for each object. .
- the object recognition unit 104 recognizes which object is included in the input image by using the image generated by the imaging unit 102 as the input image. For example, the object recognition unit 104 collates a set of feature points extracted from the input image with the shape of the object defined by the model data. Further, the object recognition unit 104 may collate image data such as a symbol mark or a text label defined by the model data with the input image. Further, the object recognizing unit 104 may collate the feature amount of the known object image defined by the model data with the feature amount extracted from the input image.
- the object recognition unit 104 uses the feature amount set representing the recognized object as data of the feature amount set extracted from the known image for the object. May be added.
- the object recognition unit 104 adds new model data to the object DB 106 based on the shape, image data, feature amount, etc. of the recognized object. Also good.
- the object recognition unit 104 can recognize the display position and orientation of the object in the image. More specifically, for example, the object recognition unit 104 detects the display position and orientation of the object included in the input image using the image generated by the imaging unit 102 as the input image.
- the posture of the object is a 4-by-4 matrix that represents the transformation between the model coordinate system in the model data stored in the object DB 106 and the coordinate system of the object shown in the input image. Is expressed in an integrated manner.
- the object recognition unit 104 can extract the angle of the object with respect to the smartphone 100 from the homogeneous transformation matrix.
- the display position of the object in the image can be represented by, for example, the center coordinates of the object in the image.
- the record generation unit 108 generates a record in the image-object DB 112 based on the result of object recognition by the object recognition unit 104.
- the image-object DB 112 for example, a record that associates an image, an object included in the image, and a display position and orientation of the object in the image is generated. For example, as shown in FIG. 3, when the object recognition unit 104 recognizes two objects 301a and 301b in the image 300 generated by the imaging unit 102, the record generation unit 108 displays the image-object DB 106 in FIG. Add a record like
- FIG. 4 is a diagram illustrating an example of a record generated in the image-object DB according to an embodiment of the present disclosure.
- FIG. 4 illustrates a record including an image ID, an object ID, a position, and a posture as a record generated in the image-object DB 112.
- the record is unique for the combination of image ID and object ID.
- the object recognition unit 104 recognizes two known objects 301a and 301b in the image 300a generated by the imaging unit 102. Therefore, the record generation unit 108 adds two records 401 a and 401 b as shown in FIG. 4 to the image-object DB 112.
- the record of the image-object DB 112 is unique for the combination of the image ID and the object ID. Therefore, as shown in the figure, if a plurality of objects 301a and 301b are found in the same image 300a, a plurality of records 401a and 401b corresponding to the objects can be generated. Furthermore, if at least one of the object 301a or the object 301b is found in another image 300b, a new record 401 is generated again.
- the record reference unit 110 refers to the record in the image-object DB 112 based on the result of object recognition by the object recognition unit 104. Referenced here is a record for another image including the object recognized by the object recognition unit 104. That is, as in the example of FIGS. 3 and 4 above, when the object 301a is recognized in the image 300a, the record reference unit 110 refers to a record for an image including the object 301a with an image other than the image 300a. .
- the record reference unit 110 may refer to the record in the image-object DB 112 in response to a request from the display control unit 114.
- a record about an image including an object designated by the display control unit 114 is referred to.
- the record reference unit 110 designates an object ID and refers to the record, thereby referring to a plurality of images including a specific object. It is possible to acquire all the records at once.
- the display control unit 114 controls display of an image on the display unit 118.
- the display control unit 114 may cause the display unit 118 to display an image generated by the imaging unit 102 (hereinafter also referred to as a captured image) as a through image or a preview image.
- the display control unit 114 may read out an image stored in the image DB 116 (hereinafter referred to as a recorded image) in accordance with an operation of the user of the smartphone 100 and display the image on the display unit 118. At this time, the display control unit 114 may process and display the captured image or the recorded image as necessary.
- the display control unit 114 changes the display position of the object included in the captured image or the recorded image. (The image itself may be moved) or the posture of the object can be changed.
- the display control unit 114 changes the display of the first image (captured image or recorded image) including a certain object to the display of the second image (recorded image different from the first image) including the same object. You may let them.
- the display control unit 114 can change the display of the image while maintaining the display position or orientation of the object, for example, by changing the display position or orientation of the object in the first image or the second image. it can.
- Such a display includes, for example, the second image of the transition destination (the same object as the first image stored in the image DB) based on the record referred to by the record reference unit 110. This is possible by identifying the image) and recognizing the display position and orientation of the object included in the second image.
- the display control unit 114 displays the result of object recognition by the object recognition unit 104 (when the first image is a captured image) or another record (first image acquired by the record reference unit 110). Display position and orientation based on the recorded image).
- an image generated by the imaging unit 102 is stored in response to an operation of the user of the smartphone 100.
- the image DB 116 may store an image acquired from an external device (not shown) via a network.
- the image DB 116 may store an image content entity, or may store link information to the image content entity.
- the image ID given to the image stored in the image DB 116 and the image ID in the image-object DB 112 are common. Or it is desirable that it is convertible. Similarly, it is desirable that the object ID given to the model data stored in the object DB 106 and the object ID in the image-object DB 112 are common or convertible.
- the display control device may be various devices including a display and an input device such as a desktop or notebook personal computer, a television, a tablet terminal, a media player, or a game machine.
- the display control apparatus does not necessarily include an imaging unit, and for example, acquires an image shared exclusively via a network, or acquires an image accumulated in a storage by reading from a removable medium. There may be.
- the smartphone 100 (which may be another terminal device) includes an imaging unit 102 and a display unit 118, and other components such as an object recognition unit 104, an object DB 106, a record generation unit 108, a record reference unit 110, an image
- the object DB 112, the display control unit 114, and the image DB 116 may be realized on a server on the network.
- the server is an example of a display control device.
- some or all of the object DB 106, the image-object DB 112, and the image DB 116 may be realized in a storage on a network.
- FIG. 5 is a diagram illustrating a first display example according to an embodiment of the present disclosure.
- A an image 310 including objects 311a to 311h is displayed.
- B the object 311c is highlighted and the postures of the objects 311a to 311h are mutually positioned. It changes while maintaining the relationship, and the object 311c faces the screen.
- C when the image 310 zooms in on the object 311c in a directly-facing state, the object 311c is enlarged and displayed.
- FIG. 6 is a diagram illustrating a second display example according to an embodiment of the present disclosure.
- an image 310 including objects 311a to 311h is displayed (similar to FIG. 5).
- the object 311h is highlighted and the postures of the objects 311a to 311h are in a positional relationship.
- the object 311h is in a directly facing state while changing.
- a predetermined operation for example, drag / pinch-in or double tap
- the image 310 is centered on the object 311h in the directly-facing state. Is zoomed in, the object 311h is enlarged.
- an image is displayed so that the orientation of the object specified in the image is changed to make the object face the screen or the specified object is displayed at the center.
- the display range can be moved, or the image can be zoomed in so that the designated object is enlarged.
- Such a display change is caused by a record in the image-object DB 112 referred to by the record reference unit 110 by the display control unit 114 (a record that associates an image, an object included in the image, and a display position or orientation in the object image). This is made possible by processing an image acquired from the imaging unit 102 or the image DB 116 based on the above.
- the display control unit 114 deforms the image of the object portion so that each object faces the screen based on information indicating the posture of each object included in the image. Further, the display control unit 114 rearranges the deformed objects so that the positional relationship before the deformation is reproduced based on the information indicating the display position of each object in the image. Furthermore, the display control unit 114 moves the display range of the image so that the specified object is displayed at the center based on the information indicating the display position of each object in the image, or enlarges the specified object. Or zoom in on the image.
- FIG. 7 is a diagram illustrating a third display example according to an embodiment of the present disclosure.
- the object 311c is displayed facing up and enlarged. This display may be displayed through the series of displays described above with reference to FIG.
- the display of the image 310 transitions to the display of a different image 320 (second image).
- the image 320 includes an object 321c.
- the object 321c is the same object as the object 311c (that is, recorded in the image-object DB 112 with the same object ID).
- the image 320 includes objects 321i and 321j in addition to the object 321c.
- the display positions and orientations of the objects 311c and 321c are maintained through the display of the image 310 shown in A and the display of the image 320 shown in B. That is, in the image 310 of A, the object 311c is arranged in the vicinity of the center, whereas in the image 320 of B, the object 321c is arranged in the vicinity of the center. Thereafter, automatically or in response to a predetermined operation by the user, as shown in C, the display of the image 320 changes to a state where the object 311c is not directly facing and is not displayed at the center. In the example shown, this is the original state of the image 320. That is, in B, the image 320 is processed and displayed so that the posture and display position of the object 311c in the image 310 before the transition are also maintained in the object 321c displayed after the transition.
- the display control unit 114 changes the posture of the object to a predetermined posture (for example, directly facing) in the display of the first image, and continues after the transition to the display of the second image. It is possible to seamlessly express transitions between images including the same object by maintaining the posture of at least temporarily. Accordingly, for example, the display can be changed from the image 310 including the objects 311a to 311h to the image 320 including the objects 321c, 321i, and 321j through the objects 311c and 321c, and is not included in the image 310. Information regarding the objects 321i and 321j can be newly obtained.
- a transition to another image including the same object as the object 321j may be executed.
- the transition of the image using the object as a medium is repeated while changing the target object as necessary, thereby forming a link between the object or the information between the images based on a new context. be able to.
- the display control unit 114 changes the display position and orientation of the object 311c to the original state of the image 320 (second image) (the state shown in FIG. 7C).
- the display of the image 310 may be changed to the display of the image 320 after changing to the display position and orientation similar to those of the object 321c in the display.
- the display control unit 114 does not change the display position and orientation of the object 311c in the display of the image 310 (first image) (without facing directly or zooming in as shown in FIG. 5), instead,
- the transition may be executed after changing the display position and orientation of the object 321c to the same display position and orientation as the object 311c in the display on the image 310.
- FIG. 8 is a diagram illustrating a fourth display example according to an embodiment of the present disclosure.
- this display example first, as shown in A, an image 330 (web page) including an object 331a displayed as a graphic is displayed.
- the display of the image 330 transitions to a different image 340 (photograph) as shown in B.
- the image 340 includes an object 341a shown in the photograph.
- the object 341a is the same object as the object 331a (that is, recorded in the image-object DB 112 with the same object ID).
- the images before and after the transition may be either virtual images such as web pages or captured images (photos).
- transition may be performed between the virtually configured image (web page) and the captured image (photo), or virtually configured images, or Transition may be performed between captured images.
- the image 330 and the image 340 are processed so that the display positions or postures of the objects 331a and 341a are maintained before and after the transition. sell.
- FIG. 9 is a diagram illustrating a fifth display example according to an embodiment of the present disclosure.
- annotations 352 given by a plurality of users are displayed in association with the object 351 (poster) included in the image 350 in a cumulative manner.
- An annotation can also be treated as an object, but in this example, the object 351 and the annotation 352 are distinguished for the sake of explanation.
- the annotation 352 can be scrolled and displayed while maintaining the posture displayed superimposed on the object 351 (in contrast to the annotations 352a to 352e displayed in A).
- annotations 352c to 352g are displayed).
- the annotation 352 can be displayed facing the object 351 independently of the object 351.
- an annotation 352 input by a plurality of users to the same object 351 included in different images 350 is recorded together with a relative positional relationship with respect to the object 351 in each image 350. It becomes possible by being.
- an example of processing for such display will be further described with reference to FIG.
- FIG. 10 is a diagram for describing an example of processing for displaying an annotation according to an embodiment of the present disclosure.
- the annotation 362a is input by the user A to the object 361a included in the image 360a.
- the annotation 362a is recorded by, for example, the object ID of the object 361a, the relative position of the annotation 362a with respect to the object 361a, the relative angle of the annotation 362a with respect to the object 361a, and the content (text or image) of the annotation 362a.
- the annotation 362a is recorded by, for example, the object ID of the object 361a, the relative position of the annotation 362a with respect to the object 361a, the relative angle of the annotation 362a with respect to the object 361a, and the content (text or image) of the annotation 362a.
- another user B is referring to another image 360b including the same object 361b as the object 361a.
- the object 361b is displayed at a different display position and orientation from the object 361a in the image 360a.
- the object 361b can be associated with the object 361a by being recorded in the image-object DB 112 with the same object ID. Is possible.
- the annotation 362a input by the user A is recorded together with the relative position and angle with respect to the object 361a, the annotation 362a can be displayed on the image 360b with reference to the object 361b.
- the display control unit 114 calculates the display position and orientation difference between the object 361a in the image 360a and the object 361b in the image 360b based on the image-object DB record acquired by the record reference unit 110. By calculating and adding the calculated difference to the relative position or angle of the annotation 362a with respect to the object 361a, the annotation 362a can be displayed in the image 360b with reference to the object 361b.
- the user B may additionally input an annotation 362b to the object 361b included in the image 360b.
- the annotation 362b also has the object ID of the object 361b (same as the object 361a), the relative position of the annotation 362b with respect to the object 361b, the relative angle of the annotation 362b with respect to the object 361b, and the content of the annotation 362b ( Text or image).
- the annotation 362b can be displayed with the object 361a as a reference.
- annotations input by each of the users A and B may be displayed in an image 360c (web page) that is referred to by another user C.
- image 360c web page
- both the object 361c and the annotations 362a and 362b may be displayed facing each other in the image 360c.
- the processing for displaying the object 361 and the annotation 362 as described above may be executed by a server on a network that provides services to the user A, the user B, and the user C, for example. That is, in this case, the server on the network may be an example of a display control device.
- FIG. 11 is a diagram illustrating a sixth display example according to an embodiment of the present disclosure.
- an option icon 373 of an image that can be changed using the object 371 as a medium is displayed.
- a plurality of images including the same object as the object 371 are stored in the image DB 116, and the user can select which one to transition to using the option icon 373.
- four option icons 373a to 373d are displayed.
- the option icon 373 is displayed, for example, when the user performs a predetermined operation (for example, long press) on the object 371 via the touch panel or the like.
- a predetermined operation for example, long press
- the transition from the image 370 to the corresponding image is executed. Even in the transition in this case, the display position or orientation of the object 371 may be maintained between the image 370 before the transition and the image after the transition.
- the selection icon 373 may display, for example, a predetermined number of images randomly selected from images that can be transitioned.
- the option icon 373 may display an image selected under a predetermined condition.
- the option icon 373a has the highest resolution image
- the option icon 373b has the highest recommendation level
- the option icon 373c has the shooting location closest to the shooting location image 370
- the option icon 373d has the shooting location closest to the shooting location image 370. The images with the highest evaluations of other users are respectively displayed.
- Information indicating the attribute is stored in association with each image ID.
- a state in which a plurality of objects are present at the same place is regarded as one context, and an application sharing the state can be realized. is there.
- the transition is seamlessly executed while the display position or orientation of the object (book) is maintained, so that the user can confirm that the image display is executed through the object (book). It can be easily recognized.
- a user's viewpoint image is continuously recorded by a camera mounted on a wearable display worn by the user, a huge amount of images can be recorded. For example, if the user feels that an object has been seen but cannot remember where it was seen, take a sample image of the object, search for and display the same object as the object included in the sample image, and display that object. The user's situation at the time of viewing can be accurately recalled.
- the viewpoint image is shared via the network, other users can access the user's viewpoint image through the object.
- the embodiment of the present disclosure is not limited to such an example, and the image At the time of transition, only the display position or the posture may be maintained. Even in that case, it may be possible to make the user recognize that the display of the image is executed through the object.
- the image-object DB may store a record associating the image, the object included in the image, and any one of the display position or the posture in the image of the object.
- FIG. 12 is a block diagram illustrating a hardware configuration example of the information processing apparatus according to the embodiment of the present disclosure.
- the illustrated information processing apparatus 900 can realize, for example, a display control apparatus such as a smartphone or a server in the above embodiment.
- the information processing apparatus 900 includes a CPU (Central Processing unit) 901, a ROM (Read Only Memory) 903, and a RAM (Random Access Memory) 905.
- the information processing apparatus 900 may include a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, a storage device 919, a drive 921, a connection port 923, and a communication device 925.
- the information processing apparatus 900 may include an imaging device 933 and a sensor 935 as necessary.
- the information processing apparatus 900 may include a processing circuit called DSP (Digital Signal Processor) or ASIC (Application Specific Integrated Circuit) instead of or in addition to the CPU 901.
- DSP Digital Signal Processor
- ASIC Application Specific Integrated Circuit
- the CPU 901 functions as an arithmetic processing device and a control device, and controls all or a part of the operation in the information processing device 900 according to various programs recorded in the ROM 903, the RAM 905, the storage device 919, or the removable recording medium 927.
- the ROM 903 stores programs and calculation parameters used by the CPU 901.
- the RAM 905 primarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like.
- the CPU 901, the ROM 903, and the RAM 905 are connected to each other by a host bus 907 configured by an internal bus such as a CPU bus. Further, the host bus 907 is connected to an external bus 911 such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge 909.
- PCI Peripheral Component Interconnect / Interface
- the input device 915 is a device operated by the user, such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever.
- the input device 915 may be, for example, a remote control device that uses infrared rays or other radio waves, or may be an external connection device 929 such as a mobile phone that supports the operation of the information processing device 900.
- the input device 915 includes an input control circuit that generates an input signal based on information input by the user and outputs the input signal to the CPU 901. The user operates the input device 915 to input various data and instruct processing operations to the information processing device 900.
- the output device 917 is a device that can notify the user of the acquired information visually or audibly.
- the output device 917 can be, for example, a display device such as an LCD (Liquid Crystal Display), a PDP (Plasma Display Panel), an organic EL (Electro-Luminescence) display, an audio output device such as a speaker and headphones, and a printer device.
- the output device 917 outputs the result obtained by the processing of the information processing device 900 as video such as text or an image, or outputs it as audio such as voice or sound.
- the storage device 919 is a data storage device configured as an example of a storage unit of the information processing device 900.
- the storage device 919 includes, for example, a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
- the storage device 919 stores programs executed by the CPU 901, various data, various data acquired from the outside, and the like.
- the drive 921 is a reader / writer for a removable recording medium 927 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and is built in or externally attached to the information processing apparatus 900.
- the drive 921 reads information recorded on the attached removable recording medium 927 and outputs the information to the RAM 905.
- the drive 921 writes a record in the attached removable recording medium 927.
- the connection port 923 is a port for directly connecting a device to the information processing apparatus 900.
- the connection port 923 can be, for example, a USB (Universal Serial Bus) port, an IEEE 1394 port, a SCSI (Small Computer System Interface) port, or the like.
- the connection port 923 may be an RS-232C port, an optical audio terminal, an HDMI (registered trademark) (High-Definition Multimedia Interface) port, or the like.
- the communication device 925 is a communication interface configured with, for example, a communication device for connecting to the communication network 931.
- the communication device 925 may be, for example, a communication card for wired or wireless LAN (Local Area Network), Bluetooth (registered trademark), or WUSB (Wireless USB).
- the communication device 925 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), or a modem for various communication.
- the communication device 925 transmits and receives signals and the like using a predetermined protocol such as TCP / IP with the Internet and other communication devices, for example.
- the communication network 931 connected to the communication device 925 is a wired or wireless network, such as the Internet, a home LAN, infrared communication, radio wave communication, or satellite communication.
- the imaging device 933 uses various members such as an imaging element such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor), and a lens for controlling the formation of a subject image on the imaging element. It is an apparatus that images a real space and generates a captured image.
- the imaging device 933 may capture a still image or may capture a moving image.
- the sensor 935 is various sensors such as an acceleration sensor, a gyro sensor, a geomagnetic sensor, an optical sensor, and a sound sensor.
- the sensor 935 acquires information about the state of the information processing apparatus 900 itself, such as the posture of the information processing apparatus 900, and information about the surrounding environment of the information processing apparatus 900, such as brightness and noise around the information processing apparatus 900, for example. To do.
- the sensor 935 may include a GPS sensor that receives a GPS (Global Positioning System) signal and measures the latitude, longitude, and altitude of the apparatus.
- GPS Global Positioning System
- Each component described above may be configured using a general-purpose member, or may be configured by hardware specialized for the function of each component. Such a configuration can be appropriately changed according to the technical level at the time of implementation.
- Embodiments of the present disclosure include, for example, a display control device (such as a smartphone or a server) as described above, a system, a display control method executed by the display control device or system, a program for causing the display control device to function, And a non-transitory tangible medium on which the program is recorded.
- a display control device such as a smartphone or a server
- a system a display control method executed by the display control device or system
- a program for causing the display control device to function And a non-transitory tangible medium on which the program is recorded.
- a record reference unit that refers to a record that associates an image, an object included in the image, and a display position or orientation of the object in the image; Based on the record, the display of the first image including the object is changed to the display of the second image including the object and different from the first image while maintaining the display position or orientation of the object.
- a display control unit (2) The display control unit changes the posture of the object to a predetermined posture in displaying the first image, and at least temporarily changes the predetermined posture even after the transition to the display of the second image.
- the predetermined posture is a directly-facing posture.
- the display control unit changes the display position or orientation of the object so as to correspond to the display position or orientation of the object in the display of the second image in the display of the first image.
- the display control device according to (1), wherein the display of the first image is changed to the display of the second image.
- the display control unit changes the display of the first image so that the display position or orientation of the object corresponds to the display position or orientation of the object in the display of the first image.
- the display control apparatus according to any one of (1) to (5), wherein at least one of the first image and the second image is a captured image.
- the display control unit causes the annotation given to the object in each of the first image and the second image to be displayed in association with the object cumulatively based on the record.
- the display control apparatus according to any one of 1) to (8).
- (11) referring to a record associating an image, an object included in the image, and a display position or orientation of the object in the image; Based on the record, the display of the first image including the object is changed to the display of the second image including the object and different from the first image while maintaining the display position or orientation of the object. And a display control method. (12) a function of referring to a record that associates an image, an object included in the image, and a display position or orientation of the object in the image; Based on the record, the display of the first image including the object is changed to the display of the second image including the object and different from the first image while maintaining the display position or orientation of the object.
- a program that causes a computer to realize the functions to be executed.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
1.装置構成
2.表示例
2-1.オブジェクト表示位置または姿勢の変化
2-2.画像間の遷移
2-3.アノテーションの表示
2-4.遷移先の選択肢の表示
3.ハードウェア構成
4.補足 The description will be made in the following order.
1. Device configuration2. Display example 2-1. Change in object display position or orientation 2-2. Transition between images 2-3. Annotation display 2-4. 2. Display of destination options Hardware configuration Supplement
図1は、本開示の一実施形態における装置構成の概要を示す図である。図1を参照すると、本実施形態では、スマートフォン100が、オブジェクト201を含む現実の空間200を撮像している。スマートフォン100の表示部118には、撮像によって生成されたスルー画像300tが表示されている。スルー画像300tには、現実のオブジェクト201に対応するオブジェクト301が含まれている。 (1. Device configuration)
FIG. 1 is a diagram illustrating an outline of a device configuration according to an embodiment of the present disclosure. Referring to FIG. 1, in this embodiment, the
(2-1.オブジェクト表示位置または姿勢の変化)
図5は、本開示の一実施形態における第1の表示例を示す図である。この表示例では、まず、Aに示すように、オブジェクト311a~311hが含まれる画像310が表示されている。ここで、ユーザがタッチパネルなどを介した操作(例えばダブルタップなど)によってオブジェクト311cを指定すると、Bに示すように、オブジェクト311cがハイライトされた上で、オブジェクト311a~311hの姿勢が互いの位置関係を維持したまま変化し、オブジェクト311cが画面に正対した状態になる。さらに、Cに示すように、正対した状態のオブジェクト311cを中心にして画像310がズームインすることによって、オブジェクト311cが拡大表示される。 (2. Display example)
(2-1. Changes in object display position or orientation)
FIG. 5 is a diagram illustrating a first display example according to an embodiment of the present disclosure. In this display example, first, as shown in A, an
図7は、本開示の一実施形態における第3の表示例を示す図である。この表示例では、まず、Aに示すように、画像310(第1の画像)において、オブジェクト311cが正対および拡大して表示されている。この表示は、例えば上記で図5を参照して説明した一連の表示を経て表示されたものであってもよい。ここで、自動的に、またはユーザの所定の操作に応じて、Bに示すように、画像310の表示が、異なる画像320(第2の画像)の表示に遷移する。画像320は、オブジェクト321cを含む。オブジェクト321cは、オブジェクト311cと同じオブジェクト(つまり、同じオブジェクトIDで画像-オブジェクトDB112に記録されている)である。なお、後述するCに示されるように、画像320には、オブジェクト321cの他にオブジェクト321i,321jが含まれる。 (2-2. Transition between images)
FIG. 7 is a diagram illustrating a third display example according to an embodiment of the present disclosure. In this display example, first, as shown in A, in the image 310 (first image), the
図9は、本開示の一実施形態における第5の表示例を示す図である。この表示例では、Aに示すように、画像350に含まれるオブジェクト351(ポスター)に、複数のユーザによって与えられたアノテーション352が累積的に関連付けて表示される。なお、アノテーションもまたオブジェクトとして扱われうるが、この例では説明のためにオブジェクト351とアノテーション352とを区別する。Bに示すように、アノテーション352は、オブジェクト351に重畳して表示された姿勢を維持したまま、スクロールさせて表示させることが可能である(Aではアノテーション352a~352eが表示されているのに対して、Bではアノテーション352c~352gが表示されている)。また、Cに示すように、アノテーション352をオブジェクト351から独立して正対表示させることも可能である。 (2-3. Annotation display)
FIG. 9 is a diagram illustrating a fifth display example according to an embodiment of the present disclosure. In this display example, as shown in A, annotations 352 given by a plurality of users are displayed in association with the object 351 (poster) included in the
図11は、本開示の一実施形態における第6の表示例を示す図である。この表示例では、画像370において、オブジェクト371を媒介として遷移することが可能な画像の選択肢アイコン373が表示される。つまり、図示された例では、オブジェクト371と同じオブジェクトを含む画像が画像DB116に複数格納されており、そのうちのどれに遷移するかをユーザが選択肢アイコン373を用いて選択することが可能である。図示された例では、4つの選択肢アイコン373a~373dが表示されている。 (2-4. Display of transition destination options)
FIG. 11 is a diagram illustrating a sixth display example according to an embodiment of the present disclosure. In this display example, in the
次に、図12を参照して、本開示の実施形態に係る情報処理装置のハードウェア構成について説明する。図12は、本開示の実施形態に係る情報処理装置のハードウェア構成例を示すブロック図である。図示された情報処理装置900は、例えば、上記の実施形態におけるスマートフォンやサーバなどの表示制御装置を実現しうる。 (3. Hardware configuration)
Next, a hardware configuration of the information processing apparatus according to the embodiment of the present disclosure will be described with reference to FIG. FIG. 12 is a block diagram illustrating a hardware configuration example of the information processing apparatus according to the embodiment of the present disclosure. The illustrated
本開示の実施形態は、例えば、上記で説明したような表示制御装置(スマートフォンまたはサーバなど)、システム、表示制御装置またはシステムで実行される表示制御方法、表示制御装置を機能させるためのプログラム、およびプログラムが記録された一時的でない有形の媒体を含みうる。 (4. Supplement)
Embodiments of the present disclosure include, for example, a display control device (such as a smartphone or a server) as described above, a system, a display control method executed by the display control device or system, a program for causing the display control device to function, And a non-transitory tangible medium on which the program is recorded.
(1)画像と、前記画像に含まれるオブジェクトと、前記オブジェクトの前記画像における表示位置または姿勢とを関連付けるレコードを参照するレコード参照部と、
前記レコードに基づいて、前記オブジェクトを含む第1の画像の表示を、前記オブジェクトを含み前記第1の画像とは異なる第2の画像の表示に、前記オブジェクトの表示位置または姿勢を維持したまま遷移させる表示制御部と
を備える表示制御装置。
(2)前記表示制御部は、前記第1の画像の表示において前記オブジェクトの姿勢を所定の姿勢に変化させ、前記第2の画像の表示への遷移後も前記所定の姿勢を少なくとも一時的に維持する、前記(1)に記載の表示制御装置。
(3)前記所定の姿勢は、正対姿勢である、前記(2)に記載の表示制御装置。
(4)前記表示制御部は、前記第1の画像の表示において前記オブジェクトの表示位置または姿勢を前記第2の画像の表示における前記オブジェクトの表示位置または姿勢に対応するように変化させた上で、前記第1の画像の表示を前記第2の画像の表示に遷移させる、前記(1)に記載の表示制御装置。
(5)前記表示制御部は、前記第1の画像の表示を、前記オブジェクトの表示位置または姿勢が前記第1の画像の表示における前記オブジェクトの表示位置または姿勢に対応するように変化した前記第2の画像の表示に遷移させる、前記(1)に記載の表示制御装置。
(6)前記第1の画像および前記第2の画像のうちの少なくともいずれかは、撮像画像である、前記(1)~(5)のいずれか1項に記載の表示制御装置。
(7)前記第1の画像および前記第2の画像のうちの少なくともいずれかは、仮想的に構成された画像である、前記(1)~(6)のいずれか1項に記載の表示制御装置。
(8)前記表示制御部は、前記第1の画像において、前記第2の画像として遷移可能な画像の選択肢を表示させる、前記(1)~(7)のいずれか1項に記載の表示制御装置。
(9)前記表示制御部は、前記画像の属性ごとに前記選択肢を表示させる、前記(8)に記載の表示制御装置。
(10)前記表示制御部は、前記レコードに基づいて、前記第1の画像および前記第2の画像のそれぞれにおいて前記オブジェクトに与えられたアノテーションを累積的に前記オブジェクトに関連付けて表示させる、前記(1)~(8)のいずれか1項に記載の表示制御装置。
(11)画像と、前記画像に含まれるオブジェクトと、前記オブジェクトの前記画像における表示位置または姿勢とを関連付けるレコードを参照することと、
前記レコードに基づいて、前記オブジェクトを含む第1の画像の表示を、前記オブジェクトを含み前記第1の画像とは異なる第2の画像の表示に、前記オブジェクトの表示位置または姿勢を維持したまま遷移させることと
を含む表示制御方法。
(12)画像と、前記画像に含まれるオブジェクトと、前記オブジェクトの前記画像における表示位置または姿勢とを関連付けるレコードを参照する機能と、
前記レコードに基づいて、前記オブジェクトを含む第1の画像の表示を、前記オブジェクトを含み前記第1の画像とは異なる第2の画像の表示に、前記オブジェクトの表示位置または姿勢を維持したまま遷移させる機能と
をコンピュータに実現させるためのプログラム。 The following configurations also belong to the technical scope of the present disclosure.
(1) a record reference unit that refers to a record that associates an image, an object included in the image, and a display position or orientation of the object in the image;
Based on the record, the display of the first image including the object is changed to the display of the second image including the object and different from the first image while maintaining the display position or orientation of the object. A display control unit.
(2) The display control unit changes the posture of the object to a predetermined posture in displaying the first image, and at least temporarily changes the predetermined posture even after the transition to the display of the second image. The display control device according to (1), wherein the display control device is maintained.
(3) The display control device according to (2), wherein the predetermined posture is a directly-facing posture.
(4) The display control unit changes the display position or orientation of the object so as to correspond to the display position or orientation of the object in the display of the second image in the display of the first image. The display control device according to (1), wherein the display of the first image is changed to the display of the second image.
(5) The display control unit changes the display of the first image so that the display position or orientation of the object corresponds to the display position or orientation of the object in the display of the first image. The display control apparatus according to (1), wherein the display is switched to display of the second image.
(6) The display control apparatus according to any one of (1) to (5), wherein at least one of the first image and the second image is a captured image.
(7) The display control according to any one of (1) to (6), wherein at least one of the first image and the second image is a virtually configured image. apparatus.
(8) The display control according to any one of (1) to (7), wherein the display control unit displays, in the first image, options of an image that can be transitioned as the second image. apparatus.
(9) The display control device according to (8), wherein the display control unit displays the options for each attribute of the image.
(10) The display control unit causes the annotation given to the object in each of the first image and the second image to be displayed in association with the object cumulatively based on the record. The display control apparatus according to any one of 1) to (8).
(11) referring to a record associating an image, an object included in the image, and a display position or orientation of the object in the image;
Based on the record, the display of the first image including the object is changed to the display of the second image including the object and different from the first image while maintaining the display position or orientation of the object. And a display control method.
(12) a function of referring to a record that associates an image, an object included in the image, and a display position or orientation of the object in the image;
Based on the record, the display of the first image including the object is changed to the display of the second image including the object and different from the first image while maintaining the display position or orientation of the object. A program that causes a computer to realize the functions to be executed.
102 撮像部
104 認識部
108 生成部
110 参照部
114 表示制御部
118 表示部 DESCRIPTION OF
Claims (12)
- 画像と、前記画像に含まれるオブジェクトと、前記オブジェクトの前記画像における表示位置または姿勢とを関連付けるレコードを参照するレコード参照部と、
前記レコードに基づいて、前記オブジェクトを含む第1の画像の表示を、前記オブジェクトを含み前記第1の画像とは異なる第2の画像の表示に、前記オブジェクトの表示位置または姿勢を維持したまま遷移させる表示制御部と
を備える表示制御装置。 A record reference unit that refers to a record that associates an image, an object included in the image, and a display position or orientation of the object in the image;
Based on the record, the display of the first image including the object is changed to the display of the second image including the object and different from the first image while maintaining the display position or orientation of the object. A display control unit. - 前記表示制御部は、前記第1の画像の表示において前記オブジェクトの姿勢を所定の姿勢に変化させ、前記第2の画像の表示への遷移後も前記所定の姿勢を少なくとも一時的に維持する、請求項1に記載の表示制御装置。 The display control unit changes the posture of the object to a predetermined posture in displaying the first image, and at least temporarily maintains the predetermined posture even after transition to the display of the second image. The display control apparatus according to claim 1.
- 前記所定の姿勢は、正対姿勢である、請求項2に記載の表示制御装置。 The display control apparatus according to claim 2, wherein the predetermined posture is a directly-facing posture.
- 前記表示制御部は、前記第1の画像の表示において前記オブジェクトの表示位置または姿勢を前記第2の画像の表示における前記オブジェクトの表示位置または姿勢に対応するように変化させた上で、前記第1の画像の表示を前記第2の画像の表示に遷移させる、請求項1に記載の表示制御装置。 The display control unit changes the display position or orientation of the object so as to correspond to the display position or orientation of the object in the display of the second image in the display of the first image. The display control apparatus according to claim 1, wherein display of one image is transitioned to display of the second image.
- 前記表示制御部は、前記第1の画像の表示を、前記オブジェクトの表示位置または姿勢が前記第1の画像の表示における前記オブジェクトの表示位置または姿勢に対応するように変化した前記第2の画像の表示に遷移させる、請求項1に記載の表示制御装置。 The display control unit displays the first image by changing the display position or orientation of the object so that the display position or orientation of the object corresponds to the display position or orientation of the object in the display of the first image. The display control device according to claim 1, wherein the display control device makes a transition to display.
- 前記第1の画像および前記第2の画像のうちの少なくともいずれかは、撮像画像である、請求項1に記載の表示制御装置。 The display control apparatus according to claim 1, wherein at least one of the first image and the second image is a captured image.
- 前記第1の画像および前記第2の画像のうちの少なくともいずれかは、仮想的に構成された画像である、請求項1に記載の表示制御装置。 The display control device according to claim 1, wherein at least one of the first image and the second image is a virtually configured image.
- 前記表示制御部は、前記第1の画像において、前記第2の画像として遷移可能な画像の選択肢を表示させる、請求項1に記載の表示制御装置。 The display control apparatus according to claim 1, wherein the display control unit displays, in the first image, options of an image that can be transitioned as the second image.
- 前記表示制御部は、前記画像の属性ごとに前記選択肢を表示させる、請求項8に記載の表示制御装置。 The display control device according to claim 8, wherein the display control unit displays the options for each attribute of the image.
- 前記表示制御部は、前記レコードに基づいて、前記第1の画像および前記第2の画像のそれぞれにおいて前記オブジェクトに与えられたアノテーションを累積的に前記オブジェクトに関連付けて表示させる、請求項1に記載の表示制御装置。 2. The display control unit according to claim 1, wherein, based on the record, the annotation given to the object in each of the first image and the second image is displayed in association with the object in a cumulative manner. Display controller.
- 画像と、前記画像に含まれるオブジェクトと、前記オブジェクトの前記画像における表示位置または姿勢とを関連付けるレコードを参照することと、
前記レコードに基づいて、前記オブジェクトを含む第1の画像の表示を、前記オブジェクトを含み前記第1の画像とは異なる第2の画像の表示に、前記オブジェクトの表示位置または姿勢を維持したまま遷移させることと
を含む表示制御方法。 Referencing a record associating an image, an object included in the image, and a display position or orientation of the object in the image;
Based on the record, the display of the first image including the object is changed to the display of the second image including the object and different from the first image while maintaining the display position or orientation of the object. And a display control method. - 画像と、前記画像に含まれるオブジェクトと、前記オブジェクトの前記画像における表示位置または姿勢とを関連付けるレコードを参照する機能と、
前記レコードに基づいて、前記オブジェクトを含む第1の画像の表示を、前記オブジェクトを含み前記第1の画像とは異なる第2の画像の表示に、前記オブジェクトの表示位置または姿勢を維持したまま遷移させる機能と
をコンピュータに実現させるためのプログラム。
A function for referring to a record that associates an image, an object included in the image, and a display position or orientation of the object in the image;
Based on the record, the display of the first image including the object is changed to the display of the second image including the object and different from the first image while maintaining the display position or orientation of the object. A program that causes a computer to realize the functions to be executed.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP14861731.9A EP3070681A4 (en) | 2013-11-13 | 2014-08-13 | Display control device, display control method and program |
JP2015547663A JP6337907B2 (en) | 2013-11-13 | 2014-08-13 | Display control apparatus, display control method, and program |
US15/025,500 US10074216B2 (en) | 2013-11-13 | 2014-08-13 | Information processing to display information based on position of the real object in the image |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-234930 | 2013-11-13 | ||
JP2013234930 | 2013-11-13 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015072194A1 true WO2015072194A1 (en) | 2015-05-21 |
Family
ID=53057140
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/071386 WO2015072194A1 (en) | 2013-11-13 | 2014-08-13 | Display control device, display control method and program |
Country Status (4)
Country | Link |
---|---|
US (1) | US10074216B2 (en) |
EP (1) | EP3070681A4 (en) |
JP (1) | JP6337907B2 (en) |
WO (1) | WO2015072194A1 (en) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10163261B2 (en) | 2014-03-19 | 2018-12-25 | Matterport, Inc. | Selecting two-dimensional imagery data for display within a three-dimensional model |
US10139985B2 (en) | 2012-06-22 | 2018-11-27 | Matterport, Inc. | Defining, displaying and interacting with tags in a three-dimensional model |
US9786097B2 (en) | 2012-06-22 | 2017-10-10 | Matterport, Inc. | Multi-modal method for interacting with 3D models |
US10127722B2 (en) | 2015-06-30 | 2018-11-13 | Matterport, Inc. | Mobile capture visualization incorporating three-dimensional and two-dimensional imagery |
US10430924B2 (en) * | 2017-06-30 | 2019-10-01 | Quirklogic, Inc. | Resizable, open editable thumbnails in a computing device |
US11287947B2 (en) | 2019-05-15 | 2022-03-29 | Microsoft Technology Licensing, Llc | Contextual input in a three-dimensional environment |
US11164395B2 (en) | 2019-05-15 | 2021-11-02 | Microsoft Technology Licensing, Llc | Structure switching in a three-dimensional environment |
US11030822B2 (en) * | 2019-05-15 | 2021-06-08 | Microsoft Technology Licensing, Llc | Content indicators in a 3D environment authoring application |
US11039061B2 (en) | 2019-05-15 | 2021-06-15 | Microsoft Technology Licensing, Llc | Content assistance in a three-dimensional environment |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10275161A (en) * | 1997-01-28 | 1998-10-13 | Dainippon Screen Mfg Co Ltd | Image retrieving method, and recording medium recorded with program for performing retrieving process |
JP2005055743A (en) * | 2003-08-06 | 2005-03-03 | Canon Inc | Image display method |
JP2007280212A (en) * | 2006-04-10 | 2007-10-25 | Sony Corp | Display control device, display control method and display control program |
JP2012128485A (en) * | 2010-12-13 | 2012-07-05 | Yahoo Japan Corp | Image search device, image search method and image search program |
JP2013105253A (en) | 2011-11-11 | 2013-05-30 | Sony Corp | Information processing apparatus, information processing method, and program |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6012069A (en) * | 1997-01-28 | 2000-01-04 | Dainippon Screen Mfg. Co., Ltd. | Method and apparatus for retrieving a desired image from an image database using keywords |
US5995119A (en) * | 1997-06-06 | 1999-11-30 | At&T Corp. | Method for generating photo-realistic animated characters |
US6278466B1 (en) * | 1998-06-11 | 2001-08-21 | Presenter.Com, Inc. | Creating animation from a video |
US7027054B1 (en) * | 2002-08-14 | 2006-04-11 | Avaworks, Incorporated | Do-it-yourself photo realistic talking head creation system and method |
EP1768011B1 (en) * | 2004-07-15 | 2012-07-11 | Nippon Telegraph And Telephone Corporation | Inner force sense presentation device, inner force sense presentation method, and inner force sense presentation program |
CA2654960A1 (en) * | 2006-04-10 | 2008-12-24 | Avaworks Incorporated | Do-it-yourself photo realistic talking head creation system and method |
US20080002225A1 (en) * | 2006-06-27 | 2008-01-03 | Masajiro Iwasaki | Printing control method, printing control device, printing sytem, terminal device, program, and recording medium |
JP4934843B2 (en) * | 2006-11-29 | 2012-05-23 | 株式会社リコー | Information processing apparatus, image registration method, and program |
JP5167821B2 (en) * | 2008-01-11 | 2013-03-21 | 株式会社リコー | Document search apparatus, document search method, and document search program |
US20130215116A1 (en) * | 2008-03-21 | 2013-08-22 | Dressbot, Inc. | System and Method for Collaborative Shopping, Business and Entertainment |
US9354718B2 (en) * | 2010-12-22 | 2016-05-31 | Zspace, Inc. | Tightly coupled interactive stereo display |
JP5776201B2 (en) * | 2011-02-10 | 2015-09-09 | ソニー株式会社 | Information processing apparatus, information sharing method, program, and terminal apparatus |
JP5741160B2 (en) * | 2011-04-08 | 2015-07-01 | ソニー株式会社 | Display control apparatus, display control method, and program |
JP5732988B2 (en) * | 2011-04-08 | 2015-06-10 | ソニー株式会社 | Image processing apparatus, display control method, and program |
US20130249948A1 (en) * | 2011-08-26 | 2013-09-26 | Reincloud Corporation | Providing interactive travel content at a display device |
US9405463B2 (en) * | 2011-11-25 | 2016-08-02 | Samsung Electronics Co., Ltd. | Device and method for gesturally changing object attributes |
JP2013165366A (en) * | 2012-02-10 | 2013-08-22 | Sony Corp | Image processing device, image processing method, and program |
GB201208088D0 (en) * | 2012-05-09 | 2012-06-20 | Ncam Sollutions Ltd | Ncam |
US10223859B2 (en) * | 2012-10-30 | 2019-03-05 | Bally Gaming, Inc. | Augmented reality gaming eyewear |
US20150123966A1 (en) * | 2013-10-03 | 2015-05-07 | Compedia - Software And Hardware Development Limited | Interactive augmented virtual reality and perceptual computing platform |
-
2014
- 2014-08-13 US US15/025,500 patent/US10074216B2/en not_active Expired - Fee Related
- 2014-08-13 WO PCT/JP2014/071386 patent/WO2015072194A1/en active Application Filing
- 2014-08-13 JP JP2015547663A patent/JP6337907B2/en not_active Expired - Fee Related
- 2014-08-13 EP EP14861731.9A patent/EP3070681A4/en not_active Withdrawn
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10275161A (en) * | 1997-01-28 | 1998-10-13 | Dainippon Screen Mfg Co Ltd | Image retrieving method, and recording medium recorded with program for performing retrieving process |
JP2005055743A (en) * | 2003-08-06 | 2005-03-03 | Canon Inc | Image display method |
JP2007280212A (en) * | 2006-04-10 | 2007-10-25 | Sony Corp | Display control device, display control method and display control program |
JP2012128485A (en) * | 2010-12-13 | 2012-07-05 | Yahoo Japan Corp | Image search device, image search method and image search program |
JP2013105253A (en) | 2011-11-11 | 2013-05-30 | Sony Corp | Information processing apparatus, information processing method, and program |
Non-Patent Citations (1)
Title |
---|
See also references of EP3070681A4 |
Also Published As
Publication number | Publication date |
---|---|
EP3070681A1 (en) | 2016-09-21 |
EP3070681A4 (en) | 2017-07-12 |
US20160210788A1 (en) | 2016-07-21 |
JPWO2015072194A1 (en) | 2017-03-16 |
JP6337907B2 (en) | 2018-06-06 |
US10074216B2 (en) | 2018-09-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6337907B2 (en) | Display control apparatus, display control method, and program | |
JP6102588B2 (en) | Information processing apparatus, information processing method, and program | |
JP6121647B2 (en) | Information processing apparatus, information processing method, and program | |
JP6167703B2 (en) | Display control device, program, and recording medium | |
JP6135783B2 (en) | Information processing apparatus, information processing method, and program | |
WO2013145566A1 (en) | Information processing apparatus, information processing method, and program | |
US20150070247A1 (en) | Information processing apparatus, information processing method, and program | |
JP2015095147A (en) | Display control device, display control method, and program | |
JP6149862B2 (en) | Display control device, display control system, and display control method | |
JP2017211811A (en) | Display control program, display control method and display control device | |
JP6686547B2 (en) | Image processing system, program, image processing method | |
US20140181709A1 (en) | Apparatus and method for using interaction history to manipulate content | |
JP5446700B2 (en) | Information processing apparatus, information processing method, and program | |
US20230043683A1 (en) | Determining a change in position of displayed digital content in subsequent frames via graphics processing circuitry | |
GB2513865A (en) | A method for interacting with an augmented reality scene | |
JP2017108356A (en) | Image management system, image management method and program | |
JP2015156187A (en) | Information processing apparatus, system, information processing method, and program | |
JP6065084B2 (en) | Information processing apparatus, information processing method, and program | |
KR20120035321A (en) | System and method for playing contents of augmented reality | |
JP6443505B2 (en) | Program, display control apparatus, and display control method | |
JP2019096305A (en) | Electronic apparatus and control method, program, and recording medium thereof | |
US20240080543A1 (en) | User interfaces for camera management | |
WO2024057650A1 (en) | Electronic device | |
WO2019102885A1 (en) | Electronic device with changeable image display section | |
JP2021081937A (en) | User terminal, control method and computer program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14861731 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015547663 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15025500 Country of ref document: US |
|
REEP | Request for entry into the european phase |
Ref document number: 2014861731 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014861731 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |