WO2015072194A1 - Display control device, display control method and program - Google Patents

Display control device, display control method and program Download PDF

Info

Publication number
WO2015072194A1
WO2015072194A1 PCT/JP2014/071386 JP2014071386W WO2015072194A1 WO 2015072194 A1 WO2015072194 A1 WO 2015072194A1 JP 2014071386 W JP2014071386 W JP 2014071386W WO 2015072194 A1 WO2015072194 A1 WO 2015072194A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
display
display control
orientation
record
Prior art date
Application number
PCT/JP2014/071386
Other languages
French (fr)
Japanese (ja)
Inventor
俊一 笠原
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Priority to EP14861731.9A priority Critical patent/EP3070681A4/en
Priority to JP2015547663A priority patent/JP6337907B2/en
Priority to US15/025,500 priority patent/US10074216B2/en
Publication of WO2015072194A1 publication Critical patent/WO2015072194A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • G06V10/242Aligning, centring, orientation detection or correction of the image by image rotation, e.g. by 90 degrees
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes

Definitions

  • the present disclosure relates to a display control device, a display control method, and a program.
  • AR augmented reality
  • an object included in a captured image is recognized, and information related to the recognized object is displayed.
  • Such information is also called annotation, and is visualized as various forms of virtual objects such as text, icons, or animations.
  • Patent Document 1 An example of such an AR technique is described in Patent Document 1, for example.
  • Patent Document 1 proposes a technique for appropriately displaying a virtual object related to a real object in accordance with the position or orientation of the object, a technique for display utilizing the real object itself is proposed. Not. As described in Patent Document 1, since it is possible to detect the position and orientation of an object, it should be possible to display more useful information by utilizing the actual object itself. It is.
  • a new and improved display control apparatus capable of displaying more useful information by utilizing the actual object itself based on the display position or orientation of the actual object included in the image.
  • a display control method and program are proposed.
  • a record reference unit that refers to a record that associates an image, an object included in the image, a display position or a posture of the object in the image, and a second object including the object based on the record.
  • a display control device that includes a display control unit that transitions display of one image to display of a second image that includes the object and is different from the first image, while maintaining the display position or orientation of the object.
  • a display control method including transition of display of one image to display of a second image including the object and different from the first image while maintaining the display position or orientation of the object.
  • a function that refers to a record that associates an image, an object included in the image, and a display position or orientation of the object in the image, and a function that includes the object based on the record A program for causing a computer to realize a function of transitioning display of one image to display of a second image that includes the object and is different from the first image while maintaining the display position or orientation of the object Is provided.
  • FIG. 10 is a diagram illustrating a first display example according to an embodiment of the present disclosure.
  • FIG. 10 is a diagram illustrating a second display example according to an embodiment of the present disclosure.
  • FIG. 14 is a diagram illustrating a third display example according to an embodiment of the present disclosure.
  • FIG. 14 is a diagram illustrating a fourth display example according to an embodiment of the present disclosure.
  • FIG. 16 is a diagram illustrating a fifth display example according to an embodiment of the present disclosure.
  • 5 is a diagram for describing an example of processing for displaying an annotation according to an embodiment of the present disclosure.
  • FIG. 18 is a diagram illustrating a sixth display example according to an embodiment of the present disclosure.
  • FIG. 3 is a block diagram illustrating a hardware configuration example of an information processing apparatus according to an embodiment of the present disclosure.
  • FIG. 1 is a diagram illustrating an outline of a device configuration according to an embodiment of the present disclosure.
  • the smartphone 100 images a real space 200 including an object 201.
  • a through image 300t generated by imaging is displayed on the display unit 118 of the smartphone 100.
  • the through image 300t includes an object 301 corresponding to the actual object 201.
  • FIG. 2 is a block diagram illustrating a schematic configuration of a smartphone according to an embodiment of the present disclosure.
  • the smartphone 100 includes an imaging unit 102, an object recognition unit 104, an object DB 106, a record generation unit 108, a record reference unit 110, an image-object DB 112, a display control unit 114, and an image.
  • DB 116 and display unit 118 are included.
  • the smartphone 100 is an example of a display control device according to an embodiment of the present disclosure, and may be realized by, for example, a hardware configuration of an information processing device described later. Hereinafter, each component will be further described.
  • the smartphone 100 includes at least a processor such as a CPU (Central Processing), a memory or storage, a camera (imaging device), and a display (output device).
  • a processor such as a CPU (Central Processing), a memory or storage, a camera (imaging device), and a display (output device).
  • the object recognition unit 104, the record generation unit 108, the record reference unit 110, and the display control unit 114 can be realized by the processor operating according to a program stored in a memory or storage.
  • the object DB 106, the image-object DB 112, and the image DB 116 can be realized by memory or storage.
  • the imaging unit 102 can be realized by a camera.
  • the display unit 118 can be realized by a display.
  • the imaging unit 102 captures an actual space and generates an image.
  • the imaging unit 102 may generate a still image or a moving image.
  • the image data generated by the imaging unit 102 is provided to the object recognition unit 104 and stored in the image DB 116 as necessary.
  • the image data may be provided to the display control unit 114 and displayed on the display unit 118 as a through image or a preview image.
  • the object recognition unit 104 performs object recognition on the image generated by the imaging unit 102.
  • the object recognition unit 104 may refer to the object DB 106 for object recognition.
  • model data relating to the shape or appearance of an object to be recognized is stored in advance.
  • the model data includes data defining the shape of each object, image data such as a predetermined symbol mark or text label attached to each object, or feature set data extracted from a known image for each object. .
  • the object recognition unit 104 recognizes which object is included in the input image by using the image generated by the imaging unit 102 as the input image. For example, the object recognition unit 104 collates a set of feature points extracted from the input image with the shape of the object defined by the model data. Further, the object recognition unit 104 may collate image data such as a symbol mark or a text label defined by the model data with the input image. Further, the object recognizing unit 104 may collate the feature amount of the known object image defined by the model data with the feature amount extracted from the input image.
  • the object recognition unit 104 uses the feature amount set representing the recognized object as data of the feature amount set extracted from the known image for the object. May be added.
  • the object recognition unit 104 adds new model data to the object DB 106 based on the shape, image data, feature amount, etc. of the recognized object. Also good.
  • the object recognition unit 104 can recognize the display position and orientation of the object in the image. More specifically, for example, the object recognition unit 104 detects the display position and orientation of the object included in the input image using the image generated by the imaging unit 102 as the input image.
  • the posture of the object is a 4-by-4 matrix that represents the transformation between the model coordinate system in the model data stored in the object DB 106 and the coordinate system of the object shown in the input image. Is expressed in an integrated manner.
  • the object recognition unit 104 can extract the angle of the object with respect to the smartphone 100 from the homogeneous transformation matrix.
  • the display position of the object in the image can be represented by, for example, the center coordinates of the object in the image.
  • the record generation unit 108 generates a record in the image-object DB 112 based on the result of object recognition by the object recognition unit 104.
  • the image-object DB 112 for example, a record that associates an image, an object included in the image, and a display position and orientation of the object in the image is generated. For example, as shown in FIG. 3, when the object recognition unit 104 recognizes two objects 301a and 301b in the image 300 generated by the imaging unit 102, the record generation unit 108 displays the image-object DB 106 in FIG. Add a record like
  • FIG. 4 is a diagram illustrating an example of a record generated in the image-object DB according to an embodiment of the present disclosure.
  • FIG. 4 illustrates a record including an image ID, an object ID, a position, and a posture as a record generated in the image-object DB 112.
  • the record is unique for the combination of image ID and object ID.
  • the object recognition unit 104 recognizes two known objects 301a and 301b in the image 300a generated by the imaging unit 102. Therefore, the record generation unit 108 adds two records 401 a and 401 b as shown in FIG. 4 to the image-object DB 112.
  • the record of the image-object DB 112 is unique for the combination of the image ID and the object ID. Therefore, as shown in the figure, if a plurality of objects 301a and 301b are found in the same image 300a, a plurality of records 401a and 401b corresponding to the objects can be generated. Furthermore, if at least one of the object 301a or the object 301b is found in another image 300b, a new record 401 is generated again.
  • the record reference unit 110 refers to the record in the image-object DB 112 based on the result of object recognition by the object recognition unit 104. Referenced here is a record for another image including the object recognized by the object recognition unit 104. That is, as in the example of FIGS. 3 and 4 above, when the object 301a is recognized in the image 300a, the record reference unit 110 refers to a record for an image including the object 301a with an image other than the image 300a. .
  • the record reference unit 110 may refer to the record in the image-object DB 112 in response to a request from the display control unit 114.
  • a record about an image including an object designated by the display control unit 114 is referred to.
  • the record reference unit 110 designates an object ID and refers to the record, thereby referring to a plurality of images including a specific object. It is possible to acquire all the records at once.
  • the display control unit 114 controls display of an image on the display unit 118.
  • the display control unit 114 may cause the display unit 118 to display an image generated by the imaging unit 102 (hereinafter also referred to as a captured image) as a through image or a preview image.
  • the display control unit 114 may read out an image stored in the image DB 116 (hereinafter referred to as a recorded image) in accordance with an operation of the user of the smartphone 100 and display the image on the display unit 118. At this time, the display control unit 114 may process and display the captured image or the recorded image as necessary.
  • the display control unit 114 changes the display position of the object included in the captured image or the recorded image. (The image itself may be moved) or the posture of the object can be changed.
  • the display control unit 114 changes the display of the first image (captured image or recorded image) including a certain object to the display of the second image (recorded image different from the first image) including the same object. You may let them.
  • the display control unit 114 can change the display of the image while maintaining the display position or orientation of the object, for example, by changing the display position or orientation of the object in the first image or the second image. it can.
  • Such a display includes, for example, the second image of the transition destination (the same object as the first image stored in the image DB) based on the record referred to by the record reference unit 110. This is possible by identifying the image) and recognizing the display position and orientation of the object included in the second image.
  • the display control unit 114 displays the result of object recognition by the object recognition unit 104 (when the first image is a captured image) or another record (first image acquired by the record reference unit 110). Display position and orientation based on the recorded image).
  • an image generated by the imaging unit 102 is stored in response to an operation of the user of the smartphone 100.
  • the image DB 116 may store an image acquired from an external device (not shown) via a network.
  • the image DB 116 may store an image content entity, or may store link information to the image content entity.
  • the image ID given to the image stored in the image DB 116 and the image ID in the image-object DB 112 are common. Or it is desirable that it is convertible. Similarly, it is desirable that the object ID given to the model data stored in the object DB 106 and the object ID in the image-object DB 112 are common or convertible.
  • the display control device may be various devices including a display and an input device such as a desktop or notebook personal computer, a television, a tablet terminal, a media player, or a game machine.
  • the display control apparatus does not necessarily include an imaging unit, and for example, acquires an image shared exclusively via a network, or acquires an image accumulated in a storage by reading from a removable medium. There may be.
  • the smartphone 100 (which may be another terminal device) includes an imaging unit 102 and a display unit 118, and other components such as an object recognition unit 104, an object DB 106, a record generation unit 108, a record reference unit 110, an image
  • the object DB 112, the display control unit 114, and the image DB 116 may be realized on a server on the network.
  • the server is an example of a display control device.
  • some or all of the object DB 106, the image-object DB 112, and the image DB 116 may be realized in a storage on a network.
  • FIG. 5 is a diagram illustrating a first display example according to an embodiment of the present disclosure.
  • A an image 310 including objects 311a to 311h is displayed.
  • B the object 311c is highlighted and the postures of the objects 311a to 311h are mutually positioned. It changes while maintaining the relationship, and the object 311c faces the screen.
  • C when the image 310 zooms in on the object 311c in a directly-facing state, the object 311c is enlarged and displayed.
  • FIG. 6 is a diagram illustrating a second display example according to an embodiment of the present disclosure.
  • an image 310 including objects 311a to 311h is displayed (similar to FIG. 5).
  • the object 311h is highlighted and the postures of the objects 311a to 311h are in a positional relationship.
  • the object 311h is in a directly facing state while changing.
  • a predetermined operation for example, drag / pinch-in or double tap
  • the image 310 is centered on the object 311h in the directly-facing state. Is zoomed in, the object 311h is enlarged.
  • an image is displayed so that the orientation of the object specified in the image is changed to make the object face the screen or the specified object is displayed at the center.
  • the display range can be moved, or the image can be zoomed in so that the designated object is enlarged.
  • Such a display change is caused by a record in the image-object DB 112 referred to by the record reference unit 110 by the display control unit 114 (a record that associates an image, an object included in the image, and a display position or orientation in the object image). This is made possible by processing an image acquired from the imaging unit 102 or the image DB 116 based on the above.
  • the display control unit 114 deforms the image of the object portion so that each object faces the screen based on information indicating the posture of each object included in the image. Further, the display control unit 114 rearranges the deformed objects so that the positional relationship before the deformation is reproduced based on the information indicating the display position of each object in the image. Furthermore, the display control unit 114 moves the display range of the image so that the specified object is displayed at the center based on the information indicating the display position of each object in the image, or enlarges the specified object. Or zoom in on the image.
  • FIG. 7 is a diagram illustrating a third display example according to an embodiment of the present disclosure.
  • the object 311c is displayed facing up and enlarged. This display may be displayed through the series of displays described above with reference to FIG.
  • the display of the image 310 transitions to the display of a different image 320 (second image).
  • the image 320 includes an object 321c.
  • the object 321c is the same object as the object 311c (that is, recorded in the image-object DB 112 with the same object ID).
  • the image 320 includes objects 321i and 321j in addition to the object 321c.
  • the display positions and orientations of the objects 311c and 321c are maintained through the display of the image 310 shown in A and the display of the image 320 shown in B. That is, in the image 310 of A, the object 311c is arranged in the vicinity of the center, whereas in the image 320 of B, the object 321c is arranged in the vicinity of the center. Thereafter, automatically or in response to a predetermined operation by the user, as shown in C, the display of the image 320 changes to a state where the object 311c is not directly facing and is not displayed at the center. In the example shown, this is the original state of the image 320. That is, in B, the image 320 is processed and displayed so that the posture and display position of the object 311c in the image 310 before the transition are also maintained in the object 321c displayed after the transition.
  • the display control unit 114 changes the posture of the object to a predetermined posture (for example, directly facing) in the display of the first image, and continues after the transition to the display of the second image. It is possible to seamlessly express transitions between images including the same object by maintaining the posture of at least temporarily. Accordingly, for example, the display can be changed from the image 310 including the objects 311a to 311h to the image 320 including the objects 321c, 321i, and 321j through the objects 311c and 321c, and is not included in the image 310. Information regarding the objects 321i and 321j can be newly obtained.
  • a transition to another image including the same object as the object 321j may be executed.
  • the transition of the image using the object as a medium is repeated while changing the target object as necessary, thereby forming a link between the object or the information between the images based on a new context. be able to.
  • the display control unit 114 changes the display position and orientation of the object 311c to the original state of the image 320 (second image) (the state shown in FIG. 7C).
  • the display of the image 310 may be changed to the display of the image 320 after changing to the display position and orientation similar to those of the object 321c in the display.
  • the display control unit 114 does not change the display position and orientation of the object 311c in the display of the image 310 (first image) (without facing directly or zooming in as shown in FIG. 5), instead,
  • the transition may be executed after changing the display position and orientation of the object 321c to the same display position and orientation as the object 311c in the display on the image 310.
  • FIG. 8 is a diagram illustrating a fourth display example according to an embodiment of the present disclosure.
  • this display example first, as shown in A, an image 330 (web page) including an object 331a displayed as a graphic is displayed.
  • the display of the image 330 transitions to a different image 340 (photograph) as shown in B.
  • the image 340 includes an object 341a shown in the photograph.
  • the object 341a is the same object as the object 331a (that is, recorded in the image-object DB 112 with the same object ID).
  • the images before and after the transition may be either virtual images such as web pages or captured images (photos).
  • transition may be performed between the virtually configured image (web page) and the captured image (photo), or virtually configured images, or Transition may be performed between captured images.
  • the image 330 and the image 340 are processed so that the display positions or postures of the objects 331a and 341a are maintained before and after the transition. sell.
  • FIG. 9 is a diagram illustrating a fifth display example according to an embodiment of the present disclosure.
  • annotations 352 given by a plurality of users are displayed in association with the object 351 (poster) included in the image 350 in a cumulative manner.
  • An annotation can also be treated as an object, but in this example, the object 351 and the annotation 352 are distinguished for the sake of explanation.
  • the annotation 352 can be scrolled and displayed while maintaining the posture displayed superimposed on the object 351 (in contrast to the annotations 352a to 352e displayed in A).
  • annotations 352c to 352g are displayed).
  • the annotation 352 can be displayed facing the object 351 independently of the object 351.
  • an annotation 352 input by a plurality of users to the same object 351 included in different images 350 is recorded together with a relative positional relationship with respect to the object 351 in each image 350. It becomes possible by being.
  • an example of processing for such display will be further described with reference to FIG.
  • FIG. 10 is a diagram for describing an example of processing for displaying an annotation according to an embodiment of the present disclosure.
  • the annotation 362a is input by the user A to the object 361a included in the image 360a.
  • the annotation 362a is recorded by, for example, the object ID of the object 361a, the relative position of the annotation 362a with respect to the object 361a, the relative angle of the annotation 362a with respect to the object 361a, and the content (text or image) of the annotation 362a.
  • the annotation 362a is recorded by, for example, the object ID of the object 361a, the relative position of the annotation 362a with respect to the object 361a, the relative angle of the annotation 362a with respect to the object 361a, and the content (text or image) of the annotation 362a.
  • another user B is referring to another image 360b including the same object 361b as the object 361a.
  • the object 361b is displayed at a different display position and orientation from the object 361a in the image 360a.
  • the object 361b can be associated with the object 361a by being recorded in the image-object DB 112 with the same object ID. Is possible.
  • the annotation 362a input by the user A is recorded together with the relative position and angle with respect to the object 361a, the annotation 362a can be displayed on the image 360b with reference to the object 361b.
  • the display control unit 114 calculates the display position and orientation difference between the object 361a in the image 360a and the object 361b in the image 360b based on the image-object DB record acquired by the record reference unit 110. By calculating and adding the calculated difference to the relative position or angle of the annotation 362a with respect to the object 361a, the annotation 362a can be displayed in the image 360b with reference to the object 361b.
  • the user B may additionally input an annotation 362b to the object 361b included in the image 360b.
  • the annotation 362b also has the object ID of the object 361b (same as the object 361a), the relative position of the annotation 362b with respect to the object 361b, the relative angle of the annotation 362b with respect to the object 361b, and the content of the annotation 362b ( Text or image).
  • the annotation 362b can be displayed with the object 361a as a reference.
  • annotations input by each of the users A and B may be displayed in an image 360c (web page) that is referred to by another user C.
  • image 360c web page
  • both the object 361c and the annotations 362a and 362b may be displayed facing each other in the image 360c.
  • the processing for displaying the object 361 and the annotation 362 as described above may be executed by a server on a network that provides services to the user A, the user B, and the user C, for example. That is, in this case, the server on the network may be an example of a display control device.
  • FIG. 11 is a diagram illustrating a sixth display example according to an embodiment of the present disclosure.
  • an option icon 373 of an image that can be changed using the object 371 as a medium is displayed.
  • a plurality of images including the same object as the object 371 are stored in the image DB 116, and the user can select which one to transition to using the option icon 373.
  • four option icons 373a to 373d are displayed.
  • the option icon 373 is displayed, for example, when the user performs a predetermined operation (for example, long press) on the object 371 via the touch panel or the like.
  • a predetermined operation for example, long press
  • the transition from the image 370 to the corresponding image is executed. Even in the transition in this case, the display position or orientation of the object 371 may be maintained between the image 370 before the transition and the image after the transition.
  • the selection icon 373 may display, for example, a predetermined number of images randomly selected from images that can be transitioned.
  • the option icon 373 may display an image selected under a predetermined condition.
  • the option icon 373a has the highest resolution image
  • the option icon 373b has the highest recommendation level
  • the option icon 373c has the shooting location closest to the shooting location image 370
  • the option icon 373d has the shooting location closest to the shooting location image 370. The images with the highest evaluations of other users are respectively displayed.
  • Information indicating the attribute is stored in association with each image ID.
  • a state in which a plurality of objects are present at the same place is regarded as one context, and an application sharing the state can be realized. is there.
  • the transition is seamlessly executed while the display position or orientation of the object (book) is maintained, so that the user can confirm that the image display is executed through the object (book). It can be easily recognized.
  • a user's viewpoint image is continuously recorded by a camera mounted on a wearable display worn by the user, a huge amount of images can be recorded. For example, if the user feels that an object has been seen but cannot remember where it was seen, take a sample image of the object, search for and display the same object as the object included in the sample image, and display that object. The user's situation at the time of viewing can be accurately recalled.
  • the viewpoint image is shared via the network, other users can access the user's viewpoint image through the object.
  • the embodiment of the present disclosure is not limited to such an example, and the image At the time of transition, only the display position or the posture may be maintained. Even in that case, it may be possible to make the user recognize that the display of the image is executed through the object.
  • the image-object DB may store a record associating the image, the object included in the image, and any one of the display position or the posture in the image of the object.
  • FIG. 12 is a block diagram illustrating a hardware configuration example of the information processing apparatus according to the embodiment of the present disclosure.
  • the illustrated information processing apparatus 900 can realize, for example, a display control apparatus such as a smartphone or a server in the above embodiment.
  • the information processing apparatus 900 includes a CPU (Central Processing unit) 901, a ROM (Read Only Memory) 903, and a RAM (Random Access Memory) 905.
  • the information processing apparatus 900 may include a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, a storage device 919, a drive 921, a connection port 923, and a communication device 925.
  • the information processing apparatus 900 may include an imaging device 933 and a sensor 935 as necessary.
  • the information processing apparatus 900 may include a processing circuit called DSP (Digital Signal Processor) or ASIC (Application Specific Integrated Circuit) instead of or in addition to the CPU 901.
  • DSP Digital Signal Processor
  • ASIC Application Specific Integrated Circuit
  • the CPU 901 functions as an arithmetic processing device and a control device, and controls all or a part of the operation in the information processing device 900 according to various programs recorded in the ROM 903, the RAM 905, the storage device 919, or the removable recording medium 927.
  • the ROM 903 stores programs and calculation parameters used by the CPU 901.
  • the RAM 905 primarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like.
  • the CPU 901, the ROM 903, and the RAM 905 are connected to each other by a host bus 907 configured by an internal bus such as a CPU bus. Further, the host bus 907 is connected to an external bus 911 such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge 909.
  • PCI Peripheral Component Interconnect / Interface
  • the input device 915 is a device operated by the user, such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever.
  • the input device 915 may be, for example, a remote control device that uses infrared rays or other radio waves, or may be an external connection device 929 such as a mobile phone that supports the operation of the information processing device 900.
  • the input device 915 includes an input control circuit that generates an input signal based on information input by the user and outputs the input signal to the CPU 901. The user operates the input device 915 to input various data and instruct processing operations to the information processing device 900.
  • the output device 917 is a device that can notify the user of the acquired information visually or audibly.
  • the output device 917 can be, for example, a display device such as an LCD (Liquid Crystal Display), a PDP (Plasma Display Panel), an organic EL (Electro-Luminescence) display, an audio output device such as a speaker and headphones, and a printer device.
  • the output device 917 outputs the result obtained by the processing of the information processing device 900 as video such as text or an image, or outputs it as audio such as voice or sound.
  • the storage device 919 is a data storage device configured as an example of a storage unit of the information processing device 900.
  • the storage device 919 includes, for example, a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
  • the storage device 919 stores programs executed by the CPU 901, various data, various data acquired from the outside, and the like.
  • the drive 921 is a reader / writer for a removable recording medium 927 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and is built in or externally attached to the information processing apparatus 900.
  • the drive 921 reads information recorded on the attached removable recording medium 927 and outputs the information to the RAM 905.
  • the drive 921 writes a record in the attached removable recording medium 927.
  • the connection port 923 is a port for directly connecting a device to the information processing apparatus 900.
  • the connection port 923 can be, for example, a USB (Universal Serial Bus) port, an IEEE 1394 port, a SCSI (Small Computer System Interface) port, or the like.
  • the connection port 923 may be an RS-232C port, an optical audio terminal, an HDMI (registered trademark) (High-Definition Multimedia Interface) port, or the like.
  • the communication device 925 is a communication interface configured with, for example, a communication device for connecting to the communication network 931.
  • the communication device 925 may be, for example, a communication card for wired or wireless LAN (Local Area Network), Bluetooth (registered trademark), or WUSB (Wireless USB).
  • the communication device 925 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), or a modem for various communication.
  • the communication device 925 transmits and receives signals and the like using a predetermined protocol such as TCP / IP with the Internet and other communication devices, for example.
  • the communication network 931 connected to the communication device 925 is a wired or wireless network, such as the Internet, a home LAN, infrared communication, radio wave communication, or satellite communication.
  • the imaging device 933 uses various members such as an imaging element such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor), and a lens for controlling the formation of a subject image on the imaging element. It is an apparatus that images a real space and generates a captured image.
  • the imaging device 933 may capture a still image or may capture a moving image.
  • the sensor 935 is various sensors such as an acceleration sensor, a gyro sensor, a geomagnetic sensor, an optical sensor, and a sound sensor.
  • the sensor 935 acquires information about the state of the information processing apparatus 900 itself, such as the posture of the information processing apparatus 900, and information about the surrounding environment of the information processing apparatus 900, such as brightness and noise around the information processing apparatus 900, for example. To do.
  • the sensor 935 may include a GPS sensor that receives a GPS (Global Positioning System) signal and measures the latitude, longitude, and altitude of the apparatus.
  • GPS Global Positioning System
  • Each component described above may be configured using a general-purpose member, or may be configured by hardware specialized for the function of each component. Such a configuration can be appropriately changed according to the technical level at the time of implementation.
  • Embodiments of the present disclosure include, for example, a display control device (such as a smartphone or a server) as described above, a system, a display control method executed by the display control device or system, a program for causing the display control device to function, And a non-transitory tangible medium on which the program is recorded.
  • a display control device such as a smartphone or a server
  • a system a display control method executed by the display control device or system
  • a program for causing the display control device to function And a non-transitory tangible medium on which the program is recorded.
  • a record reference unit that refers to a record that associates an image, an object included in the image, and a display position or orientation of the object in the image; Based on the record, the display of the first image including the object is changed to the display of the second image including the object and different from the first image while maintaining the display position or orientation of the object.
  • a display control unit (2) The display control unit changes the posture of the object to a predetermined posture in displaying the first image, and at least temporarily changes the predetermined posture even after the transition to the display of the second image.
  • the predetermined posture is a directly-facing posture.
  • the display control unit changes the display position or orientation of the object so as to correspond to the display position or orientation of the object in the display of the second image in the display of the first image.
  • the display control device according to (1), wherein the display of the first image is changed to the display of the second image.
  • the display control unit changes the display of the first image so that the display position or orientation of the object corresponds to the display position or orientation of the object in the display of the first image.
  • the display control apparatus according to any one of (1) to (5), wherein at least one of the first image and the second image is a captured image.
  • the display control unit causes the annotation given to the object in each of the first image and the second image to be displayed in association with the object cumulatively based on the record.
  • the display control apparatus according to any one of 1) to (8).
  • (11) referring to a record associating an image, an object included in the image, and a display position or orientation of the object in the image; Based on the record, the display of the first image including the object is changed to the display of the second image including the object and different from the first image while maintaining the display position or orientation of the object. And a display control method. (12) a function of referring to a record that associates an image, an object included in the image, and a display position or orientation of the object in the image; Based on the record, the display of the first image including the object is changed to the display of the second image including the object and different from the first image while maintaining the display position or orientation of the object.
  • a program that causes a computer to realize the functions to be executed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

[Problem] To display more useful information by utilizing real objects contained in an image on the basis of the display position or orientation of the real objects so contained. [Solution] This display control device is provided with: a record reference unit which refers to records that associate an image, an object contained in the image, and the display position or orientation of the object in the image; a display control unit which, on the basis of the aforementioned record, shifts the display of the first image containing the object onto the display of a second image different from the first image and containing the object, while maintaining the display position or orientation of the object.

Description

表示制御装置、表示制御方法およびプログラムDisplay control apparatus, display control method, and program
 本開示は、表示制御装置、表示制御方法およびプログラムに関する。 The present disclosure relates to a display control device, a display control method, and a program.
 近年、現実の画像に付加的な情報を重畳してユーザに呈示する、拡張現実(AR:Augmented Reality)と呼ばれる技術が注目されている。AR技術では、例えば撮像画像に含まれるオブジェクトが認識され、認識されたオブジェクトに関連する情報が表示される。このような情報はアノテーションとも呼ばれ、例えばテキスト、アイコンまたはアニメーションなど、さまざまな形態の仮想オブジェクトとして可視化される。このようなAR技術の例は、例えば特許文献1に記載されている。 In recent years, a technique called augmented reality (AR) that superimposes additional information on a real image and presents it to a user has attracted attention. In the AR technique, for example, an object included in a captured image is recognized, and information related to the recognized object is displayed. Such information is also called annotation, and is visualized as various forms of virtual objects such as text, icons, or animations. An example of such an AR technique is described in Patent Document 1, for example.
特開2013-105253号公報JP 2013-105253 A
 しかしながら、特許文献1では、現実のオブジェクトに関連する仮想オブジェクトをオブジェクトの位置または姿勢に合わせて適切に表示する技術が提案されているものの、現実のオブジェクト自体を活用した表示の技術については提案されていない。特許文献1に記載されているように、オブジェクトの位置や姿勢を検出することは可能なので、これを利用して、現実のオブジェクト自体を活用してより有用な情報を表示することも可能なはずである。 However, although Patent Document 1 proposes a technique for appropriately displaying a virtual object related to a real object in accordance with the position or orientation of the object, a technique for display utilizing the real object itself is proposed. Not. As described in Patent Document 1, since it is possible to detect the position and orientation of an object, it should be possible to display more useful information by utilizing the actual object itself. It is.
 そこで、本開示では、画像に含まれる現実のオブジェクトの表示位置または姿勢に基づいて、現実のオブジェクト自体を活用してより有用な情報を表示することが可能な、新規かつ改良された表示制御装置、表示制御方法およびプログラムを提案する。 Therefore, in the present disclosure, a new and improved display control apparatus capable of displaying more useful information by utilizing the actual object itself based on the display position or orientation of the actual object included in the image. A display control method and program are proposed.
 本開示によれば、画像と、上記画像に含まれるオブジェクトと、上記オブジェクトの上記画像における表示位置または姿勢とを関連付けるレコードを参照するレコード参照部と、上記レコードに基づいて、上記オブジェクトを含む第1の画像の表示を、上記オブジェクトを含み上記第1の画像とは異なる第2の画像の表示に、上記オブジェクトの表示位置または姿勢を維持したまま遷移させる表示制御部とを含む表示制御装置が提供される。 According to the present disclosure, a record reference unit that refers to a record that associates an image, an object included in the image, a display position or a posture of the object in the image, and a second object including the object based on the record. A display control device that includes a display control unit that transitions display of one image to display of a second image that includes the object and is different from the first image, while maintaining the display position or orientation of the object. Provided.
 また、本開示によれば、画像と、上記画像に含まれるオブジェクトと、上記オブジェクトの上記画像における表示位置または姿勢とを関連付けるレコードを参照することと、上記レコードに基づいて、上記オブジェクトを含む第1の画像の表示を、上記オブジェクトを含み上記第1の画像とは異なる第2の画像の表示に、上記オブジェクトの表示位置または姿勢を維持したまま遷移させることとを含む表示制御方法が提供される。 In addition, according to the present disclosure, referring to a record that associates an image, an object included in the image, and a display position or orientation of the object in the image, and based on the record, There is provided a display control method including transition of display of one image to display of a second image including the object and different from the first image while maintaining the display position or orientation of the object. The
 また、本開示によれば、画像と、上記画像に含まれるオブジェクトと、上記オブジェクトの上記画像における表示位置または姿勢とを関連付けるレコードを参照する機能と、上記レコードに基づいて、上記オブジェクトを含む第1の画像の表示を、上記オブジェクトを含み上記第1の画像とは異なる第2の画像の表示に、上記オブジェクトの表示位置または姿勢を維持したまま遷移させる機能とをコンピュータに実現させるためのプログラムが提供される。 In addition, according to the present disclosure, a function that refers to a record that associates an image, an object included in the image, and a display position or orientation of the object in the image, and a function that includes the object based on the record. A program for causing a computer to realize a function of transitioning display of one image to display of a second image that includes the object and is different from the first image while maintaining the display position or orientation of the object Is provided.
 以上説明したように本開示によれば、画像に含まれる現実のオブジェクトの表示位置または姿勢に基づいて、現実のオブジェクト自体を活用してより有用な情報を表示することができる。 As described above, according to the present disclosure, it is possible to display more useful information by utilizing the actual object itself based on the display position or orientation of the actual object included in the image.
 なお、上記の効果は必ずしも限定的なものではなく、上記の効果とともに、または上記の効果に代えて、本明細書に示されたいずれかの効果、または本明細書から把握され得る他の効果が奏されてもよい。 Note that the above effects are not necessarily limited, and any of the effects shown in the present specification, or other effects that can be grasped from the present specification, together with or in place of the above effects. May be played.
本開示の一実施形態における装置構成の概要を示す図である。It is a figure showing an outline of device composition in one embodiment of this indication. 本開示の一実施形態に係るスマートフォンの概略的な構成を示すブロック図である。It is a block diagram showing a schematic structure of a smart phone concerning one embodiment of this indication. 本開示の一実施形態において撮像部が生成する画像の例を示す図である。It is a figure showing an example of an image which an imaging part generates in one embodiment of this indication. 本開示の一実施形態において画像-オブジェクトDBに生成されるレコードの例を示す図である。5 is a diagram illustrating an example of a record generated in an image-object DB according to an embodiment of the present disclosure. FIG. 本開示の一実施形態における第1の表示例を示す図である。FIG. 10 is a diagram illustrating a first display example according to an embodiment of the present disclosure. 本開示の一実施形態における第2の表示例を示す図である。FIG. 10 is a diagram illustrating a second display example according to an embodiment of the present disclosure. 本開示の一実施形態における第3の表示例を示す図である。FIG. 14 is a diagram illustrating a third display example according to an embodiment of the present disclosure. 本開示の一実施形態における第4の表示例を示す図である。FIG. 14 is a diagram illustrating a fourth display example according to an embodiment of the present disclosure. 本開示の一実施形態における第5の表示例を示す図である。FIG. 16 is a diagram illustrating a fifth display example according to an embodiment of the present disclosure. 本開示の一実施形態におけるアノテーションの表示のための処理の例について説明するための図である。5 is a diagram for describing an example of processing for displaying an annotation according to an embodiment of the present disclosure. FIG. 本開示の一実施形態における第6の表示例を示す図である。FIG. 18 is a diagram illustrating a sixth display example according to an embodiment of the present disclosure. 本開示の実施形態に係る情報処理装置のハードウェア構成例を示すブロック図である。FIG. 3 is a block diagram illustrating a hardware configuration example of an information processing apparatus according to an embodiment of the present disclosure.
 以下に添付図面を参照しながら、本開示の好適な実施の形態について詳細に説明する。なお、本明細書および図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複説明を省略する。 Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. In the present specification and drawings, components having substantially the same functional configuration are denoted by the same reference numerals, and redundant description is omitted.
 なお、説明は以下の順序で行うものとする。
 1.装置構成
 2.表示例
  2-1.オブジェクト表示位置または姿勢の変化
  2-2.画像間の遷移
  2-3.アノテーションの表示
  2-4.遷移先の選択肢の表示
 3.ハードウェア構成
 4.補足
The description will be made in the following order.
1. Device configuration2. Display example 2-1. Change in object display position or orientation 2-2. Transition between images 2-3. Annotation display 2-4. 2. Display of destination options Hardware configuration Supplement
 (1.装置構成)
 図1は、本開示の一実施形態における装置構成の概要を示す図である。図1を参照すると、本実施形態では、スマートフォン100が、オブジェクト201を含む現実の空間200を撮像している。スマートフォン100の表示部118には、撮像によって生成されたスルー画像300tが表示されている。スルー画像300tには、現実のオブジェクト201に対応するオブジェクト301が含まれている。
(1. Device configuration)
FIG. 1 is a diagram illustrating an outline of a device configuration according to an embodiment of the present disclosure. Referring to FIG. 1, in this embodiment, the smartphone 100 images a real space 200 including an object 201. On the display unit 118 of the smartphone 100, a through image 300t generated by imaging is displayed. The through image 300t includes an object 301 corresponding to the actual object 201.
 図2は、本開示の一実施形態に係るスマートフォンの概略的な構成を示すブロック図である。図2を参照すると、スマートフォン100は、撮像部102と、オブジェクト認識部104と、オブジェクトDB106と、レコード生成部108と、レコード参照部110と、画像-オブジェクトDB112と、表示制御部114と、画像DB116と、表示部118とを含む。スマートフォン100は、本開示の実施形態に係る表示制御装置の一例であり、例えば後述する情報処理装置のハードウェア構成によって実現されうる。以下、それぞれの構成要素についてさらに説明する。 FIG. 2 is a block diagram illustrating a schematic configuration of a smartphone according to an embodiment of the present disclosure. Referring to FIG. 2, the smartphone 100 includes an imaging unit 102, an object recognition unit 104, an object DB 106, a record generation unit 108, a record reference unit 110, an image-object DB 112, a display control unit 114, and an image. DB 116 and display unit 118 are included. The smartphone 100 is an example of a display control device according to an embodiment of the present disclosure, and may be realized by, for example, a hardware configuration of an information processing device described later. Hereinafter, each component will be further described.
 なお、本実施形態では、スマートフォン100が、少なくとも、CPU(Central Processing)などのプロセッサと、メモリまたはストレージと、カメラ(撮像装置)と、ディスプレイ(出力装置)とを備える。例えば、オブジェクト認識部104、レコード生成部108、レコード参照部110、および表示制御部114は、プロセッサがメモリまたはストレージに格納されたプログラムに従って動作することによって実現されうる。オブジェクトDB106、画像-オブジェクトDB112、および画像DB116は、メモリまたはストレージによって実現されうる。撮像部102は、カメラによって実現されうる。表示部118は、ディスプレイによって実現されうる。 In the present embodiment, the smartphone 100 includes at least a processor such as a CPU (Central Processing), a memory or storage, a camera (imaging device), and a display (output device). For example, the object recognition unit 104, the record generation unit 108, the record reference unit 110, and the display control unit 114 can be realized by the processor operating according to a program stored in a memory or storage. The object DB 106, the image-object DB 112, and the image DB 116 can be realized by memory or storage. The imaging unit 102 can be realized by a camera. The display unit 118 can be realized by a display.
 撮像部102は、現実の空間を撮像して画像を生成する。撮像部102は、静止画像を生成してもよいし、動画像を生成してもよい。撮像部102が生成した画像のデータは、オブジェクト認識部104に提供される他、必要に応じて画像DB116に格納される。また、画像のデータは、表示制御部114に提供されて、スルー画像またはプレビュー画像として表示部118に表示されてもよい。 The imaging unit 102 captures an actual space and generates an image. The imaging unit 102 may generate a still image or a moving image. The image data generated by the imaging unit 102 is provided to the object recognition unit 104 and stored in the image DB 116 as necessary. The image data may be provided to the display control unit 114 and displayed on the display unit 118 as a through image or a preview image.
 オブジェクト認識部104は、撮像部102が生成した画像について、オブジェクト認識を実行する。オブジェクト認識部104は、オブジェクト認識のためにオブジェクトDB106を参照してもよい。オブジェクトDB106には、例えば、認識の対象になるオブジェクトの形状または外観に関するモデルデータが予め蓄積されている。モデルデータは、各オブジェクトの形状を定義するデータ、各オブジェクトに付される所定のシンボルマークもしくはテキストラベルなどの画像データ、または各オブジェクトについて既知の画像から抽出された特徴量セットのデータなどを含む。 The object recognition unit 104 performs object recognition on the image generated by the imaging unit 102. The object recognition unit 104 may refer to the object DB 106 for object recognition. In the object DB 106, for example, model data relating to the shape or appearance of an object to be recognized is stored in advance. The model data includes data defining the shape of each object, image data such as a predetermined symbol mark or text label attached to each object, or feature set data extracted from a known image for each object. .
 より具体的には、例えば、オブジェクト認識部104は、撮像部102によって生成された画像を入力画像として用いて、入力画像にどのオブジェクトが含まれるかを認識する。オブジェクト認識部104は、例えば、入力画像から抽出される特徴点のセットを、モデルデータによって定義されるオブジェクトの形状と照合する。また、オブジェクト認識部104は、モデルデータによって定義されるシンボルマークまたはテキストラベルなどの画像データを、入力画像と照合してもよい。さらに、オブジェクト認識部104は、モデルデータによって定義される既知の物体の画像の特徴量を、入力画像から抽出される特徴量と照合してもよい。 More specifically, for example, the object recognition unit 104 recognizes which object is included in the input image by using the image generated by the imaging unit 102 as the input image. For example, the object recognition unit 104 collates a set of feature points extracted from the input image with the shape of the object defined by the model data. Further, the object recognition unit 104 may collate image data such as a symbol mark or a text label defined by the model data with the input image. Further, the object recognizing unit 104 may collate the feature amount of the known object image defined by the model data with the feature amount extracted from the input image.
 ここで、新たな入力画像において既知のオブジェクトが認識された場合、オブジェクト認識部104は、認識されたオブジェクトを表す特徴量セットを、当該オブジェクトについて既知の画像から抽出された特徴量セットのデータとして追加してもよい。また、新たな入力画像において新たなオブジェクトが認識された場合、オブジェクト認識部104は、認識されたオブジェクトの形状や画像データ、特徴量などに基づいて、オブジェクトDB106に新たなモデルデータを追加してもよい。 Here, when a known object is recognized in the new input image, the object recognition unit 104 uses the feature amount set representing the recognized object as data of the feature amount set extracted from the known image for the object. May be added. When a new object is recognized in the new input image, the object recognition unit 104 adds new model data to the object DB 106 based on the shape, image data, feature amount, etc. of the recognized object. Also good.
 さらに、オブジェクト認識部104は、画像におけるオブジェクトの表示位置および姿勢を認識することが可能である。より具体的には、例えば、オブジェクト認識部104は、撮像部102によって生成された画像を入力画像として用いて、入力画像に含まれるオブジェクトの表示位置および姿勢を検出する。例えば、オブジェクトの姿勢は、オブジェクトDB106に格納されたモデルデータでのモデル座標系と、入力画像に映っているオブジェクトの座標系との間の変換を表す4行4列の1つの同次変換行列によって統合的に表現される。オブジェクト認識部104は、この同次変換行列から、スマートフォン100に対するオブジェクトの角度を抽出することができる。また、画像におけるオブジェクトの表示位置は、例えば画像内でのオブジェクトの中心座標によって表すことができる。 Furthermore, the object recognition unit 104 can recognize the display position and orientation of the object in the image. More specifically, for example, the object recognition unit 104 detects the display position and orientation of the object included in the input image using the image generated by the imaging unit 102 as the input image. For example, the posture of the object is a 4-by-4 matrix that represents the transformation between the model coordinate system in the model data stored in the object DB 106 and the coordinate system of the object shown in the input image. Is expressed in an integrated manner. The object recognition unit 104 can extract the angle of the object with respect to the smartphone 100 from the homogeneous transformation matrix. Further, the display position of the object in the image can be represented by, for example, the center coordinates of the object in the image.
 レコード生成部108は、オブジェクト認識部104によるオブジェクト認識の結果に基づいて、画像-オブジェクトDB112にレコードを生成する。画像-オブジェクトDB112には、例えば、画像と、画像に含まれるオブジェクトと、オブジェクトの画像における表示位置および姿勢を関連付けるレコードが生成される。例えば、図3に示すように、撮像部102が生成した画像300において、オブジェクト認識部104が2つのオブジェクト301a,301bを認識した場合、レコード生成部108は、画像-オブジェクトDB106に図4に示すようなレコードを追加する。 The record generation unit 108 generates a record in the image-object DB 112 based on the result of object recognition by the object recognition unit 104. In the image-object DB 112, for example, a record that associates an image, an object included in the image, and a display position and orientation of the object in the image is generated. For example, as shown in FIG. 3, when the object recognition unit 104 recognizes two objects 301a and 301b in the image 300 generated by the imaging unit 102, the record generation unit 108 displays the image-object DB 106 in FIG. Add a record like
 図4は、本開示の一実施形態において画像-オブジェクトDBに生成されるレコードの例を示す図である。図4には、画像-オブジェクトDB112に生成されるレコードとして、画像ID、オブジェクトID、位置、および姿勢を含むレコードが例示されている。図示された例において、レコードは、画像IDとオブジェクトIDとの組み合わせについて一意である。上記のように、図3に示した例では、撮像部102が生成した画像300aにおいて、オブジェクト認識部104が2つの既知のオブジェクト301a,301bを認識している。そこで、レコード生成部108は、図4に示すような2つのレコード401a,401bを画像-オブジェクトDB112に追加する。 FIG. 4 is a diagram illustrating an example of a record generated in the image-object DB according to an embodiment of the present disclosure. FIG. 4 illustrates a record including an image ID, an object ID, a position, and a posture as a record generated in the image-object DB 112. In the example shown, the record is unique for the combination of image ID and object ID. As described above, in the example illustrated in FIG. 3, the object recognition unit 104 recognizes two known objects 301a and 301b in the image 300a generated by the imaging unit 102. Therefore, the record generation unit 108 adds two records 401 a and 401 b as shown in FIG. 4 to the image-object DB 112.
 例えば、レコード401aは、画像300a(画像ID=img_000001)に含まれるオブジェクト301a(オブジェクトID=obj_000001)についてのレコードであり、画像300aにオブジェクト301aが含まれることと、画像300aにおけるオブジェクト301aの表示位置(画像内の座標で(X,Y)=(0.276,0.843))と、画像300aにおけるオブジェクト301aの姿勢(同次変換行列Mによって表現される)とを関連付けて記録している。 For example, the record 401a is a record for the object 301a (object ID = obj_000001) included in the image 300a (image ID = img_000001), and that the image 301a includes the object 301a and the display position of the object 301a in the image 300a. (coordinates of the image (X, Y) = (0.276,0.843 )) and, to record in association with the posture of the object 301a (represented by homogeneous transformation matrices M 1) in the image 300a Yes.
 上記のように、画像-オブジェクトDB112のレコードは、画像IDとオブジェクトIDとの組み合わせについて一意である。従って、図示されているように、同じ画像300aの中でも、複数のオブジェクト301a,301bが発見されれば、それぞれにオブジェクトに対応する複数のレコード401a,401bが生成されうる。さらに、別の画像300bでオブジェクト301aまたはオブジェクト301bの少なくともいずれかが発見されれば、また新たなレコード401が生成される。 As described above, the record of the image-object DB 112 is unique for the combination of the image ID and the object ID. Therefore, as shown in the figure, if a plurality of objects 301a and 301b are found in the same image 300a, a plurality of records 401a and 401b corresponding to the objects can be generated. Furthermore, if at least one of the object 301a or the object 301b is found in another image 300b, a new record 401 is generated again.
 再び図2を参照して、レコード参照部110は、オブジェクト認識部104によるオブジェクト認識の結果に基づいて、画像-オブジェクトDB112のレコードを参照する。ここで参照されるのは、オブジェクト認識部104によって認識されたオブジェクトを含む他の画像についてのレコードである。つまり、上記の図3および図4の例のように、画像300aにおいてオブジェクト301aが認識された場合、レコード参照部110は、画像300a以外の画像で、オブジェクト301aを含む画像についてのレコードを参照する。例えば、レコード参照部110は、図4に示すようなレコードを含む画像-オブジェクトDB112において、「画像ID≠img_000001 AND オブジェクトID=obj_000001」というクエリを発行してレコードを参照してもよい。 Referring to FIG. 2 again, the record reference unit 110 refers to the record in the image-object DB 112 based on the result of object recognition by the object recognition unit 104. Referenced here is a record for another image including the object recognized by the object recognition unit 104. That is, as in the example of FIGS. 3 and 4 above, when the object 301a is recognized in the image 300a, the record reference unit 110 refers to a record for an image including the object 301a with an image other than the image 300a. . For example, the record reference unit 110 may refer to a record by issuing a query “image ID ≠ img_000001 AND object ID = obj_000001” in the image-object DB 112 including the record as shown in FIG.
 あるいは、レコード参照部110は、表示制御部114の要求に応じて画像-オブジェクトDB112のレコードを参照してもよい。ここで参照されるのは、例えば、表示制御部114によって指定されたオブジェクトを含む画像についてのレコードである。例えば、上記の図4のような画像-オブジェクトDB112のレコードが生成されている場合、レコード参照部110は、オブジェクトIDを指定してレコードを参照することによって、特定のオブジェクトを含む複数の画像についてのレコードを一括して取得することが可能である。 Alternatively, the record reference unit 110 may refer to the record in the image-object DB 112 in response to a request from the display control unit 114. Here, for example, a record about an image including an object designated by the display control unit 114 is referred to. For example, when the record of the image-object DB 112 as shown in FIG. 4 is generated, the record reference unit 110 designates an object ID and refers to the record, thereby referring to a plurality of images including a specific object. It is possible to acquire all the records at once.
 表示制御部114は、表示部118への画像の表示を制御する。例えば、表示制御部114は、撮像部102によって生成された画像(以下、撮像画像ともいう)を、スルー画像またはプレビュー画像として表示部118に表示させてもよい。また、例えば、表示制御部114は、スマートフォン100のユーザの操作に応じて、画像DB116に格納された画像(以下、記録画像)を読み出して、表示部118に表示させてもよい。このとき、表示制御部114は、撮像画像または記録画像を必要に応じて加工して表示させてもよい。本実施形態において、撮像画像および記録画像では、例えばオブジェクト認識部104によって画像に含まれるオブジェクトが認識されているため、表示制御部114は、撮像画像または記録画像に含まれるオブジェクトの表示位置を変化させたり(画像自体を移動させてもよい)、オブジェクトの姿勢を変化させたりすることも可能である。 The display control unit 114 controls display of an image on the display unit 118. For example, the display control unit 114 may cause the display unit 118 to display an image generated by the imaging unit 102 (hereinafter also referred to as a captured image) as a through image or a preview image. Further, for example, the display control unit 114 may read out an image stored in the image DB 116 (hereinafter referred to as a recorded image) in accordance with an operation of the user of the smartphone 100 and display the image on the display unit 118. At this time, the display control unit 114 may process and display the captured image or the recorded image as necessary. In the present embodiment, in the captured image and the recorded image, for example, the object included in the image is recognized by the object recognition unit 104, so the display control unit 114 changes the display position of the object included in the captured image or the recorded image. (The image itself may be moved) or the posture of the object can be changed.
 例えば、表示制御部114は、あるオブジェクトを含む第1の画像(撮像画像または記録画像)の表示を、同じオブジェクトを含む第2の画像(第1の画像とは異なる記録画像)の表示に遷移させてもよい。このとき、表示制御部114は、例えば第1の画像または第2の画像においてオブジェクトの表示位置または姿勢を変化させることによって、オブジェクトの表示位置または姿勢を維持したまま画像の表示を遷移させることができる。 For example, the display control unit 114 changes the display of the first image (captured image or recorded image) including a certain object to the display of the second image (recorded image different from the first image) including the same object. You may let them. At this time, the display control unit 114 can change the display of the image while maintaining the display position or orientation of the object, for example, by changing the display position or orientation of the object in the first image or the second image. it can.
 このような表示は、例えば、表示制御部114が、レコード参照部110が参照したレコードに基づいて、遷移先の第2の画像(画像DBに格納された、第1の画像と同じオブジェクトを含む画像)を特定し、さらに、第2の画像に含まれるオブジェクトの表示位置および姿勢を認識することによって可能になる。第1の画像について、表示制御部114は、オブジェクト認識部104によるオブジェクト認識の結果(第1の画像が撮像画像である場合)、またはレコード参照部110が取得した別のレコード(第1の画像が記録画像である場合)に基づいて表示位置および姿勢を認識する。 Such a display includes, for example, the second image of the transition destination (the same object as the first image stored in the image DB) based on the record referred to by the record reference unit 110. This is possible by identifying the image) and recognizing the display position and orientation of the object included in the second image. For the first image, the display control unit 114 displays the result of object recognition by the object recognition unit 104 (when the first image is a captured image) or another record (first image acquired by the record reference unit 110). Display position and orientation based on the recorded image).
 画像DB116には、例えば、撮像部102によって生成された画像が、スマートフォン100のユーザの操作に応じて格納される。また、画像DB116には、ネットワークを介して外部装置(図示せず)から取得された画像が格納されてもよい。画像DB116には、画像のコンテンツ実体が格納されてもよいし、画像のコンテンツ実体へのリンク情報が格納されてもよい。 In the image DB 116, for example, an image generated by the imaging unit 102 is stored in response to an operation of the user of the smartphone 100. The image DB 116 may store an image acquired from an external device (not shown) via a network. The image DB 116 may store an image content entity, or may store link information to the image content entity.
 ここで、上述したようなレコード参照部110および表示制御部114の処理を可能にするために、画像DB116に格納される画像に与えられる画像IDと、画像-オブジェクトDB112における画像IDとは共通、または変換可能であることが望ましい。同様に、オブジェクトDB106に格納されるモデルデータに与えられるオブジェクトIDと、画像-オブジェクトDB112におけるオブジェクトIDとも共通、または変換可能であることが望ましい。 Here, in order to enable the processing of the record reference unit 110 and the display control unit 114 as described above, the image ID given to the image stored in the image DB 116 and the image ID in the image-object DB 112 are common. Or it is desirable that it is convertible. Similarly, it is desirable that the object ID given to the model data stored in the object DB 106 and the object ID in the image-object DB 112 are common or convertible.
 以上、本開示の一実施形態における装置構成について説明した。なお、上記の例では表示制御装置としてスマートフォンを例示したが、本開示の実施形態はこのような例には限られない。例えば、表示制御装置は、デスクトップ型もしくはノート型のパーソナルコンピュータ、テレビ、タブレット端末、メディアプレーヤ、またはゲーム機など、ディスプレイと入力装置とを備える各種の機器でありうる。表示制御装置は、必ずしも撮像部を備えなくてもよく、例えば専らネットワークを介して共有された画像を取得したり、リムーバブルメディアからの読み込みなどによってストレージに蓄積された画像を取得したりするものであってもよい。 Heretofore, the device configuration according to an embodiment of the present disclosure has been described. In the above example, a smartphone is exemplified as the display control device, but the embodiment of the present disclosure is not limited to such an example. For example, the display control device may be various devices including a display and an input device such as a desktop or notebook personal computer, a television, a tablet terminal, a media player, or a game machine. The display control apparatus does not necessarily include an imaging unit, and for example, acquires an image shared exclusively via a network, or acquires an image accumulated in a storage by reading from a removable medium. There may be.
 また、本開示の実施形態における別の装置構成として、上記の構成の少なくとも一部が、ネットワーク上のサーバにおいて実現されてもよい。例えば、スマートフォン100(他の端末装置でもよい)は撮像部102と表示部118とを有し、他の構成要素、例えばオブジェクト認識部104、オブジェクトDB106、レコード生成部108、レコード参照部110、画像-オブジェクトDB112、表示制御部114、および画像DB116は、ネットワーク上のサーバにおいて実現されてもよい。この場合、サーバが表示制御装置の例になる。また、上記と同様の装置構成において、オブジェクトDB106、画像-オブジェクトDB112、および画像DB116のうちの一部または全部が、ネットワーク上のストレージにおいて実現されてもよい。 Also, as another device configuration in the embodiment of the present disclosure, at least a part of the above configuration may be realized in a server on a network. For example, the smartphone 100 (which may be another terminal device) includes an imaging unit 102 and a display unit 118, and other components such as an object recognition unit 104, an object DB 106, a record generation unit 108, a record reference unit 110, an image The object DB 112, the display control unit 114, and the image DB 116 may be realized on a server on the network. In this case, the server is an example of a display control device. Further, in the same apparatus configuration as described above, some or all of the object DB 106, the image-object DB 112, and the image DB 116 may be realized in a storage on a network.
 (2.表示例)
 (2-1.オブジェクト表示位置または姿勢の変化)
 図5は、本開示の一実施形態における第1の表示例を示す図である。この表示例では、まず、Aに示すように、オブジェクト311a~311hが含まれる画像310が表示されている。ここで、ユーザがタッチパネルなどを介した操作(例えばダブルタップなど)によってオブジェクト311cを指定すると、Bに示すように、オブジェクト311cがハイライトされた上で、オブジェクト311a~311hの姿勢が互いの位置関係を維持したまま変化し、オブジェクト311cが画面に正対した状態になる。さらに、Cに示すように、正対した状態のオブジェクト311cを中心にして画像310がズームインすることによって、オブジェクト311cが拡大表示される。
(2. Display example)
(2-1. Changes in object display position or orientation)
FIG. 5 is a diagram illustrating a first display example according to an embodiment of the present disclosure. In this display example, first, as shown in A, an image 310 including objects 311a to 311h is displayed. Here, when the user designates the object 311c by an operation via the touch panel or the like (for example, a double tap), as shown in B, the object 311c is highlighted and the postures of the objects 311a to 311h are mutually positioned. It changes while maintaining the relationship, and the object 311c faces the screen. Further, as shown in C, when the image 310 zooms in on the object 311c in a directly-facing state, the object 311c is enlarged and displayed.
 図6は、本開示の一実施形態における第2の表示例を示す図である。この表示例では、まず、Aに示すように、オブジェクト311a~311hが含まれる画像310が表示されている(図5と同様)。ここで、ユーザがタッチパネルなどを介した操作(例えばダブルタップなど)によってオブジェクト311hを指定すると、Bに示すように、オブジェクト311hがハイライトされた上で、オブジェクト311a~311hの姿勢が位置関係を維持したまま変化し、オブジェクト311hが正対した状態になる。さらに、この状態でオブジェクト311hに対してさらに所定の操作(例えばドラッグ/ピンチイン、またはダブルタップなど)が取得されると、Cに示すように、正対した状態のオブジェクト311hを中心にして画像310がズームインすることによって、オブジェクト311hが拡大される。 FIG. 6 is a diagram illustrating a second display example according to an embodiment of the present disclosure. In this display example, as shown in A, an image 310 including objects 311a to 311h is displayed (similar to FIG. 5). Here, when the user designates the object 311h by an operation (for example, a double tap) via the touch panel or the like, as shown in B, the object 311h is highlighted and the postures of the objects 311a to 311h are in a positional relationship. The object 311h is in a directly facing state while changing. Further, when a predetermined operation (for example, drag / pinch-in or double tap) is further acquired on the object 311h in this state, as shown in C, the image 310 is centered on the object 311h in the directly-facing state. Is zoomed in, the object 311h is enlarged.
 上記の2つの例に示されるように、本実施形態では、画像において指定されたオブジェクトの姿勢を変化させてオブジェクトを画面に正対させたり、指定されたオブジェクトが中心に表示されるように画像の表示範囲を移動させたり、指定されたオブジェクトが拡大表示されるように画像をズームインさせたりすることが可能である。 As shown in the above two examples, in this embodiment, an image is displayed so that the orientation of the object specified in the image is changed to make the object face the screen or the specified object is displayed at the center. The display range can be moved, or the image can be zoomed in so that the designated object is enlarged.
 このような表示の変化は、表示制御部114が、レコード参照部110が参照した画像-オブジェクトDB112のレコード(画像と、画像に含まれるオブジェクトと、オブジェクトの画像における表示位置または姿勢を関連付けるレコード)に基づいて、撮像部102または画像DB116から取得された画像を加工することによって可能になる。 Such a display change is caused by a record in the image-object DB 112 referred to by the record reference unit 110 by the display control unit 114 (a record that associates an image, an object included in the image, and a display position or orientation in the object image). This is made possible by processing an image acquired from the imaging unit 102 or the image DB 116 based on the above.
 より具体的には、例えば、表示制御部114は、画像に含まれる各オブジェクトの姿勢を示す情報に基づいて、各オブジェクトが画面に正対するように、オブジェクトの部分の画像を変形させる。また、表示制御部114は、画像における各オブジェクトの表示位置を示す情報に基づいて、変形後の各オブジェクトを、変形前の位置関係が再現されるように再配置する。さらに、表示制御部114は、画像における各オブジェクトの表示位置を示す情報に基づいて、指定されたオブジェクトが中心に表示されるように画像の表示範囲を移動させたり、指定されたオブジェクトが拡大されるように画像をズームインさせたりする。 More specifically, for example, the display control unit 114 deforms the image of the object portion so that each object faces the screen based on information indicating the posture of each object included in the image. Further, the display control unit 114 rearranges the deformed objects so that the positional relationship before the deformation is reproduced based on the information indicating the display position of each object in the image. Furthermore, the display control unit 114 moves the display range of the image so that the specified object is displayed at the center based on the information indicating the display position of each object in the image, or enlarges the specified object. Or zoom in on the image.
 (2-2.画像間の遷移)
 図7は、本開示の一実施形態における第3の表示例を示す図である。この表示例では、まず、Aに示すように、画像310(第1の画像)において、オブジェクト311cが正対および拡大して表示されている。この表示は、例えば上記で図5を参照して説明した一連の表示を経て表示されたものであってもよい。ここで、自動的に、またはユーザの所定の操作に応じて、Bに示すように、画像310の表示が、異なる画像320(第2の画像)の表示に遷移する。画像320は、オブジェクト321cを含む。オブジェクト321cは、オブジェクト311cと同じオブジェクト(つまり、同じオブジェクトIDで画像-オブジェクトDB112に記録されている)である。なお、後述するCに示されるように、画像320には、オブジェクト321cの他にオブジェクト321i,321jが含まれる。
(2-2. Transition between images)
FIG. 7 is a diagram illustrating a third display example according to an embodiment of the present disclosure. In this display example, first, as shown in A, in the image 310 (first image), the object 311c is displayed facing up and enlarged. This display may be displayed through the series of displays described above with reference to FIG. Here, as shown in B, automatically or in response to a user's predetermined operation, the display of the image 310 transitions to the display of a different image 320 (second image). The image 320 includes an object 321c. The object 321c is the same object as the object 311c (that is, recorded in the image-object DB 112 with the same object ID). As shown in C described later, the image 320 includes objects 321i and 321j in addition to the object 321c.
 このとき、Aに示す画像310の表示と、Bに示す画像320の表示とを通じて、オブジェクト311c,321cの表示位置および姿勢が維持される。つまり、Aの画像310ではオブジェクト311cが正対して中央付近に配置されているのに対し、Bの画像320でも、オブジェクト321cが正対して中央付近に配置されている。その後、自動的に、またはユーザの所定の操作に応じて、Cに示すように、画像320の表示が、オブジェクト311cが正対しておらず、中央にも表示されていない状態に変化する。図示された例では、これが画像320の元の状態である。つまり、Bでは、遷移前の画像310におけるオブジェクト311cの姿勢および表示位置が、遷移後に表示されるオブジェクト321cでも維持されるように、画像320が加工された上で表示されている。 At this time, the display positions and orientations of the objects 311c and 321c are maintained through the display of the image 310 shown in A and the display of the image 320 shown in B. That is, in the image 310 of A, the object 311c is arranged in the vicinity of the center, whereas in the image 320 of B, the object 321c is arranged in the vicinity of the center. Thereafter, automatically or in response to a predetermined operation by the user, as shown in C, the display of the image 320 changes to a state where the object 311c is not directly facing and is not displayed at the center. In the example shown, this is the original state of the image 320. That is, in B, the image 320 is processed and displayed so that the posture and display position of the object 311c in the image 310 before the transition are also maintained in the object 321c displayed after the transition.
 このように、本実施形態では、表示制御部114が、第1の画像の表示においてオブジェクトの姿勢を所定の姿勢(例えば正対)に変化させ、第2の画像の表示への遷移後も所定の姿勢を少なくとも一時的に維持することによって、同じオブジェクトを含む画像間の遷移をシームレスに表現することが可能である。これによって、例えば、オブジェクト311c,321cを媒介にして、オブジェクト311a~311hを含む画像310からオブジェクト321c,321i,321jを含む画像320に表示を遷移させることができ、画像310には含まれていなかったオブジェクト321i,321jに関しての情報を新たに得ることができる。 As described above, in the present embodiment, the display control unit 114 changes the posture of the object to a predetermined posture (for example, directly facing) in the display of the first image, and continues after the transition to the display of the second image. It is possible to seamlessly express transitions between images including the same object by maintaining the posture of at least temporarily. Accordingly, for example, the display can be changed from the image 310 including the objects 311a to 311h to the image 320 including the objects 321c, 321i, and 321j through the objects 311c and 321c, and is not included in the image 310. Information regarding the objects 321i and 321j can be newly obtained.
 例えば、上記の図7の例で、Cに示した状態で今度はオブジェクト321jを選択することによって、オブジェクト321jと同じオブジェクトを含む他の画像への遷移が実行されてもよい。本実施形態では、このようにして、オブジェクトを媒介にした画像の遷移を、必要に応じて対象のオブジェクトを変更しながら繰り返すことで、新たなコンテキストによるオブジェクトまたは画像同士の情報のリンクを形成することができる。 For example, in the example of FIG. 7 described above, by selecting the object 321j this time in the state shown in C, a transition to another image including the same object as the object 321j may be executed. In the present embodiment, in this way, the transition of the image using the object as a medium is repeated while changing the target object as necessary, thereby forming a link between the object or the information between the images based on a new context. be able to.
 別の例として、表示制御部114は、画像310(第1の画像)の表示において、オブジェクト311cの表示位置および姿勢を画像320(第2の画像)の元の状態(図7Cに示す状態)での表示におけるオブジェクト321cと同様の表示位置および姿勢に変化させた上で、画像310の表示を画像320の表示に遷移させてもよい。逆に表示制御部114は、画像310(第1の画像)の表示においてオブジェクト311cの表示位置および姿勢を変化させず(図5に示すような正対やズームインをせず)、代わりに、画像320の表示において、オブジェクト321cの表示位置および姿勢を、画像310での表示におけるオブジェクト311cと同様の表示位置および姿勢に変化させた上で遷移を実行してもよい。 As another example, in the display of the image 310 (first image), the display control unit 114 changes the display position and orientation of the object 311c to the original state of the image 320 (second image) (the state shown in FIG. 7C). The display of the image 310 may be changed to the display of the image 320 after changing to the display position and orientation similar to those of the object 321c in the display. Conversely, the display control unit 114 does not change the display position and orientation of the object 311c in the display of the image 310 (first image) (without facing directly or zooming in as shown in FIG. 5), instead, In the display 320, the transition may be executed after changing the display position and orientation of the object 321c to the same display position and orientation as the object 311c in the display on the image 310.
 図8は、本開示の一実施形態における第4の表示例を示す図である。この表示例では、まず、Aに示すように、グラフィックとして表示されたオブジェクト331aを含む画像330(ウェブページ)が表示されている。この状態で、例えば、ユーザがオブジェクト331aを指定して所定の操作を実行することによって、Bに示すように、画像330の表示が、異なる画像340(写真)に遷移する。画像340は、写真に写ったオブジェクト341aを含む。オブジェクト341aは、オブジェクト331aと同じオブジェクト(つまり、同じオブジェクトIDで画像-オブジェクトDB112に記録されている)である。 FIG. 8 is a diagram illustrating a fourth display example according to an embodiment of the present disclosure. In this display example, first, as shown in A, an image 330 (web page) including an object 331a displayed as a graphic is displayed. In this state, for example, when the user designates the object 331a and executes a predetermined operation, the display of the image 330 transitions to a different image 340 (photograph) as shown in B. The image 340 includes an object 341a shown in the photograph. The object 341a is the same object as the object 331a (that is, recorded in the image-object DB 112 with the same object ID).
 このように、本実施形態において、遷移前後の画像は、ウェブページなどの仮想的に構成された画像、または撮像画像(写真)のどちらであってもよい。上記の表示例のように、仮想的に構成された画像(ウェブページ)と撮像画像(写真)との間で相互に遷移が実行されてもよいし、仮想的に構成された画像同士、または撮像画像同士の間で遷移が実行されてもよい。また、図8には示していないが、この例においても、遷移の前後でオブジェクト331a,341aの表示位置または姿勢が維持されるように、画像330または画像340のいずれか、または両方が加工されうる。 As described above, in this embodiment, the images before and after the transition may be either virtual images such as web pages or captured images (photos). As in the above display example, transition may be performed between the virtually configured image (web page) and the captured image (photo), or virtually configured images, or Transition may be performed between captured images. Although not shown in FIG. 8, also in this example, either or both of the image 330 and the image 340 are processed so that the display positions or postures of the objects 331a and 341a are maintained before and after the transition. sell.
 (2-3.アノテーションの表示)
 図9は、本開示の一実施形態における第5の表示例を示す図である。この表示例では、Aに示すように、画像350に含まれるオブジェクト351(ポスター)に、複数のユーザによって与えられたアノテーション352が累積的に関連付けて表示される。なお、アノテーションもまたオブジェクトとして扱われうるが、この例では説明のためにオブジェクト351とアノテーション352とを区別する。Bに示すように、アノテーション352は、オブジェクト351に重畳して表示された姿勢を維持したまま、スクロールさせて表示させることが可能である(Aではアノテーション352a~352eが表示されているのに対して、Bではアノテーション352c~352gが表示されている)。また、Cに示すように、アノテーション352をオブジェクト351から独立して正対表示させることも可能である。
(2-3. Annotation display)
FIG. 9 is a diagram illustrating a fifth display example according to an embodiment of the present disclosure. In this display example, as shown in A, annotations 352 given by a plurality of users are displayed in association with the object 351 (poster) included in the image 350 in a cumulative manner. An annotation can also be treated as an object, but in this example, the object 351 and the annotation 352 are distinguished for the sake of explanation. As shown in B, the annotation 352 can be scrolled and displayed while maintaining the posture displayed superimposed on the object 351 (in contrast to the annotations 352a to 352e displayed in A). In B, annotations 352c to 352g are displayed). Further, as shown in C, the annotation 352 can be displayed facing the object 351 independently of the object 351.
 このような表示は、本実施形態において、複数のユーザがそれぞれ異なる画像350に含まれる同じオブジェクト351に対して入力したアノテーション352が、それぞれの画像350におけるオブジェクト351に対する相対的な位置関係とともに記録されていることによって可能になる。以下で、図10を説明して、このような表示のための処理の例についてさらに説明する。 In this embodiment, in this embodiment, an annotation 352 input by a plurality of users to the same object 351 included in different images 350 is recorded together with a relative positional relationship with respect to the object 351 in each image 350. It becomes possible by being. Hereinafter, an example of processing for such display will be further described with reference to FIG.
 図10は、本開示の一実施形態におけるアノテーションの表示のための処理の例について説明するための図である。図示された例では、ユーザAによって、画像360aに含まれるオブジェクト361aに対して、アノテーション362aが入力される。ここで、アノテーション362aは、例えば、オブジェクト361aのオブジェクトID、オブジェクト361aに対するアノテーション362aの相対的な位置、オブジェクト361aに対するアノテーション362aの相対的な角度、およびアノテーション362aの内容(テキストまたは画像)によって記録される。 FIG. 10 is a diagram for describing an example of processing for displaying an annotation according to an embodiment of the present disclosure. In the illustrated example, the annotation 362a is input by the user A to the object 361a included in the image 360a. Here, the annotation 362a is recorded by, for example, the object ID of the object 361a, the relative position of the annotation 362a with respect to the object 361a, the relative angle of the annotation 362a with respect to the object 361a, and the content (text or image) of the annotation 362a. The
 一方、別のユーザBは、オブジェクト361aと同じオブジェクト361bを含む別の画像360bを参照している。画像360bにおいて、オブジェクト361bは、画像360aにおけるオブジェクト361aとは異なる表示位置および姿勢で表示されているが、例えば画像-オブジェクトDB112に同じオブジェクトIDで記録されていることによって、オブジェクト361aと関連付けることが可能である。さらに、ユーザAが入力したアノテーション362aが、オブジェクト361aに対する相対的な位置や角度とともに記録されているために、画像360bにおいても、オブジェクト361bを基準にしてアノテーション362aを表示させることが可能である。 Meanwhile, another user B is referring to another image 360b including the same object 361b as the object 361a. In the image 360b, the object 361b is displayed at a different display position and orientation from the object 361a in the image 360a. For example, the object 361b can be associated with the object 361a by being recorded in the image-object DB 112 with the same object ID. Is possible. Furthermore, since the annotation 362a input by the user A is recorded together with the relative position and angle with respect to the object 361a, the annotation 362a can be displayed on the image 360b with reference to the object 361b.
 より具体的には、表示制御部114は、レコード参照部110が取得した画像-オブジェクトDBのレコードに基づいて、画像360aにおけるオブジェクト361aと、画像360bにおけるオブジェクト361bとの表示位置および姿勢の差分を算出し、算出された差分を、アノテーション362aのオブジェクト361aに対する相対的な位置や角度に加えることで、画像360bにおいてオブジェクト361bを基準にしてアノテーション362aを表示させることができる。 More specifically, the display control unit 114 calculates the display position and orientation difference between the object 361a in the image 360a and the object 361b in the image 360b based on the image-object DB record acquired by the record reference unit 110. By calculating and adding the calculated difference to the relative position or angle of the annotation 362a with respect to the object 361a, the annotation 362a can be displayed in the image 360b with reference to the object 361b.
 ここで、ユーザBは、画像360bに含まれるオブジェクト361bに対して、追加でアノテーション362bを入力してもよい。アノテーション362bも、アノテーション362aと同様に、オブジェクト361bのオブジェクトID(オブジェクト361aと同じ)、オブジェクト361bに対するアノテーション362bの相対的な位置、オブジェクト361bに対するアノテーション362bの相対的な角度、およびアノテーション362bの内容(テキストまたは画像)によって記録されうる。これによって、図示されているように、ユーザAが参照している画像360aでも、オブジェクト361aを基準にしてアノテーション362bを表示することができる。 Here, the user B may additionally input an annotation 362b to the object 361b included in the image 360b. Similarly to the annotation 362a, the annotation 362b also has the object ID of the object 361b (same as the object 361a), the relative position of the annotation 362b with respect to the object 361b, the relative angle of the annotation 362b with respect to the object 361b, and the content of the annotation 362b ( Text or image). As a result, as shown in the drawing, even in the image 360a that the user A is referring to, the annotation 362b can be displayed with the object 361a as a reference.
 さらに、ユーザA,Bのそれぞれによって入力されたアノテーションは、さらに別のユーザCが参照する画像360c(ウェブページ)において表示されてもよい。この場合、オブジェクト361cと、アノテーション362a,362bとは、いずれも画像360cにおいて正対して表示されてもよい。 Furthermore, the annotations input by each of the users A and B may be displayed in an image 360c (web page) that is referred to by another user C. In this case, both the object 361c and the annotations 362a and 362b may be displayed facing each other in the image 360c.
 以上のようなオブジェクト361およびアノテーション362の表示のための処理は、例えば、ユーザA、ユーザB、およびユーザCにサービスを提供するネットワーク上のサーバによって実行されてもよい。つまり、この場合、ネットワーク上のサーバが、表示制御装置の例であってもよい。 The processing for displaying the object 361 and the annotation 362 as described above may be executed by a server on a network that provides services to the user A, the user B, and the user C, for example. That is, in this case, the server on the network may be an example of a display control device.
 (2-4.遷移先の選択肢の表示)
 図11は、本開示の一実施形態における第6の表示例を示す図である。この表示例では、画像370において、オブジェクト371を媒介として遷移することが可能な画像の選択肢アイコン373が表示される。つまり、図示された例では、オブジェクト371と同じオブジェクトを含む画像が画像DB116に複数格納されており、そのうちのどれに遷移するかをユーザが選択肢アイコン373を用いて選択することが可能である。図示された例では、4つの選択肢アイコン373a~373dが表示されている。
(2-4. Display of transition destination options)
FIG. 11 is a diagram illustrating a sixth display example according to an embodiment of the present disclosure. In this display example, in the image 370, an option icon 373 of an image that can be changed using the object 371 as a medium is displayed. In other words, in the illustrated example, a plurality of images including the same object as the object 371 are stored in the image DB 116, and the user can select which one to transition to using the option icon 373. In the illustrated example, four option icons 373a to 373d are displayed.
 選択肢アイコン373は、例えば、ユーザがタッチパネルなどを介して、オブジェクト371に対する所定の操作(例えば長押しなど)を実行した場合に表示される。ユーザが複数の選択肢アイコン373のうちのいずれかを選択することによって、画像370から該当する画像への遷移が実行される。この場合の遷移でも、遷移前の画像370と、遷移後の画像との間で、オブジェクト371の表示位置または姿勢が維持されてもよい。 The option icon 373 is displayed, for example, when the user performs a predetermined operation (for example, long press) on the object 371 via the touch panel or the like. When the user selects one of the plurality of option icons 373, the transition from the image 370 to the corresponding image is executed. Even in the transition in this case, the display position or orientation of the object 371 may be maintained between the image 370 before the transition and the image after the transition.
 ここで、選択肢アイコン373は、例えば、遷移することが可能な画像の中からランダムに選択された所定の数の画像を表示してもよい。あるいは、選択肢アイコン373は、所定の条件で選択された画像を表示してもよい。図示された例では、選択肢アイコン373aに最も解像度が高い画像が、選択肢アイコン373bに最も推薦度が高い画像が、選択肢アイコン373cに撮影場所が画像370に撮影場所に最も近い画像が、選択肢アイコン373dに最も他のユーザの評価が高い画像が、それぞれ表示されている。 Here, the selection icon 373 may display, for example, a predetermined number of images randomly selected from images that can be transitioned. Alternatively, the option icon 373 may display an image selected under a predetermined condition. In the illustrated example, the option icon 373a has the highest resolution image, the option icon 373b has the highest recommendation level, the option icon 373c has the shooting location closest to the shooting location image 370, and the option icon 373d. The images with the highest evaluations of other users are respectively displayed.
 上記の例のように、解像度、推薦度、撮影場所、または他のユーザの評価などの属性に基づいて選択肢アイコン373として表示される画像を決定する場合、例えば、画像DB116において、それぞれの画像の属性を示す情報が、それぞれの画像IDに関連付けて格納される。 When determining an image to be displayed as the option icon 373 based on attributes such as resolution, recommendation level, shooting location, or evaluation of other users as in the above example, for example, in the image DB 116, Information indicating the attribute is stored in association with each image ID.
 このように、画像370に含まれるオブジェクト371を媒介として遷移することが可能な画像が選択肢アイコン373として表示されることによって、例えば、ユーザが画像の遷移によって所望の情報に到達することがより容易になりうる。 As described above, by displaying an image that can be transitioned using the object 371 included in the image 370 as an option icon 373, for example, it is easier for the user to reach desired information by transition of the image. Can be.
 以上で説明したような本開示の実施形態のいくつかの表示例によれば、複数のオブジェクトが同じ場所に存在している状態を1つのコンテキストと捉え、その状態を共有するアプリケーションが実現可能である。 According to some display examples of the embodiment of the present disclosure as described above, a state in which a plurality of objects are present at the same place is regarded as one context, and an application sharing the state can be realized. is there.
 一例として、ユーザが机上に置かれた本を撮像した画像から、同じ本をオブジェクトとして含む他の画像への遷移を実行した場合、書店で本棚に並んだその本の画像(他のユーザが撮像し、ネットワーク上で共有されている画像でありうる)に遷移することで、その本に関連する本としてどのような本が選択されているのかというコンテキストを理解することができる。 As an example, when a user makes a transition from an image of a book placed on a desk to another image containing the same book as an object, the book image on the bookshelf in the bookstore (taken by another user) And can be an image shared on the network), it is possible to understand the context of what kind of book is selected as the book related to the book.
 また、複数の書店やオフィス、個人の書棚などでその本が並べられている画像に次々と遷移できれば、いろいろな視点で選択された、その本に関連する他の本についての情報を得ることができる。このような遷移にあたって、オブジェクト(本)の表示位置または姿勢が維持されたままシームレスに遷移が実行されることで、ユーザは、画像の表示がオブジェクト(本)を媒介として実行されていることを容易に認識することができる。 In addition, if you can transition to images where the books are arranged in multiple bookstores, offices, personal bookshelves, etc., you can get information on other books related to the books selected from various viewpoints. it can. In such a transition, the transition is seamlessly executed while the display position or orientation of the object (book) is maintained, so that the user can confirm that the image display is executed through the object (book). It can be easily recognized.
 また、例えば、ユーザが装着するウェアラブルディスプレイに搭載されたカメラなどによって継続的にユーザの視点画像が記録されているような場合、膨大な量の画像が記録されうる。例えば、ユーザがあるオブジェクト見た気がするがどこで見たか思い出せないような場合、オブジェクトのサンプル画像を撮像し、サンプル画像に含まれるオブジェクトと同じオブジェクトを検索して表示させることで、そのオブジェクトを見た際のユーザの状況を正確に思い出すことができる。また、ネットワークを介して視点画像が共有される場合、他のユーザが、オブジェクトを媒介としてユーザの視点画像にアクセスすることができる。 Also, for example, when a user's viewpoint image is continuously recorded by a camera mounted on a wearable display worn by the user, a huge amount of images can be recorded. For example, if the user feels that an object has been seen but cannot remember where it was seen, take a sample image of the object, search for and display the same object as the object included in the sample image, and display that object. The user's situation at the time of viewing can be accurately recalled. When the viewpoint image is shared via the network, other users can access the user's viewpoint image through the object.
 なお、上記の実施形態および表示例の説明では、画像の遷移にあたってオブジェクトの表示位置および姿勢の両方が維持されることとしたが、本開示の実施形態はこのような例には限られず、画像の遷移時には、表示位置または姿勢のいずれかだけが維持されてもよい。その場合でも、画像の表示がオブジェクトを媒介として実行されていることをユーザに認識させることは十分に可能でありうる。また、その場合、画像-オブジェクトDBには、画像と、画像に含まれるオブジェクトと、オブジェクトの画像における表示位置または姿勢のいずれか一方とを関連付けるレコードが格納されていてもよい。 In the above description of the embodiment and the display example, it is assumed that both the display position and the posture of the object are maintained during the transition of the image. However, the embodiment of the present disclosure is not limited to such an example, and the image At the time of transition, only the display position or the posture may be maintained. Even in that case, it may be possible to make the user recognize that the display of the image is executed through the object. In this case, the image-object DB may store a record associating the image, the object included in the image, and any one of the display position or the posture in the image of the object.
 (3.ハードウェア構成)
 次に、図12を参照して、本開示の実施形態に係る情報処理装置のハードウェア構成について説明する。図12は、本開示の実施形態に係る情報処理装置のハードウェア構成例を示すブロック図である。図示された情報処理装置900は、例えば、上記の実施形態におけるスマートフォンやサーバなどの表示制御装置を実現しうる。
(3. Hardware configuration)
Next, a hardware configuration of the information processing apparatus according to the embodiment of the present disclosure will be described with reference to FIG. FIG. 12 is a block diagram illustrating a hardware configuration example of the information processing apparatus according to the embodiment of the present disclosure. The illustrated information processing apparatus 900 can realize, for example, a display control apparatus such as a smartphone or a server in the above embodiment.
 情報処理装置900は、CPU(Central Processing unit)901、ROM(Read Only Memory)903、およびRAM(Random Access Memory)905を含む。また、情報処理装置900は、ホストバス907、ブリッジ909、外部バス911、インターフェース913、入力装置915、出力装置917、ストレージ装置919、ドライブ921、接続ポート923、通信装置925を含んでもよい。さらに、情報処理装置900は、必要に応じて、撮像装置933、およびセンサ935を含んでもよい。情報処理装置900は、CPU901に代えて、またはこれとともに、DSP(Digital Signal Processor)またはASIC(Application Specific Integrated Circuit)と呼ばれるような処理回路を有してもよい。 The information processing apparatus 900 includes a CPU (Central Processing unit) 901, a ROM (Read Only Memory) 903, and a RAM (Random Access Memory) 905. The information processing apparatus 900 may include a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, a storage device 919, a drive 921, a connection port 923, and a communication device 925. Furthermore, the information processing apparatus 900 may include an imaging device 933 and a sensor 935 as necessary. The information processing apparatus 900 may include a processing circuit called DSP (Digital Signal Processor) or ASIC (Application Specific Integrated Circuit) instead of or in addition to the CPU 901.
 CPU901は、演算処理装置および制御装置として機能し、ROM903、RAM905、ストレージ装置919、またはリムーバブル記録媒体927に記録された各種プログラムに従って、情報処理装置900内の動作全般またはその一部を制御する。ROM903は、CPU901が使用するプログラムや演算パラメータなどを記憶する。RAM905は、CPU901の実行において使用するプログラムや、その実行において適宜変化するパラメータなどを一次記憶する。CPU901、ROM903、およびRAM905は、CPUバスなどの内部バスにより構成されるホストバス907により相互に接続されている。さらに、ホストバス907は、ブリッジ909を介して、PCI(Peripheral Component Interconnect/Interface)バスなどの外部バス911に接続されている。 The CPU 901 functions as an arithmetic processing device and a control device, and controls all or a part of the operation in the information processing device 900 according to various programs recorded in the ROM 903, the RAM 905, the storage device 919, or the removable recording medium 927. The ROM 903 stores programs and calculation parameters used by the CPU 901. The RAM 905 primarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like. The CPU 901, the ROM 903, and the RAM 905 are connected to each other by a host bus 907 configured by an internal bus such as a CPU bus. Further, the host bus 907 is connected to an external bus 911 such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge 909.
 入力装置915は、例えば、マウス、キーボード、タッチパネル、ボタン、スイッチおよびレバーなど、ユーザによって操作される装置である。入力装置915は、例えば、赤外線やその他の電波を利用したリモートコントロール装置であってもよいし、情報処理装置900の操作に対応した携帯電話などの外部接続機器929であってもよい。入力装置915は、ユーザが入力した情報に基づいて入力信号を生成してCPU901に出力する入力制御回路を含む。ユーザは、この入力装置915を操作することによって、情報処理装置900に対して各種のデータを入力したり処理動作を指示したりする。 The input device 915 is a device operated by the user, such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever. The input device 915 may be, for example, a remote control device that uses infrared rays or other radio waves, or may be an external connection device 929 such as a mobile phone that supports the operation of the information processing device 900. The input device 915 includes an input control circuit that generates an input signal based on information input by the user and outputs the input signal to the CPU 901. The user operates the input device 915 to input various data and instruct processing operations to the information processing device 900.
 出力装置917は、取得した情報をユーザに対して視覚的または聴覚的に通知することが可能な装置で構成される。出力装置917は、例えば、LCD(Liquid Crystal Display)、PDP(Plasma Display Panel)、有機EL(Electro-Luminescence)ディスプレイなどの表示装置、スピーカおよびヘッドホンなどの音声出力装置、ならびにプリンタ装置などでありうる。出力装置917は、情報処理装置900の処理により得られた結果を、テキストまたは画像などの映像として出力したり、音声または音響などの音声として出力したりする。 The output device 917 is a device that can notify the user of the acquired information visually or audibly. The output device 917 can be, for example, a display device such as an LCD (Liquid Crystal Display), a PDP (Plasma Display Panel), an organic EL (Electro-Luminescence) display, an audio output device such as a speaker and headphones, and a printer device. . The output device 917 outputs the result obtained by the processing of the information processing device 900 as video such as text or an image, or outputs it as audio such as voice or sound.
 ストレージ装置919は、情報処理装置900の記憶部の一例として構成されたデータ格納用の装置である。ストレージ装置919は、例えば、HDD(Hard Disk Drive)などの磁気記憶部デバイス、半導体記憶デバイス、光記憶デバイス、または光磁気記憶デバイスなどにより構成される。このストレージ装置919は、CPU901が実行するプログラムや各種データ、および外部から取得した各種のデータなどを格納する。 The storage device 919 is a data storage device configured as an example of a storage unit of the information processing device 900. The storage device 919 includes, for example, a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, or a magneto-optical storage device. The storage device 919 stores programs executed by the CPU 901, various data, various data acquired from the outside, and the like.
 ドライブ921は、磁気ディスク、光ディスク、光磁気ディスク、または半導体メモリなどのリムーバブル記録媒体927のためのリーダライタであり、情報処理装置900に内蔵、あるいは外付けされる。ドライブ921は、装着されているリムーバブル記録媒体927に記録されている情報を読み出して、RAM905に出力する。また、ドライブ921は、装着されているリムーバブル記録媒体927に記録を書き込む。 The drive 921 is a reader / writer for a removable recording medium 927 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and is built in or externally attached to the information processing apparatus 900. The drive 921 reads information recorded on the attached removable recording medium 927 and outputs the information to the RAM 905. In addition, the drive 921 writes a record in the attached removable recording medium 927.
 接続ポート923は、機器を情報処理装置900に直接接続するためのポートである。接続ポート923は、例えば、USB(Universal Serial Bus)ポート、IEEE1394ポート、SCSI(Small Computer System Interface)ポートなどでありうる。また、接続ポート923は、RS-232Cポート、光オーディオ端子、HDMI(登録商標)(High-Definition Multimedia Interface)ポートなどであってもよい。接続ポート923に外部接続機器929を接続することで、情報処理装置900と外部接続機器929との間で各種のデータが交換されうる。 The connection port 923 is a port for directly connecting a device to the information processing apparatus 900. The connection port 923 can be, for example, a USB (Universal Serial Bus) port, an IEEE 1394 port, a SCSI (Small Computer System Interface) port, or the like. The connection port 923 may be an RS-232C port, an optical audio terminal, an HDMI (registered trademark) (High-Definition Multimedia Interface) port, or the like. By connecting the external connection device 929 to the connection port 923, various types of data can be exchanged between the information processing apparatus 900 and the external connection device 929.
 通信装置925は、例えば、通信ネットワーク931に接続するための通信デバイスなどで構成された通信インターフェースである。通信装置925は、例えば、有線または無線LAN(Local Area Network)、Bluetooth(登録商標)、またはWUSB(Wireless USB)用の通信カードなどでありうる。また、通信装置925は、光通信用のルータ、ADSL(Asymmetric Digital Subscriber Line)用のルータ、または、各種通信用のモデムなどであってもよい。通信装置925は、例えば、インターネットや他の通信機器との間で、TCP/IPなどの所定のプロトコルを用いて信号などを送受信する。また、通信装置925に接続される通信ネットワーク931は、有線または無線によって接続されたネットワークであり、例えば、インターネット、家庭内LAN、赤外線通信、ラジオ波通信または衛星通信などである。 The communication device 925 is a communication interface configured with, for example, a communication device for connecting to the communication network 931. The communication device 925 may be, for example, a communication card for wired or wireless LAN (Local Area Network), Bluetooth (registered trademark), or WUSB (Wireless USB). The communication device 925 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), or a modem for various communication. The communication device 925 transmits and receives signals and the like using a predetermined protocol such as TCP / IP with the Internet and other communication devices, for example. The communication network 931 connected to the communication device 925 is a wired or wireless network, such as the Internet, a home LAN, infrared communication, radio wave communication, or satellite communication.
 撮像装置933は、例えば、CCD(Charge Coupled Device)またはCMOS(Complementary Metal Oxide Semiconductor)などの撮像素子、および撮像素子への被写体像の結像を制御するためのレンズなどの各種の部材を用いて実空間を撮像し、撮像画像を生成する装置である。撮像装置933は、静止画を撮像するものであってもよいし、また動画を撮像するものであってもよい。 The imaging device 933 uses various members such as an imaging element such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor), and a lens for controlling the formation of a subject image on the imaging element. It is an apparatus that images a real space and generates a captured image. The imaging device 933 may capture a still image or may capture a moving image.
 センサ935は、例えば、加速度センサ、ジャイロセンサ、地磁気センサ、光センサ、音センサなどの各種のセンサである。センサ935は、例えば情報処理装置900の筐体の姿勢など、情報処理装置900自体の状態に関する情報や、情報処理装置900の周辺の明るさや騒音など、情報処理装置900の周辺環境に関する情報を取得する。また、センサ935は、GPS(Global Positioning System)信号を受信して装置の緯度、経度および高度を測定するGPSセンサを含んでもよい。 The sensor 935 is various sensors such as an acceleration sensor, a gyro sensor, a geomagnetic sensor, an optical sensor, and a sound sensor. The sensor 935 acquires information about the state of the information processing apparatus 900 itself, such as the posture of the information processing apparatus 900, and information about the surrounding environment of the information processing apparatus 900, such as brightness and noise around the information processing apparatus 900, for example. To do. The sensor 935 may include a GPS sensor that receives a GPS (Global Positioning System) signal and measures the latitude, longitude, and altitude of the apparatus.
 以上、情報処理装置900のハードウェア構成の一例を示した。上記の各構成要素は、汎用的な部材を用いて構成されていてもよいし、各構成要素の機能に特化したハードウェアにより構成されていてもよい。かかる構成は、実施する時々の技術レベルに応じて適宜変更されうる。 Heretofore, an example of the hardware configuration of the information processing apparatus 900 has been shown. Each component described above may be configured using a general-purpose member, or may be configured by hardware specialized for the function of each component. Such a configuration can be appropriately changed according to the technical level at the time of implementation.
 (4.補足)
 本開示の実施形態は、例えば、上記で説明したような表示制御装置(スマートフォンまたはサーバなど)、システム、表示制御装置またはシステムで実行される表示制御方法、表示制御装置を機能させるためのプログラム、およびプログラムが記録された一時的でない有形の媒体を含みうる。
(4. Supplement)
Embodiments of the present disclosure include, for example, a display control device (such as a smartphone or a server) as described above, a system, a display control method executed by the display control device or system, a program for causing the display control device to function, And a non-transitory tangible medium on which the program is recorded.
 以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示の技術的範囲はかかる例に限定されない。本開示の技術分野における通常の知識を有する者であれば、請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。 The preferred embodiments of the present disclosure have been described in detail above with reference to the accompanying drawings, but the technical scope of the present disclosure is not limited to such examples. It is obvious that a person having ordinary knowledge in the technical field of the present disclosure can come up with various changes or modifications within the scope of the technical idea described in the claims. Of course, it is understood that it belongs to the technical scope of the present disclosure.
 また、本明細書に記載された効果は、あくまで説明的または例示的なものであって限定的ではない。つまり、本開示に係る技術は、上記の効果とともに、または上記の効果に代えて、本明細書の記載から当業者には明らかな他の効果を奏しうる。 In addition, the effects described in this specification are merely illustrative or illustrative, and are not limited. That is, the technology according to the present disclosure can exhibit other effects that are apparent to those skilled in the art from the description of the present specification in addition to or instead of the above effects.
 なお、以下のような構成も本開示の技術的範囲に属する。
(1)画像と、前記画像に含まれるオブジェクトと、前記オブジェクトの前記画像における表示位置または姿勢とを関連付けるレコードを参照するレコード参照部と、
 前記レコードに基づいて、前記オブジェクトを含む第1の画像の表示を、前記オブジェクトを含み前記第1の画像とは異なる第2の画像の表示に、前記オブジェクトの表示位置または姿勢を維持したまま遷移させる表示制御部と
 を備える表示制御装置。
(2)前記表示制御部は、前記第1の画像の表示において前記オブジェクトの姿勢を所定の姿勢に変化させ、前記第2の画像の表示への遷移後も前記所定の姿勢を少なくとも一時的に維持する、前記(1)に記載の表示制御装置。
(3)前記所定の姿勢は、正対姿勢である、前記(2)に記載の表示制御装置。
(4)前記表示制御部は、前記第1の画像の表示において前記オブジェクトの表示位置または姿勢を前記第2の画像の表示における前記オブジェクトの表示位置または姿勢に対応するように変化させた上で、前記第1の画像の表示を前記第2の画像の表示に遷移させる、前記(1)に記載の表示制御装置。
(5)前記表示制御部は、前記第1の画像の表示を、前記オブジェクトの表示位置または姿勢が前記第1の画像の表示における前記オブジェクトの表示位置または姿勢に対応するように変化した前記第2の画像の表示に遷移させる、前記(1)に記載の表示制御装置。
(6)前記第1の画像および前記第2の画像のうちの少なくともいずれかは、撮像画像である、前記(1)~(5)のいずれか1項に記載の表示制御装置。
(7)前記第1の画像および前記第2の画像のうちの少なくともいずれかは、仮想的に構成された画像である、前記(1)~(6)のいずれか1項に記載の表示制御装置。
(8)前記表示制御部は、前記第1の画像において、前記第2の画像として遷移可能な画像の選択肢を表示させる、前記(1)~(7)のいずれか1項に記載の表示制御装置。
(9)前記表示制御部は、前記画像の属性ごとに前記選択肢を表示させる、前記(8)に記載の表示制御装置。
(10)前記表示制御部は、前記レコードに基づいて、前記第1の画像および前記第2の画像のそれぞれにおいて前記オブジェクトに与えられたアノテーションを累積的に前記オブジェクトに関連付けて表示させる、前記(1)~(8)のいずれか1項に記載の表示制御装置。
(11)画像と、前記画像に含まれるオブジェクトと、前記オブジェクトの前記画像における表示位置または姿勢とを関連付けるレコードを参照することと、
 前記レコードに基づいて、前記オブジェクトを含む第1の画像の表示を、前記オブジェクトを含み前記第1の画像とは異なる第2の画像の表示に、前記オブジェクトの表示位置または姿勢を維持したまま遷移させることと
 を含む表示制御方法。
(12)画像と、前記画像に含まれるオブジェクトと、前記オブジェクトの前記画像における表示位置または姿勢とを関連付けるレコードを参照する機能と、
 前記レコードに基づいて、前記オブジェクトを含む第1の画像の表示を、前記オブジェクトを含み前記第1の画像とは異なる第2の画像の表示に、前記オブジェクトの表示位置または姿勢を維持したまま遷移させる機能と
 をコンピュータに実現させるためのプログラム。
The following configurations also belong to the technical scope of the present disclosure.
(1) a record reference unit that refers to a record that associates an image, an object included in the image, and a display position or orientation of the object in the image;
Based on the record, the display of the first image including the object is changed to the display of the second image including the object and different from the first image while maintaining the display position or orientation of the object. A display control unit.
(2) The display control unit changes the posture of the object to a predetermined posture in displaying the first image, and at least temporarily changes the predetermined posture even after the transition to the display of the second image. The display control device according to (1), wherein the display control device is maintained.
(3) The display control device according to (2), wherein the predetermined posture is a directly-facing posture.
(4) The display control unit changes the display position or orientation of the object so as to correspond to the display position or orientation of the object in the display of the second image in the display of the first image. The display control device according to (1), wherein the display of the first image is changed to the display of the second image.
(5) The display control unit changes the display of the first image so that the display position or orientation of the object corresponds to the display position or orientation of the object in the display of the first image. The display control apparatus according to (1), wherein the display is switched to display of the second image.
(6) The display control apparatus according to any one of (1) to (5), wherein at least one of the first image and the second image is a captured image.
(7) The display control according to any one of (1) to (6), wherein at least one of the first image and the second image is a virtually configured image. apparatus.
(8) The display control according to any one of (1) to (7), wherein the display control unit displays, in the first image, options of an image that can be transitioned as the second image. apparatus.
(9) The display control device according to (8), wherein the display control unit displays the options for each attribute of the image.
(10) The display control unit causes the annotation given to the object in each of the first image and the second image to be displayed in association with the object cumulatively based on the record. The display control apparatus according to any one of 1) to (8).
(11) referring to a record associating an image, an object included in the image, and a display position or orientation of the object in the image;
Based on the record, the display of the first image including the object is changed to the display of the second image including the object and different from the first image while maintaining the display position or orientation of the object. And a display control method.
(12) a function of referring to a record that associates an image, an object included in the image, and a display position or orientation of the object in the image;
Based on the record, the display of the first image including the object is changed to the display of the second image including the object and different from the first image while maintaining the display position or orientation of the object. A program that causes a computer to realize the functions to be executed.
 100  スマートフォン
 102  撮像部
 104  認識部
 108  生成部
 110  参照部
 114  表示制御部
 118  表示部
DESCRIPTION OF SYMBOLS 100 Smartphone 102 Imaging part 104 Recognition part 108 Generation part 110 Reference part 114 Display control part 118 Display part

Claims (12)

  1.  画像と、前記画像に含まれるオブジェクトと、前記オブジェクトの前記画像における表示位置または姿勢とを関連付けるレコードを参照するレコード参照部と、
     前記レコードに基づいて、前記オブジェクトを含む第1の画像の表示を、前記オブジェクトを含み前記第1の画像とは異なる第2の画像の表示に、前記オブジェクトの表示位置または姿勢を維持したまま遷移させる表示制御部と
     を備える表示制御装置。
    A record reference unit that refers to a record that associates an image, an object included in the image, and a display position or orientation of the object in the image;
    Based on the record, the display of the first image including the object is changed to the display of the second image including the object and different from the first image while maintaining the display position or orientation of the object. A display control unit.
  2.  前記表示制御部は、前記第1の画像の表示において前記オブジェクトの姿勢を所定の姿勢に変化させ、前記第2の画像の表示への遷移後も前記所定の姿勢を少なくとも一時的に維持する、請求項1に記載の表示制御装置。 The display control unit changes the posture of the object to a predetermined posture in displaying the first image, and at least temporarily maintains the predetermined posture even after transition to the display of the second image. The display control apparatus according to claim 1.
  3.  前記所定の姿勢は、正対姿勢である、請求項2に記載の表示制御装置。 The display control apparatus according to claim 2, wherein the predetermined posture is a directly-facing posture.
  4.  前記表示制御部は、前記第1の画像の表示において前記オブジェクトの表示位置または姿勢を前記第2の画像の表示における前記オブジェクトの表示位置または姿勢に対応するように変化させた上で、前記第1の画像の表示を前記第2の画像の表示に遷移させる、請求項1に記載の表示制御装置。 The display control unit changes the display position or orientation of the object so as to correspond to the display position or orientation of the object in the display of the second image in the display of the first image. The display control apparatus according to claim 1, wherein display of one image is transitioned to display of the second image.
  5.  前記表示制御部は、前記第1の画像の表示を、前記オブジェクトの表示位置または姿勢が前記第1の画像の表示における前記オブジェクトの表示位置または姿勢に対応するように変化した前記第2の画像の表示に遷移させる、請求項1に記載の表示制御装置。 The display control unit displays the first image by changing the display position or orientation of the object so that the display position or orientation of the object corresponds to the display position or orientation of the object in the display of the first image. The display control device according to claim 1, wherein the display control device makes a transition to display.
  6.  前記第1の画像および前記第2の画像のうちの少なくともいずれかは、撮像画像である、請求項1に記載の表示制御装置。 The display control apparatus according to claim 1, wherein at least one of the first image and the second image is a captured image.
  7.  前記第1の画像および前記第2の画像のうちの少なくともいずれかは、仮想的に構成された画像である、請求項1に記載の表示制御装置。 The display control device according to claim 1, wherein at least one of the first image and the second image is a virtually configured image.
  8.  前記表示制御部は、前記第1の画像において、前記第2の画像として遷移可能な画像の選択肢を表示させる、請求項1に記載の表示制御装置。 The display control apparatus according to claim 1, wherein the display control unit displays, in the first image, options of an image that can be transitioned as the second image.
  9.  前記表示制御部は、前記画像の属性ごとに前記選択肢を表示させる、請求項8に記載の表示制御装置。 The display control device according to claim 8, wherein the display control unit displays the options for each attribute of the image.
  10.  前記表示制御部は、前記レコードに基づいて、前記第1の画像および前記第2の画像のそれぞれにおいて前記オブジェクトに与えられたアノテーションを累積的に前記オブジェクトに関連付けて表示させる、請求項1に記載の表示制御装置。 2. The display control unit according to claim 1, wherein, based on the record, the annotation given to the object in each of the first image and the second image is displayed in association with the object in a cumulative manner. Display controller.
  11.  画像と、前記画像に含まれるオブジェクトと、前記オブジェクトの前記画像における表示位置または姿勢とを関連付けるレコードを参照することと、
     前記レコードに基づいて、前記オブジェクトを含む第1の画像の表示を、前記オブジェクトを含み前記第1の画像とは異なる第2の画像の表示に、前記オブジェクトの表示位置または姿勢を維持したまま遷移させることと
     を含む表示制御方法。
    Referencing a record associating an image, an object included in the image, and a display position or orientation of the object in the image;
    Based on the record, the display of the first image including the object is changed to the display of the second image including the object and different from the first image while maintaining the display position or orientation of the object. And a display control method.
  12.  画像と、前記画像に含まれるオブジェクトと、前記オブジェクトの前記画像における表示位置または姿勢とを関連付けるレコードを参照する機能と、
     前記レコードに基づいて、前記オブジェクトを含む第1の画像の表示を、前記オブジェクトを含み前記第1の画像とは異なる第2の画像の表示に、前記オブジェクトの表示位置または姿勢を維持したまま遷移させる機能と
     をコンピュータに実現させるためのプログラム。
     
    A function for referring to a record that associates an image, an object included in the image, and a display position or orientation of the object in the image;
    Based on the record, the display of the first image including the object is changed to the display of the second image including the object and different from the first image while maintaining the display position or orientation of the object. A program that causes a computer to realize the functions to be executed.
PCT/JP2014/071386 2013-11-13 2014-08-13 Display control device, display control method and program WO2015072194A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP14861731.9A EP3070681A4 (en) 2013-11-13 2014-08-13 Display control device, display control method and program
JP2015547663A JP6337907B2 (en) 2013-11-13 2014-08-13 Display control apparatus, display control method, and program
US15/025,500 US10074216B2 (en) 2013-11-13 2014-08-13 Information processing to display information based on position of the real object in the image

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013-234930 2013-11-13
JP2013234930 2013-11-13

Publications (1)

Publication Number Publication Date
WO2015072194A1 true WO2015072194A1 (en) 2015-05-21

Family

ID=53057140

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/071386 WO2015072194A1 (en) 2013-11-13 2014-08-13 Display control device, display control method and program

Country Status (4)

Country Link
US (1) US10074216B2 (en)
EP (1) EP3070681A4 (en)
JP (1) JP6337907B2 (en)
WO (1) WO2015072194A1 (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10163261B2 (en) 2014-03-19 2018-12-25 Matterport, Inc. Selecting two-dimensional imagery data for display within a three-dimensional model
US10139985B2 (en) 2012-06-22 2018-11-27 Matterport, Inc. Defining, displaying and interacting with tags in a three-dimensional model
US9786097B2 (en) 2012-06-22 2017-10-10 Matterport, Inc. Multi-modal method for interacting with 3D models
US10127722B2 (en) 2015-06-30 2018-11-13 Matterport, Inc. Mobile capture visualization incorporating three-dimensional and two-dimensional imagery
US10430924B2 (en) * 2017-06-30 2019-10-01 Quirklogic, Inc. Resizable, open editable thumbnails in a computing device
US11287947B2 (en) 2019-05-15 2022-03-29 Microsoft Technology Licensing, Llc Contextual input in a three-dimensional environment
US11164395B2 (en) 2019-05-15 2021-11-02 Microsoft Technology Licensing, Llc Structure switching in a three-dimensional environment
US11030822B2 (en) * 2019-05-15 2021-06-08 Microsoft Technology Licensing, Llc Content indicators in a 3D environment authoring application
US11039061B2 (en) 2019-05-15 2021-06-15 Microsoft Technology Licensing, Llc Content assistance in a three-dimensional environment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10275161A (en) * 1997-01-28 1998-10-13 Dainippon Screen Mfg Co Ltd Image retrieving method, and recording medium recorded with program for performing retrieving process
JP2005055743A (en) * 2003-08-06 2005-03-03 Canon Inc Image display method
JP2007280212A (en) * 2006-04-10 2007-10-25 Sony Corp Display control device, display control method and display control program
JP2012128485A (en) * 2010-12-13 2012-07-05 Yahoo Japan Corp Image search device, image search method and image search program
JP2013105253A (en) 2011-11-11 2013-05-30 Sony Corp Information processing apparatus, information processing method, and program

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6012069A (en) * 1997-01-28 2000-01-04 Dainippon Screen Mfg. Co., Ltd. Method and apparatus for retrieving a desired image from an image database using keywords
US5995119A (en) * 1997-06-06 1999-11-30 At&T Corp. Method for generating photo-realistic animated characters
US6278466B1 (en) * 1998-06-11 2001-08-21 Presenter.Com, Inc. Creating animation from a video
US7027054B1 (en) * 2002-08-14 2006-04-11 Avaworks, Incorporated Do-it-yourself photo realistic talking head creation system and method
EP1768011B1 (en) * 2004-07-15 2012-07-11 Nippon Telegraph And Telephone Corporation Inner force sense presentation device, inner force sense presentation method, and inner force sense presentation program
CA2654960A1 (en) * 2006-04-10 2008-12-24 Avaworks Incorporated Do-it-yourself photo realistic talking head creation system and method
US20080002225A1 (en) * 2006-06-27 2008-01-03 Masajiro Iwasaki Printing control method, printing control device, printing sytem, terminal device, program, and recording medium
JP4934843B2 (en) * 2006-11-29 2012-05-23 株式会社リコー Information processing apparatus, image registration method, and program
JP5167821B2 (en) * 2008-01-11 2013-03-21 株式会社リコー Document search apparatus, document search method, and document search program
US20130215116A1 (en) * 2008-03-21 2013-08-22 Dressbot, Inc. System and Method for Collaborative Shopping, Business and Entertainment
US9354718B2 (en) * 2010-12-22 2016-05-31 Zspace, Inc. Tightly coupled interactive stereo display
JP5776201B2 (en) * 2011-02-10 2015-09-09 ソニー株式会社 Information processing apparatus, information sharing method, program, and terminal apparatus
JP5741160B2 (en) * 2011-04-08 2015-07-01 ソニー株式会社 Display control apparatus, display control method, and program
JP5732988B2 (en) * 2011-04-08 2015-06-10 ソニー株式会社 Image processing apparatus, display control method, and program
US20130249948A1 (en) * 2011-08-26 2013-09-26 Reincloud Corporation Providing interactive travel content at a display device
US9405463B2 (en) * 2011-11-25 2016-08-02 Samsung Electronics Co., Ltd. Device and method for gesturally changing object attributes
JP2013165366A (en) * 2012-02-10 2013-08-22 Sony Corp Image processing device, image processing method, and program
GB201208088D0 (en) * 2012-05-09 2012-06-20 Ncam Sollutions Ltd Ncam
US10223859B2 (en) * 2012-10-30 2019-03-05 Bally Gaming, Inc. Augmented reality gaming eyewear
US20150123966A1 (en) * 2013-10-03 2015-05-07 Compedia - Software And Hardware Development Limited Interactive augmented virtual reality and perceptual computing platform

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10275161A (en) * 1997-01-28 1998-10-13 Dainippon Screen Mfg Co Ltd Image retrieving method, and recording medium recorded with program for performing retrieving process
JP2005055743A (en) * 2003-08-06 2005-03-03 Canon Inc Image display method
JP2007280212A (en) * 2006-04-10 2007-10-25 Sony Corp Display control device, display control method and display control program
JP2012128485A (en) * 2010-12-13 2012-07-05 Yahoo Japan Corp Image search device, image search method and image search program
JP2013105253A (en) 2011-11-11 2013-05-30 Sony Corp Information processing apparatus, information processing method, and program

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3070681A4

Also Published As

Publication number Publication date
EP3070681A1 (en) 2016-09-21
EP3070681A4 (en) 2017-07-12
US20160210788A1 (en) 2016-07-21
JPWO2015072194A1 (en) 2017-03-16
JP6337907B2 (en) 2018-06-06
US10074216B2 (en) 2018-09-11

Similar Documents

Publication Publication Date Title
JP6337907B2 (en) Display control apparatus, display control method, and program
JP6102588B2 (en) Information processing apparatus, information processing method, and program
JP6121647B2 (en) Information processing apparatus, information processing method, and program
JP6167703B2 (en) Display control device, program, and recording medium
JP6135783B2 (en) Information processing apparatus, information processing method, and program
WO2013145566A1 (en) Information processing apparatus, information processing method, and program
US20150070247A1 (en) Information processing apparatus, information processing method, and program
JP2015095147A (en) Display control device, display control method, and program
JP6149862B2 (en) Display control device, display control system, and display control method
JP2017211811A (en) Display control program, display control method and display control device
JP6686547B2 (en) Image processing system, program, image processing method
US20140181709A1 (en) Apparatus and method for using interaction history to manipulate content
JP5446700B2 (en) Information processing apparatus, information processing method, and program
US20230043683A1 (en) Determining a change in position of displayed digital content in subsequent frames via graphics processing circuitry
GB2513865A (en) A method for interacting with an augmented reality scene
JP2017108356A (en) Image management system, image management method and program
JP2015156187A (en) Information processing apparatus, system, information processing method, and program
JP6065084B2 (en) Information processing apparatus, information processing method, and program
KR20120035321A (en) System and method for playing contents of augmented reality
JP6443505B2 (en) Program, display control apparatus, and display control method
JP2019096305A (en) Electronic apparatus and control method, program, and recording medium thereof
US20240080543A1 (en) User interfaces for camera management
WO2024057650A1 (en) Electronic device
WO2019102885A1 (en) Electronic device with changeable image display section
JP2021081937A (en) User terminal, control method and computer program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14861731

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2015547663

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 15025500

Country of ref document: US

REEP Request for entry into the european phase

Ref document number: 2014861731

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2014861731

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE