JP2016014977A - Augmented reality information providing system, augmented reality information providing method, and augmented reality information providing program - Google Patents

Augmented reality information providing system, augmented reality information providing method, and augmented reality information providing program Download PDF

Info

Publication number
JP2016014977A
JP2016014977A JP2014136071A JP2014136071A JP2016014977A JP 2016014977 A JP2016014977 A JP 2016014977A JP 2014136071 A JP2014136071 A JP 2014136071A JP 2014136071 A JP2014136071 A JP 2014136071A JP 2016014977 A JP2016014977 A JP 2016014977A
Authority
JP
Japan
Prior art keywords
augmented reality
reality information
image data
server
sharer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2014136071A
Other languages
Japanese (ja)
Other versions
JP6331777B2 (en
Inventor
義明 澁田
Yoshiaki Shibuta
義明 澁田
南 猛
Takeshi Minami
猛 南
高橋 健一
Kenichi Takahashi
健一 高橋
松原 賢士
Kenji Matsubara
賢士 松原
小澤 開拓
Kaitaku Ozawa
開拓 小澤
陽介 谷口
Yosuke Taniguchi
陽介 谷口
敦史 田村
Atsushi Tamura
敦史 田村
Original Assignee
コニカミノルタ株式会社
Konica Minolta Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社, Konica Minolta Inc filed Critical コニカミノルタ株式会社
Priority to JP2014136071A priority Critical patent/JP6331777B2/en
Publication of JP2016014977A publication Critical patent/JP2016014977A/en
Application granted granted Critical
Publication of JP6331777B2 publication Critical patent/JP6331777B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Abstract

PROBLEM TO BE SOLVED: To provide an augmented reality information providing system that performs writing or modification on an AR (augmented reality) tag, and can immediately share changed information with other users.SOLUTION: A terminal device 110 comprises: an acquisition unit 111 that takes in an image on a two-dimensional medium to acquire image data; a storage unit 112 that stores the image data; a recognition unit 113 that recognizes a marker in the image; a reception unit 114 that receives augmented reality information stored in an AR tag server 120 associated with the marker; a display unit 115 that displays the image, and displays the augmented reality information in association with the recognized marker; an extraction unit 117 that causes the acquisition unit to acquire the image data after an additional description, and compares the image data stored prior to the additional description with the image data acquired after the additional description to thereby extract image data on a part having the additional description performed; and an update request unit 119 that requests to a server so as to update the augmented reality information including the image data having the augmented reality information extracted.

Description

  The present invention relates to a technique for displaying augmented reality information in association with a marker described on a two-dimensional medium.

A technique called Augmented Reality (“AR” for short) has been put into practical use in which a part of reality is expanded and modified by a computer by adding information to the real world as seen by a person.
For example, a technique using augmented reality that can capture an AR marker printed on a two-dimensional medium such as paper and display the corresponding spot content on the AR marker is disclosed (for example, Patent Document 1). reference).

In addition, a technique is disclosed in which AR information is shared among a plurality of user terminals by transmitting video and object recognition information to a user terminal from a remote terminal (see, for example, Patent Document 2).
In addition, a technique is disclosed in which a marker is realized by a digital marker and the type and operation of the AR tag are changed by editing the marker (see, for example, Patent Document 3).

  Also, a technique for generating data of content that moves the first terminal and uses the movement trajectory as a drawn line, and displays the content in a superimposed manner at a position corresponding to the first terminal in the video imaged by the second terminal. Is disclosed (for example, see Patent Document 4).

JP 2012-014606 A JP 2012-043435 A JP 2011-159274 A JP 2011-128977 A

When a user views an AR tag displayed on a two-dimensional medium such as paper or a tablet terminal using a mobile terminal, the AR tag is written or modified, and the changed information is immediately available. I want to share with other users.
However, when adding or modifying information to the AR tag, the AR tag data stored in the server or the like must be updated via the administrator, so there is a time lag until the change is completed. And information cannot be shared in real time. Furthermore, if the original data stored in the server etc. is updated, the updated contents will be known to all users, so personal memos and information that only specific members want to be notified are added to the AR tag. I can't.

  The present invention has been made in view of the circumstances as described above, and is an augmented reality information providing system that can write or modify an AR tag and immediately share the changed information with other users. A first object is to provide an augmented reality information providing method and an augmented reality information providing program. A second object of the present invention is to appropriately share information to be notified only to a specific member in the augmented reality information providing system, augmented reality information providing method, and augmented reality information providing program.

  In order to achieve the above object, the augmented reality information providing system of the present invention includes a plurality of terminal devices that display augmented reality information in association with markers described on a two-dimensional medium, and the augmented reality information as the terminal device. An augmented reality information providing system including a server provided to each of the first and second servers, wherein the server stores the augmented reality information in association with the marker, and each of the plurality of terminal devices is two-dimensional. An acquisition unit that captures an image described on a medium and acquires image data; a storage unit that stores the acquired image data; and the marker in the image based on the acquired image data. Recognizing recognition means, means for receiving augmented reality information linked to the recognized marker and stored in the server from the server, Display means for displaying the image using the image data, displaying augmented reality information received from the server in association with the recognized marker, and the acquisition means after additional description on the two-dimensional medium The image data of the image after the additional description described on the two-dimensional medium is acquired, and the image data stored in the storage means before the additional description is compared with the image data acquired after the additional description. To the server so as to update the augmented reality information stored in the server including the extracted image data. Update request means for requesting, and the server further updates the augmented reality information in response to an update request from the update request means.

Here, the plurality of terminal devices further include voice recognition means for recognizing a user's utterance, and the extraction means recognizes a predetermined utterance by the voice recognition means after additional description on the two-dimensional medium. When it is done, the acquisition means can acquire image data.
Here, the plurality of terminal devices further include accepting means for accepting designation of a sharer to be shared with the additionally described image data from a user and acquiring sharer information for identifying the sharer. The update request means requests the server to store the augmented reality information updated including the additionally described image data and the acquired sharer information in association with each other. The server receives an update request from the update request unit, updates the augmented reality information in association with the sharer information, and applies only to the terminal device used by the sharer identified by the sharer information. Reality information can be provided.

Here, each of the plurality of terminal devices includes a voice recognition unit that recognizes a user's utterance, and the extraction unit displays an image on the acquisition unit when a predetermined utterance is recognized by the voice recognition unit. Data can be acquired, and the reception unit can acquire the sharer information based on an utterance indicating the sharer recognized by the voice recognition unit.
In order to achieve the above object, an augmented reality information providing method of the present invention includes a plurality of terminal devices that display augmented reality information in association with markers described on a two-dimensional medium, and the augmented reality information that is the terminal device. Each of the plurality of terminal devices, the augmented reality information providing method in a system including a server provided to each of the plurality of terminal devices, wherein the server stores the augmented reality information in association with the marker. An acquisition step of acquiring an image data by capturing an image described on a two-dimensional medium, a storage step of storing the acquired image data, and the image in the image based on the acquired image data A recognition step for recognizing a marker, and augmented reality information associated with the recognized marker and stored in the server Receiving, displaying the image using the acquired image data, displaying the augmented reality information received from the server in association with the recognized marker, and displaying on the two-dimensional medium When the additional description is performed, the image data of the image after the additional description described on the two-dimensional medium is acquired, and the image data stored in the storing step before the additional description and the image data acquired after the additional description are acquired. The extraction step of extracting the image data of the additionally described part by comparing the image data with the updated image data, and the augmented reality information stored in the server is updated including the extracted image data. , Performing an update request step for requesting the server, and receiving an update request by the update request step in the server, And updates the serial augmented reality information.

Here, the augmented reality information providing method further performs a speech recognition step of recognizing a user's utterance in each of the plurality of terminal devices, and the extraction step is additionally described on the two-dimensional medium, Image data can be acquired when a predetermined utterance is recognized by the voice recognition step.
Here, in the augmented reality information providing method, in each of the plurality of terminal devices, a designation of a sharer to be shared with respect to the additionally described image data is received from the user, and the sharer is identified. A reception step of acquiring sharer information is performed, and the update request step stores the augmented reality information updated including the additionally described image data and the acquired sharer information in association with each other. As described above, the request is made to the server, and the server receives the update request in the update request step, updates the augmented reality information in association with the sharer information, and is identified by the sharer information. The augmented reality information can be provided only to the terminal device used by the person.

  Here, the augmented reality information providing method further performs a speech recognition step of recognizing a user's utterance in each of the plurality of terminal devices, and the extraction step recognizes a predetermined utterance by the speech recognition step. In this case, image data is acquired, and the accepting step can acquire the sharer information based on an utterance indicating the sharer recognized by the voice recognition step.

  In order to achieve the above object, an augmented reality information providing program of the present invention is an augmented reality information providing program that causes a terminal device to execute a process of displaying augmented reality information in association with a marker described on a two-dimensional medium. An acquisition step of acquiring an image data by capturing an image described on a two-dimensional medium, a storage step of storing the acquired image data, and the image based on the acquired image data A recognition step for recognizing the marker therein, a step of receiving from the server augmented reality information associated with the recognized marker and stored in the server, and using the acquired image data A display step for displaying an image and displaying augmented reality information received from the server in association with the recognized marker. And after the additional description on the two-dimensional medium, the image data of the image after the additional description described on the two-dimensional medium is acquired, and the image data stored in the storage step before the additional description The extraction step of extracting the image data of the additionally described part by comparing the image data acquired after the additional description, the augmented reality information stored in the server, the extracted image data And an update requesting step for requesting the server so as to be updated.

Here, the Zhang reality information providing program further includes a voice recognition step of recognizing a user's utterance in each of the plurality of terminal devices, and the extraction step is additionally described on the two-dimensional medium, Image data can be acquired when a predetermined utterance is recognized by the voice recognition step.
Here, the Zhang reality information providing program further receives a designation of a sharer to be shared with the additional described image data from the user in the plurality of terminal devices, and identifies the sharer. Including an accepting step of acquiring information, wherein the update requesting step stores the augmented reality information updated including the additionally described image data and the acquired sharer information in association with each other. , The server can be requested.

  Here, the Zhang reality information providing program further includes a voice recognition step of recognizing a user's utterance, and the extraction step acquires image data when a predetermined utterance is recognized by the voice recognition step. The accepting step can acquire the sharer information based on the utterance indicating the sharer recognized by the voice recognition step.

  According to the augmented reality information providing system, the augmented reality information providing method, and the augmented reality information providing program configured as described above, the image data of the additionally described portion is extracted by comparing the image data before and after the additional description. Thus, since only the written description can be added to the augmented reality information, the AR tag can be accurately written and corrected, and the changed information can be immediately shared with other users. Furthermore, by receiving the designation of the sharer from the user and updating the augmented reality information in association with the sharer information, it is possible to appropriately share information that is desired to be notified only to a specific member.

It is a figure which shows the augmented reality information provision system outline | summary which concerns on this embodiment. It is a figure which shows an example of the display image displayed on a built-in monitor etc. by the original image in which the AR marker taken in by the acquisition part was described, and a display part. It is a figure which shows an example of the display screen when a reception part acquires sharer information. It is a figure which shows a mode that the updated augmented reality information and sharer information are matched and memorize | stored. It is a figure which shows an example of the operation | movement at the time of updating augmented reality information provision in an augmented reality information provision system. It is a figure which shows another example of the operation | movement at the time of updating augmented reality information in an augmented reality information provision system. It is a figure which shows the sequence in an augmented reality information provision system. FIG. 11 is a diagram (No. 1) illustrating a processing procedure of an update providing program that causes a terminal device to execute a process of updating augmented reality information and a process of providing augmented reality information. It is FIG. (2) which shows the process sequence of the update provision program which makes a terminal device perform the process which updates augmented reality information, and the process which provides augmented reality information. It is a figure which shows the process sequence of the provision program which makes an AR tag server perform the process which provides augmented reality information.

<Embodiment>
<Configuration>
FIG. 1 is a diagram showing an overview of an augmented reality information providing system 100 according to the present embodiment.
The augmented reality information providing system 100 includes a plurality of terminal devices 110 and an AR tag server 120.

Each terminal device 110 can display augmented reality information in association with an AR marker written on a two-dimensional medium such as paper.
The AR tag server 120 stores augmented reality information in association with an AR marker, and can provide augmented reality information to each of the terminal devices 110.
The terminal device 110 includes an acquisition unit 111, a storage unit 112, a recognition unit 113, a reception unit 114, a display unit 115, a voice recognition unit 116, an extraction unit 117, a reception unit 118, and an update request unit 119, respectively.

The acquisition unit 111 acquires an image data by capturing an image described on a two-dimensional medium using a built-in camera or the like.
The storage unit 112 stores the image data acquired by the acquisition unit 111.
The recognition unit 113 recognizes the AR marker in the image based on the image data acquired by the acquisition unit 111. In the present embodiment, the AR marker is recognized on the terminal device 110 side, but the terminal device 110 transmits the image data acquired by the acquisition unit 111 to the AR tag server 120 and uses this to transmit the AR tag server. The AR marker may be recognized on the 120 side.

The receiving unit 114 receives augmented reality information associated with the AR marker recognized by the recognition unit 113 and stored in the AR tag server 120 from the AR tag server 120.
The display unit 115 displays an image on the built-in monitor or the like using the image data acquired by the acquisition unit 111, and uses the augmented reality information received from the AR tag server 120 as the AR marker at the recognized position of the AR marker. Replaced and displayed as AR tag.

  2 is a diagram showing an example of an original image (left side in FIG. 2) on which an AR marker captured by the acquisition unit 111 is written, and a display image (right side in FIG. 2) displayed on a built-in monitor or the like by the display unit 115. It is. As shown in FIG. 2, when an AR marker (A in FIG. 2) is described in the original image, augmented reality information (B in FIG. 2) is located at a position corresponding to the AR marker in the table image. Is displayed as an AR tag.

The voice recognition unit 116 recognizes a user's utterance using a built-in microphone or the like.
The extraction unit 117 causes the acquisition unit 111 to acquire the image data of the image after the additional description described on the two-dimensional medium after being additionally described on the two-dimensional medium by the user, and the storage unit 112 before the additional description. By comparing the image data stored in the above and the image data acquired after the additional description, the image data of the additionally described part is extracted together with the description position information indicating the description position. In addition, after the user performs additional description on the two-dimensional medium, the extraction unit 117 shares the image additionally described from the user with another user, for example, by touching the “Share” button on the touch panel. When receiving an instruction to do so, it is preferable that the acquisition unit 111 acquire the image data of the image after the additional description described on the two-dimensional medium. Further, the extraction unit 117 preferably causes the acquisition unit 111 to acquire image data when a predetermined utterance such as “share” is recognized by the voice recognition unit 116. In this embodiment, the image data of the part additionally described on the terminal device 110 side is extracted. However, the image data acquired before the additional description and the image data acquired after the additional description are extracted from the terminal device 110. May be transmitted to the AR tag server 120, and the image data of the part additionally described on the AR tag server 120 side may be extracted using these.

The accepting unit 118 accepts designation of a sharer who wants to share the additionally described image data from the user when the user wants to disclose the additionally described content only to some specific users, and selects the sharer. Get sharer information to identify.
FIG. 3 is a diagram illustrating an example of a display screen when the reception unit 118 acquires sharer information. As shown in FIG. 3, for example, when the user utters “share”, a sharing destination list window (A in FIG. 3) is displayed in which a list of pre-registered group names and job titles is listed. Then, a sharer is selected by touching the display position of the desired sharer in the share destination list. Note that it is preferable that the reception unit 118 receives the designation of the sharer from the user when the speech recognition unit 116 recognizes an utterance indicating the sharer such as “group 1”. In the present embodiment, the sharer information is acquired from the user in the terminal device 110, but the sharer information may be acquired in the AR tag server 120, or the sharer information is acquired in another external terminal. Also good.

  The update request unit 119 requests the AR tag server 120 to update the augmented reality information stored in the AR tag server 120 including the image data extracted by the extraction unit 117. More specifically, the update request unit 119 adds the image data extracted by the extraction unit 117 to the described position in the augmented reality information received by the receiving unit 114, and creates the updated augmented reality information. The updated augmented reality information is transmitted to the AR tag server 120, and the AR tag server 120 is requested to store the updated augmented reality information and supply it to each of the terminal devices 110 thereafter. Alternatively, the update request unit 119 transmits the image data of the portion extracted by the extraction unit 117 and the description position information thereof to the AR tag server 120, and the image transmitted to the AR tag server 120 with the corresponding augmented reality information. The data is added to the description position indicated by the description position information, stored, and thereafter requested to be supplied to each of the terminal devices 110. Further, when the sharer information is acquired by the reception unit 118, the update request unit 119 further transmits the sharer information to the AR tag server 120, and the extracted image data is transmitted to the AR tag server 120. A request is made to store the augmented reality information updated inclusive and the sharer information in association with each other.

When the AR tag server 120 further receives a request to update the augmented reality information stored from the update request unit 119, the AR tag server 120 updates the augmented reality information. Further, when the AR tag server 120 receives a request from the update request unit 119 to store the augmented reality information and the sharer information in association with each other, the AR tag server 120 associates and updates the augmented reality information with the sharer information. .
Further, the AR tag server 120 matches the sharer information when the sharer information is associated with the augmented reality information stored in association with the AR marker recognized by the recognition unit 113. The augmented reality information is provided only to the terminal device 110 used by the user. Specifically, when the augmented reality information is delivered to the receiving unit 114, the AR tag server 120 receives the user identification information from the terminal device 110, and uses the identification information to identify the user for each augmented reality information. And only the augmented reality information determined to be a sharer is transmitted to the receiving unit 114.

FIG. 4 is a diagram illustrating a state in which the updated augmented reality information and the sharer information are stored in association with each other.
The augmented reality information “AR tag 1” corresponding to the AR marker “marker 1” described in the first line of FIG. 4 has no registration of updated augmented reality information. Therefore, the AR tag server 120 provides “AR tag 1” as augmented reality information corresponding to “marker 1” to all users.

  The augmented reality information “AR tag 2” corresponding to the AR marker “marker 2” described in the second line of FIG. 4 includes the updated augmented reality information “AR tag 2-1”, and the sharer information “user” 1 ”is associated. Therefore, the AR tag server 120 provides “AR tag 2-1” as augmented reality information corresponding to “marker 2” only to the user 1, and provides “AR tag 2” to other users.

  The augmented reality information “AR tag 3” corresponding to the AR marker “marker 3” described in the third to fourth lines of FIG. 4 includes the updated augmented reality information “AR tag 3-1” and “AR tag 3- 2 ”, the former is associated with the sharer information“ User 2 ”and“ Group 1 ”, and the latter is associated with“ User 3 ”. Therefore, the AR tag server 120 provides “AR tag 3-1” as augmented reality information corresponding to “marker 3” only to the user 2 and the members of “group 1”, and the user 3 only becomes “marker 3”. “AR tag 3-1” is provided as corresponding augmented reality information, and “AR tag 3” is provided to other users. Here, the members of “Group 1” need to be registered in advance.

When the user 3 is included in the members of the “group 1”, the user 3 has an access right to a plurality of augmented reality information. In response to a request for providing augmented reality information, the plurality of augmented reality information may be provided so as to be selectable, or the plurality of augmented reality information may be combined and provided.
<Operation>
FIG. 5 is a diagram illustrating an example of an operation when the augmented reality information provision is updated in the augmented reality information providing system 100.

(1) When the user desires to add an additional description to the display image (A in FIG. 5) displayed on the display unit 115 and share it with other users, the user directly displays the original image on the original image. A character or a picture is written in (B in FIG. 5). Here, “important” and an arrow are written (C in FIG. 5).
(2) When writing is completed, the user utters “share” (D in FIG. 5).

(3) The extraction unit 117 extracts the image data of the part additionally described together with the description position information. Here, “important” and an arrow are extracted (E in FIG. 5).
(4) The update request unit 119 requests the AR tag server 120 to update the augmented reality information including the additionally described image data. Here, “important” and an arrow are added to the original augmented reality information (F in FIG. 5).

FIG. 6 is a diagram illustrating another example of the operation when the augmented reality information is updated in the augmented reality information providing system 100.
(1) When the user desires to add an additional description to the image displayed on the display unit 115 (A in FIG. 6) and share it with other users, the user directly displays the image on the original image. Write characters and pictures (B in FIG. 6). Here, “−” is written over the portion of the AR tag where “Osaka” is displayed, and “Fukuoka” is written below (−C).

(2) When writing is completed, the user utters “share” (D in FIG. 6).
(3) The extraction unit 117 extracts the image data of the part additionally described together with the description position information. Here, “-” and “Fukuoka” are extracted (E in FIG. 6).
(4) The update request unit 119 requests the AR tag server 120 to update the augmented reality information including the additionally described image data. Here, “−” is treated as a symbol indicating correction, the display of “Osaka” where “−” overlaps is deleted from the original augmented reality information, and “Osaka” of the original augmented reality information is deleted. “Fukuoka” is overwritten in the deleted area (F in FIG. 6). The symbol that indicates correction may be any symbol as long as it is an identifiable symbol, and may be, for example, “/” or “x”. Further, which image in the AR tag overlaps the symbol instructing correction is determined by the positional relationship between the symbol in the extracted image data and the AR marker on the two-dimensional medium, and each of the AR marker and the AR tag. This can be determined by comparing the positional relationship on the virtual display with the image.

FIG. 7 is a diagram showing a sequence in the augmented reality information providing system 100. In FIG. 7, the terminal device 110 used by the user 1 is described as “terminal X”, the terminal device 110 used by the user 2 is described as “terminal Y”, and the AR tag server 120 is described as “server Z”.
The sequence shown in FIG. 7 will be described below.
(1) When an image is captured by the terminal X, it is recognized whether or not there is an AR marker in the image, and if there is an AR marker, the terminal X is connected to the server Z and augmented reality linked to the AR marker. Information is requested (A in FIG. 7).

(2) When the requested augmented reality information includes information associated with the sharer information, the server Z requests the terminal X for user identification information (B in FIG. 7).
(3) The identification information of the user 1 is transmitted from the terminal X to the server Z (C in FIG. 7). If there is no information that is associated with the sharer information in the requested augmented reality information, the request for the identification information (2) and the transmission of the identification information (3) are omitted.

(4) Among the augmented reality information linked to the AR marker, augmented reality information not associated with the sharer information or augmented reality information included in the sharer of the user 1 is the server Z. To terminal X (D in FIG. 7).
(5) Terminal X replaces augmented reality information with AR markers, displays an image as an AR tag, and terminal X requests server Z for a list of sharing destinations after additional description is made by the user on the two-dimensional medium (E in FIG. 7).

(6) A sharing destination list is transmitted from the server Z to the terminal X (F in FIG. 7). When the user does not specify a sharing destination, the request for the sharing destination list in (5) and the transmission of the sharing destination list in (6) are omitted.
(7) When the terminal X displays the sharing destination list and the sharing destination is selected by the user, the terminal X requests the server Z to update the augmented reality information (G in FIG. 7).

(8) When the update of the augmented reality information is completed in the server Z, the fact that the update is completed is transmitted from the server Z to the terminal X (H in FIG. 7).
(9) On the other hand, when an image is captured by the terminal Y, it is recognized whether or not there is an AR marker in the image, and if there is an AR marker, the terminal Y is linked to the server Z to the AR marker. Request augmented reality information (I in FIG. 7).

(10) When the requested augmented reality information includes information associated with the sharer information, the server Z requests the terminal Y for user identification information (J in FIG. 7).
(11) The identification information of the user 2 is transmitted from the terminal Y to the server Z (K in FIG. 7). Note that if there is no request for the shared information associated with the requested augmented reality information, the request for the identification information (10) and the transmission of the identification information (11) are omitted.

(12) Among the augmented reality information linked to the AR marker, augmented reality information not associated with the sharer information or augmented reality information included in the sharer of the user 2 is the server Z. To terminal Y (L in FIG. 7).
(13) The terminal Y replaces the augmented reality information with the AR marker and displays the image as an AR tag.

8 and 9 are diagrams illustrating processing procedures of an update providing program that causes the terminal device 110 to execute processing for updating augmented reality information and processing for providing augmented reality information.
(1) It is determined whether or not there is an AR marker in the captured image (step S1). If there is no AR marker, an image is displayed and the process ends.
(2) When there is an AR marker, the augmented reality information linked to the AR marker is requested to the AR tag server 120 (step S2).

(3) It is determined whether or not user identification information is requested from the AR tag server 120 (step S3).
(4) When the identification information of the user is requested, the identification information of the user using the terminal device 110 is transmitted to the AR tag server 120 (step S4).
(5) It is determined whether or not augmented reality information is transmitted from the AR tag server 120 (step S5).

(6) The augmented reality information is acquired from the AR tag server 120, the augmented reality information is replaced with the AR marker, the image is displayed, and the image data is stored (step S6).
(7) It is determined whether or not a predetermined utterance such as “share” is recognized (step S7).
(8) The image after additional description is taken in (step S8).
(9) The difference between the image data before the additional description and the image data after the additional description is taken, and the image data of the additionally described portion and its description position information are extracted (step S9).

(10) It is determined whether or not there is a symbol for instructing correction in the extracted image data (step S10).
(11) If there is no symbol for instructing correction, the additionally described image data is added to the augmented reality information (step S11).
(12) If there is a symbol for instructing correction, the display of the image overlapping the symbol is deleted from the augmented reality information, and the description other than the symbol is overwritten in the deleted region (step S12).

(13) Let the user select whether or not to specify a sharing destination (step S13).
(14) When designating the sharing destination, the AR tag server 120 is requested for a sharing destination list (step S14).
(15) It is determined whether or not the sharing destination list is transmitted from the AR tag server 120 (step S15).

(16) The shared destination list is received and displayed (step S16).
(17) It is determined whether or not an utterance indicating a sharer existing in the share destination list is recognized (step S17). When the sharer is not specified, the user recognizes that the sharer is not specified by saying, for example, “not specified”.
(18) When an utterance indicating a sharer is recognized, the augmented reality information generated in (11) or (12) and the sharer information for identifying the sharer recognized in (17) are used as an AR tag server. The AR tag server 120 is requested to send them to 120 and store them in association with each other (step S18). If the sharing destination is not specified in step S13 and the sharing person is not specified in step S17, the augmented reality information generated in (11) or (12) is sent to the AR tag server 120. The AR tag server 120 is requested to store this without restricting the sharer.

(19) Wait for notification from the AR tag server 120 that the update has been completed (step S19).
(20) A message indicating that the update has been completed is displayed, and the process is terminated. (Step S20).
(21) It is determined whether a timeout has occurred (step S21).
(22) A message indicating that a timeout has occurred is displayed, and the process is terminated. (Step S22).

FIG. 10 is a diagram illustrating a processing procedure of a providing program that causes the AR tag server 120 to execute processing for providing augmented reality information.
(1) It is determined whether there is a request for providing augmented reality information from the terminal device 110 (step S31).
(2) When there is a request for providing augmented reality information, it is determined whether there is any augmented reality information associated with the AR marker that is associated with the sharer information (step S32).

(3) If there is something associated with the sharer information, the terminal device 110 is requested for user identification information (step S33).
(4) It is determined whether user identification information has been transmitted from the terminal device 110 (step S34).
(5) The user identification information is received from the terminal device 110, and the augmented reality information included in the sharer among the augmented reality information linked to the AR marker is transmitted to the terminal device 110. (Step S35).

(6) The augmented reality information associated with the AR marker is transmitted to the terminal device 110 (step S36).
(7) It is determined whether a sharing destination list is requested from the terminal device 110 (step S37).
(8) The requested sharing destination list is transmitted to the terminal device 110 (step S38).

(9) It is determined whether a request to update augmented reality information is received from the terminal device 110 (step S39).
(10) The augmented reality information is updated (step S40).
(11) Notify the terminal device 110 that the update has been completed (step S41).
As described above, according to the augmented reality information providing system 100, the image data of the additional description portion written by the user can be extracted and the augmented reality information can be updated including this, so that the changed information can be immediately updated. Can be shared with other users. In addition, since it is possible to specify a sharer who wants to share the additionally described image data, it is possible to appropriately share information to be notified only to specific members.

  Note that a program that allows a computer to execute an operation as in this embodiment is recorded on a computer-readable recording medium, and this recording medium can be distributed and traded. In addition, the program may be distributed through a network or the like, for example, and may be traded, or may be displayed on a display device, printed, and presented to a user.

  Here, the computer-readable recording medium includes, for example, a removable recording medium such as a floppy disk, CD, MO, DVD, and memory card, and a fixed recording medium such as a hard disk and a semiconductor memory, and the like. It is not something.

  The present invention can be widely applied as convenient application software that can be used in a portable external terminal device. According to the present invention, the AR tag can be written or modified and the changed information can be immediately shared with other users, and its industrial utility value is extremely high.

DESCRIPTION OF SYMBOLS 100 Augmented reality information provision system 110 Terminal device 111 Acquisition part 112 Storage part 113 Recognition part 114 Receiving part 115 Display part 116 Voice recognition part 117 Extraction part 118 Reception part 119 Update request part 120 AR tag server

Claims (12)

  1. An augmented reality information providing system including a plurality of terminal devices that display augmented reality information in association with markers described on a two-dimensional medium, and a server that provides the augmented reality information to each of the terminal devices. ,
    The server
    The augmented reality information is stored in association with the marker,
    Each of the plurality of terminal devices is
    An acquisition means for acquiring an image data by capturing an image described on a two-dimensional medium;
    Storage means for storing the acquired image data;
    Recognition means for recognizing the marker in the image based on the acquired image data;
    Means for receiving augmented reality information associated with the recognized marker and stored in the server from the server;
    Display means for displaying the image using the acquired image data and displaying augmented reality information received from the server in association with the recognized marker;
    After the additional description on the two-dimensional medium, the acquisition means acquires the image data of the image after the additional description described on the two-dimensional medium, and the image data stored in the storage means before the additional description And extracting means for extracting the image data of the additionally described portion by comparing the image data acquired after the additional description,
    Update request means for requesting the server to update the augmented reality information stored in the server, including the extracted image data,
    The server further includes:
    The augmented reality information providing system, wherein the augmented reality information is updated in response to an update request from the update request means.
  2. The plurality of terminal devices further includes:
    With voice recognition means to recognize user utterances,
    The extraction means includes
    After the additional description on the two-dimensional medium, when the predetermined utterance is recognized by the voice recognition unit, the acquisition unit acquires the image data of the image after the additional description described on the two-dimensional medium. The augmented reality information providing system according to claim 1.
  3. The plurality of terminal devices further includes:
    Receiving a designation of a sharer to be shared with respect to the additionally described image data from a user, and receiving means for acquiring sharer information for identifying the sharer,
    The update request means includes
    Requesting the server to store the augmented reality information updated including the additionally described image data and the acquired sharer information in association with each other;
    The server
    In response to an update request from the update request means, the augmented reality information is updated in association with the sharer information, and the augmented reality information is provided only to the terminal device used by the sharer identified by the sharer information. The augmented reality information providing system according to claim 1.
  4. Each of the plurality of terminal devices is
    With voice recognition means to recognize user utterances,
    The extraction means includes
    When a predetermined utterance is recognized by the voice recognition unit, the acquisition unit acquires image data,
    The accepting means is
    The augmented reality information providing system according to claim 3, wherein the sharer information is acquired based on an utterance indicating the sharer recognized by the voice recognition unit.
  5. An augmented reality information providing method in a system including a plurality of terminal devices that display augmented reality information in association with markers described on a two-dimensional medium, and a server that provides the augmented reality information to each of the terminal devices. There,
    The server
    The augmented reality information is stored in association with the marker,
    In each of the plurality of terminal devices,
    An acquisition step of capturing an image described on a two-dimensional medium and acquiring image data;
    A storage step of storing the acquired image data;
    A recognition step for recognizing the marker in the image based on the acquired image data;
    Receiving augmented reality information associated with the recognized marker and stored in the server from the server;
    Displaying the image using the acquired image data and displaying the augmented reality information received from the server in association with the recognized marker;
    After the additional description on the two-dimensional medium, the image data of the image after the additional description described on the two-dimensional medium is acquired, and the image data stored in the storage step before the additional description and the additional description An extraction step of extracting the image data of the additionally described part by comparing with image data acquired later;
    An update request step for requesting the server to update the augmented reality information stored in the server including the extracted image data;
    The augmented reality information providing method, wherein the server updates the augmented reality information in response to an update request in the update request step.
  6. The augmented reality information providing method further includes:
    In each of the plurality of terminal devices, perform a speech recognition step of recognizing the user's utterance,
    The extraction step includes
    6. The augmented reality information providing method according to claim 5, further comprising: acquiring image data when a predetermined utterance is recognized by the voice recognition step after additional description on the two-dimensional medium.
  7. The augmented reality information providing method further includes:
    In each of the plurality of terminal devices, a designation of a sharer who wants to share the additional described image data is accepted from a user, and a reception step of acquiring sharer information for identifying the sharer is performed.
    The update request step includes:
    Requesting the server to store the augmented reality information updated including the additionally described image data and the acquired sharer information in association with each other;
    In the server, in response to the update request in the update request step, the augmented reality information is updated in association with the sharer information, and the augmented reality is only applied to the terminal device used by the sharer identified by the sharer information. The augmented reality information providing method according to claim 5, wherein information is provided.
  8. The augmented reality information providing method further includes:
    In each of the plurality of terminal devices, perform a speech recognition step of recognizing the user's utterance,
    The extraction step includes
    When a predetermined utterance is recognized by the voice recognition step, image data is acquired,
    The reception step includes
    The augmented reality information providing method according to claim 7, wherein the sharer information is acquired based on an utterance indicating the sharer recognized by the voice recognition step.
  9. An augmented reality information providing program for causing a terminal device to execute a process of displaying augmented reality information in association with a marker described on a two-dimensional medium,
    An acquisition step of capturing an image described on a two-dimensional medium and acquiring image data;
    A storage step of storing the acquired image data;
    A recognition step for recognizing the marker in the image based on the acquired image data;
    Receiving augmented reality information associated with the recognized marker and stored in the server from the server;
    Displaying the image using the acquired image data and displaying the augmented reality information received from the server in association with the recognized marker;
    After the additional description on the two-dimensional medium, the image data of the image after the additional description described on the two-dimensional medium is acquired, and the image data stored in the storage step before the additional description and the additional description An extraction step of extracting the image data of the additionally described part by comparing with image data acquired later;
    An augmented reality information providing program for requesting the server to update the augmented reality information stored in the server to include the extracted image data.
  10. The Zhang reality information providing program further includes:
    Each of the plurality of terminal devices includes a speech recognition step of recognizing a user's utterance,
    The extraction step includes
    The augmented reality information providing program according to claim 9, further comprising: acquiring image data when a predetermined utterance is recognized by the voice recognition step after additional description on the two-dimensional medium.
  11. The Zhang reality information providing program further includes:
    In the plurality of terminal devices, including an accepting step of accepting from the user designation of a sharer to be shared with the additional described image data, and acquiring sharer information for identifying the sharer,
    The update request step includes:
    The augmented reality information updated to include the additionally described image data and the acquired sharer information are requested to the server to be stored in association with each other. 9. The augmented reality information providing program according to 9.
  12. The Zhang reality information providing program further includes:
    A speech recognition step for recognizing the user's utterance,
    The extraction step includes
    When a predetermined utterance is recognized by the voice recognition step, image data is acquired,
    The reception step includes
    The augmented reality information providing program according to claim 11, wherein the sharer information is acquired based on an utterance indicating the sharer recognized by the voice recognition step.
JP2014136071A 2014-07-01 2014-07-01 Augmented reality information providing system, augmented reality information providing method, and augmented reality information providing program Active JP6331777B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2014136071A JP6331777B2 (en) 2014-07-01 2014-07-01 Augmented reality information providing system, augmented reality information providing method, and augmented reality information providing program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2014136071A JP6331777B2 (en) 2014-07-01 2014-07-01 Augmented reality information providing system, augmented reality information providing method, and augmented reality information providing program

Publications (2)

Publication Number Publication Date
JP2016014977A true JP2016014977A (en) 2016-01-28
JP6331777B2 JP6331777B2 (en) 2018-05-30

Family

ID=55231128

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2014136071A Active JP6331777B2 (en) 2014-07-01 2014-07-01 Augmented reality information providing system, augmented reality information providing method, and augmented reality information providing program

Country Status (1)

Country Link
JP (1) JP6331777B2 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006195668A (en) * 2005-01-12 2006-07-27 Ricoh Co Ltd Teleconference system
JP2010205138A (en) * 2009-03-05 2010-09-16 Fuji Xerox Co Ltd Document management device and program
JP2012043435A (en) * 2010-08-18 2012-03-01 Pantech Co Ltd Augmented reality service sharing method, and user terminal, remote terminal and system used for sharing augmented reality service
JP2012108282A (en) * 2010-11-17 2012-06-07 Nikon Corp Electronic apparatus
JP2013026922A (en) * 2011-07-22 2013-02-04 Kyocera Document Solutions Inc Image formation system, information processing device, image formation device, and computer program
WO2013068429A1 (en) * 2011-11-08 2013-05-16 Vidinoti Sa Image annotation method and system
JP2014120805A (en) * 2012-12-13 2014-06-30 Canon Inc Information processing device, information processing method, and program

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006195668A (en) * 2005-01-12 2006-07-27 Ricoh Co Ltd Teleconference system
JP2010205138A (en) * 2009-03-05 2010-09-16 Fuji Xerox Co Ltd Document management device and program
JP2012043435A (en) * 2010-08-18 2012-03-01 Pantech Co Ltd Augmented reality service sharing method, and user terminal, remote terminal and system used for sharing augmented reality service
JP2012108282A (en) * 2010-11-17 2012-06-07 Nikon Corp Electronic apparatus
JP2013026922A (en) * 2011-07-22 2013-02-04 Kyocera Document Solutions Inc Image formation system, information processing device, image formation device, and computer program
WO2013068429A1 (en) * 2011-11-08 2013-05-16 Vidinoti Sa Image annotation method and system
JP2014120805A (en) * 2012-12-13 2014-06-30 Canon Inc Information processing device, information processing method, and program

Also Published As

Publication number Publication date
JP6331777B2 (en) 2018-05-30

Similar Documents

Publication Publication Date Title
US8861377B2 (en) Transmission management system, program, computer readable information recording medium, program providing system, and maintenance system
JP4997783B2 (en) Electronic conference system, electronic conference support program, electronic conference control device, information terminal device, electronic conference support method
US9977570B2 (en) Digital image tagging apparatuses, systems, and methods
US10157191B2 (en) Metadata tagging system, image searching method and device, and method for tagging a gesture thereof
TW201126358A (en) System and method for tagging multiple digital images
KR20150026367A (en) Method for providing services using screen mirroring and apparatus thereof
US9462442B2 (en) Apparatus and method for sharing information through presence service in a communication network
CN102132312A (en) Tagging images with labels
CN102132244A (en) Image tagging user interface
KR20160011613A (en) Method and device for information acquisition
US9363463B2 (en) Image management system, image management method, and computer program product
US20140152852A1 (en) Predetermined-area management system, communication method, and computer program product
JP2018170019A (en) Method and apparatus for recognition and matching of objects depicted in images
CN102436460A (en) Apparatus and method for providing object information
JP6171263B2 (en) Remote conference system and remote conference terminal
US20120086792A1 (en) Image identification and sharing on mobile devices
US8751534B2 (en) Method and apparatus for managing file
JP2012063890A (en) Annotation device
WO2013051180A1 (en) Image processing apparatus, image processing method, and program
US20170185272A1 (en) Information processing method and information processing system
CN101056386A (en) Electronic conference system, electronic conference support method, and electronic conference control apparatus
AU2006233231B2 (en) Electronic conference system, electronic conference support method, electronic conference control apparatus, and portable storage device
JP2007213467A (en) Conference support apparatus
US10158795B2 (en) Electronic apparatus for communicating with another apparatus
JP5284428B2 (en) Augmented reality service sharing method and user terminal, remote terminal and system used for sharing augmented reality service

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20170419

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20180124

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20180213

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20180308

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20180403

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20180416

R150 Certificate of patent or registration of utility model

Ref document number: 6331777

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150