WO2019189768A1 - Procédé de communication, dispositif de communication, émetteur, et programme - Google Patents

Procédé de communication, dispositif de communication, émetteur, et programme Download PDF

Info

Publication number
WO2019189768A1
WO2019189768A1 PCT/JP2019/014013 JP2019014013W WO2019189768A1 WO 2019189768 A1 WO2019189768 A1 WO 2019189768A1 JP 2019014013 W JP2019014013 W JP 2019014013W WO 2019189768 A1 WO2019189768 A1 WO 2019189768A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
receiver
visible light
time
information
Prior art date
Application number
PCT/JP2019/014013
Other languages
English (en)
Japanese (ja)
Inventor
秀紀 青山
大嶋 光昭
Original Assignee
パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ filed Critical パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ
Priority to JP2020511094A priority Critical patent/JP7287950B2/ja
Publication of WO2019189768A1 publication Critical patent/WO2019189768A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K19/00Record carriers for use with machines and with at least a part designed to carry digital markings
    • G06K19/06Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B10/00Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
    • H04B10/11Arrangements specific to free-space transmission, i.e. transmission through air or vacuum
    • H04B10/114Indoor or close-range type systems
    • H04B10/116Visible light communication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present invention relates to a communication method, a communication device, a transmitter, a program, and the like.
  • Patent Literature 1 in an optical space transmission device that transmits information to free space using light, limited transmission is performed by performing communication using a plurality of monochromatic light sources of illumination light. A technology for efficiently realizing communication between devices is described in the apparatus.
  • the conventional method is limited to a case where a device to be applied has a three-color light source such as illumination. Moreover, the receiver which receives the transmitted information cannot display an image useful for the user.
  • the present invention solves such problems and provides a communication method that enables communication between various devices.
  • a communication method is a communication method using a terminal including an image sensor, and determines whether or not the terminal can perform visible light communication, and the terminal performs visible light communication.
  • the image sensor determines that it is possible to obtain a decoding image by capturing an image of a subject whose luminance changes, the first image is transmitted from the striped pattern that appears in the decoding image.
  • the image sensor When acquiring identification information and determining that the terminal is not capable of performing visible light communication in the determination of visible light communication, the image sensor acquires a captured image by capturing the subject, By performing edge detection of the captured image, at least one contour is extracted, a predetermined specific region is specified from the at least one contour, and a line of the specific region The subject from the turn acquires the second identification information to be transmitted.
  • FIG. 1 is a diagram illustrating an example of an observation method of luminance of a light emitting unit in the first embodiment.
  • FIG. 2 is a diagram illustrating an example of a method of observing the luminance of the light emitting unit in the first embodiment.
  • FIG. 3 is a diagram illustrating an example of a method of observing the luminance of the light emitting unit in the first embodiment.
  • FIG. 4 is a diagram illustrating an example of a method of observing the luminance of the light emitting unit in the first embodiment.
  • FIG. 5A is a diagram illustrating an example of an observation method of luminance of a light emitting unit in Embodiment 1.
  • FIG. 5B is a diagram illustrating an example of an observation method of luminance of a light emitting unit in Embodiment 1.
  • FIG. 5A is a diagram illustrating an example of an observation method of luminance of a light emitting unit in Embodiment 1.
  • FIG. 5B is a diagram illustrating an example of an observation method of lumina
  • FIG. 5C is a diagram illustrating an example of an observation method of luminance of a light emitting unit in Embodiment 1.
  • FIG. 5D is a diagram illustrating an example of an observation method of luminance of a light emitting unit in Embodiment 1.
  • FIG. 5E is a diagram illustrating an example of an observation method of luminance of a light emitting unit in Embodiment 1.
  • FIG. 5F is a diagram illustrating an example of an observation method of luminance of a light emitting unit in Embodiment 1.
  • FIG. 5G is a diagram illustrating an example of an observation method of luminance of a light emitting unit in Embodiment 1.
  • FIG. 5H is a diagram illustrating an example of an observation method of luminance of a light emitting unit in Embodiment 1.
  • FIG. 6A is a flowchart of the information communication method in Embodiment 1.
  • FIG. 6B is a block diagram of the information communication apparatus according to Embodiment 1.
  • FIG. 7 is a diagram illustrating an example of a photographing operation of the receiver in the second embodiment.
  • FIG. 8 is a diagram illustrating another example of the photographing operation of the receiver in the second embodiment.
  • FIG. 9 is a diagram illustrating another example of the photographing operation of the receiver in the second embodiment.
  • FIG. 10 is a diagram illustrating an example of display operation of the receiver in Embodiment 2.
  • FIG. 11 is a diagram illustrating an example of display operation of the receiver in Embodiment 2.
  • FIG. 12 is a diagram illustrating an example of operation of a receiver in Embodiment 2.
  • FIG. 12 is a diagram illustrating an example of operation of a receiver in Embodiment 2.
  • FIG. 13 is a diagram illustrating another example of operation of a receiver in Embodiment 2.
  • FIG. 14 is a diagram illustrating another example of operation of a receiver in Embodiment 2.
  • FIG. 15 is a diagram illustrating another example of operation of a receiver in Embodiment 2.
  • FIG. 16 is a diagram illustrating another example of operation of a receiver in Embodiment 2.
  • FIG. 17 is a diagram illustrating another example of operation of a receiver in Embodiment 2.
  • FIG. 18A is a diagram illustrating an example of operation of a transmitter and a receiver in Embodiment 2.
  • FIG. 18B is a diagram illustrating an example of operation of a transmitter and a receiver in Embodiment 2.
  • FIG. 18A is a diagram illustrating an example of operation of a transmitter and a receiver in Embodiment 2.
  • FIG. 18C is a diagram illustrating an example of operation of a transmitter and a receiver in Embodiment 2.
  • FIG. 19 is a diagram for explaining an application example to the route guidance in the second embodiment.
  • FIG. 20 is a diagram for explaining an application example to usage log accumulation and analysis in the second embodiment.
  • FIG. 21 is a diagram illustrating an example of application of the transmitter and the receiver in the second embodiment.
  • FIG. 22 is a diagram illustrating an example of application of a transmitter and a receiver in Embodiment 2.
  • FIG. 23 is a diagram illustrating an example of an application according to the third embodiment.
  • FIG. 24 is a diagram illustrating an example of an application according to the third embodiment.
  • FIG. 25 is a diagram illustrating an example of a transmission signal and an example of a voice synchronization method in the third embodiment.
  • FIG. 26 is a diagram illustrating an example of a transmission signal in the third embodiment.
  • FIG. 27 is a diagram illustrating an example of a process flow of the receiver in Embodiment 3.
  • FIG. 28 is a diagram illustrating an example of a user interface of the receiver in the third embodiment.
  • FIG. 29 is a diagram illustrating an example of a process flow of the receiver in Embodiment 3.
  • FIG. 30 is a diagram illustrating another example of the processing flow of the receiver in Embodiment 3.
  • FIG. 31A is a diagram for explaining a specific method of synchronized playback in the third embodiment.
  • FIG. 31A is a diagram for explaining a specific method of synchronized playback in the third embodiment.
  • FIG. 31B is a block diagram illustrating a configuration of a playback device (receiver) that performs synchronized playback in the third embodiment.
  • FIG. 31C is a flowchart illustrating a processing operation of a playback device (receiver) that performs synchronized playback in the third embodiment.
  • FIG. 32 is a diagram for explaining preparations for synchronized playback in the third embodiment.
  • FIG. 33 is a diagram illustrating an example of application of a receiver in Embodiment 3.
  • FIG. 34A is a front view of a receiver held by a holder in the third embodiment.
  • FIG. 34B is a rear view of the receiver held by the holder in the third embodiment.
  • FIG. 35 is a diagram for describing a use case of a receiver held by a holder in the third embodiment.
  • FIG. 36 is a flowchart showing the processing operation of the receiver held by the holder in the third embodiment.
  • FIG. 37 is a diagram illustrating an example of an image displayed by the receiver in Embodiment 3.
  • FIG. 38 is a diagram showing another example of the holder in the third embodiment.
  • FIG. 39A is a diagram illustrating an example of a visible light signal in Embodiment 3.
  • FIG. 39B is a diagram illustrating an example of a visible light signal in Embodiment 3.
  • FIG. 39C is a diagram illustrating an example of a visible light signal in Embodiment 3.
  • FIG. 39D is a diagram illustrating an example of a visible light signal in Embodiment 3.
  • FIG. 40 is a diagram illustrating a configuration of a visible light signal in the third embodiment.
  • FIG. 41 is a diagram illustrating an example in which the receiver in Embodiment 4 displays an AR image.
  • FIG. 42 is a diagram illustrating an example of a display system in Embodiment 4.
  • FIG. 43 is a diagram illustrating another example of the display system according to Embodiment 4.
  • FIG. 44 is a diagram illustrating another example of the display system according to Embodiment 4.
  • FIG. 45 is a flowchart illustrating an example of processing operations of a receiver in Embodiment 4.
  • FIG. 46 is a diagram illustrating another example in which the receiver in Embodiment 4 displays an AR image.
  • FIG. 46 is a diagram illustrating another example in which the receiver in Embodiment 4 displays an AR image.
  • FIG. 47 is a diagram illustrating another example in which the receiver in Embodiment 4 displays an AR image.
  • FIG. 48 is a diagram illustrating another example in which the receiver in Embodiment 4 displays an AR image.
  • FIG. 49 is a diagram illustrating another example in which the receiver in Embodiment 4 displays an AR image.
  • FIG. 50 is a diagram illustrating another example in which the receiver in Embodiment 4 displays an AR image.
  • FIG. 51 is a diagram illustrating another example in which the receiver in Embodiment 4 displays an AR image.
  • FIG. 52 is a flowchart illustrating another example of the processing operation of the receiver in the fourth embodiment.
  • FIG. 53 is a diagram illustrating another example in which the receiver in Embodiment 4 displays an AR image.
  • FIG. 53 is a diagram illustrating another example in which the receiver in Embodiment 4 displays an AR image.
  • FIG. 54 is a diagram illustrating a captured display image Ppre and a decoding image Pdec acquired by capturing by the receiver in the fourth embodiment.
  • FIG. 55 is a diagram illustrating an example of a captured display image Ppre displayed on the receiver in the fourth embodiment.
  • FIG. 56 is a flowchart illustrating another example of processing operations of a receiver in Embodiment 4.
  • FIG. 57 is a diagram illustrating another example in which the receiver in Embodiment 4 displays an AR image.
  • FIG. 58 is a diagram illustrating another example in which the receiver in Embodiment 4 displays an AR image.
  • FIG. 59 is a diagram illustrating another example in which the receiver in Embodiment 4 displays an AR image.
  • FIG. 60 is a diagram illustrating another example in which the receiver in Embodiment 4 displays an AR image.
  • FIG. 61 is a diagram showing an example of recognition information in the fourth embodiment.
  • FIG. 62 is a flowchart illustrating another example of processing operations of a receiver in Embodiment 4.
  • FIG. 63 is a diagram illustrating an example in which the receiver in Embodiment 4 identifies bright line pattern regions.
  • FIG. 64 is a diagram illustrating another example of the receiver in Embodiment 4.
  • FIG. 65 is a flowchart illustrating another example of the processing operation of the receiver in the fourth embodiment.
  • FIG. 66 is a diagram illustrating an example of a transmission system including a plurality of transmitters in Embodiment 4.
  • FIG. 67 is a diagram illustrating an example of a transmission system including a plurality of transmitters and receivers in Embodiment 4.
  • 68A is a flowchart illustrating an example of processing operations of a receiver in Embodiment 4.
  • FIG. 68B is a flowchart illustrating an example of processing operations of a receiver in Embodiment 4.
  • FIG. 69A is a flowchart illustrating a display method according to Embodiment 4.
  • FIG. 69B is a block diagram illustrating a structure of a display device in Embodiment 4.
  • FIG. 70 is a diagram illustrating an example in which the receiver in the first modification of the fourth embodiment displays an AR image.
  • FIG. 71 is a diagram illustrating another example in which the receiver in the first modification of the fourth embodiment displays an AR image.
  • FIG. 72 is a diagram illustrating another example in which the receiver in the first modification of the fourth embodiment displays an AR image.
  • FIG. 73 is a diagram illustrating another example in which the receiver in the first modification of the fourth embodiment displays an AR image.
  • 74 is a diagram illustrating another example of a receiver in Modification 1 of Embodiment 4.
  • FIG. 75 is a diagram illustrating another example in which the receiver in the first modification of the fourth embodiment displays an AR image.
  • FIG. 76 is a diagram illustrating another example in which the receiver in the first modification of the fourth embodiment displays an AR image.
  • FIG. 77 is a flowchart illustrating an example of processing operations of the receiver in the first modification of the fourth embodiment.
  • FIG. 78 is a diagram illustrating an example of a problem when an AR image assumed in the receiver in Embodiment 4 or the modification 1 thereof is displayed.
  • FIG. 79 is a diagram illustrating an example in which the receiver in the second modification of the fourth embodiment displays an AR image.
  • FIG. 80 is a flowchart illustrating an example of processing operations of the receiver in the second modification of the fourth embodiment.
  • FIG. 81 is a diagram illustrating another example in which the receiver in the second modification of the fourth embodiment displays an AR image.
  • FIG. 82 is a flowchart illustrating another example of the processing operation of the receiver in the second modification of the fourth embodiment.
  • FIG. 83 is a diagram illustrating another example in which the receiver in the second modification of the fourth embodiment displays an AR image.
  • FIG. 84 is a diagram illustrating another example in which the receiver in the second modification of the fourth embodiment displays an AR image.
  • FIG. 85 is a diagram illustrating another example in which the receiver in the second modification of the fourth embodiment displays an AR image.
  • FIG. 86 is a diagram illustrating another example in which the receiver in the second modification of the fourth embodiment displays an AR image.
  • FIG. 87A is a flowchart illustrating a display method according to one embodiment of the present invention.
  • FIG. 87B is a block diagram illustrating a structure of a display device according to one embodiment of the present invention.
  • FIG. 88 is a diagram illustrating an example of enlargement and movement of an AR image in the third modification of the fourth embodiment.
  • FIG. 89 is a diagram illustrating an example of expansion of an AR image in the third modification of the fourth embodiment.
  • FIG. 90 is a flowchart illustrating an example of a processing operation related to enlargement and movement of an AR image by a receiver according to the third modification of the fourth embodiment.
  • FIG. 91 is a diagram illustrating an example of superimposition of AR images in the third modification of the fourth embodiment.
  • FIG. 92 is a diagram illustrating an example of superimposition of AR images in the third modification of the fourth embodiment.
  • FIG. 93 is a diagram illustrating an example of superimposition of AR images in the third modification of the fourth embodiment.
  • FIG. 95A is a diagram illustrating an example of a captured display image obtained by imaging by the receiver in the third modification of the fourth embodiment.
  • FIG. 95B is a diagram illustrating an example of a menu screen displayed on the display of the receiver in the third modification of the fourth embodiment.
  • FIG. 96 is a flowchart illustrating an example of processing operations of the receiver and the server in the third modification of the fourth embodiment.
  • FIG. 97 is a diagram for explaining sound volume reproduced by the receiver in the third modification of the fourth embodiment.
  • FIG. 98 is a diagram illustrating a relationship between the distance from the receiver to the transmitter and the sound volume in the third modification of the fourth embodiment.
  • FIG. 99 is a diagram illustrating an example of superimposition of AR images by the receiver in the third modification of the fourth embodiment.
  • FIG. 100 is a diagram illustrating an example of superimposition of AR images by a receiver in the third modification of the fourth embodiment.
  • FIG. 101 is a diagram for describing an example of how to obtain a line scan time by a receiver in the third modification of the fourth embodiment.
  • FIG. 102 is a diagram for describing an example of how to obtain the line scan time by the receiver in the third modification of the fourth embodiment.
  • FIG. 103 is a flowchart illustrating an example of how to obtain the line scan time by the receiver in the third modification of the fourth embodiment.
  • FIG. 104 is a diagram illustrating an example of superimposition of AR images by the receiver in the third modification of the fourth embodiment.
  • FIG. 105 is a diagram illustrating an example of superimposition of AR images by the receiver in the third modification of the fourth embodiment.
  • FIG. 106 is a diagram illustrating an example of superimposition of AR images by the receiver in the third modification of the fourth embodiment.
  • FIG. 107 is a diagram illustrating an example of a decoding image acquired in accordance with the attitude of the receiver in the third modification of the fourth embodiment.
  • FIG. 108 is a diagram illustrating another example of the decoding image acquired in accordance with the attitude of the receiver in the third modification of the fourth embodiment.
  • FIG. 109 is a flowchart illustrating an example of processing operations of the receiver in Modification 3 of Embodiment 4.
  • FIG. 109 is a flowchart illustrating an example of processing operations of the receiver in Modification 3 of Embodiment 4.
  • FIG. 110 is a diagram illustrating an example of a camera lens switching process performed by a receiver according to the third modification of the fourth embodiment.
  • FIG. 111 is a diagram illustrating an example of camera switching processing by the receiver in the third modification of the fourth embodiment.
  • FIG. 112 is a flowchart illustrating an example of processing operations of the receiver and the server in the third modification of the fourth embodiment.
  • FIG. 113 is a diagram illustrating an example of superimposition of AR images by the receiver in the third modification of the fourth embodiment.
  • FIG. 114 is a sequence diagram illustrating processing operations of a system including a receiver, a microwave oven, a relay server, and an electronic settlement server in Modification 3 of Embodiment 4.
  • FIG. 114 is a sequence diagram illustrating processing operations of a system including a receiver, a microwave oven, a relay server, and an electronic settlement server in Modification 3 of Embodiment 4.
  • FIG. 115 is a sequence diagram illustrating processing operations of a system including a POS terminal, a server, a receiver, and a microwave oven according to the third modification of the fourth embodiment.
  • 116 is a diagram illustrating an example of indoor use in Modification 3 of Embodiment 4.
  • FIG. 117 is a diagram illustrating an example of an augmented reality object display according to the third modification of the fourth embodiment.
  • FIG. 118 is a diagram illustrating a configuration of a display system according to the fourth modification of the fourth embodiment.
  • FIG. 119 is a flowchart illustrating the processing operation of the display system in the fourth modification of the fourth embodiment.
  • FIG. 120 is a flowchart illustrating a recognition method according to an aspect of the present invention.
  • 121 is a diagram illustrating an example of an operation mode of a visible light signal according to Embodiment 5.
  • 122A is a flowchart illustrating a visible light signal generation method according to Embodiment 5.
  • FIG. 122B is a block diagram illustrating a configuration of the signal generation device according to Embodiment 5.
  • FIG. 123 is a diagram illustrating a format of an MPM MAC frame according to the sixth embodiment.
  • FIG. 124 is a flowchart illustrating a processing operation of the encoding device that generates an MPM MAC frame according to the sixth embodiment.
  • FIG. 125 is a flowchart showing a processing operation of the decoding apparatus for decoding the MPM MAC frame in the sixth embodiment.
  • FIG. 126 shows MAC PIB attributes in the sixth embodiment.
  • FIG. 127 is a diagram for explaining an MPM light control method according to the sixth embodiment.
  • FIG. 128 is a diagram showing attributes of the PHY PIB in the sixth embodiment.
  • FIG. 129 is a diagram for explaining MPM in the sixth embodiment.
  • FIG. 130 is a diagram illustrating a PLCP header subfield according to the sixth embodiment.
  • FIG. 131 is a diagram illustrating a PLCP center subfield according to the sixth embodiment.
  • FIG. 132 is a diagram illustrating a PLCP footer subfield according to the sixth embodiment.
  • FIG. 133 is a diagram illustrating a waveform of the PHY PWM mode in the MPM according to the sixth embodiment.
  • FIG. 134 is a diagram illustrating a PHY PPM mode waveform in the MPM according to the sixth embodiment.
  • FIG. 135 is a flowchart illustrating an example of the decoding method according to the sixth embodiment.
  • FIG. 136 is a flowchart illustrating an example of the encoding method according to the sixth embodiment.
  • FIG. 137 is a diagram illustrating an example in which the receiver in Embodiment 7 displays an AR image.
  • FIG. 138 is a diagram illustrating an example of a captured display image on which an AR image is superimposed, according to the seventh embodiment.
  • FIG. 139 is a diagram illustrating another example in which the receiver in Embodiment 7 displays an AR image.
  • FIG. 140 is a flowchart illustrating the operation of the receiver in the seventh embodiment.
  • FIG. 141 is a diagram for explaining operation of a transmitter in Embodiment 7.
  • FIG. 142 is a diagram for explaining another operation of the transmitter in Embodiment 7.
  • FIG. FIG. 143 is a diagram for describing another operation of the transmitter in the seventh embodiment.
  • FIG. 144 is a diagram illustrating a comparative example for describing easiness of receiving an optical ID in the seventh embodiment.
  • FIG. 145A is a flowchart illustrating an operation of the transmitter in the seventh embodiment.
  • FIG. 145B is a block diagram illustrating a configuration of a transmitter in Embodiment 7.
  • FIG. 146 is a diagram illustrating another example in which the receiver in Embodiment 7 displays an AR image.
  • FIG. 147 is a diagram for explaining an operation of the transmitter in the eighth embodiment.
  • FIG. 148A is a flowchart illustrating a transmission method according to the eighth embodiment.
  • 148B is a block diagram illustrating a configuration of a transmitter in Embodiment 8.
  • FIG. 149 is a diagram illustrating an example of a detailed configuration of a visible light signal in Embodiment 8.
  • FIG. 150 is a diagram illustrating another example of a detailed configuration of a visible light signal according to Embodiment 8.
  • FIG. 151 is a diagram illustrating another example of a detailed configuration of a visible light signal according to the eighth embodiment.
  • FIG. 152 is a diagram illustrating another example of a detailed configuration of a visible light signal according to the eighth embodiment.
  • FIG. 153 is a diagram illustrating a relationship between the sum of the variables y 0 to y 3 , the total time length, and the effective time length in the eighth embodiment.
  • FIG. 154A is a flowchart illustrating a transmission method according to Embodiment 8.
  • FIG. 154B is a block diagram illustrating a configuration of a transmitter in Embodiment 8.
  • FIG. 155 is a diagram illustrating the structure of the display system according to the ninth embodiment.
  • FIG. 156 is a sequence diagram illustrating processing operations of the receiver and the server according to Embodiment 9.
  • FIG. 157 is a flowchart showing the processing operation of the server in the ninth embodiment.
  • FIG. 158 is a diagram illustrating an example of communication in the case where the transmitter and the receiver in Embodiment 9 are mounted on a vehicle, respectively.
  • FIG. 159 is a flowchart showing the processing operation of the vehicle in the ninth embodiment.
  • FIG. 160 is a diagram illustrating an example in which the receiver in Embodiment 9 displays an AR image.
  • FIG. 161 is a diagram illustrating another example in which the receiver in Embodiment 9 displays an AR image.
  • FIG. 162 is a diagram illustrating processing operation of a receiver in Embodiment 9.
  • FIG. 163 is a diagram illustrating an example of operation on a receiver in Embodiment 9.
  • FIG. 164 is a diagram illustrating an example of AR image displayed on the receiver in Embodiment 9.
  • FIG. 165 is a diagram illustrating an example of the AR image superimposed on the captured display image in the ninth embodiment.
  • FIG. 166 is a diagram illustrating an example of the AR image superimposed on the captured display image in Embodiment 9.
  • 167 is a diagram illustrating an example of a transmitter in Embodiment 9.
  • FIG. 168 is a diagram illustrating another example of a transmitter in Embodiment 9.
  • FIG. 169 is a diagram illustrating another example of a transmitter in Embodiment 9.
  • FIG. FIG. 170 is a diagram illustrating an example of a system using a receiver compatible with optical communication and a receiver not compatible with optical communication in Embodiment 9.
  • FIG. 171 is a flowchart illustrating processing operations of the receiver in Embodiment 9.
  • FIG. 171 is a flowchart illustrating processing operations of the receiver in Embodiment 9.
  • FIG. 173A is a flowchart illustrating a display method according to one embodiment of the present invention.
  • FIG. 173B is a block diagram illustrating a structure of a display device according to one embodiment of the present invention.
  • 174 is a diagram illustrating an example of an image drawn on a transmitter in Embodiment 10.
  • FIG. 175 is a diagram illustrating another example of an image drawn on a transmitter in Embodiment 10.
  • FIG. 176 is a diagram illustrating an example of a transmitter and a receiver in Embodiment 10.
  • FIG. FIG. 177 is a diagram for describing the fundamental frequency of a line pattern in the tenth embodiment.
  • FIG. 178A is a flowchart showing a processing operation of the encoding apparatus according to the tenth embodiment.
  • FIG. 178B is a diagram for describing a processing operation of the encoding device according to the tenth embodiment.
  • FIG. 179 is a flowchart illustrating processing operations of a receiver which is a decoding device according to Embodiment 10.
  • FIG. 180 is a flowchart illustrating processing operations of a receiver in Embodiment 10.
  • FIG. 181A is a diagram illustrating an example of a system configuration in Embodiment 10.
  • FIG. 181B is a diagram illustrating processing of the camera according to Embodiment 10.
  • FIG. 182 is a diagram illustrating another example of the configuration of the system according to the tenth embodiment.
  • FIG. 183 is a diagram illustrating another example of an image drawn on the transmitter in Embodiment 10.
  • FIG. 184 is a diagram illustrating an example of a format of a MAC frame constituting the frame ID in the tenth embodiment.
  • FIG. 185 is a diagram illustrating an example of a MAC header configuration in the tenth embodiment.
  • FIG. 186 is a diagram illustrating an example of a table for deriving the number of packet divisions according to the tenth embodiment.
  • FIG. 187 is a diagram illustrating PHY coding according to the tenth embodiment.
  • FIG. 188 is a diagram illustrating an example of a transmission image Im3 having a PHY symbol in Embodiment 10.
  • FIG. 189 is a diagram for explaining two PHY versions in the tenth embodiment.
  • FIG. 190 is a diagram for explaining the Gray code in the tenth embodiment.
  • FIG. 191 is a diagram illustrating an example of decoding processing by the receiver in Embodiment 10.
  • FIG. 192 is a diagram for describing a transmission image fraud detection method by the receiver in the tenth embodiment.
  • FIG. 193 is a flowchart illustrating an example of a decoding process including fraud detection of a transmission image by a receiver in the tenth embodiment.
  • FIG. 194A is a flowchart showing a display method according to a modification of the tenth embodiment.
  • FIG. 194B is a block diagram illustrating a structure of the display device according to the modification of the tenth embodiment.
  • FIG. 191 is a diagram illustrating an example of decoding processing by the receiver in Embodiment 10.
  • FIG. 192 is a diagram for describing a transmission image fraud detection method by the receiver in the tenth embodiment.
  • FIG. 193 is a flowchart illustrating an example of
  • FIG. 194C is a flowchart illustrating a communication method according to a modification of the tenth embodiment.
  • FIG. 194D is a block diagram showing a configuration of a communication apparatus according to this variation of the tenth embodiment.
  • FIG. 194E is a block diagram illustrating a configuration of the transmitter according to Embodiment 10 and its modifications.
  • FIG. 195 is a diagram illustrating an example of a configuration of a communication system including a server in the eleventh embodiment.
  • FIG. 196 is a flowchart illustrating a management method by the first server in the eleventh embodiment.
  • FIG. 197 is a diagram illustrating an illumination system in Embodiment 12.
  • FIG. 198 is a diagram illustrating an example of arrangement of illumination devices and a decoding image in Embodiment 12.
  • FIG. 199 is a diagram illustrating another example of arrangement of illumination devices and a decoding image in Embodiment 12.
  • FIG. 200 is a diagram for describing position estimation using the illumination device in Embodiment 12.
  • FIG. 201 is a flowchart illustrating processing operation of a receiver in Embodiment 12.
  • FIG. 202 is a diagram illustrating an example of a communication system in Embodiment 12.
  • FIG. 203 is a diagram for describing self-position estimation processing by a receiver in Embodiment 12.
  • FIG. 204 is a flowchart illustrating self-position estimation processing by a receiver in Embodiment 12.
  • FIG. 205 is a flowchart illustrating an outline of receiver self-position estimation processing according to the twelfth embodiment.
  • FIG. 206 is a diagram illustrating the relationship between radio wave IDs and optical IDs in the twelfth embodiment.
  • 207 is a diagram for describing an example of imaging by a receiver in Embodiment 12.
  • FIG. 208 is a diagram for describing another example of imaging by a receiver in Embodiment 12.
  • FIG. 209 is a diagram for describing a camera used by a receiver in Embodiment 12.
  • FIG. 210 is a flowchart illustrating an example of processing in which the receiver in Embodiment 12 changes the visible light signal of the transmitter.
  • FIG. 211 is a flowchart illustrating another example of processing in which the receiver in Embodiment 12 changes the visible light signal of the transmitter.
  • FIG. 212 is a diagram for describing navigation by a receiver in Embodiment 13.
  • FIG. 213 is a flowchart illustrating an example of self-position estimation by the receiver in Embodiment 13.
  • FIG. 214 is a diagram for describing a visible light signal received by the receiver in Embodiment 13.
  • FIG. 215 is a flowchart illustrating another example of self-position estimation by the receiver in the thirteenth embodiment.
  • FIG. 216 is a flowchart illustrating an example of determination of reflected light by the receiver in Embodiment 13.
  • FIG. 217 is a flowchart illustrating an example of navigation by a receiver in Embodiment 13.
  • FIG. 218 is a diagram illustrating an example of the transmitter 100 configured as a projector in Embodiment 13. In FIG. FIG. FIG.
  • FIG. 219 is a flowchart illustrating another example of self-position estimation by the receiver in Embodiment 13.
  • FIG. 220 is a flowchart illustrating an example of processing performed by a transmitter in the thirteenth embodiment.
  • FIG. 221 is a flowchart illustrating another example of navigation by a receiver in Embodiment 13.
  • FIG. 222 is a flowchart illustrating an example of processing by a receiver in Embodiment 13.
  • 223 is a diagram illustrating an example of a screen which is displayed on the display of the receiver in Embodiment 13.
  • FIG. 224 is a diagram illustrating an example of character display by the receiver in Embodiment 13.
  • FIG. 225 is a diagram illustrating another example of a screen displayed on the display of the receiver in Embodiment 13.
  • FIG. 226 is a diagram showing a system configuration for performing navigation to a meeting place in the thirteenth embodiment.
  • 227 is a diagram illustrating another example of a screen displayed on the display of the receiver in Embodiment 13.
  • FIG. 228 is a diagram showing the inside of the concert hall.
  • FIG. 229 is a flowchart illustrating an example of a communication method according to the first aspect of the present invention.
  • a communication method is a communication method using a terminal including an image sensor, and determines whether or not the terminal can perform visible light communication, and the terminal performs visible light communication.
  • the image sensor captures a subject whose luminance changes to obtain a decoding image, and the subject transmits from the striped pattern appearing in the decoding image.
  • a captured image is acquired by capturing the subject with the image sensor. Then, by performing edge detection of the captured image, at least one contour is extracted, a predetermined specific area is specified from the at least one outline, and a line of the specific area is identified. The object acquires the second identification information transmitted from the pattern.
  • a terminal such as a receiver can receive first identification information or second identification information from a subject such as a transmitter regardless of whether or not visible light communication is possible. Can be acquired. That is, when the terminal can perform visible light communication, the terminal acquires, for example, the light ID from the subject as the first identification information. On the other hand, even if the terminal cannot perform visible light communication, the terminal can acquire, for example, an image ID or a frame ID from the subject as the second identification information.
  • the transmission image illustrated in FIGS. 183 and 188 is captured as a subject, the area of the transmission image is selected as a specific area (that is, a selection area), and second identification information is obtained from the line pattern of the transmission image. Is acquired. Therefore, even when visible light communication is impossible, the second identification information can be appropriately acquired.
  • the striped pattern is also called a bright line pattern or a bright line pattern region.
  • an area having a quadrangular outline of a predetermined size or an area having a rounded quadrangular outline of a predetermined size or more may be specified as the specific area.
  • a quadrangular or rounded quadrangular region can be appropriately identified as the specific region.
  • the terminal when the terminal is identified as a terminal that can change the exposure time to a predetermined value or less, it is determined that visible light communication can be performed.
  • the terminal specifies that the exposure time cannot be changed to the predetermined value or less, it may be determined that the visible light communication cannot be performed.
  • the exposure time of the image sensor is set to a first exposure time, By capturing the subject with the first exposure time to obtain the decoding image, and in the determination of the visible light communication, when it is determined that the terminal cannot perform visible light communication,
  • the exposure time of the image sensor is set to a second exposure time, the captured image is acquired by capturing the subject with the second exposure time, and the first exposure time is set. May be shorter than the second exposure time.
  • the terminal can acquire the first identification information or the second identification information suitable for the terminal by properly using the first exposure time and the second exposure time.
  • the subject has a rectangular shape as viewed from the image sensor, and the central area of the subject changes in luminance, whereby the first identification information is transmitted, and a barcode-like line pattern is formed around the subject.
  • the image sensor corresponds to a plurality of exposure lines of the image sensor.
  • the decoding image including a bright line pattern composed of a plurality of bright lines is acquired, the first identification information is acquired by decoding the bright line pattern, and in the determination of the visible light communication, the terminal
  • the second pattern is extracted from the line pattern of the captured image. It may acquire the different information.
  • the first identification information and the second identification information can be appropriately acquired from the subject whose central region changes in luminance.
  • the first identification information obtained from the decoding image and the second identification information obtained from the line pattern may be the same information.
  • the same information can be acquired from the subject both in a terminal capable of visible light communication and a terminal incapable of visible light communication.
  • the first moving image associated with the first identification information is displayed, and the first When the operation of sliding the moving image is received, the second moving image associated with the first identification information may be displayed next to the first moving image.
  • each of the first moving image and the second moving image is the first AR image P46 and the second AR image P46c shown in FIG.
  • the first identification information is, for example, an optical ID as described above.
  • the second moving image associated with the first identification information is next to the first moving image. Is displayed. Therefore, an image useful for the user can be easily displayed.
  • FIG. 194A since it is determined in advance whether or not visible light communication is possible, it is possible to omit useless processing to acquire a visible light signal until impossible, The processing burden can be reduced.
  • the second moving image when an operation of sliding the first moving image in the horizontal direction is accepted, the second moving image is displayed, and the first moving image is slid in the vertical direction.
  • a still image associated with the first identification information may be displayed.
  • the second moving image is displayed by sliding the first moving image in the horizontal direction, that is, by swiping.
  • a still image associated with the first identification information is displayed by sliding the first moving image in the vertical direction.
  • the still image is, for example, an AR image P47 shown in FIG. Therefore, it is possible to easily display a wide variety of images useful to the user.
  • the object in the picture displayed first may be in the same position.
  • the first displayed object is at the same position, so the user can It is possible to easily grasp that the moving image and the second moving image are related to each other.
  • the next moving image associated with the first identification information may be displayed next to the displayed moving image.
  • the object in the picture displayed first may be in the same position.
  • At least one of the first moving image and the second moving image has a higher transparency at the position as the position in the moving image is closer to the end of the moving image. It may be formed.
  • an image may be displayed outside an area where at least one of the first moving image and the second moving image is displayed.
  • the image sensor is a region that includes a pattern of a plurality of bright lines by acquiring a normal captured image by imaging with a first exposure time by the image sensor and imaging by a second exposure time shorter than the first exposure time.
  • the decoding image including a bright line pattern region is acquired
  • the first identification information is acquired by decoding the decoding image
  • at least one moving image of the first moving image or the second moving image In the image display, a reference area at the same position as the bright line pattern area in the decoding image is specified from the normal captured image, and the moving image is superimposed on the normal captured image based on the reference area. May be recognized as a target area, and the moving image may be superimposed on the target area. For example, in displaying the moving image of at least one of the first moving image and the second moving image, the upper, lower, left, or right region of the reference region in the normal captured image is the target region. You may recognize as.
  • the target area is recognized based on the reference area, and the moving image is superimposed on the target area. Can be easily increased.
  • the size of the moving image may be increased as the size of the bright line pattern region is increased.
  • the size of the moving image changes according to the size of the bright line pattern region, so that the object indicated by the moving image is more compared to the case where the size of the moving image is fixed.
  • the moving image can be displayed so that it exists in reality.
  • a transmitter includes an illumination plate, a light source that emits light from a back side of the illumination plate, and a microcontroller that changes luminance of the light source, and the microcontroller includes the light source.
  • the first identification information is transmitted from the light source through the illuminating plate, and a barcode-like line pattern is arranged around the front side of the illuminating plate.
  • Second identification information is encoded, and the first identification information and the second identification information are the same information.
  • the lighting plate has a rectangular shape.
  • the same information can be transmitted to a terminal capable of performing visible light communication and a terminal capable of performing visible light communication.
  • a recording medium such as an apparatus, a system, a method, an integrated circuit, a computer program, or a computer-readable CD-ROM. You may implement
  • FIG. 1 shows an example in which imaging devices arranged in one row are exposed simultaneously, and imaging is performed by shifting the exposure start time in the order of closer rows.
  • the exposure line of the image sensor that exposes simultaneously is called an exposure line
  • the pixel line on the image corresponding to the image sensor is called a bright line.
  • this image capturing method When this image capturing method is used to capture an image of a blinking light source on the entire surface of the image sensor, bright lines (light and dark lines of pixel values) along the exposure line appear on the captured image as shown in FIG. .
  • bright lines light and dark lines of pixel values
  • the imaging frame rate By recognizing the bright line pattern, it is possible to estimate the light source luminance change at a speed exceeding the imaging frame rate. Thereby, by transmitting a signal as a change in light source luminance, communication at a speed higher than the imaging frame rate can be performed.
  • LO lower luminance value
  • HI high
  • Low may be in a state where the light source is not shining, or may be shining weaker than high.
  • the imaging frame rate is 30 fps
  • a change in luminance with a period of 1.67 milliseconds can be recognized.
  • the exposure time is set shorter than 10 milliseconds, for example.
  • FIG. 2 shows a case where the exposure of the next exposure line is started after the exposure of one exposure line is completed.
  • the transmission speed is a maximum of flm bits per second.
  • the light emission time of the light emitting unit is controlled by a unit time shorter than the exposure time of each exposure line. More information can be transmitted.
  • information can be transmitted at a maximum rate of flElv bits per second.
  • the basic period of transmission can be recognized by causing the light emitting unit to emit light at a timing slightly different from the exposure timing of each exposure line.
  • FIG. 4 shows a case where the exposure of the next exposure line is started before the exposure of one exposure line is completed. That is, the exposure times of adjacent exposure lines are partially overlapped in time.
  • the S / N ratio can be improved.
  • the exposure time of adjacent exposure lines has a partial overlap in time, and a configuration in which some exposure lines have no partial overlap. It is also possible. By configuring a part of the exposure lines so as not to partially overlap in time, it is possible to suppress the generation of intermediate colors due to the overlap of exposure times on the imaging screen, and to detect bright lines more appropriately. .
  • the exposure time is calculated from the brightness of each exposure line, and the light emission state of the light emitting unit is recognized.
  • the brightness of each exposure line is determined by a binary value indicating whether the luminance is equal to or higher than a threshold value, in order to recognize the state where no light is emitted, the state where the light emitting unit does not emit light is indicated for each line. It must last longer than the exposure time.
  • FIG. 5A shows the influence of the difference in exposure time when the exposure start times of the exposure lines are equal.
  • 7500a is the case where the exposure end time of the previous exposure line is equal to the exposure start time of the next exposure line
  • 7500b is the case where the exposure time is longer than that.
  • the exposure time of adjacent exposure lines is partially overlapped in time, so that the exposure time can be increased. That is, the light incident on the image sensor increases and a bright image can be obtained.
  • the imaging sensitivity for capturing images with the same brightness can be suppressed to a low level, an image with less noise can be obtained, so that communication errors are suppressed.
  • FIG. 5B shows the influence of the difference in the exposure start time of each exposure line when the exposure times are equal.
  • 7501a is the case where the exposure end time of the previous exposure line is equal to the exposure start time of the next exposure line
  • 7501b is the case where the exposure of the next exposure line is started earlier than the end of exposure of the previous exposure line.
  • the sample interval ( difference in exposure start time) becomes dense, the change in the light source luminance can be estimated more accurately, the error rate can be reduced, and the change in the light source luminance in a shorter time is recognized. be able to.
  • the exposure time overlap it is possible to recognize blinking of the light source that is shorter than the exposure time by using the difference in exposure amount between adjacent exposure lines.
  • the exposure time satisfy the exposure time> (sample interval ⁇ pulse width).
  • the pulse width is a pulse width of light that is a period during which the luminance of the light source is High. Thereby, the High brightness can be detected appropriately.
  • the exposure time is set to be longer than that in the normal shooting mode.
  • the communication speed can be dramatically improved.
  • the exposure time needs to be set as exposure time ⁇ 1/8 ⁇ f. Blanking that occurs during shooting is at most half the size of one frame. That is, since the blanking time is less than half of the shooting time, the actual shooting time is 1 / 2f at the shortest time.
  • the exposure time since it is necessary to receive quaternary information within a time of 1 / 2f, at least the exposure time needs to be shorter than 1 / (2f ⁇ 4). Since the normal frame rate is 60 frames / second or less, it is possible to generate an appropriate bright line pattern in the image data and perform high-speed signal transmission by setting the exposure time to 1/480 seconds or less. Become.
  • FIG. 5C shows an advantage when the exposure times are short when the exposure times of the exposure lines do not overlap.
  • the exposure time is long, even if the light source has a binary luminance change as in 7502a, the captured image has an intermediate color portion as in 7502e, and it becomes difficult to recognize the luminance change of the light source.
  • the free time (predetermined waiting time) t D2 not predetermined exposure start exposure of the next exposure line, the luminance variation of the light source Can be easily recognized. That is, a more appropriate bright line pattern such as 7502f can be detected.
  • an exposure time t E can be realized to be smaller than the time difference t D of the exposure start time of each exposure line.
  • the exposure time is set shorter than the normal shooting mode until a predetermined idle time occurs. This can be realized. Further, even when the normal photographing mode is the case where the exposure end time of the previous exposure line and the exposure start time of the next exposure line are equal, by setting the exposure time short until a predetermined non-exposure time occurs, Can be realized.
  • the exposure time of adjacent exposure lines has a partial overlap in time, and a configuration in which some exposure lines have no partial overlap. It is also possible. Further, in all exposure lines, it is not necessary to provide a configuration in which an idle time (predetermined waiting time) in which a predetermined exposure is not performed is provided after the exposure of one exposure line is completed until the exposure of the next exposure line is started. It is also possible to have a configuration in which the lines partially overlap in time. With such a configuration, it is possible to take advantage of the advantages of each configuration.
  • the same readout method is used in the normal shooting mode in which shooting is performed at a normal frame rate (30 fps, 60 fps) and in the visible light communication mode in which shooting is performed with an exposure time of 1/480 second or less in which visible light communication is performed.
  • a signal may be read by a circuit.
  • FIG. 5D shows the relationship between the minimum change time t S of the light source luminance, the exposure time t E , the time difference t D of the exposure start time of each exposure line, and the captured image.
  • Figure 5E the transition and time t T of the light source luminance, which shows the relationship between the time difference t D of the exposure start time of each exposure line.
  • t D is larger than the t T, exposure lines to be neutral is reduced, it is easy to estimate the light source luminance.
  • the exposure line of the intermediate color is continuously 2 or less, which is desirable.
  • t T the light source is less than 1 microsecond in the case of LED, light source for an approximately 5 microseconds in the case of organic EL, a t D by 5 or more microseconds, to facilitate estimation of the light source luminance be able to.
  • Figure 5F shows a high frequency noise t HT of light source luminance, the relationship between the exposure time t E.
  • t E is larger than t HT , the captured image is less affected by high frequency noise, and light source luminance is easily estimated.
  • t E is an integral multiple of t HT , the influence of high frequency noise is eliminated, and the light source luminance is most easily estimated.
  • t E > t HT .
  • the main cause of high frequency noise derived from the switching power supply circuit since many of the t HT in the switching power supply for the lamp is less than 20 microseconds, by the t E and 20 micro-seconds or more, the estimation of the light source luminance It can be done easily.
  • Figure 5G is the case t HT is 20 microseconds, which is a graph showing the relationship between the size of the exposure time t E and the high frequency noise.
  • t E is the value becomes equal to the value when the amount of noise takes a maximum, 15 microseconds or more, or, 35 microseconds or more, or, It can be confirmed that the efficiency is good when it is set to 54 microseconds or more, or 74 microseconds or more. From the viewpoint of reducing high-frequency noise, it is desirable that t E be large. However, as described above, there is a property that light source luminance can be easily estimated in that the smaller the t E , the more difficult the intermediate color portion is generated.
  • t E when the light source luminance change period is 15 to 35 microseconds, t E is 15 microseconds or more, and when the light source luminance change period is 35 to 54 microseconds, t E is 35 microseconds or more.
  • t E is 54 microseconds or more when the cycle is 54 to 74 microseconds of change, t E when the period of the change in light source luminance is 74 microseconds or more may be set as 74 microseconds or more.
  • Figure 5H shows the relationship between the exposure time t E and the recognition success rate. Since the exposure time t E has a relative meaning with respect to the time when the luminance of the light source is constant, the value (relative exposure time) obtained by dividing the period t S in which the light source luminance changes by the exposure time t E is taken as the horizontal axis. Yes. From the graph, it can be seen that if the recognition success rate is desired to be almost 100%, the relative exposure time should be 1.2 or less. For example, when the transmission signal is 1 kHz, the exposure time may be about 0.83 milliseconds or less.
  • the relative exposure time may be set to 1.25 or less, and when the recognition success rate is set to 80% or more, the relative exposure time may be set to 1.4 or less. Recognize. Also, the recognition success rate drops sharply when the relative exposure time is around 1.5, and becomes almost 0% at 1.6, so it can be seen that the relative exposure time should not be set to exceed 1.5. . It can also be seen that after the recognition rate becomes 0 at 7507c, it rises again at 7507d, 7507e, and 7507f.
  • the relative exposure time is 1.9 to 2.2, 2.4 to 2.6, and 2.8 to 3.0. Just do it.
  • these exposure times may be used as the intermediate mode.
  • FIG. 6A is a flowchart of the information communication method in the present embodiment.
  • the information communication method in the present embodiment is an information communication method for acquiring information from a subject, and includes steps SK91 to SK93.
  • a plurality of bright lines corresponding to a plurality of exposure lines included in the image sensor are generated in an image obtained by photographing the subject by an image sensor according to a change in luminance of the subject.
  • a first exposure time setting step SK91 for setting a first exposure time of the image sensor; and the image sensor shoots the subject whose luminance changes with the set first exposure time
  • First image acquisition step SK92 for acquiring a bright line image including a plurality of bright lines, and information acquisition for acquiring information by demodulating data specified by the patterns of the plurality of bright lines included in the acquired bright line image Step SK93, and in the first image acquisition step SK92, the plurality of exposure lines Re starts exposure at successively different times, and the exposure of the adjacent exposure line after a predetermined idle time from the end, the exposure is started adjacent to the exposure line.
  • FIG. 6B is a block diagram of the information communication apparatus according to the present embodiment.
  • the information communication device K90 in the present embodiment is an information communication device that acquires information from a subject, and includes constituent elements K91 to K93.
  • the information communication apparatus K90 causes a plurality of bright lines corresponding to a plurality of exposure lines included in the image sensor to be generated in response to a change in luminance of the subject in an image obtained by photographing the subject by an image sensor.
  • an exposure time setting unit K91 for setting an exposure time of the image sensor, and the image sensor for acquiring a bright line image including the plurality of bright lines by photographing the subject whose luminance changes with the set exposure time.
  • an information acquisition unit K93 that acquires information by demodulating data specified by the patterns of the plurality of bright lines included in the acquired bright line image, and the plurality of exposure lines. Each of these starts exposure at different times sequentially and is adjacent to the exposure line. From the exposure of the down is completed after a predetermined idle time has elapsed, exposure is started.
  • each of the plurality of exposure lines is exposed to the adjacent exposure line adjacent to the exposure line. Since exposure is started after a lapse of a predetermined idle time after the end, it is possible to easily recognize a change in luminance of the subject. As a result, information can be appropriately acquired from the subject.
  • each component may be configured by dedicated hardware or may be realized by executing a software program suitable for each component.
  • Each component may be realized by a program execution unit such as a CPU or a processor reading and executing a software program recorded on a recording medium such as a hard disk or a semiconductor memory.
  • the program causes the computer to execute the information communication method shown by the flowchart of FIG. 6A.
  • shooting in the normal shooting mode or normal shooting mode is referred to as normal shooting
  • shooting in the visible light communication mode or visible light communication mode is referred to as visible light shooting (visible light communication).
  • shooting in an intermediate mode may be used, and an intermediate image may be used instead of a composite image described later.
  • FIG. 7 is a diagram illustrating an example of the photographing operation of the receiver in this embodiment.
  • the receiver 8000 switches the shooting mode to normal shooting, visible light communication, normal shooting, and so on. Then, the receiver 8000 generates a composite image in which the bright line pattern, the subject, and the surrounding area are clearly displayed by combining the normal captured image and the visible light communication image, and displays the composite image on the display. .
  • This composite image is an image generated by superimposing the bright line pattern of the visible light communication image on the portion where the signal in the normal captured image is transmitted. Further, the bright line pattern, the subject, and the surroundings displayed by the composite image are clear and have a sharpness sufficiently recognized by the user. By displaying such a composite image, the user can more clearly know from where or from where the signal is transmitted.
  • FIG. 8 is a diagram illustrating another example of the photographing operation of the receiver in this embodiment.
  • the receiver 8000 includes a camera Ca1 and a camera Ca2.
  • the camera Ca1 performs normal photographing
  • the camera Ca2 performs visible light photographing.
  • the camera Ca1 acquires the normal captured image as described above
  • the camera Ca2 acquires the visible light communication image as described above.
  • the receiver 8000 generates the above-described combined image by combining the normal captured image and the visible light communication image, and displays the combined image on the display.
  • FIG. 9 is a diagram illustrating another example of the photographing operation of the receiver in this embodiment.
  • the camera Ca1 switches the shooting mode to normal shooting, visible light communication, normal shooting, and so on.
  • the camera Ca2 continuously performs normal shooting.
  • the receiver 8000 receives from the normal shooting images acquired by these cameras using stereo vision (the principle of triangulation).
  • the distance from the machine 8000 to the subject (hereinafter referred to as subject distance) is estimated.
  • FIG. 10 is a diagram illustrating an example of the display operation of the receiver in this embodiment.
  • the receiver 8000 switches the photographing mode to visible light communication, normal photographing, visible light communication, and so on.
  • the receiver 8000 activates an application program when performing visible light communication for the first time.
  • the receiver 8000 estimates its own position based on the signal received by visible light communication.
  • the receiver 8000 displays AR (Augmented Reality) information on the normal shot image acquired by the normal shooting.
  • This AR information is acquired based on the position estimated as described above.
  • the receiver 8000 estimates the movement and direction change of the receiver 8000 based on the detection result of the 9-axis sensor and the motion detection of the normal captured image, and matches the estimated movement and direction change. To move the display position of the AR information.
  • the AR information can be made to follow the subject image of the normal captured image.
  • the receiver 8000 switches the shooting mode from the normal shooting to the visible light communication, the AR information is superimposed on the latest normal shooting image acquired at the time of the normal shooting immediately before the visible light communication.
  • the receiver 8000 displays a normal captured image on which the AR information is superimposed.
  • the receiver 8000 estimates the movement and direction change of the receiver 8000 on the basis of the detection result by the 9-axis sensor, and AR in accordance with the estimated movement and direction change.
  • Move information and normal captured images Thereby, AR information can be made to follow the subject image of the normal captured image in accordance with the movement of the receiver 8000 or the like in the case of visible light communication as in the case of normal imaging. Further, the normal image can be enlarged and reduced in accordance with the movement of the receiver 8000 or the like.
  • FIG. 11 is a diagram showing an example of the display operation of the receiver in this embodiment.
  • the receiver 8000 may display the composite image on which the bright line pattern is projected, as shown in FIG.
  • the receiver 8000 normally captures a signal explicit object that is an image having a predetermined color for notifying that a signal is transmitted instead of the bright line pattern.
  • a composite image may be generated by superimposing on the image, and the composite image may be displayed.
  • the receiver 8000 normally has a location where a signal is transmitted indicated by a dotted frame and an identifier (for example, ID: 101, ID: 102, etc.).
  • the captured image may be displayed as a composite image.
  • the receiver 8000 recognizes a signal that is an image having a predetermined color for notifying that a specific type of signal is transmitted instead of the bright line pattern.
  • a composite image may be generated by superimposing an object on a normal captured image, and the composite image may be displayed.
  • the color of the signal identification object differs depending on the type of signal output from the transmitter. For example, when the signal output from the transmitter is position information, a red signal identification object is superimposed, and when the signal output from the transmitter is a coupon, the green signal identification object is Superimposed.
  • FIG. 12 is a diagram illustrating an example of the operation of the receiver in this embodiment.
  • the receiver 8000 may display a normal captured image and output a sound for notifying the user that the transmitter has been found.
  • the receiver 8000 varies the type of output sound, the number of outputs, or the output time depending on the number of transmitters found, the type of received signal, or the type of information specified by the signal. It may be allowed.
  • FIG. 13 is a diagram illustrating another example of the operation of the receiver in this embodiment.
  • the receiver 8000 when the user touches the bright line pattern displayed in the composite image, the receiver 8000 generates an information notification image based on a signal transmitted from the subject corresponding to the touched bright line pattern, and the information notification Display an image.
  • This information notification image indicates, for example, a store coupon or a place.
  • the bright line pattern may be a signal explicit object, a signal identification object, a dotted line frame, or the like shown in FIG. The same applies to the bright line patterns described below.
  • FIG. 14 is a diagram illustrating another example of the operation of the receiver in this embodiment.
  • the receiver 8000 when the user touches the bright line pattern displayed in the composite image, the receiver 8000 generates an information notification image based on a signal transmitted from the subject corresponding to the touched bright line pattern, and the information notification Display an image.
  • the information notification image indicates the current location of the receiver 8000 by a map or the like.
  • FIG. 15 is a diagram illustrating another example of the operation of the receiver in this embodiment.
  • the receiver 8000 when the user performs a swipe on the receiver 8000 on which the composite image is displayed, the receiver 8000 performs normal shooting having a dotted frame and an identifier, similar to the normal shot image illustrated in FIG. An image is displayed and a list of information is displayed so as to follow the swipe operation. In this list, information specified by a signal transmitted from a location (transmitter) indicated by each identifier is shown.
  • the swipe may be, for example, an operation of moving a finger from outside the right side of the display in the receiver 8000.
  • the swipe may be an operation of moving a finger from the upper side, the lower side, or the left side of the display.
  • the receiver 8000 may display an information notification image (for example, an image showing a coupon) showing the information in more detail.
  • an information notification image for example, an image showing a coupon
  • FIG. 16 is a diagram illustrating another example of the operation of the receiver in this embodiment.
  • the receiver 8000 displays the information notification image superimposed on the composite image so as to follow the swipe operation.
  • This information notification image shows the subject distance with an arrow in an easy-to-understand manner for the user.
  • the swipe may be, for example, an operation of moving a finger from outside the lower side of the display in the receiver 8000.
  • the swipe may be an operation of moving a finger from the left side of the display, from the upper side, or from the right side.
  • FIG. 17 is a diagram illustrating another example of the operation of the receiver in this embodiment.
  • the receiver 8000 images a transmitter, which is a signage indicating a plurality of stores, as a subject, and displays a normal captured image acquired by the imaging.
  • the receiver 8000 when the user taps the signage image of one store included in the subject displayed in the normal captured image, the receiver 8000 generates an information notification image based on a signal transmitted from the signage of the store Then, the information notification image 8001 is displayed.
  • This information notification image 8001 is an image showing, for example, a vacant seat situation in a store.
  • the information communication method is an information communication method for acquiring information from a subject, and an bright line corresponding to an exposure line included in the image sensor is included in an image obtained by photographing the subject with an image sensor.
  • a first exposure time setting step for setting an exposure time of the image sensor so as to occur in accordance with a change in luminance of the subject; and the image sensor photographs the subject whose luminance changes with the set exposure time.
  • the bright line image acquisition step of acquiring a bright line image that is an image including the bright line, and the spatial position of the portion where the bright line appears can be identified based on the bright line image, and the subject and the subject
  • An image display step for displaying a display image in which the surroundings of the subject are projected, and before the image is included in the acquired bright line image Including an information acquisition step of acquiring transmission information by demodulating the data identified by the pattern of bright lines.
  • a composite image or an intermediate image as shown in FIGS. 7, 8, and 11 is displayed as a display image.
  • the spatial position of the part where the bright line appears is identified by a bright line pattern, a signal explicit object, a signal identification object, a dotted line frame, or the like. Therefore, the user can easily find a subject that is transmitting a signal due to a change in luminance by viewing such a display image.
  • the information communication method further includes a second exposure time setting step for setting an exposure time longer than the exposure time, and the image sensor photographs the subject and the surroundings of the subject with the long exposure time.
  • a composite step of generating a composite image by superimposing the captured image, and the composite image may be displayed as the display image in the image display step.
  • the signal object is a bright line pattern, a signal explicit object, a signal identification object, a dotted line frame, or the like, and a composite image is displayed as a display image as shown in FIGS.
  • the user can more easily find the subject that is transmitting the signal due to the luminance change.
  • an exposure time is set to 1/3000 sec.
  • the bright line image acquisition step the bright line image in which the periphery of the subject is projected is acquired, and in the image display step.
  • the bright line image may be displayed as the display image.
  • the bright line image is acquired and displayed as an intermediate image. Therefore, it is not necessary to perform processing such as acquiring and synthesizing the normal captured image and the visible light communication image, and the processing can be simplified.
  • the image sensor includes a first image sensor and a second image sensor.
  • the first image sensor captures the normal captured image, and the bright line is acquired.
  • the bright line image may be acquired by capturing the second image sensor simultaneously with the capturing of the first image sensor.
  • a normal photographed image and a visible light communication image that is a bright line image are acquired by each camera. Therefore, compared with the case where a normal captured image and a visible light communication image are acquired with one camera, those images can be acquired earlier, and the processing can be speeded up.
  • the information communication method when the part where the bright line appears in the display image is designated by a user operation, the information communication method further includes the transmission information acquired from the pattern of the bright line of the designated part.
  • An information presentation step of presenting presentation information based on the information may be included.
  • the operation by the user is shown in association with a tap, swipe, an operation in which a fingertip is continuously applied to the part for a predetermined time, an operation in which a line of sight is directed to the part for a predetermined time or more,
  • the presentation information is displayed as an information notification image. Thereby, desired information can be presented to the user.
  • a bright line corresponding to an exposure line included in the image sensor is generated in an image obtained by photographing the subject by an image sensor according to a change in luminance of the subject.
  • the first exposure time setting step of setting the exposure time of the image sensor, and the image including the bright line by the image sensor photographing the subject whose luminance changes with the set exposure time.
  • the plurality of subjects are photographed during the period in which the image sensor is moved.
  • the information communication method By acquiring the bright line image including a plurality of parts where the bright lines appear, and in the information acquisition step, for each part, by demodulating data specified by the pattern of the bright lines of the parts, The information communication method further estimates the position of the image sensor based on the acquired positions of the plurality of subjects and the movement state of the image sensor. A position estimation step may be included.
  • a bright line corresponding to an exposure line included in the image sensor is generated in an image obtained by photographing the subject by an image sensor according to a change in luminance of the subject.
  • the first exposure time setting step of setting the exposure time of the image sensor, and the image including the bright line by the image sensor photographing the subject whose luminance changes with the set exposure time is performed.
  • the user can be authenticated and the convenience can be improved.
  • a bright line corresponding to an exposure line included in the image sensor is generated in an image obtained by photographing the subject by an image sensor according to a change in luminance of the subject.
  • the exposure time setting step for setting the exposure time of the image sensor, and the image sensor captures the bright line image including the bright line by photographing the subject whose luminance changes at the set exposure time.
  • the image acquisition step the image is reflected on the reflection surface.
  • the bright line image is acquired by photographing a plurality of the subjects, and the information acquisition step is performed.
  • the bright line is separated into bright lines corresponding to each of the plurality of subjects according to the intensity of the bright lines included in the bright line image, and each subject is specified by a bright line pattern corresponding to the subject.
  • Information may be acquired by demodulating the data.
  • a bright line corresponding to an exposure line included in the image sensor is generated in an image obtained by photographing the subject by an image sensor according to a change in luminance of the subject.
  • the exposure time setting step for setting the exposure time of the image sensor and the image sensor captures the bright line image including the bright line by photographing the subject whose luminance changes at the set exposure time.
  • the image acquisition step the image is reflected on the reflection surface.
  • the bright line image is acquired by photographing the subject, and the information communication method includes: , Based on the luminance distribution in the emission line image may include position estimation step for estimating the position of the object.
  • the luminance change when the luminance change is switched between the luminance change according to the first pattern and the luminance change according to the second pattern, it may be switched with a buffer time.
  • An information communication method for transmitting a signal according to a luminance change wherein a determination step of determining a luminance change pattern by modulating a signal to be transmitted, and the light emitter changes in luminance according to the determined pattern And transmitting the signal to be transmitted, the signal comprising a plurality of large blocks, each of the plurality of large blocks including first data and a preamble for the first data. And a check signal for the first data, wherein the first data is composed of a plurality of small blocks, and the small blocks include second data, a preamble for the second data, and the first data. And a check signal for the second data may be included.
  • An information communication method for transmitting a signal by luminance change wherein a plurality of transmitters each modulate a signal to be transmitted to determine a luminance change pattern, and for each transmitter, And a transmitting step in which a light emitter provided in the transmitter changes the luminance according to the determined pattern and transmits the signal to be transmitted.
  • signals having different frequencies or protocols are transmitted. May be.
  • An information communication method for transmitting a signal by luminance change wherein a plurality of transmitters each modulate a signal to be transmitted to determine a luminance change pattern, and for each transmitter, A transmitter in which a light emitter provided in the transmitter transmits a signal to be transmitted by changing in luminance according to the determined pattern, wherein in the transmitting step, one of the plurality of transmitters One transmitter may receive a signal transmitted from the other transmitter and transmit another signal in a manner that does not interfere with the received signal.
  • FIG. 18A shows an example of a usage form of the present invention in a train platform.
  • the user holds the portable terminal over an electronic bulletin board or lighting, and obtains information displayed on the electronic bulletin board, train information of a station where the electronic bulletin board is installed, information on the premises of the station, or the like by visible light communication.
  • the information itself displayed on the electronic bulletin board may be transmitted to the portable terminal by visible light communication, or ID information corresponding to the electronic bulletin board is transmitted to the portable terminal, and the ID information acquired by the portable terminal
  • the information displayed on the electronic bulletin board may be acquired by inquiring the server.
  • the server transmits the content displayed on the electronic bulletin board to the mobile terminal based on the ID information.
  • the train ticket information stored in the memory of the mobile terminal is compared with the information displayed on the electronic bulletin board, and the ticket information corresponding to the user's ticket is displayed on the electronic bulletin board.
  • An arrow indicating the destination to the home where the user's scheduled train arrives is displayed on the display.
  • the route to that seat may be displayed.
  • the arrow When the arrow is displayed, it can be displayed more easily by displaying the arrow using the same color as the color of the train route in the map or the train guide information.
  • the user's reservation information (home number, vehicle number, departure time, seat number) can also be displayed. By displaying the user reservation information together, it is possible to prevent erroneous recognition.
  • the ticket information is stored in the server, query the server from the mobile terminal to obtain and compare the ticket information, or compare the ticket information with the information displayed on the electronic bulletin board on the server side. Thus, information related to the ticket information can be acquired.
  • the target route may be estimated from the history of the user performing a transfer search, and the route may be displayed.
  • the contents displayed on the electronic bulletin board may be acquired and compared.
  • Information related to the user may be highlighted with respect to the display of the electronic bulletin board on the display, or may be rewritten and displayed.
  • an arrow for guiding to the boarding place on each route may be displayed.
  • an arrow for guiding to a store or restroom may be displayed on the display.
  • the user's behavior characteristics may be managed in advance by a server, and an arrow for guiding the user to a store / restroom may be displayed on the display when the user often stops at a store / restaurant in the station.
  • FIG. 18A shows an example of a train, it is possible to perform display with a similar configuration even on an airplane or a bus.
  • a portable terminal such as a smartphone (that is, a receiver such as a receiver 200 described later) captures a visible light signal from the electronic bulletin board by imaging the electronic bulletin board as illustrated in (1) of FIG. 18A. Is received as an optical ID or optical data.
  • the mobile terminal performs self-position estimation. That is, the mobile terminal acquires the position on the map of the electronic bulletin board indicated directly or indirectly by the optical data. Then, the mobile terminal, for example, with respect to the electronic bulletin board based on its own posture obtained by the 9-axis sensor and the position, shape, size, etc. in the image of the electronic bulletin board displayed in the image obtained by imaging. Calculate the relative position of the mobile terminal.
  • the portable terminal estimates its own position, which is the position of the portable terminal on the map, based on the position of the electronic bulletin board on the map and its relative position.
  • the mobile terminal searches for a route from the starting point that is the self-location to a destination indicated by, for example, ticket information, and starts navigation for guiding the user to the destination along the route.
  • the mobile terminal may transmit information indicating the starting point and the destination to the server, and obtain the above-described route searched by the server from the server. At this time, the mobile terminal may acquire a map including the route from the server.
  • the mobile terminal In navigation, as shown in (2) to (4) of FIG. 18A, the mobile terminal repeatedly captures images by the camera, and sequentially displays normal captured images obtained by the imaging in real time, while indicating an arrow indicating the user's destination Or the like is superimposed on the normal captured image. The user moves according to the displayed direction instruction image while carrying the portable terminal. Then, the mobile terminal updates the self-position of the mobile terminal based on the movement of the object or the feature point displayed in each of the above normal captured images. For example, the portable terminal detects the movement of the object or feature point displayed in each of the above-described normal captured images, and estimates the movement direction and movement distance of the portable terminal based on the movement.
  • a portable terminal updates the present self position based on the estimated moving direction and moving distance, and the self position estimated in (1) of FIG. 18A.
  • This self-position update may be performed every frame period of the normal captured image, or may be performed every period longer than the frame period. That is, when the mobile terminal is on the underground floor or route, the mobile terminal cannot acquire GPS data. Therefore, in such a case, the mobile terminal estimates or updates its own position based on the movement of the above-described feature points of each normal captured image without using GPS data.
  • the mobile terminal may guide the elevator to the user during the route to the destination.
  • the mobile terminal receives the optical data, and FIG.
  • the self-position is estimated. For example, even when a user gets on an elevator, the mobile terminal transmits optical data transmitted from a transmitter (that is, a transmitter such as a transmitter 100 described later) installed as an illumination device or the like inside the elevator cage. Receive.
  • the light data directly or indirectly indicates the floor on which the elevator car is currently located.
  • the mobile terminal can identify the floor on which the mobile terminal is currently located by receiving the optical data. If the current position of the bag is not directly indicated by the optical data, the mobile terminal transmits the information indicated by the optical data to the server, and stores the floor information associated with the information in the server. , Get from that server. Thus, the mobile terminal specifies the floor indicated by the floor information as the floor where the mobile terminal is currently located. The floor specified in this way is treated as a self-position.
  • the terminal device replaces the self-position derived from the movement of the feature points of the normal captured image with the self-position derived using the optical data. , Reset the self-position.
  • the mobile terminal performs the same processing as (2) to (4) of FIG. 18A if the user does not reach the destination after getting off the elevator. While navigating. Moreover, the portable terminal repeatedly confirms whether GPS data can be acquired during navigation. Therefore, the portable terminal determines that the GPS data can be acquired when it goes up from the underground floor or route. And a portable terminal switches the estimation method of a self-position from the estimation method based on motion, such as a feature point, to the estimation method based on GPS data. Then, as shown in (9) of FIG. 18A, the mobile terminal continues to execute navigation until the user arrives at the destination while estimating its own position based on the GPS data. Note that since the mobile terminal cannot acquire GPS data, for example, when the user enters the basement again, the self-position estimation method is changed from the estimation method based on GPS data to the estimation method based on the movement of feature points or the like. Switch to.
  • FIG. 18A will be described in detail.
  • a receiver mounted as a wearable device such as a smartphone or smart glass receives the visible light signal (optical data) transmitted from the transmitter in (1) of FIG. 18A.
  • the transmitter is implemented, for example, as an illumination signboard, a poster, or illumination that illuminates an image.
  • the receiver starts navigation to the destination according to the received optical data, information preset in the receiver, and a user instruction.
  • the receiver transmits optical data to the server and obtains navigation information associated with the data.
  • the navigation information includes the following first information to sixth information.
  • the first information is information indicating the position and shape of the transmitter.
  • the second information is information indicating a route to the destination.
  • the third information is information on another transmitter on and near the route to the destination.
  • the information of another transmitter indicates the optical data transmitted by the transmitter, the position and shape of the transmitter, and the position and shape of the reflected light.
  • the fourth information is position specifying information regarding the route and its vicinity.
  • the position specifying information is radio wave information or sound wave information for specifying an image feature amount or position.
  • the fifth information is information indicating the distance to the destination and the estimated arrival time.
  • the sixth information is part or all of the content information for performing the AR display.
  • the navigation information may be stored in advance in the receiver. Note that the above-described shape may include a size.
  • the receiver uses the relative position between the transmitter and the receiver, which is calculated from the way the transmitter is reflected in the image obtained by imaging and the sensor value of the acceleration sensor, and the position information of the transmitter.
  • the self position is estimated, and the self position is set as the starting point of navigation.
  • the receiver may start navigation by estimating the receiver's own position not by optical data but by image feature values, barcodes or two-dimensional codes, radio waves, or sound waves.
  • the receiver displays the navigation to the destination as shown in (2) of FIG. 18A.
  • the navigation display may be an AR display in which another image is superimposed on a normal captured image obtained by imaging by a camera, may be a map display, may be an instruction by voice or vibration, A combination of these may also be used.
  • the display method may be selected by a receiver, optical data, or settings on the server. Any setting may be prioritized. If the destination (that is, the destination) is a boarding place for transportation, the receiver acquires the timetable and displays the reserved time or the departure time or boarding time near the expected arrival time. Also good. If the destination is a theater or the like, the receiver may display the start time or the admission deadline.
  • the receiver advances navigation according to the movement of the receiver as shown in (3) and (4) of FIG. 18A.
  • the receiver determines from the moving distance between the images of the feature points in the plurality of images, to the moving distance of the receiver during the imaging of those images.
  • the direction may be estimated.
  • the receiver may estimate the moving distance and direction of the receiver from the transition of the acceleration sensor, radio waves, or sound waves. Further, the receiver may estimate the moving distance and direction of the receiver by SLAM (Simultaneous Localization and Mapping) or PTAM (Parallel Tracking and Mapping).
  • SLAM Simultaneous Localization and Mapping
  • PTAM Parallel Tracking and Mapping
  • the receiver when the receiver receives optical data different from the optical data received in (1) of FIG. 18A, for example, outside the elevator, the receiver sends the optical data to the server.
  • the shape and position of the transmitter associated with the data may be obtained.
  • the receiver may estimate the receiver's own position by the same method as (1) in FIG. 18A. Thereby, the receiver corrects the current position of navigation by eliminating the error of the receiver's self-position estimation that has occurred in the processes (3) and (4) of FIG. 18A. If the receiver receives only part of the visible light signal and complete optical data cannot be obtained, the transmitter that is nearest to the navigation information is the transmitter that is transmitting the visible light signal.
  • receiver's self-position is estimated in the same manner as described above. This allows transmitters with poor reception conditions, such as small transmitters, remote transmitters, or dark transmitters, to be used for receiver self-position estimation. it can.
  • the receiver receives optical data by reflected light in (6) of FIG. 18A.
  • the receiver identifies that the medium of the received optical data is reflected light based on the imaging direction, the light intensity, or the clarity of the outline.
  • the receiver identifies the position of the reflected light (that is, the position on the map) from the navigation information, and estimates the center of the reflected light area being imaged as the position of the reflected light. .
  • the receiver estimates the receiver's own position and corrects the current navigation position.
  • the receiver When the receiver receives a signal for specifying the position of GPS, GLONASS, Galileo, Hokuto satellite positioning system, IRNSS, etc., the receiver specifies the position of the receiver based on the signal, and the current position of navigation (ie, self-position). ) Is corrected. If the strength of the signal is sufficient, that is, if the strength is higher than a predetermined strength, the receiver estimates the self-position based only on the signal. If the strength is equal to or lower than the predetermined strength, (3) in FIG. 18A and The method used in (4) may be used in combination.
  • the receiver When the receiver receives a visible light signal, [1] a radio signal having a predetermined ID received simultaneously with the visible light signal, [2] a radio signal having a predetermined ID received last, or [3]
  • the information indicating the position of the last estimated receiver is transmitted to the server together with the information indicated by the visible light signal.
  • the transmitter which transmits the visible light signal is specified.
  • the receiver receives a visible light signal with an algorithm specified by the above-described radio signal or information indicating the position of the receiver, and is indicated by the visible light signal to a server specified in the same manner as described above. Information may be transmitted.
  • the receiver may estimate the self-position and display information on products near the self-position. Further, the receiver may navigate to the position of the product designated by the user. In addition, the receiver may present an optimal route to go around all the locations of the plurality of products specified by the user. The optimum route is the route with the shortest distance, the route with the shortest required time, or the route with the least amount of movement effort. Further, the receiver may perform navigation so as to pass through a predetermined place in addition to the product or place designated by the user. Thereby, the advertisement of a predetermined place, or the goods or store in the place can be performed.
  • FIG. 18B is a diagram for describing navigation by the receiver 200 in the elevator according to the present embodiment.
  • AR navigation is a navigation function that guides a user to a destination by superimposing a direction indicating image such as an arrow on a normal captured image.
  • AR navigation is also simply referred to as navigation or navigation.
  • the receiver receives an optical signal (that is, a visible light signal, optical data, or optical ID) from a transmitter disposed in the elevator cage.
  • an optical signal that is, a visible light signal, optical data, or optical ID
  • Receive. a receiver acquires elevator ID and floor information based on the optical signal.
  • the elevator ID is identification information for identifying the elevator in which the transmitter is arranged or the cage
  • the floor information is information indicating the floor (or the number of floors) where the cage is currently located.
  • the receiver transmits an optical signal (or information indicated by the optical signal) to a server, and obtains an elevator ID and rank information associated with the optical signal at the server from the server.
  • the transmitter may always transmit the same optical signal regardless of the floor of the elevator, and may transmit different optical signals depending on the floor on which the fence is located.
  • the transmitter is configured as a lighting device, for example.
  • the light from this transmitter shines brightly inside the elevator car. Therefore, the receiver can directly receive the optical signal superimposed on such light from the transmitter, and can also indirectly receive the signal through reflection from the inner wall surface or floor of the fence.
  • the receiver is located at the current position according to the elevator ID and floor number information acquired based on the optical signal transmitted from the transmitter even when the cage containing the receiver is rising. Sequentially identifies the floors. Then, as shown in (3) of FIG. 18B, when the floor on which the receiver is currently located is the target floor, the receiver displays a message or an image prompting to get off the elevator on the display of the receiver. To do. Further, the receiver may output a sound prompting to get off the elevator.
  • the receiver uses the estimation method using the movement of the feature points of the normal captured image as described above (( As shown in 4), the above-mentioned AR navigation is resumed while estimating the self-position.
  • the receiver determines its own position by an estimation method using the GPS data as shown in (4) of FIG. 18B. The above-mentioned AR navigation is resumed while estimating.
  • FIG. 18C is a diagram illustrating an example of a system configuration provided in the elevator according to the present embodiment.
  • the transmitter 100 which is the above-mentioned transmitter, is installed in the elevator cage 420.
  • the transmitter 100 is disposed on the ceiling of the eaves 420 as an illuminating device for the eaves 420 of the elevator.
  • the transmitter 100 also includes a built-in camera 404 and a microphone 411.
  • the built-in camera 404 takes an image of the inside of the bag 420, and the microphone 411 collects the sound inside the bag 420.
  • a monitoring camera system 401 is a system having at least one camera that captures an image of the interior of the bag 420.
  • the floor display unit 414 displays the floor on which the bag 420 is currently located.
  • the sensor 403 includes, for example, at least one of an atmospheric pressure sensor and an acceleration sensor.
  • the elevator also includes an image recognition unit 402, a current floor detection unit 405, a light modulation unit 406, a light emitting circuit 407, a wireless unit 409, and a voice recognition unit 410.
  • the image recognizing unit 402 recognizes a character (that is, a floor) displayed on the floor display unit 414 from an image obtained by imaging by the monitoring camera system 401 or the built-in camera 404, and obtains current floor data obtained by the recognition. Output.
  • the current floor data indicates the number of floors displayed in the floor number display unit 414.
  • the voice recognition unit 410 recognizes the floor where the bag 420 is currently located based on the voice data output from the microphone 411, and outputs floor data indicating the floor.
  • the current floor detection unit 405 detects the floor on which the bag 420 is currently located based on data output from at least one of the sensor 403, the image recognition unit 402, and the voice recognition unit 410. Then, the current floor detection unit 405 outputs information indicating the detected floor to the light modulation unit 406.
  • the light modulation unit 406 modulates the signal indicating the floor output from the current floor detection unit 405 and the signal indicating the elevator ID, and outputs the modulated signal to the light emitting circuit 407.
  • the light emitting circuit 407 changes the luminance of the transmitter 100 in accordance with the modulated signal. As a result, the visible light signal, the optical signal, the optical data, or the optical ID, which indicates the floor where the fence 420 is currently located and the elevator ID, is transmitted from the transmitter 100.
  • the radio unit 409 modulates information indicating the floor output from the current floor detection unit 405 and a signal indicating the elevator ID, and transmits the modulated signal by radio.
  • the wireless unit 409 transmits a signal by Wi-Fi or Bluetooth.
  • the receiver 200 can identify the floor and the elevator ID where the receiver 200 is currently located by receiving at least one of the radio signal and the optical signal.
  • the elevator may include a current floor detection unit 412 having the above-described floor number display unit 414.
  • the current floor detection unit 412 includes an elevator control unit 413 and a floor number display unit 414.
  • the elevator control unit 413 controls the elevating and stopping of the eaves 420. Such an elevator control unit 413 keeps track of the floor on which the fence 420 is currently located.
  • the elevator control unit 413 may output data indicating the grasped floor to the light modulation unit 406 and the radio unit 409 as current floor data.
  • the receiver 200 can realize the AR navigation shown in FIGS. 18A and 18B.
  • FIG. 19 is a diagram illustrating an example of application of the transmission / reception system in the second embodiment.
  • the receiver 8955a receives, for example, the transmission ID of the transmitter 8955b configured as a guide plate, acquires the map data displayed on the guide plate from the server, and displays the map data.
  • the server may transmit an advertisement suitable for the user of the receiver 8955a, and the receiver 8955a may also display this advertisement information.
  • the receiver 8955a displays a route from the current location to a location designated by the user.
  • FIG. 20 is a diagram illustrating an example of an application example of the transmission and reception system in the second embodiment.
  • the receiver 8957a receives the ID transmitted from the transmitter 8957b configured as a signboard, for example, acquires coupon information from the server, and displays the coupon information.
  • the receiver 8957a stores subsequent user actions such as saving a coupon, moving to a store displayed on the coupon, shopping at the store, and leaving without saving the coupon. Save to 8957c.
  • the subsequent behavior of the user who has acquired information from the sign 8957b can be analyzed, and the advertising value of the sign 8957b can be estimated.
  • the information communication method in the present embodiment is an information communication method for acquiring information from a subject, and each exposure included in the image sensor is included in an image obtained by photographing the first subject that is the subject by an image sensor.
  • a first exposure time setting step for setting a first exposure time of the image sensor so that a plurality of bright lines corresponding to a line are generated according to a change in luminance of the first subject;
  • a first bright line image acquisition step of acquiring a first bright line image that is an image including the plurality of bright lines by photographing the first subject that changes with the set first exposure time;
  • the first transmission information is acquired by demodulating the data specified by the plurality of bright line patterns included in the acquired first bright line image.
  • Comprising a first information obtaining step after the first transmission information is acquired by sending a control signal, and a door control steps to open the door against the opening and closing devices of the door.
  • the information communication method may further include a second image in which the image sensor includes a plurality of bright lines by photographing the second subject whose luminance changes with the set first exposure time.
  • Second transmission information is acquired by demodulating data specified by a pattern of the plurality of bright lines included in the acquired second bright line image and a second bright line image acquiring step of acquiring a bright line image A second information acquisition step; and an approach determination step of determining whether or not a receiving device including the image sensor is approaching the door based on the acquired first and second transmission information.
  • the control signal may be transmitted when it is determined that the receiving device is approaching the door.
  • the door can be opened only when the receiving device (receiver) approaches the door, that is, only at an appropriate timing.
  • a second exposure time setting step for setting a second exposure time longer than the first exposure time and the image sensor sets a third subject.
  • the charge is read after a predetermined time has elapsed from the time when the charge is read for the exposure line adjacent to the exposure line, and the first bright line image obtaining step is performed. Then, without using the optical black for the charge readout, the optical black in the image sensor is different from the optical black.
  • the charge is read after a time longer than the predetermined time from when the charge is read for the exposure line adjacent to the exposure line. Also good.
  • the readout (exposure) of charges from the optical black is not performed. Therefore, the readout (exposure) of charges from the effective pixel area, which is an area other than the optical black in the image sensor, is performed. This time can be lengthened. As a result, it is possible to increase the time for receiving a signal in the effective pixel region, and it is possible to acquire many signals.
  • the length in the direction perpendicular to each of the plurality of bright lines in the plurality of bright line patterns included in the first bright line image is less than a predetermined length.
  • the frame rate is reduced and the bright line is renewed.
  • An image is acquired as a third bright line image.
  • the length of the bright line pattern included in the third bright line image can be increased, and the transmitted signal can be acquired for one block.
  • the information communication method further includes a ratio setting step for setting a ratio between a vertical width and a horizontal width of an image obtained by the image sensor, and the first bright line image acquisition step includes the set ratio.
  • a clipping determination step for determining whether or not an end in a direction perpendicular to each exposure line in the image is clipped, and when it is determined that the end is clipped, the ratio set in the ratio setting step Changing the ratio to a non-clipping ratio which is a ratio at which the edge is not clipped, and the image sensor captures the first bright line with the non-clipping ratio by photographing the first subject whose luminance changes.
  • the ratio of the horizontal width to the vertical width of the effective pixel area of the image sensor is 4: 3
  • the ratio of the horizontal width to the vertical width of the image is set to 16: 9, and a bright line along the horizontal direction appears. That is, when the exposure line is along the horizontal direction, it is determined that the upper end and the lower end of the image are clipped. That is, it is determined that the end of the first bright line image is missing.
  • the ratio of the image is changed to 4: 3, which is a ratio that is not clipped.
  • the information communication method further includes a compression step of generating a compressed image by compressing the first bright line image in a direction parallel to each of the plurality of bright lines included in the first bright line image. And a compressed image transmission step of transmitting the compressed image.
  • the information communication method further determines that the receiving device including the image sensor has been moved in a predetermined manner, and a gesture determination step for determining whether or not the receiving device has been moved in a predetermined manner.
  • a gesture determination step for determining whether or not the receiving device has been moved in a predetermined manner.
  • an activation step of activating the image sensor may be included.
  • FIG. 21 is a diagram illustrating an application example of the transmitter and the receiver in the second embodiment.
  • the robot 8970 has, for example, a function as a self-propelled cleaner and a function as a receiver in each of the above embodiments.
  • the lighting devices 8971a and 8971b each have a function as a transmitter in each of the above embodiments.
  • the robot 8970 performs cleaning while moving in the room and photographs the lighting device 8971a that illuminates the room.
  • the lighting device 8971a transmits the ID of the lighting device 8971a by changing the luminance.
  • the robot 8970 receives the ID from the lighting device 8971a and estimates its own position (self-position) based on the ID as in the above embodiments. That is, the robot 8970 moves itself based on the detection result by the 9-axis sensor, the relative position of the lighting device 8971a reflected in the image obtained by photographing, and the absolute position of the lighting device 8971a specified by the ID. Is estimated.
  • the robot 8970 when the robot 8970 moves away from the lighting device 8971a by moving, the robot 8970 transmits a signal to turn off the lighting device 8971a (turn-off command). For example, when the robot 8970 leaves the lighting device 8971a by a predetermined distance, the robot 8970 transmits a turn-off command. Alternatively, the robot 8970 transmits a turn-off command to the lighting device 8971a when the lighting device 8971a does not appear in the image obtained by shooting or when another lighting device appears in the image. When the lighting device 8971a receives a turn-off command from the robot 8970, the lighting device 8971a turns off according to the turn-off command.
  • turn-off command For example, when the robot 8970 leaves the lighting device 8971a by a predetermined distance, the robot 8970 transmits a turn-off command. Alternatively, the robot 8970 transmits a turn-off command to the lighting device 8971a when the lighting device 8971
  • the robot 8970 detects that it has approached the lighting device 8971b based on the estimated self-position while moving and cleaning. That is, the robot 8970 holds information indicating the position of the lighting device 8971b, and when the distance between the self position and the position of the lighting device 8971b is equal to or less than a predetermined distance, the lighting device 8971b. Detecting that you are approaching. Then, the robot 8970 transmits a signal (lighting command) for instructing lighting to the lighting device 8971b. When the lighting device 8971b receives the lighting command, the lighting device 8971b lights up in accordance with the lighting command.
  • the robot 8970 can brighten only the surroundings while moving and can easily perform cleaning.
  • FIG. 22 is a diagram illustrating an application example of the transmitter and the receiver in the second embodiment.
  • the lighting device 8974 has a function as a transmitter in each of the above embodiments.
  • the lighting device 8974 illuminates a route bulletin board 8975 at a railway station, for example, while changing in luminance.
  • the receiver 8973 pointed to the route bulletin board 8975 by the user photographs the route bulletin board 8975.
  • the receiver 8973 acquires the ID of the route bulletin board 8975, and acquires detailed information about each route described in the route bulletin board 8975, which is information associated with the ID.
  • the receiver 8973 displays a guide image 8973a indicating the detailed information.
  • the guidance image 8973a indicates the distance to the route described on the route bulletin board 8975, the direction toward the route, and the time when the next train arrives on the route.
  • the receiver 8973 displays a supplementary guide image 8973b.
  • This supplementary guide image 8973b is, for example, a user's selection operation of any one of a railway timetable, information on another route different from the route indicated by the guide image 8973a, and detailed information on the station. It is an image for displaying accordingly.
  • FIG. 23 is a diagram illustrating an example of an application according to the third embodiment.
  • a receiver 1800a configured as a smartphone receives a signal (visible light signal) transmitted from a transmitter 1800b configured as, for example, a street digital signage. That is, the receiver 1800a receives the timing of image reproduction by the transmitter 1800b. The receiver 1800a reproduces sound at the same timing as the image reproduction. In other words, the receiver 1800a performs synchronized reproduction of the sound so that the image and sound reproduced by the transmitter 1800b are synchronized. Note that the receiver 1800a may reproduce the same image as the image (reproduced image) reproduced by the transmitter 1800b or a related image related to the reproduced image together with the sound. Further, the receiver 1800a may cause a device connected to the receiver 1800a to reproduce sound and the like. Further, after receiving the visible light signal, the receiver 1800a may download content such as sound or related images associated with the visible light signal from the server. The receiver 1800a performs synchronous reproduction after the download.
  • the user can select the sound that matches the display of the transmitter 1800b. Can hear. Further, even when there is a distance that takes time to reach the voice, it is possible to listen to the voice that matches the display.
  • FIG. 24 is a diagram illustrating an example of an application according to the third embodiment.
  • Each of the receiver 1800a and the receiver 1800c obtains and reproduces audio corresponding to a video such as a movie displayed on the transmitter 1800d from the server, in the language set in the receiver.
  • the transmitter 1800d transmits a visible light signal indicating an ID for identifying the displayed video to the receiver.
  • the receiver transmits a request signal including the ID indicated in the visible light signal and the language set in the receiver to the server.
  • the receiver acquires the audio corresponding to the request signal from the server and reproduces it. Thereby, the user can enjoy the work displayed on the transmitter 1800d in the language set by the user.
  • 25 and 26 are diagrams showing an example of a transmission signal and an example of a voice synchronization method in the third embodiment.
  • Different data are associated with a time every fixed time (N seconds).
  • These data may be, for example, an ID for identifying time, may be time, or may be audio data (for example, 64 Kbps data).
  • the following description is based on the assumption that the data is an ID. Different IDs may have different additional information parts attached to the ID.
  • the packets that make up the ID are different. Therefore, it is desirable that IDs are not continuous.
  • the transmitter 1800d transmits the ID in accordance with the reproduction time of the displayed image, for example.
  • the receiver can recognize the reproduction time (synchronization time) of the image of the transmitter 1800d by detecting the timing when the ID is changed.
  • the synchronization time can be recognized by the following method.
  • (B1) Assume that the midpoint of the receiving section where the ID has changed is the ID changing point. Further, the time after an integer multiple of the time N from the ID change point estimated in the past is also estimated as the ID change point, and the midpoint of the plurality of ID change points is estimated as a more accurate ID change point. With such an estimation algorithm, an accurate ID change point can be gradually estimated.
  • N By setting N to 0.5 seconds or less, it can be synchronized accurately.
  • FIG. 26 is a diagram illustrating an example of a transmission signal in the third embodiment.
  • a time packet is a packet that holds the time of transmission.
  • the time packet is divided into a time packet 1 representing a fine time and a time packet 2 representing a rough time.
  • time packet 2 indicates the hour and minute of the time
  • time packet 1 indicates only the second of the time.
  • a packet indicating the time may be divided into three or more time packets. Since the coarse time is less necessary, the receiver can recognize the synchronization time quickly and accurately by transmitting more fine time packets than coarse time packets.
  • the visible light signal includes the second information (hour packet 2) indicating the hour and minute of the time, and the first information (time packet 1) indicating the second of the time.
  • the time when the visible light signal is transmitted from the transmitter 1800d is indicated.
  • the receiver 1800a receives the second information and receives the first information more times than the number of times of receiving the second information.
  • FIG. 27 is a diagram illustrating an example of a processing flow of the receiver 1800a according to the third embodiment.
  • a processing delay time is designated for the receiver 1800a (step S1801). This may be stored in the processing program or specified by the user. When the user performs correction, it is possible to realize more accurate synchronization according to the individual receiver. This processing delay time can be synchronized more accurately by changing it depending on the receiver model, the temperature of the receiver, and the CPU usage rate.
  • the receiver 1800a determines whether or not a time packet has been received or whether or not an ID associated for voice synchronization has been received (step S1802).
  • the receiver 1800a determines whether it has been received (Y in step S1802), it further determines whether there is an image waiting for processing (step S1804). If it is determined that there is an image waiting for processing (Y in step S1804), the receiver 1800a discards the image waiting for processing or delays processing of the image waiting for processing to receive from the latest acquired image. Is performed (step S1805). Thereby, it is possible to avoid an unexpected delay due to the amount of waiting for processing.
  • the receiver 1800a measures the position in the image where the visible light signal (specifically the bright line) is located (step S1806). In other words, by measuring the position in the direction perpendicular to the exposure line from the first exposure line in the image sensor, the time difference (delay time in the image) from the image acquisition start time to the signal reception time is obtained. Can be calculated.
  • the receiver 1800a can accurately perform synchronized reproduction by reproducing the sound or moving image at the time obtained by adding the processing delay time and the in-image delay time to the recognized synchronization time (step S1807).
  • step S1802 if it is determined in step S1802 that the receiver 1800a has not received the time packet or the voice synchronization ID, the receiver 1800a receives a signal from the image obtained by imaging (step S1803).
  • FIG. 28 is a diagram illustrating an example of a user interface of the receiver 1800a according to the third embodiment.
  • the user can adjust the processing delay time described above by pressing one of the buttons Bt1 to Bt4 displayed on the receiver 1800a.
  • the processing delay time may be set by a swipe operation as shown in FIG. Thereby, synchronous reproduction can be performed more accurately based on the user's sense.
  • FIG. 29 is a diagram illustrating an example of a processing flow of the receiver 1800a according to the third embodiment.
  • the earphone-only playback shown by this processing flow enables audio playback without disturbing the surroundings.
  • the receiver 1800a checks whether or not the setting limited to the earphone is performed (step S1811).
  • the setting limited to the earphone is performed, for example, the setting limited to the earphone is set in the receiver 1800a.
  • settings that are limited to earphones are made in the received signal (visible light signal). Alternatively, it is recorded in the server or the receiver 1800a in association with the received signal that it is limited to the earphone.
  • step S1813 it is determined whether or not the earphone is connected to the receiver 1800a.
  • the receiver 1800a When the receiver 1800a confirms that the earphone is not limited (N in Step S1811) or determines that the earphone is connected (Y in Step S1813), the receiver 1800a reproduces the sound (Step S1812). When playing back audio, the receiver 1800a adjusts the volume so that the volume is within the set range. This setting range is set similarly to the setting limited to the earphone.
  • the receiver 1800a determines that the earphone is not connected (N in step S1813)
  • the receiver 1800a performs a notification prompting the user to connect the earphone (step S1814).
  • This notification is performed by, for example, screen display, audio output, or vibration.
  • the receiver 1800a prepares an interface for forced reproduction and determines whether or not the user has performed an operation of forced reproduction (Ste S1815). If it is determined that the forced playback operation has been performed (Y in step S1815), the receiver 1800a plays back the audio even when the earphone is not connected (step S1812).
  • the receiver 1800a retains the audio data received in advance and the analyzed synchronization time so that the earphone is connected. Quickly synchronize audio playback.
  • FIG. 30 is a diagram illustrating another example of the processing flow of the receiver 1800a according to the third embodiment.
  • the receiver 1800a receives an ID from the transmitter 1800d (step S1821). That is, the receiver 1800a receives a visible light signal indicating the ID of the transmitter 1800d or the ID of the content displayed on the transmitter 1800d.
  • the receiver 1800a downloads information (content) associated with the received ID from the server (step S1822). Alternatively, the receiver 1800a reads out the information from the data holding unit in the receiver 1800a. Hereinafter, this information is referred to as related information.
  • the receiver 1800a determines whether or not the synchronous reproduction flag included in the related information indicates ON (step S1823). If it is determined that the synchronous reproduction flag does not indicate ON (N in step S1823), the receiver 1800a outputs the content indicated by the related information (step S1824). That is, when the content is an image, the receiver 1800a displays an image, and when the content is audio, the receiver 1800a outputs audio.
  • receiver 1800a determines that the synchronous reproduction flag indicates ON (Y in step S1823), is the time adjustment mode included in the related information set to the transmitter reference mode? Then, it is determined whether or not the absolute time mode is set (step S1825). If it is determined that the absolute time mode is set, the receiver 1800a determines whether or not the last time adjustment has been performed within a certain time from the current time (step S1826). The time adjustment at this time is processing for obtaining time information by a predetermined method and using the time information to adjust the time of a clock provided in the receiver 1800a to the absolute time of the reference clock.
  • the predetermined method is, for example, a method using a GPS (Global Positioning System) radio wave or an NTP (Network Time Protocol) radio wave. Note that the current time described above may be a time when the receiver 1800a, which is a terminal device, receives a visible light signal.
  • the receiver 1800a determines that the last time adjustment has been performed within a certain time (Y in step S1826), the receiver 1800a outputs the related information based on the time of the clock of the receiver 1800a, and is displayed on the transmitter 1800d. Content and related information are synchronized (step S1827).
  • the content indicated by the related information is, for example, a moving image
  • the receiver 1800a displays the moving image so as to be synchronized with the content displayed on the transmitter 1800d.
  • the content indicated by the related information is, for example, audio
  • the receiver 1800a outputs the audio so as to be synchronized with the content displayed on the transmitter 1800d.
  • the related information indicates sound
  • the related information includes each frame constituting the sound, and these frames are time stamped.
  • the receiver 1800a outputs a sound synchronized with the content of the transmitter 1800d by playing back a frame with a type stamp corresponding to the time of its own clock.
  • the receiver 1800a determines that the last time adjustment has not been performed within a certain time (N in step S1826), the receiver 1800a attempts to obtain the time information by a predetermined method, and whether or not the time information has been obtained. Is determined (step S1828). If it is determined that the time information has been obtained (Y in step S1828), the receiver 1800a updates the time of the clock of the receiver 1800a using the time information (step S1829). Then, the receiver 1800a executes the process of step S1827 described above.
  • step S1825 If it is determined in step S1825 that the time adjustment mode is the transmitter reference mode, or if it is determined in step S1828 that time information could not be obtained (N in step S1828), the receiver 1800a
  • the time information is acquired from the transmitter 1800d (step S1830). That is, the receiver 1800a acquires time information that is a synchronization signal from the transmitter 1800d through visible light communication.
  • the synchronization signals are time packet 1 and time packet 2 shown in FIG.
  • the receiver 1800a acquires time information from the transmitter 1800d by radio waves such as Bluetooth (registered trademark) or Wi-Fi. Then, the receiver 1800a executes the processes of steps S1829 and S1827 described above.
  • processing is performed for synchronization between the clock of the terminal device that is the receiver 1800a and the reference clock by GPS radio waves or NTP radio waves.
  • the time of the terminal device, the time of the terminal device, and the time of the transmitter according to the time indicated by the visible light signal transmitted from the transmitter 1800d. Synchronize between. Accordingly, the terminal device can reproduce the content (moving image or sound) at the timing synchronized with the transmitter-side content reproduced by the transmitter 1800d.
  • FIG. 31A is a diagram for explaining a specific method of synchronized playback in the third embodiment. There are methods a to e shown in FIG.
  • the transmitter 1800d outputs a visible light signal indicating the content ID and the content playback time by changing the luminance of the display, as in the above embodiments.
  • the content playback time is the playback time of data that is part of the content that is being played back by the transmitter 1800d when the content ID is transmitted from the transmitter 1800d.
  • the data is a picture or a sequence constituting the moving image if the content is a moving image, or a frame constituting the sound if the content is sound.
  • the playback time indicates, for example, the playback time from the beginning of the content as the time. If the content is a moving image, the playback time is included in the content as a PTS (Presentation Time Stamp). That is, the content includes the reproduction time (display time) of the data for each data constituting the content.
  • PTS Presentation Time Stamp
  • the receiver 1800a receives the visible light signal by photographing the transmitter 1800d as in the above embodiments. Then, the receiver 1800a transmits a request signal including the content ID indicated by the visible light signal to the server 1800f. The server 1800f receives the request signal, and transmits the content associated with the content ID included in the request signal to the receiver 1800a.
  • the receiver 1800a When the receiver 1800a receives the content, the receiver 1800a plays the content from the time of (content playback time + elapsed time since ID reception).
  • the elapsed time from the reception of the ID is an elapsed time from when the content ID is received by the receiver 1800a.
  • the transmitter 1800d outputs a visible light signal indicating the content ID and the content playback time by changing the luminance of the display, as in the above embodiments.
  • the receiver 1800a receives the visible light signal by photographing the transmitter 1800d as in the above embodiments. Then, the receiver 1800a transmits a request signal including the content ID indicated by the visible light signal and the content playback time to the server 1800f.
  • the server 1800f receives the request signal, and transmits only a part of the content after the content playback time to the receiver 1800a among the content associated with the content ID included in the request signal.
  • the receiver 1800a When the receiver 1800a receives the part of the content, the receiver 1800a reproduces the part of the content from the time point (elapsed time since the reception of the ID).
  • the transmitter 1800d outputs a visible light signal indicating the transmitter ID and the content reproduction time by changing the luminance of the display, as in the above embodiments.
  • the transmitter ID is information for identifying the transmitter.
  • the receiver 1800a receives the visible light signal by photographing the transmitter 1800d as in the above embodiments. Then, the receiver 1800a transmits a request signal including the transmitter ID indicated by the visible light signal to the server 1800f.
  • the server 1800f holds, for each transmitter ID, a reproduction schedule that is a timetable of content reproduced by the transmitter with the transmitter ID. Further, the server 1800f includes a clock. When such a server 1800f receives the request signal, the content associated with the transmitter ID included in the request signal and the clock time (server time) of the server 1800f is the content being played back. Identify from the playback schedule. Then, the server 1800f transmits the content to the receiver 1800a.
  • the receiver 1800a When the receiver 1800a receives the content, the receiver 1800a plays the content from the time of (content playback time + elapsed time since ID reception).
  • the transmitter 1800d outputs a visible light signal indicating the transmitter ID and the transmitter time by changing the luminance of the display as in the above embodiments.
  • the transmitter time is a time indicated by a clock provided in the transmitter 1800d.
  • the receiver 1800a receives the visible light signal by photographing the transmitter 1800d as in the above embodiments. Then, the receiver 1800a transmits a request signal including the transmitter ID indicated by the visible light signal and the transmitter time to the server 1800f.
  • the server 1800f holds the above reproduction schedule.
  • the server 1800f receives the request signal
  • the server 1800f identifies the content associated with the transmitter ID and the transmitter time included in the request signal as the content being reproduced from the reproduction schedule.
  • the server 1800f specifies the content playback time from the transmitter time. That is, the server 1800f finds the playback start time of the specified content from the playback schedule, and specifies the time between the transmitter time and the playback start time as the content playback time. Then, the server 1800f transmits the content and the content playback time to the receiver 1800a.
  • the receiver 1800a Upon receiving the content and the content playback time, the receiver 1800a plays the content from the time of (content playback time + elapsed time since reception of ID).
  • the visible light signal indicates the time when the visible light signal is transmitted from the transmitter 1800d. Therefore, the receiver 1800a, which is a terminal device, can receive content associated with the time (transmitter time) at which the visible light signal is transmitted from the transmitter 1800d. For example, if the transmitter time is 5:43, content played back at 5:43 can be received.
  • the server 1800f has a plurality of contents each associated with a time.
  • the content associated with the time indicated by the visible light signal may not exist in the server 1800f.
  • the receiver 1800a as the terminal device is closest to the time indicated by the visible light signal and is associated with the time after the time indicated by the visible light signal among the plurality of contents. Content may be received. Thereby, even if the content associated with the time indicated by the visible light signal does not exist in the server 1800f, it is possible to receive appropriate content from among the plurality of contents in the server 1800f.
  • the reproduction method includes a signal receiving step of receiving a visible light signal by a sensor of the receiver 1800a (terminal device) from a transmitter 1800d that transmits a visible light signal according to a change in luminance of the light source, and a receiver.
  • 1800a transmits a request signal for requesting the content associated with the visible light signal to the server 1800f
  • the receiver 1800a receives the content from the server 1800f, and reproduces the content.
  • the visible light signal indicates a transmitter ID and a transmitter time.
  • the transmitter ID is ID information.
  • the transmitter time is the time indicated by the clock of the transmitter 1800d, and the time when the visible light signal is transmitted from the transmitter 1800d.
  • the receiver 1800a receives the content associated with the transmitter ID and the transmitter time indicated by the visible light signal. As a result, the receiver 1800a can reproduce appropriate content with respect to the transmitter ID and the transmitter time.
  • the transmitter 1800d outputs a visible light signal indicating the transmitter ID by changing the luminance of the display as in the above embodiments.
  • the receiver 1800a receives the visible light signal by photographing the transmitter 1800d as in the above embodiments. Then, the receiver 1800a transmits a request signal including the transmitter ID indicated by the visible light signal to the server 1800f.
  • the server 1800f holds the above-described reproduction schedule and further includes a clock.
  • the server 1800f receives the request signal
  • the server 1800f identifies the content associated with the transmitter ID and the server time included in the request signal from the reproduction schedule as content being reproduced.
  • the server time is the time indicated by the clock of the server 1800f.
  • the server 1800f finds the reproduction start time of the specified content from the reproduction schedule table. Then, the server 1800f transmits the content and the content reproduction start time to the receiver 1800a.
  • the receiver 1800a When the receiver 1800a receives the content and the content playback start time, the receiver 1800a plays the content from the time of (receiver time-content playback start time).
  • the receiver time is a time indicated by a clock provided in the receiver 1800a.
  • the reproduction method includes a signal receiving step of receiving a visible light signal by a sensor of the receiver 1800a (terminal device) from a transmitter 1800d that transmits a visible light signal due to a luminance change of the light source;
  • the transmitting step of transmitting a request signal for requesting the content associated with the visible light signal from the receiver 1800a to the server 1800f, and the receiver 1800a include each time and data reproduced at each time
  • the transmitter 1800d if content related to the content (transmitter-side content) is reproduced, the receiver 1800a can reproduce the content in synchronization with the transmitter-side content appropriately. .
  • the server 1800f may transmit only a part of the content after the content playback time to the receiver 1800a.
  • the receiver 1800a transmits a request signal to the server 1800f and receives necessary data from the server 1800f.
  • the data in the server 1800f is transmitted in advance without performing such transmission / reception. You may keep it.
  • FIG. 31B is a block diagram showing the configuration of a playback apparatus that performs synchronized playback by the method e described above.
  • the playback device B10 is a receiver 1800a or a terminal device that performs synchronous playback by the method e described above, and includes a sensor B11, a request signal transmission unit B12, a content reception unit B13, a clock B14, and a playback unit B15. I have.
  • Sensor B11 is, for example, an image sensor, and receives the visible light signal from a transmitter 1800d that transmits a visible light signal according to a change in luminance of the light source.
  • the request signal transmission unit B12 transmits a request signal for requesting content associated with the visible light signal to the server 1800f.
  • the content receiving unit B13 receives content including each time and data reproduced at each time from the server 1800f.
  • the reproduction unit B15 reproduces data corresponding to the time of the clock B14 in the content.
  • FIG. 31C is a flowchart showing the processing operation of the terminal device that performs synchronous reproduction by the method e described above.
  • the playback device B10 is a receiver 1800a or a terminal device that performs synchronized playback by the method e described above, and executes each process of steps SB11 to SB15.
  • step SB11 the visible light signal is received from the transmitter 1800d that transmits the visible light signal according to the luminance change of the light source.
  • step SB12 a request signal for requesting content associated with the visible light signal is transmitted to server 1800f.
  • step SB13 content including each time and data reproduced at each time is received from server 1800f.
  • step SB15 data corresponding to the time of the clock B14 is reproduced from the content.
  • the data in the content can be appropriately played back at the correct time indicated by the content without being played back at the wrong time.
  • each component may be configured by dedicated hardware or may be realized by executing a software program suitable for each component.
  • Each component may be realized by a program execution unit such as a CPU or a processor reading and executing a software program recorded on a recording medium such as a hard disk or a semiconductor memory.
  • the software that realizes the playback apparatus B10 and the like of the present embodiment is a program that causes a computer to execute each step included in the flowchart shown in FIG. 31C.
  • FIG. 32 is a diagram for explaining preparations for synchronized playback in the third embodiment.
  • the receiver 1800a adjusts the time of the clock provided in the receiver 1800a to the time of the reference clock in order to perform synchronized playback. For this time adjustment, the receiver 1800a performs the following processes (1) to (5).
  • the receiver 1800a receives a signal.
  • This signal may be a visible light signal transmitted by a change in luminance of the display of the transmitter 1800d, or a radio wave signal based on Wi-Fi or Bluetooth (registered trademark) from a wireless device.
  • the receiver 1800a acquires position information indicating the position of the receiver 1800a by, for example, GPS instead of receiving such a signal. Then, the receiver 1800a recognizes that the receiver 1800a has entered a predetermined place or building based on the position information.
  • the receiver 1800a When the receiver 1800a receives the above signal or recognizes that it has entered a predetermined location, it receives a request signal for requesting data (related information) associated with the signal or location. It transmits to the server (visible light ID resolution server) 1800f.
  • the server visible light ID resolution server
  • the server 1800f transmits the above-described data and a time adjustment request for causing the receiver 1800a to adjust the time to the receiver 1800a.
  • the receiver 1800a When receiving the data and the time adjustment request, the receiver 1800a transmits the time adjustment request to the GPS time server, the NTP server, or the base station of the telecommunications carrier (carrier).
  • the server or the base station Upon receiving the time adjustment request, the server or the base station transmits time data (time information) indicating the current time (reference clock time or absolute time) to the receiver 1800a.
  • time data time information
  • the receiver 1800a adjusts the time by adjusting the time of the clock provided to the receiver 1800a to the current time indicated by the time data.
  • a GPS (Global Positioning System) radio wave or an NTP (Network Time Protocol) radio wave is used between the clock provided in the receiver 1800a (terminal device) and the reference clock. Synchronized. Therefore, the receiver 1800a can reproduce the data corresponding to the time at an appropriate time according to the reference clock.
  • FIG. 33 is a diagram illustrating an example of application of the receiver 1800a in the third embodiment.
  • the receiver 1800a is configured as a smartphone as described above, and is used by being held by a holder 1810 formed of, for example, a translucent resin or glass member.
  • the holder 1810 includes a back plate portion 1810a and a locking portion 1810b provided upright on the back plate portion 1810a.
  • the receiver 1800a is inserted between the back plate portion 1810a and the locking portion 1810b so as to be along the back plate portion 1810a.
  • FIG. 34A is a front view of receiver 1800a held by holder 1810 in the third embodiment.
  • the receiver 1800a is held by the holder 1810 in the inserted state as described above.
  • the locking portion 1810b locks with the lower portion of the receiver 1800a and sandwiches the lower portion with the back plate portion 1810a.
  • the back surface of the receiver 1800a faces the back plate portion 1810a, and the display 1801 of the receiver 1800a is exposed.
  • FIG. 34B is a rear view of receiver 1800a held by holder 1810 in the third embodiment.
  • a through hole 1811 is formed in the back plate portion 1810a, and a variable filter 1812 is attached in the vicinity of the through hole 1811.
  • camera 1802 of receiver 1800a is exposed through back hole 1811 from back plate portion 1810a.
  • the flashlight 1803 of the receiver 1800a faces the variable filter 1812.
  • the variable filter 1812 is formed in a disk shape, for example, and has three color filters (a red filter, a yellow filter, and a green filter) each having a fan shape and the same size.
  • the variable filter 1812 is attached to the back plate portion 1810a so as to be rotatable about the center of the variable filter 1812.
  • the red filter is a filter having red translucency
  • the yellow filter is a filter having yellow translucency
  • the green filter is a filter having green translucency.
  • variable filter 1812 is rotated, and, for example, the red filter is disposed at a position facing the flashlight 1803a.
  • the light emitted from the flashlight 1803a is diffused inside the holder 1810 as red light by passing through the red filter.
  • substantially the entire holder 1810 emits red light.
  • variable filter 1812 is rotated and, for example, the yellow filter is disposed at a position facing the flashlight 1803a.
  • the light emitted from the flashlight 1803a is diffused inside the holder 1810 as yellow light by passing through the yellow filter.
  • substantially the entire holder 1810 emits yellow light.
  • variable filter 1812 is rotated so that, for example, the green filter is disposed at a position facing the flashlight 1803a.
  • the light emitted from the flashlight 1803a is diffused inside the holder 1810 as green light by passing through the green filter.
  • substantially the entire holder 1810 emits green light.
  • the holder 1810 lights in red, yellow or green like a penlight.
  • FIG. 35 is a diagram for describing a use case of the receiver 1800a held by the holder 1810 in the third embodiment.
  • a receiver with a holder that is a receiver 1800a held by a holder 1810 is used in an amusement park or the like. That is, the plurality of receivers with holders that are directed to the float moving in the amusement park blink in synchronization with the music flowing from the float.
  • the float is configured as a transmitter in each of the above embodiments, and transmits a visible light signal by a change in luminance of a light source attached to the float.
  • the float transmits a visible light signal indicating the ID of the float.
  • the receiver with a holder receives the visible light signal, ie, ID, by imaging
  • the receiver 1800a that has received the ID acquires a program associated with the ID from, for example, a server.
  • This program includes instructions for turning on the flashlight 1803 of the receiver 1800a at each predetermined time. Each predetermined time is set in accordance with the music flowing from the float (so as to be synchronized). Then, the receiver 1800a blinks the flashlight 1803a according to the program.
  • each receiver 1800a that has received the ID repeats lighting at the same timing according to the music flowing from the float of the ID.
  • each receiver 1800a blinks the flashlight 1803 in accordance with a set color filter (hereinafter referred to as a setting filter).
  • the setting filter is a color filter that faces the flashlight 1803 of the receiver 1800a.
  • Each receiver 1800a recognizes the current setting filter based on an operation by the user. Alternatively, each receiver 1800a recognizes the current setting filter based on the color of an image obtained by photographing with the camera 1802.
  • the receiver 1800a held in the holder 1810 is synchronized with the float music and the receiver 1800a held in the other holder 1810 in the same manner as the synchronous playback shown in FIGS. Then, the flashlight 1803, that is, the holder 1810 is blinked.
  • FIG. 36 is a flowchart showing the processing operation of the receiver 1800a held by the holder 1810 in the third embodiment.
  • the receiver 1800a receives the float ID indicated by the visible light signal from the float (step S1831). Next, the receiver 1800a acquires a program associated with the ID from the server (step S1832). Next, the receiver 1800a executes the program to turn on the flashlight 1803 at each predetermined time according to the setting filter (step S1833).
  • the receiver 1800a may cause the display 1801 to display an image corresponding to the received ID or the acquired program.
  • FIG. 37 is a diagram illustrating an example of an image displayed by the receiver 1800a according to the third embodiment.
  • the receiver 1800a when the receiver 1800a receives an ID from a Santa Claus float, the receiver 1800a displays a Santa Claus image as shown in FIG. Further, as shown in FIG. 37B, the receiver 1800a may change the background color of the Santa Claus image to the color of the setting filter simultaneously with the lighting of the flashlight 1803. For example, when the color of the setting filter is red, the holder 1810 is lit red by turning on the flashlight 1803, and at the same time, a Santa Claus image having a red background color is displayed on the display 1801. That is, the blinking of the holder 1810 and the display on the display 1801 are synchronized.
  • FIG. 38 is a diagram showing another example of the holder in the third embodiment.
  • the holder 1820 is configured in the same manner as the holder 1810 described above, but does not include the through hole 1811 and the variable filter 1812.
  • a holder 1820 holds the receiver 1800a in a state where the display 1801 of the receiver 1800a is directed to the back plate portion 1820a.
  • the receiver 1800a causes the display 1801 to emit light instead of the flashlight 1803.
  • light from the display 1801 is diffused over substantially the entire holder 1820. Therefore, when the receiver 1800a causes the display 1801 to emit light with red light according to the above-described program, the holder 1820 is lit red. Similarly, when the receiver 1800a causes the display 1801 to emit light with yellow light according to the above-described program, the holder 1820 is lit in yellow.
  • the holder 1820 lights up in green. If such a holder 1820 is used, the setting of the variable filter 1812 can be omitted.
  • FIG. 39A to FIG. 39D are diagrams illustrating examples of visible light signals in the third embodiment.
  • the transmitter generates a 4PPM visible light signal and changes the luminance according to the visible light signal, for example, as shown in FIG. 39A.
  • the transmitter allocates 4 slots to one signal unit, and generates a visible light signal composed of a plurality of signal units.
  • the signal unit indicates High (H) or Low (L) for each slot.
  • the transmitter emits light brightly in the H slot and emits light darkly or extinguishes in the L slot.
  • one slot is a period corresponding to a time of 1/9600 seconds.
  • the transmitter may generate a visible light signal in which the number of slots allocated to one signal unit is variable.
  • the signal unit includes a signal indicating H in one or more consecutive slots and a signal indicating L in one slot following the H signal. Since the number of slots of H is variable, the total number of slots in the signal unit is variable.
  • the transmitter generates a visible light signal including these signal units in the order of a signal unit of 3 slots, a signal unit of 4 slots, and a signal unit of 6 slots. Also in this case, the transmitter emits light brightly in the H slot and emits light darkly or extinguishes in the L slot.
  • the transmitter may allocate an arbitrary period (signal unit period) to one signal unit without allocating a plurality of slots to one signal unit.
  • the signal unit period includes an H period and an L period following the H period.
  • the period of H is adjusted according to the signal before modulation.
  • the period L may be fixed and may be a period corresponding to the slot.
  • the H period and the L period are, for example, periods of 100 ⁇ s or more. For example, as shown in FIG.
  • the transmitter transmits a visible light signal including signal units in the order of a signal unit having a signal unit period of 210 ⁇ s, a signal unit having a signal unit period of 220 ⁇ s, and a signal unit having a signal unit period of 230 ⁇ s.
  • the transmitter emits light brightly during the H period and emits light darkly or extinguishes during the L period.
  • the transmitter may generate a signal indicating L and H alternately as a visible light signal.
  • the L period and the H period in the visible light signal are adjusted according to the signals before modulation.
  • the transmitter indicates H for a period of 100 ⁇ s, then indicates L for a period of 120 ⁇ s, then indicates H for a period of 110 ⁇ s, and further indicates L for a period of 200 ⁇ s.
  • a visible light signal is transmitted.
  • the transmitter emits light brightly during the H period and emits light darkly or extinguishes during the L period.
  • FIG. 40 is a diagram showing a configuration of a visible light signal in the third embodiment.
  • the visible light signal includes, for example, a signal 1, a brightness adjustment signal corresponding to the signal 1, a signal 2, and a brightness adjustment signal corresponding to the signal 2.
  • the transmitter When the transmitter generates the signal 1 and the signal 2 by modulating the signals before modulation, the transmitter generates a brightness adjustment signal for the signals and generates the above-described visible light signal.
  • the brightness adjustment signal corresponding to signal 1 is a signal that compensates for increase / decrease in brightness due to a luminance change according to signal 1.
  • the brightness adjustment signal corresponding to the signal 2 is a signal that compensates for increase / decrease in brightness due to a luminance change according to the signal 2.
  • the brightness B1 is expressed by the luminance change according to the signal 1 and the brightness adjustment signal of the signal 1
  • Brightness B2 is expressed.
  • the transmitter in the present embodiment generates the brightness adjustment signals of signal 1 and signal 2 as part of the visible light signal so that the brightness B1 and brightness B2 are equal. Thereby, the brightness is kept constant and flicker can be suppressed.
  • the transmitter 1 when the transmitter 1 generates the signal 1, the transmitter 1 generates the signal 1 including the data 1, the preamble (header) following the data 1, and the data 1 following the preamble.
  • the preamble is a signal corresponding to data 1 arranged before and after the preamble.
  • this preamble is a signal serving as an identifier for reading data 1.
  • the reproduction method includes a signal receiving step of receiving the visible light signal by a sensor of a terminal device from a transmitter that transmits a visible light signal according to a luminance change of a light source, and the visible light signal from the terminal device.
  • the terminal device can appropriately reproduce the data in the content at the correct time indicated by the content without reproducing the data at the wrong time.
  • the receiver as the terminal device reproduces the content from the time of (receiver time ⁇ content reproduction start time).
  • the data corresponding to the clock time of the terminal device described above is data at the time of (receiver time ⁇ content reproduction start time) in the content.
  • the terminal device can play back the content appropriately synchronized with the transmitter-side content.
  • the content is sound or image.
  • the clock provided in the terminal device and the reference clock may be synchronized with each other by GPS (Global Positioning System) radio waves or NTP (Network Time Protocol) radio waves.
  • GPS Global Positioning System
  • NTP Network Time Protocol
  • the visible light signal may indicate a time when the visible light signal is transmitted from the transmitter.
  • the terminal device can receive the content associated with the time (transmitter time) at which the visible light signal is transmitted from the transmitter. For example, if the transmitter time is 5:43, content played back at 5:43 can be received.
  • the time at which the process for synchronizing the clock of the terminal device and the reference clock is performed by the GPS radio wave or the NTP radio wave is determined by the terminal device as the visible signal. If it is before a predetermined time from the time when the optical signal is received, synchronization is performed between the clock of the terminal device and the clock of the transmitter according to the time indicated by the visible light signal transmitted from the transmitter. It may be taken.
  • the synchronization may not be properly maintained. In such a case, there is a possibility that the terminal device cannot reproduce the content at a time synchronized with the transmitter-side content reproduced by the transmitter. Therefore, in the playback method according to one aspect of the present invention, as shown in steps S1829 and S1830 of FIG. 30, when a predetermined time has elapsed, the clock of the terminal device (receiver) and the clock of the transmitter are set. Synchronized. Therefore, the terminal device can reproduce the content at a time synchronized with the transmitter-side content reproduced by the transmitter.
  • the server has a plurality of contents each associated with a time, and in the contents receiving step, when the contents associated with the time indicated by the visible light signal does not exist in the server, Among the plurality of contents, content that is closest to the time indicated by the visible light signal and that is associated with a time after the time indicated by the visible light signal may be received.
  • a signal receiving step of receiving the visible light signal by a sensor of a terminal device from a transmitter that transmits a visible light signal due to a luminance change of the light source, and a content associated with the visible light signal from the terminal device A transmission step of transmitting a request signal for requesting to the server, a content reception step in which the terminal device receives the content from the server, and a reproduction step of reproducing the content, wherein the visible light signal is: ID information and a time at which the visible light signal is transmitted from the transmitter are indicated.
  • the ID information indicated by the visible light signal and the content associated with the time are received. Good.
  • the visible light signal is associated with the time (transmitter time) at which the visible light signal is transmitted from the transmitter among the plurality of contents associated with the ID information (transmitter ID).
  • the received content is received and played back. Therefore, it is possible to reproduce content appropriate for the transmitter ID and transmitter time.
  • the visible light signal includes second information indicating the hour and minute of the time and first information indicating the second of the time, so that the visible light signal is transmitted from the transmitter.
  • the second information may be received and the first information may be received more times than the number of times the second information is received.
  • the packet indicating the current time expressed using all of the hour, minute, and second can be saved in every second. That is, as shown in FIG. 26, if the hour and minute of the time when the packet is transmitted is not updated from the time and minute indicated in the previously transmitted packet, the packet indicating only the second (time packet 1 Only the first information is required to be transmitted. Therefore, by reducing the second information that is the packet indicating the hour and the minute (time packet 2) than the first information that is the packet indicating the second (time packet 1) transmitted by the transmitter, Transmission of packets containing redundant contents can be suppressed.
  • FIG. 41 is a diagram illustrating an example in which the receiver according to the present embodiment displays an AR image.
  • the receiver 200 is a receiver including the image sensor and the display 201 according to any one of the first to third embodiments, and is configured as a smartphone, for example.
  • Such a receiver 200 acquires the above-described captured display image Pa, which is the normal captured image, and the above-described decoding image, which is the visible light communication image or the bright line image, by capturing the subject with the image sensor.
  • the image sensor of the receiver 200 images the transmitter 100 configured as a station name sign.
  • the transmitter 100 is the transmitter according to any one of the first to third embodiments, and includes one or a plurality of light emitting elements (for example, LEDs).
  • the transmitter 100 changes in luminance by blinking one or more light emitting elements, and transmits an optical ID (light identification information) by the change in luminance.
  • This light ID is the above-mentioned visible light signal.
  • the receiver 200 captures the transmitter 100 with the normal exposure time, thereby acquiring the captured display image Pa projected by the transmitter 100, and the transmitter 100 with a communication exposure time shorter than the normal exposure time.
  • a decoding image is acquired by imaging.
  • the normal exposure time is the exposure time in the above-described normal photographing mode
  • the communication exposure time is the exposure time in the above-described visible light communication mode.
  • the receiver 200 acquires the optical ID by decoding the decoding image. That is, the receiver 200 receives the optical ID from the transmitter 100. The receiver 200 transmits the optical ID to the server. Then, the receiver 200 acquires the AR image P1 corresponding to the optical ID and the recognition information from the server. The receiver 200 recognizes an area corresponding to the recognition information in the captured display image Pa as a target area. For example, the receiver 200 recognizes an area in which a station name sign that is the transmitter 100 is displayed as a target area. Then, the receiver 200 superimposes the AR image P1 on the target area, and displays the captured display image Pa on which the AR image P1 is superimposed on the display 201.
  • the receiver 200 when “Kyoto Station” is written in Japanese as a station name in the station name that is the transmitter 100, the receiver 200 describes the AR image P1 in which the station name is written in English, that is, “Kyoto Station”. Obtained AR image P1.
  • the AR image P1 is superimposed on the target area of the captured display image Pa, the captured display image Pa can be displayed so that a station name with a station name written in English actually exists.
  • a user who can understand English can easily understand the station name described in the station name sign that is the transmitter 100 by looking at the captured display image Pa even if the user cannot read Japanese.
  • the recognition information may be an image to be recognized (for example, an image of the above-described station name sign), or may be a feature point and a feature amount of the image.
  • the feature points and feature quantities are obtained by, for example, SIFT (Scale-invariant feature transform), SURF (Speed-Uploaded Robust Feature), ORB (Oriented-BREF), AKAZE (Accelerated) image, etc.
  • the recognition information may be a white square image similar to the image to be recognized, and may further indicate the aspect ratio (aspect ratio) of the square.
  • the identification information may be random dots that appear in the recognition target image.
  • the recognition information may indicate a direction based on a predetermined direction, such as the above-described white square or random dot.
  • the predetermined direction is, for example, the direction of gravity.
  • the receiver 200 recognizes an area corresponding to such recognition information as a target area from the captured display image Pa. Specifically, if the recognition information is an image, the receiver 200 recognizes a region similar to the image that is the recognition information as a target region. If the recognition information is a feature point and a feature amount obtained by image processing, the receiver 200 performs feature point detection and feature amount extraction by performing the image processing on the captured display image Pa. . Then, the receiver 200 recognizes, in the captured display image Pa, a region having feature points and feature amounts that are similar to the feature points and feature amounts that are recognition information as target regions. If the recognition information indicates a white square and its direction, the receiver 200 first detects the direction of gravity using an acceleration sensor provided in the receiver 200. Then, the receiver 200 recognizes, as a target area, an area similar to a white square directed in the direction indicated by the recognition information from the captured display image Pa arranged with reference to the direction of gravity.
  • the recognition information may include reference information for specifying a reference area in the captured display image Pa and target information indicating a relative position of the target area with respect to the reference area.
  • the reference information is an image to be recognized, a feature point and a feature amount, a white square image, or a random dot as described above.
  • the receiver 200 when recognizing the target area, the receiver 200 first specifies the reference area from the captured display image Pa based on the reference information. And the receiver 200 recognizes the area
  • the target information may indicate that the target area is in the same position as the reference area.
  • the recognition information includes the reference information and the target information, the target region can be recognized in a wide range. Further, the server can freely set the location where the AR image is superimposed and can be instructed to the receiver 200.
  • the reference information may indicate that the reference area in the captured display image Pa is an area where the display of the captured display image is displayed.
  • the transmitter 100 is configured as a display such as a television, for example, the target area can be recognized with reference to the area where the display is displayed.
  • the receiver 200 in the present embodiment specifies the reference image and the image recognition method based on the light ID.
  • the image recognition method is a method for recognizing the captured display image Pa, for example, geometric feature extraction, spectral feature extraction, texture feature extraction, or the like.
  • the reference image is data indicating a reference feature amount.
  • the feature amount is, for example, the feature amount of the white outer frame of the image, and specifically, may be data expressing the feature of the image as a vector.
  • the receiver 200 extracts a feature amount from the captured display image Pa in accordance with an image recognition method, and compares the feature amount with the feature amount of the reference image, so that the above-described reference region or target region is extracted from the captured display image Pa. Find out.
  • the image recognition method may include, for example, a location use method, a marker use method, and a markerless method.
  • the location utilization method is a method utilizing GPS position information (that is, the position of the receiver 200), and the target area is recognized from the captured display image Pa based on the position information.
  • the marker utilization method is a method of using a marker composed of white and black graphics such as a two-dimensional barcode as a target specifying mark. That is, in this marker usage method, the target region is recognized based on the marker displayed in the captured display image Pa.
  • a feature point or feature amount is extracted from the captured display image Pa by image analysis on the captured display image Pa, and the position and region of the target are specified based on the extracted feature point or feature amount. Is the method. That is, when the image recognition method is a markerless method, the image recognition method is the above-described geometric feature amount extraction, spectral feature amount extraction, texture feature amount extraction, or the like.
  • Such a receiver 200 receives an optical ID from the transmitter 100 and acquires a reference image and an image recognition method associated with the optical ID (hereinafter referred to as a received optical ID) from the server, thereby obtaining the reference.
  • Images and image recognition methods may be specified. That is, the server stores a plurality of sets including the reference image and the image recognition method, and each of the plurality of sets is associated with a different light ID. Thus, one set associated with the received light ID can be identified from among a plurality of sets stored in the server. Therefore, the speed of image processing for superimposing the AR image can be improved.
  • the receiver 200 may acquire a reference image associated with the received light ID by inquiring of the server, and the received light ID may be obtained from a plurality of reference images held by the receiver 200 in advance. You may acquire the reference
  • the server may hold the relative position information associated with each light ID together with the reference image, the image recognition method, and the AR image for each light ID.
  • the relative position information is information indicating the relative positional relationship between the reference area and the target area, for example.
  • the receiver 200 may recognize the above-described reference area as a target area and superimpose an AR image on the reference area. That is, the receiver 200 may store a program for displaying an AR image based on the reference image in advance, instead of acquiring the relative position information, and display the AR image in a white frame that is a reference region, for example. . In this case, relative position information is not necessary.
  • the server holds a plurality of sets including a reference image, relative position information, an AR image, and an image recognition method.
  • the receiver 200 acquires one set associated with the received light ID from these sets.
  • the server holds a plurality of sets including a reference image and an AR image.
  • the receiver 200 uses a predetermined relative position information and an image recognition method, and acquires one set associated with the received light ID from the set.
  • the receiver 200 may hold a plurality of sets including the relative position information and the image recognition method in advance, and select one set associated with the received light ID from the plurality of sets.
  • the receiver 200 may inquire by transmitting the received light ID to the server, and acquire relative position information corresponding to the received light ID and information for specifying the image recognition method from the server. Then, the receiver 200 selects one set based on the information acquired from the server from among a plurality of sets each having the relative position information and the image recognition method.
  • the receiver 200 selects one set associated with the received light ID from a plurality of sets each including the relative position information and the image recognition method stored in advance without inquiring of the server. May be.
  • the receiver 200 holds a plurality of sets including a reference image, relative position information, an AR image, and an image recognition method, and selects one set from these sets. Similarly to (2) above, the receiver 200 may select one set by making an inquiry to the server, or may select one set associated with the receiver optical ID.
  • the receiver 200 holds a plurality of sets including the reference image and the AR image, and selects one set associated with the received light ID.
  • the receiver 200 uses a predetermined image recognition method and relative position information.
  • FIG. 42 is a diagram showing an example of the display system in the present embodiment.
  • the display system in the present embodiment includes, for example, the transmitter 100, the receiver 200, and the server 300, which are the above-described station names.
  • the receiver 200 first receives an optical ID from the transmitter 100 in order to display the captured display image on which the AR image is superimposed as described above. Next, the receiver 200 transmits the optical ID to the server 300.
  • the server 300 holds an AR image and recognition information associated with each light ID for each light ID. Therefore, when the server 300 receives the optical ID from the receiver 200, the server 300 selects the AR image and the recognition information associated with the received optical ID, and sends the selected AR image and the recognition information to the receiver 200. Send. Thereby, the receiver 200 receives the AR image and the recognition information transmitted from the server 300, and displays the captured display image on which the AR image is superimposed.
  • FIG. 43 is a diagram showing another example of the display system in the present embodiment.
  • the display system according to the present embodiment includes, for example, the transmitter 100 that is the above-described station name sign, the receiver 200, the first server 301, and the second server 302.
  • the receiver 200 first receives an optical ID from the transmitter 100 in order to display the captured display image on which the AR image is superimposed as described above. Next, the receiver 200 transmits the optical ID to the first server 301.
  • the first server 301 When receiving the optical ID from the receiver 200, the first server 301 notifies the receiver 200 of a URL (Uniform Resource Locator) and Key associated with the received optical ID. Receiving such notification, the receiver 200 accesses the second server 302 based on the URL, and passes the key to the second server 302.
  • a URL Uniform Resource Locator
  • the second server 302 holds an AR image and recognition information associated with each key for each key. Therefore, when receiving a key from the receiver 200, the second server 302 selects an AR image and recognition information associated with the key, and transmits the selected AR image and recognition information to the receiver 200. . Thereby, the receiver 200 receives the AR image and the recognition information transmitted from the second server 302, and displays a captured display image on which the AR image is superimposed.
  • FIG. 44 is a diagram showing another example of the display system in the present embodiment.
  • the display system according to the present embodiment includes, for example, the transmitter 100 that is the above-described station name sign, the receiver 200, the first server 301, and the second server 302.
  • the receiver 200 first receives an optical ID from the transmitter 100 in order to display the captured display image on which the AR image is superimposed as described above. Next, the receiver 200 transmits the optical ID to the first server 301.
  • the first server 301 When receiving the optical ID from the receiver 200, the first server 301 notifies the second server 302 of the Key associated with the received optical ID.
  • the second server 302 holds an AR image and recognition information associated with each key for each key. Therefore, when the second server 302 receives the key from the first server 301, the second server 302 selects the AR image and the recognition information associated with the key, and the selected AR image and the recognition information are used as the first server. To the server 301. When receiving the AR image and the recognition information from the second server 302, the first server 301 transmits the AR image and the recognition information to the receiver 200. Thereby, the receiver 200 receives the AR image and the recognition information transmitted from the first server 301, and displays the captured display image on which the AR image is superimposed.
  • the second server 302 transmits the AR image and the recognition information to the first server 301.
  • the second server 302 may transmit the AR image and the recognition information to the receiver 200 without transmitting to the first server 301. .
  • FIG. 45 is a flowchart showing an example of the processing operation of the receiver 200 in the present embodiment.
  • the receiver 200 starts imaging with the above-described normal exposure time and communication exposure time (step S101). Then, the receiver 200 acquires an optical ID by decoding the decoding image obtained by imaging with the communication exposure time (step S102). Next, the receiver 200 transmits the optical ID to the server (step S103).
  • the receiver 200 acquires the AR image corresponding to the transmitted optical ID and the recognition information from the server (step S104). Next, the receiver 200 recognizes, as a target area, an area corresponding to the recognition information in the captured display image obtained by imaging with the normal exposure time (step S105). Then, the receiver 200 superimposes the AR image on the target area, and displays the captured display image on which the AR image is superimposed (step S106).
  • the receiver 200 determines whether or not the imaging and the display of the captured display image should be terminated (step S107).
  • the receiver 200 determines whether or not the imaging and the display of the captured display image should be terminated (N in step S107).
  • it further determines whether or not the acceleration of the receiver 200 is equal to or greater than a threshold value (step S108). This acceleration is measured by an acceleration sensor provided in the receiver 200.
  • the receiver 200 executes the processing from step S105. Thereby, even when the captured display image displayed on the display 201 of the receiver 200 is shifted, the AR image can follow the target area of the captured display image.
  • the receiver 200 determines that the acceleration is equal to or greater than the threshold (Y in step S108)
  • the receiver 200 executes the processing from step S102. Thereby, when the transmitter 100 is no longer displayed in the captured display image, it is possible to suppress erroneously recognizing an area where a subject different from the transmitter 100 is displayed as the target area.
  • the AR image is displayed superimposed on the captured display image, an image useful for the user can be displayed. Furthermore, it is possible to superimpose an AR image on an appropriate target area while suppressing the processing load.
  • augmented reality that is, AR
  • a huge number of recognition target images stored in advance are compared with the captured display image, and any of the recognition target images is included in the captured display image. It is determined whether or not If it is determined that the recognition target image is included, the AR image corresponding to the recognition target image is superimposed on the captured display image. At this time, the AR image is aligned based on the recognition target image.
  • a huge number of recognition target images are compared with captured display images, and further, position detection of the recognition target images in the captured display image is necessary even in alignment. There is a problem of a large amount of calculation and a high processing load.
  • the light ID is acquired by decoding the decoding image obtained by imaging the subject. That is, the optical ID transmitted from the transmitter that is the subject is received. Furthermore, the AR image corresponding to this optical ID and the recognition information are acquired from the server. Therefore, the server does not need to compare a huge number of recognition target images and captured display images, and can select and transmit an AR image previously associated with the optical ID to the display device. Thereby, the amount of calculation can be reduced and the processing load can be significantly suppressed. Furthermore, the AR image display process can be speeded up.
  • recognition information corresponding to this optical ID is acquired from the server.
  • the recognition information is information for recognizing a target area that is an area in which an AR image is superimposed in a captured display image.
  • This recognition information may be information indicating that a white square is the target area, for example.
  • the target area can be easily recognized, and the processing load can be further suppressed. That is, the processing load can be further suppressed according to the content of the recognition information.
  • the server can arbitrarily set the content of the recognition information according to the optical ID, the balance between the processing load and the recognition accuracy can be appropriately maintained.
  • the receiver 200 acquires the AR image and the recognition information corresponding to the optical ID from the server.
  • the AR image and the recognition information At least one of these may be acquired in advance. That is, the receiver 200 collects a plurality of AR images and a plurality of recognition information corresponding to a plurality of optical IDs that may be received from the server and stores them. Thereafter, when receiving the optical ID, the receiver 200 selects an AR image and recognition information corresponding to the optical ID from a plurality of AR images and a plurality of recognition information stored in the receiver 200. Thereby, the display processing of the AR image can be further accelerated.
  • FIG. 46 is a diagram illustrating another example in which the receiver 200 according to the present embodiment displays an AR image.
  • the transmitter 100 is configured as a lighting device, and transmits a light ID by changing the luminance while illuminating the facility guide plate 101. Since the guide plate 101 is illuminated by the light from the transmitter 100, the luminance changes in the same manner as the transmitter 100, and the light ID is transmitted.
  • the receiver 200 acquires the captured display image Pb and the decoding image in the same manner as described above by imaging the guide plate 101 illuminated by the transmitter 100.
  • the receiver 200 acquires the optical ID by decoding the decoding image. That is, the receiver 200 receives the optical ID from the guide plate 101.
  • the receiver 200 transmits the optical ID to the server.
  • the receiver 200 acquires the AR image P2 and the recognition information corresponding to the optical ID from the server.
  • the receiver 200 recognizes an area corresponding to the recognition information in the captured display image Pb as a target area. For example, the receiver 200 recognizes the area where the frame 102 on the guide plate 101 is projected as the target area. This frame 102 is a frame for indicating the waiting time of the facility.
  • the receiver 200 superimposes the AR image P2 on the target area, and displays the captured display image Pb on which the AR image P2 is superimposed on the display 201.
  • the AR image P2 is an image including the character string “30 minutes”.
  • the receiver 200 captures the captured display image so that the guide plate 101 on which the waiting time “30 minutes” is described actually exists. Pb can be displayed. Thereby, it is possible to inform the user of the receiver 200 of the waiting time easily and easily without providing a special display device on the guide plate 101.
  • 47 is a diagram illustrating another example in which the receiver 200 according to the present embodiment displays an AR image.
  • the transmitter 100 includes two illumination devices as shown in FIG. 47, for example.
  • the transmitter 100 transmits the light ID by changing the luminance while illuminating the facility guide plate 104. Since the guide plate 104 is illuminated by the light from the transmitter 100, the luminance changes in the same manner as the transmitter 100, and the light ID is transmitted.
  • the guide plate 104 indicates names of a plurality of facilities such as “ABC land” and “adventure land”.
  • the receiver 200 acquires the captured display image Pc and the decoding image in the same manner as described above by imaging the guide plate 104 illuminated by the transmitter 100.
  • the receiver 200 acquires the optical ID by decoding the decoding image. That is, the receiver 200 receives the optical ID from the guide plate 104.
  • the receiver 200 transmits the optical ID to the server.
  • the receiver 200 acquires the AR image P3 and the recognition information corresponding to the optical ID from the server.
  • the receiver 200 recognizes an area corresponding to the recognition information in the captured display image Pc as a target area. For example, the receiver 200 recognizes an area where the guide plate 104 is projected as a target area.
  • the receiver 200 superimposes the AR image P3 on the target area, and displays the captured display image Pc on which the AR image P3 is superimposed on the display 201.
  • the AR image P3 is an image indicating names of a plurality of facilities.
  • the longer the waiting time of the facility the smaller the name of the facility is displayed.
  • the shorter the waiting time of the facility the larger the name of the facility is displayed.
  • the receiver 200 since the AR image P3 is superimposed on the target area of the captured display image Pc, the receiver 200 seems to actually have a guide plate 104 on which each facility name having a size corresponding to the waiting time is described.
  • the captured display image Pc can be displayed. Thereby, without providing a special display device on the guide plate 104, the user of the receiver 200 can be informed easily and easily of the waiting time of each facility.
  • FIG. 48 is a diagram illustrating another example in which the receiver 200 according to the present embodiment displays an AR image.
  • the transmitter 100 includes two illumination devices as shown in FIG. 48, for example.
  • the transmitter 100 transmits the light ID by changing the luminance while illuminating the castle wall 105. Since the castle wall 105 is illuminated by the light from the transmitter 100, the luminance is changed in the same manner as the transmitter 100, and the light ID is transmitted. Further, on the castle wall 105, for example, a small mark imitating the character's face is engraved as a hidden character 106.
  • the receiver 200 acquires the captured display image Pd and the decoding image in the same manner as described above by capturing an image of the castle wall 105 illuminated by the transmitter 100.
  • the receiver 200 acquires the optical ID by decoding the decoding image. That is, the receiver 200 receives the optical ID from the castle wall 105.
  • the receiver 200 transmits the optical ID to the server.
  • the receiver 200 acquires the AR image P4 corresponding to the optical ID and the recognition information from the server.
  • the receiver 200 recognizes an area corresponding to the recognition information in the captured display image Pd as a target area. For example, the receiver 200 recognizes, as the target area, an area in which a range including the hidden character 106 in the castle wall 105 is projected.
  • the receiver 200 superimposes the AR image P4 on the target area, and displays the captured display image Pd on which the AR image P4 is superimposed on the display 201.
  • the AR image P4 is an image imitating a character's face.
  • the AR image P4 is an image that is sufficiently larger than the hidden character 106 displayed in the captured display image Pd.
  • the receiver 200 captures the image so that the castle wall 105 engraved with a large mark imitating the character's face actually exists.
  • the display image Pd can be displayed. Thereby, it is possible to inform the user of the receiver 200 of the position of the hidden character 106 in an easy-to-understand manner.
  • FIG. 49 is a diagram illustrating another example in which the receiver 200 according to the present embodiment displays an AR image.
  • the transmitter 100 includes two illumination devices as shown in FIG. 49, for example.
  • the transmitter 100 transmits the light ID by changing the luminance while illuminating the facility guide plate 107. Since the guide plate 107 is illuminated by the light from the transmitter 100, the luminance changes in the same manner as the transmitter 100, and the light ID is transmitted.
  • an infrared shielding paint 108 is applied to a plurality of corners of the guide plate 107.
  • the receiver 200 acquires the captured display image Pe and the decoding image in the same manner as described above by imaging the guide plate 107 illuminated by the transmitter 100.
  • the receiver 200 acquires the optical ID by decoding the decoding image. That is, the receiver 200 receives the optical ID from the guide plate 107.
  • the receiver 200 transmits the optical ID to the server.
  • the receiver 200 acquires the AR image P5 and the recognition information corresponding to the optical ID from the server.
  • the receiver 200 recognizes an area corresponding to the recognition information in the captured display image Pe as a target area. For example, the receiver 200 recognizes an area where the guide plate 107 is projected as a target area.
  • the recognition information indicates that a rectangle circumscribing the plurality of infrared shielding paints 108 is the target region.
  • the infrared blocking paint 108 blocks infrared rays included in the light emitted from the transmitter 100. Therefore, the image sensor of the receiver 200 recognizes the infrared shielding paint 108 as an image darker than the surrounding area.
  • the receiver 200 recognizes rectangles circumscribing the plurality of infrared shielding paints 108 that appear as dark images as target regions.
  • the receiver 200 superimposes the AR image P5 on the target area, and displays the captured display image Pe on which the AR image P5 is superimposed on the display 201.
  • the AR image P5 shows a schedule of events to be performed in the facility of the guide board 107.
  • the receiver 200 displays the captured display image Pe so that the guide plate 107 on which the event schedule is described actually exists. can do. Thereby, without providing a special display device on the guide plate 107, it is possible to inform the user of the receiver 200 of the schedule of the facility event in an easy-to-understand manner.
  • an infrared reflecting paint may be applied to the guide plate 107 instead of the infrared shielding paint 108.
  • the infrared reflecting paint reflects infrared rays included in the light irradiated from the transmitter 100. Therefore, the image sensor of the receiver 200 recognizes the infrared reflective paint as an image brighter than the surrounding area. That is, in this case, the receiver 200 recognizes rectangles circumscribing a plurality of infrared reflective paints that appear as bright images as target regions.
  • FIG. 50 is a diagram illustrating another example in which the receiver 200 according to the present embodiment displays an AR image.
  • the transmitter 100 is configured as a station name sign and is disposed near the station exit guide plate 110.
  • the station exit guide plate 110 includes a light source and emits light, but, unlike the transmitter 100, does not transmit an optical ID.
  • the decoding image Pdec includes a bright line pattern region Pdec1 corresponding to the transmitter 100 and a bright region corresponding to the station exit guide plate 110.
  • Pdec2 appears.
  • the bright line pattern region Pdec1 is a region composed of a plurality of bright line patterns that appear by exposure at a communication exposure time of a plurality of exposure lines of the image sensor of the receiver 200.
  • the identification information includes reference information for specifying the reference region Pbas in the captured display image Ppre and target information indicating the relative position of the target region Ptar with respect to the reference region Pbas.
  • the reference information indicates that the position of the reference area Pbas in the captured display image Ppre is the same as the position of the bright line pattern area Pdec1 in the decoding image Pdec.
  • the target information indicates that the position of the target area is the position of the reference area.
  • the receiver 200 specifies the reference region Pbas from the captured display image Ppre based on the reference information. That is, the receiver 200 specifies, in the captured display image Ppre, an area that is at the same position as the bright line pattern area Pdec1 in the decoding image Pdec as the reference area Pbas. Further, the receiver 200 recognizes, as the target region Ptar, a region in the relative position indicated by the target information with reference to the position of the reference region Pbas in the captured display image Ppre. In the above example, since the target information indicates that the position of the target area Ptar is the position of the reference area Pbas, the receiver 200 recognizes the reference area Pbas in the captured display image Ppre as the target area Ptar.
  • the receiver 200 superimposes the AR image P1 on the target area Ptar in the captured display image Ppre.
  • the bright line pattern region Pdec1 is used to recognize the target region Ptar.
  • the region where the transmitter 100 is projected is to be recognized as the target region Ptar from only the captured display image Ppre without using the bright line pattern region Pdec1
  • erroneous recognition may occur.
  • the captured display image Ppre not the area where the transmitter 100 is projected but the area where the station exit guide plate 110 is projected may be erroneously recognized as the target area Ptar. This is because the image of the transmitter 100 and the image of the station exit guide plate 110 in the captured display image Ppre are similar.
  • the bright line pattern region Pdec1 it is possible to accurately recognize the target region Ptar while suppressing the occurrence of erroneous recognition.
  • FIG. 51 is a diagram illustrating another example in which the receiver 200 according to the present embodiment displays an AR image.
  • the transmitter 100 transmits a light ID by changing the luminance of the entire station name sign, and the target information indicates that the position of the target area is the position of the reference area.
  • the transmitter 100 transmits the light ID by changing the luminance of the light emitting elements arranged in a part of the outer frame of the station name sign without changing the brightness of the entire station name sign.
  • the target information only needs to indicate the relative position of the target area Ptar with respect to the reference area Pbas. For example, the position of the target area Ptar is above the reference area Pbas (specifically, vertically upward). May be shown.
  • the transmitter 100 transmits the light ID by changing the luminance of a plurality of light emitting elements arranged along the horizontal direction below the outer frame of the station name sign.
  • the target information indicates that the position of the target area Ptar is above the reference area Pbas.
  • the receiver 200 specifies the reference region Pbas from the captured display image Ppre based on the reference information. That is, the receiver 200 specifies, in the captured display image Ppre, an area that is at the same position as the bright line pattern area Pdec1 in the decoding image Pdec as the reference area Pbas. Specifically, the receiver 200 specifies a rectangular reference region Pbas that is long in the horizontal direction and short in the vertical direction. Further, the receiver 200 recognizes, as the target region Ptar, a region in the relative position indicated by the target information with reference to the position of the reference region Pbas in the captured display image Ppre. That is, the receiver 200 recognizes an area above the reference area Pbas in the captured display image Ppre as the target area Ptar. At this time, the receiver 200 specifies the direction above the reference region Pbas based on the direction of gravity measured by the acceleration sensor provided in the receiver 200.
  • the target information may indicate not only the relative position of the target area Ptar but also the size, shape, and aspect ratio of the target area Ptar.
  • the receiver 200 recognizes the target area Ptar having the size, shape, and aspect ratio indicated by the target information.
  • the receiver 200 may determine the size of the target area Ptar based on the size of the reference area Pbas.
  • FIG. 52 is a flowchart showing another example of the processing operation of the receiver 200 in the present embodiment.
  • the receiver 200 executes the processing of steps S101 to S104, as in the example shown in FIG.
  • the receiver 200 identifies the bright line pattern region Pdec1 from the decoding image Pdec (step S111).
  • the receiver 200 specifies a reference area Pbas corresponding to the bright line pattern area Pdec1 from the captured display image Ppre (step S112).
  • the receiver 200 recognizes the target area Ptar from the captured display image Ppre based on the recognition information (specifically, target information) and the reference area Pbas (step S113).
  • the receiver 200 superimposes the AR image on the target area Ptar of the captured display image Ppre, and displays the captured display image Ppre on which the AR image is superimposed (step S106). . Then, the receiver 200 determines whether or not the imaging and the display of the captured display image Pre are to be ended (Step S107). Here, if the receiver 200 determines that it should not be ended (N in step S107), it further determines whether or not the acceleration of the receiver 200 is equal to or greater than a threshold value (step S114). This acceleration is measured by an acceleration sensor provided in the receiver 200. When the receiver 200 determines that the acceleration is less than the threshold value (N in step S114), the receiver 200 executes the processing from step S113.
  • a threshold value is measured by an acceleration sensor provided in the receiver 200.
  • the receiver 200 determines that the acceleration is equal to or greater than the threshold (Y in step S114), the receiver 200 executes the processing from step S111 or step S102. Thereby, it can suppress that the area
  • FIG. 53 is a diagram illustrating another example in which the receiver 200 according to the present embodiment displays an AR image.
  • the receiver 200 When the AR image P1 in the displayed captured display image Ppre is tapped, the receiver 200 enlarges and displays the AR image P1. Alternatively, when tapped, the receiver 200 may display a new AR image showing more detailed content than that shown in the AR image P1 instead of the AR image P1. In addition, when the AR image P1 indicates information for one page of an information magazine including a plurality of pages, the receiver 200 displays a new AR image indicating information on the next page of the page of the AR image P1. You may display instead of AR image P1. Alternatively, when tapped, the receiver 200 may display a moving image related to the AR image P1 as a new AR image instead of the AR image P1. At this time, the receiver 200 may display a moving image in which an object (autumn leaves in the example of FIG. 53) comes out from the target area Ptar as an AR image.
  • an object autumn leaves in the example of FIG. 53
  • FIG. 54 is a diagram showing a captured display image Ppre and a decoding image Pdec acquired by imaging of the receiver 200 in the present embodiment.
  • the receiver 200 acquires captured images such as a captured display image Ppre and a decoding image Pdec at a frame rate of 30 fps as shown in FIG. Specifically, the receiver 200 acquires the captured display image Ppre “A” at time t1, acquires the decoding image Pdec at time t2, and acquires the captured display image Ppre “B” at time t3. The captured display image Ppre and the decoding image Pdec are obtained alternately.
  • the receiver 200 displays only the captured display image Ppre among the captured images, and does not display the decoding image Pdec. That is, as shown in (a2) of FIG. 54, the receiver 200 displays the captured display image Ppre acquired immediately before, instead of the decoding image Pdec, when acquiring the decoding image Pdec. Specifically, the receiver 200 displays the acquired captured display image Ppre “A” at time t1, and again displays the captured display image Ppre “A” acquired at time t1 at time t2. To do. Thereby, the receiver 200 displays the captured display image Ppre at a frame rate of 15 fps.
  • the receiver 200 alternately acquires the captured display image Ppre and the decoding image Pdec, but the acquisition form of these images in the present embodiment is It is not restricted to such a form. That is, the receiver 200 continuously obtains N (N is an integer equal to or greater than 1) decoding images Pdec, and then continuously captures M (M is an integer equal to or greater than 1) captured display images Ppre. You may repeat acquiring.
  • the receiver 200 needs to switch the acquired captured image to the captured display image Ppre and the decoding image Pdec, and this switching may take time. Therefore, as illustrated in (b1) of FIG. 54, the receiver 200 may provide a switching period when switching between acquisition of the captured display image Ppre and acquisition of the decoding image Pdec. Specifically, when the receiver 200 obtains the decoding image Pdec at time t3, the receiver 200 executes processing for switching the captured image during the switching period from time t3 to t5, and at time t5, the captured display image Prep " A ”is acquired. Thereafter, the receiver 200 executes processing for switching the captured image in the switching period from time t5 to time t7, and acquires the decoding image Pdec at time t7.
  • the receiver 200 displays the captured display image Ppre acquired immediately before in the switching period, as shown in (b2) of FIG. Therefore, in this case, the display frame rate of the captured display image Ppre in the receiver 200 is low, for example, 3 fps.
  • the captured display image Ppre displayed may not move according to the movement of the receiver 200. That is, the captured display image Ppre is not displayed as a live view. Therefore, the receiver 200 may move the captured display image Ppre according to the movement of the receiver 200.
  • FIG. 55 is a diagram showing an example of the captured display image Ppre displayed on the receiver 200 in the present embodiment.
  • the receiver 200 displays, on the display 201, a captured display image Ppre obtained by imaging, for example, as illustrated in FIG.
  • the user moves the receiver 200 to the left side.
  • the receiver 200 moves the displayed captured display image Ppre to the right as shown in FIG. 55 (b). That is, the receiver 200 includes an acceleration sensor, and moves the displayed captured display image Ppre to match the movement of the receiver 200 according to the acceleration measured by the acceleration sensor. Thereby, the receiver 200 can display the captured display image Ppre as a live view in a pseudo manner.
  • FIG. 56 is a flowchart showing another example of the processing operation of the receiver 200 in the present embodiment.
  • the receiver 200 superimposes the AR image on the target area Ptar of the captured display image Ppre and follows the target area Ptar (step S122). That is, an AR image that moves together with the target area Ptar in the captured display image Ppre is displayed. Then, the receiver 200 determines whether or not to maintain the display of the AR image (step S122). If it is determined that the display of the AR image is not maintained (N in step S122), if the receiver 200 acquires a new light ID by imaging, the new AR image corresponding to the light ID is captured and displayed. It is displayed superimposed on Pre (step S123).
  • the receiver 200 repeatedly executes the processing from step S121. At this time, the receiver 200 does not display another AR image even if another AR image is acquired. Alternatively, even when the receiver 200 has acquired a new decoding image Pdec, the receiver 200 does not acquire an optical ID by decoding the decoding image Pdec. At this time, power consumption for decoding can be suppressed.
  • the display of the AR image by maintaining the display of the AR image, it is possible to prevent the displayed AR image from being erased or being difficult to see due to the display of another AR image. That is, the displayed AR image can be easily seen by the user.
  • the receiver 200 determines to maintain the display of the AR image until a predetermined period (a certain period) elapses after the AR image is displayed. That is, when displaying the captured display image Ppre, the receiver 200 is determined in advance while suppressing the display of the second AR image different from the first AR image that is the AR image superimposed in step S121. The first AR image is displayed only during the display period. The receiver 200 may prohibit the decoding of the newly acquired decoding image Pdec during this display period.
  • a predetermined period a certain period
  • the receiver 200 may include a face camera, and when detecting that the user's face is approaching based on the imaging result of the face camera, the receiver 200 may determine to maintain the display of the AR image. . That is, when displaying the captured display image Ppre, the receiver 200 further determines whether or not the user's face is approaching the receiver 200 by imaging with a face camera provided in the receiver 200. When the receiver 200 determines that the face is approaching, the first AR image is suppressed while suppressing the display of the second AR image that is different from the first AR image that is the AR image superimposed in step S121. An AR image is displayed.
  • the receiver 200 may include an acceleration sensor, and when detecting that the user's face is approaching based on the measurement result of the acceleration sensor, the receiver 200 may determine to maintain the display of the AR image. . That is, when displaying the captured display image Ppre, the receiver 200 further determines whether or not the user's face is approaching the receiver 200 based on the acceleration of the receiver 200 measured by the acceleration sensor. For example, when the acceleration of the receiver 200 measured by the acceleration sensor shows a positive value in a direction perpendicular to the display 201 of the receiver 200, the receiver 200 is approaching the user's face. judge. When the receiver 200 determines that the face is approaching, the first 200 is performed while suppressing the display of the second AR image that is different from the first augmented reality image that is the AR image superimposed in step S121. The AR image is displayed.
  • the first AR image can be prevented from being replaced with a different second AR image.
  • the receiver 200 may determine that the display of the AR image is maintained when a lock button provided in the receiver 200 is pressed.
  • step S122 the receiver 200 determines that the display of the AR image is not maintained when the above-described certain period (that is, the display period) has elapsed. Further, the receiver 200 determines that the display of the AR image is not maintained when acceleration equal to or greater than the threshold value is measured by the acceleration sensor even when the above-described certain period has not elapsed. That is, when displaying the captured display image Ppre, the receiver 200 further measures the acceleration of the receiver 200 with the acceleration sensor during the display period described above, and determines whether or not the measured acceleration is equal to or greater than a threshold value. When the receiver 200 determines that the second AR image is greater than or equal to the threshold, the receiver 200 cancels the suppression of the display of the second AR image, thereby displaying the second AR image instead of the first AR image in step S123.
  • the above-described certain period that is, the display period
  • the receiver 200 determines that the display of the AR image is not maintained when acceleration equal to or greater than the threshold value is measured by the acceleration sensor even when the above-described
  • the suppression of the display of the second AR image is released. Therefore, for example, when the user moves the receiver 200 greatly so as to point the image sensor at another subject, the second AR image can be displayed immediately.
  • FIG. 57 is a diagram illustrating another example in which the receiver 200 according to the present embodiment displays an AR image.
  • the transmitter 100 is configured as a lighting device, and transmits a light ID by changing the luminance while illuminating a stage 111 for a small doll. Since the stage 111 is illuminated by the light from the transmitter 100, the luminance changes similarly to the transmitter 100, and the optical ID is transmitted.
  • the two receivers 200 image the stage 111 illuminated by the transmitter 100 from the left and right.
  • the left receiver 200 of the two receivers 200 captures the captured display image Pf and the decoding image in the same manner as described above by capturing the stage 111 illuminated by the transmitter 100 from the left.
  • the receiver 200 on the left side acquires the optical ID by decoding the decoding image. That is, the left receiver 200 receives the optical ID from the stage 111.
  • the left receiver 200 transmits the optical ID to the server.
  • the left-side receiver 200 acquires a three-dimensional AR image and recognition information corresponding to the optical ID from the server.
  • This three-dimensional AR image is an image for displaying a doll three-dimensionally, for example.
  • the left receiver 200 recognizes an area corresponding to the recognition information in the captured display image Pf as a target area. For example, the left receiver 200 recognizes the upper area at the center of the stage 111 as the target area.
  • the left-side receiver 200 generates a two-dimensional AR image P6a corresponding to the orientation from the three-dimensional AR image based on the orientation of the stage 111 displayed in the captured display image Pf. Then, the receiver 200 on the left side superimposes the two-dimensional AR image P6a on the target area, and displays the captured display image Pf on which the AR image P6a is superimposed on the display 201. In this case, since the two-dimensional AR image P6a is superimposed on the target area of the captured display image Pf, the left receiver 200 displays the captured display image Pf so that the doll actually exists on the stage 111. can do.
  • the right receiver 200 of the two receivers 200 captures the captured display image Pg and the decoding image in the same manner as described above by capturing the stage 111 illuminated by the transmitter 100 from the right. get.
  • the right receiver 200 acquires the optical ID by decoding the decoding image. That is, the right receiver 200 receives the optical ID from the stage 111.
  • the right receiver 200 transmits the optical ID to the server.
  • the right-side receiver 200 acquires a three-dimensional AR image and recognition information corresponding to the optical ID from the server.
  • the right receiver 200 recognizes an area corresponding to the recognition information in the captured display image Pg as a target area. For example, the right receiver 200 recognizes the upper area at the center of the stage 111 as the target area.
  • the right-side receiver 200 generates a two-dimensional AR image P6b corresponding to the orientation from the three-dimensional AR image based on the orientation of the stage 111 displayed in the captured display image Pg. Then, the receiver 200 on the right side superimposes the two-dimensional AR image P6b on the target region, and displays the captured display image Pg on which the AR image P6b is superimposed on the display 201. In this case, since the two-dimensional AR image P6b is superimposed on the target region of the captured display image Pg, the right-side receiver 200 displays the captured display image Pg so that the doll actually exists on the stage 111. can do.
  • the two receivers 200 display the AR images P6a and P6b at the same position on the stage 111.
  • the AR images P6a and P6b are generated according to the orientation of the receiver 200 so that the virtual doll is actually facing a predetermined direction. Therefore, the captured display image can be displayed so that the doll actually exists on the stage 111 no matter what direction the stage 111 is captured.
  • the receiver 200 generates a two-dimensional AR image corresponding to the positional relationship between the receiver 200 and the stage 111 from the three-dimensional AR image. May be obtained from the server. That is, the receiver 200 transmits information indicating the positional relationship together with the optical ID to the server, and acquires the two-dimensional AR image from the server instead of the three-dimensional AR image. Thereby, the burden on the receiver 200 can be reduced.
  • FIG. 58 is a diagram illustrating another example in which the receiver 200 according to the present embodiment displays an AR image.
  • the transmitter 100 is configured as an illumination device, and transmits a light ID by changing the luminance while illuminating a cylindrical structure 112. Since the structure 112 is illuminated by the light from the transmitter 100, the luminance is changed similarly to the transmitter 100, and the light ID is transmitted.
  • the receiver 200 acquires the captured display image Ph and the decoding image in the same manner as described above by imaging the structure 112 illuminated by the transmitter 100.
  • the receiver 200 acquires the optical ID by decoding the decoding image. That is, the receiver 200 receives the optical ID from the structure 112.
  • the receiver 200 transmits the optical ID to the server.
  • the receiver 200 acquires the AR image P7 corresponding to the optical ID and the recognition information from the server.
  • the receiver 200 recognizes an area corresponding to the recognition information in the captured display image Ph as a target area. For example, the receiver 200 recognizes an area where the central portion of the structure 112 is projected as a target area.
  • the receiver 200 superimposes the AR image P7 on the target area, and displays the captured display image Ph on which the AR image P7 is superimposed on the display 201.
  • the AR image P ⁇ b> 7 is an image including a character string “ABCD”, and the character string is distorted in accordance with the curved surface at the center of the structure 112.
  • the receiver 200 makes sure that the character string drawn on the structure 112 actually exists.
  • the captured display image Ph can be displayed.
  • FIG. 59 is a diagram illustrating another example in which the receiver 200 according to the present embodiment displays an AR image.
  • the transmitter 100 transmits the light ID by changing the luminance while illuminating the menu 113 of the restaurant. Since the menu 113 is illuminated by the light from the transmitter 100, the luminance changes in the same manner as the transmitter 100, and the light ID is transmitted.
  • the menu 113 indicates names of a plurality of dishes such as “ABC soup”, “XYZ salad”, and “KLM lunch”.
  • the receiver 200 acquires the captured display image Pi and the decoding image in the same manner as described above by capturing an image of the menu 113 illuminated by the transmitter 100.
  • the receiver 200 acquires the optical ID by decoding the decoding image. That is, the receiver 200 receives the optical ID from the menu 113.
  • the receiver 200 transmits the optical ID to the server.
  • the receiver 200 acquires the AR image P8 corresponding to the optical ID and the recognition information from the server.
  • the receiver 200 recognizes an area corresponding to the recognition information in the captured display image Pi as a target area. For example, the receiver 200 recognizes an area where the menu 113 is displayed as the target area.
  • the receiver 200 superimposes the AR image P8 on the target region, and displays the captured display image Pi on which the AR image P8 is superimposed on the display 201.
  • the AR image P8 is an image that shows the ingredients used for each of a plurality of dishes by marks.
  • the AR image P8 shows a mark imitating an egg for a dish “XYZ salad” using eggs, and a pig for a dish “KLM lunch” using pork. The imitated mark is shown.
  • the receiver 200 displays the captured display image Pi so that the menu 113 with the food mark is actually present. be able to. Thereby, without providing a special display device in the menu 113, the user of the receiver 200 can be easily and easily informed of the ingredients of each dish.
  • the receiver 200 acquires a plurality of AR images, selects an AR image suitable for the user from the plurality of AR images based on the user information set by the user, and superimposes the AR images. May be. For example, if the user information indicates that the user shows an allergic reaction to the egg, the receiver 200 selects an AR image that is marked with an egg for a dish in which the egg is used. If the user information indicates that the intake of pork is prohibited, the receiver 200 selects an AR image in which a pork mark is attached to a dish in which pork is used. Alternatively, the receiver 200 may transmit the user information together with the optical ID to the server and acquire an AR image corresponding to the optical ID and the user information from the server. Thereby, for each user, a menu that prompts the user to call can be displayed.
  • FIG. 60 is a diagram illustrating another example in which the receiver 200 according to the present embodiment displays an AR image.
  • the transmitter 100 is configured as a television, and transmits an optical ID by changing luminance while displaying an image on a display.
  • a normal television 114 is disposed in the vicinity of the transmitter 100. The television 114 displays an image on the display, but does not transmit an optical ID.
  • the receiver 200 acquires the captured display image Pj and the decoding image in the same manner as described above, for example, by imaging the television 114 together with the transmitter 100.
  • the receiver 200 acquires the optical ID by decoding the decoding image. That is, the receiver 200 receives the optical ID from the transmitter 100.
  • the receiver 200 transmits the optical ID to the server.
  • the receiver 200 acquires the AR image P9 and the recognition information corresponding to the optical ID from the server.
  • the receiver 200 recognizes an area corresponding to the recognition information in the captured display image Pj as a target area.
  • the receiver 200 uses the bright line pattern area of the image for decoding, so that the lower part of the area where the transmitter 100 transmitting the optical ID is displayed in the captured display image Pj is the first target area. Recognize as At this time, the reference information included in the recognition information indicates that the position of the reference area in the captured display image Pj is the same as the position of the bright line pattern area in the decoding image. Furthermore, the target information included in the recognition information indicates that there is a target area below the reference area. The receiver 200 recognizes the first target area described above using such recognition information.
  • the receiver 200 recognizes an area whose position is fixed in advance below the captured display image Pj as the second target area.
  • the second target area is larger than the first target area.
  • the target information included in the recognition information further indicates not only the position of the first target area but also the position and size of the second target area as described above.
  • the receiver 200 recognizes the second target area described above using such recognition information.
  • the receiver 200 superimposes the AR image P9 on the first target area and the second target area, and displays the captured display image Pj on which the AR image P8 is superimposed on the display 201.
  • the receiver 200 matches the size of the AR image P9 with the size of the first target area, and superimposes the AR image P9 whose size has been adjusted on the first target area.
  • the receiver 200 matches the size of the AR image P9 with the size of the second target area, and superimposes the AR image P9 whose size has been adjusted on the second target area.
  • the AR image P9 indicates a caption for the video of the transmitter 100.
  • the language of the caption of the AR image P9 is a language according to the user information set and registered in the receiver 200. That is, when transmitting the optical ID to the server, the receiver 200 also transmits the user information (for example, information indicating the user's nationality or language used) to the server. Then, the receiver 200 acquires an AR image P9 indicating a language caption corresponding to the user information.
  • the receiver 200 acquires a plurality of AR images P9 indicating subtitles in different languages, and uses the AR images used for superimposition from the plurality of AR images P9 according to the user information registered and registered. P9 may be selected.
  • the receiver 200 acquires a captured display image Pj and a decoding image by capturing a plurality of displays each displaying an image as a subject. Then, when the receiver 200 recognizes the target area, an area in which the transmission display (that is, the transmitter 100) that is the display that transmits the light ID among the plurality of displays appears in the captured display image Pj. Is recognized as a target area. Next, the receiver 200 superimposes the first subtitle corresponding to the image displayed on the transmission display as an AR image on the target area. Furthermore, the receiver 200 superimposes a second subtitle, which is a subtitle obtained by enlarging the first subtitle, on a region larger than the target region in the captured display image Pj.
  • a second subtitle which is a subtitle obtained by enlarging the first subtitle
  • the receiver 200 can display the captured display image Pj so that captions actually exist in the video of the transmitter 100. Furthermore, since the receiver 200 also superimposes a large caption on the lower part of the captured display image Pj, even if the caption attached to the video of the transmitter 100 is small, the caption can be easily viewed. In addition, when there is no caption attached to the video of the transmitter 100 and only a large caption is superimposed on the lower part of the captured display image Pj, whether the superimposed caption is a caption for the video of the transmitter 100 or a television It is difficult to determine whether it is a caption for 114 videos. However, in the present embodiment, since captions are attached to the video of the transmitter 100 that transmits the optical ID, the user can easily determine which video the superimposed caption is for. Can do.
  • the receiver 200 may further determine whether or not audio information is included in the information acquired from the server. When the receiver 200 determines that the audio information is included, the receiver 200 outputs the audio indicated by the audio information with priority over the first and second subtitles. Thereby, since sound is preferentially output, it is possible to reduce the burden of the user reading subtitles.
  • the subtitle language is changed according to the user information (that is, the user attribute), but the video (that is, the content) itself displayed on the transmitter 100 may be changed.
  • the video displayed on the transmitter 100 is a news video
  • the receiver 200 may update the news video broadcast in Japan. Is acquired as an AR image. Then, the receiver 200 superimposes the news video on an area where the display of the transmitter 100 is displayed (that is, the target area).
  • the user information indicates that the user is American
  • the receiver 200 acquires a news video broadcast in the United States as an AR image.
  • the receiver 200 superimposes the news video on an area where the display of the transmitter 100 is displayed (that is, the target area). Thereby, an image suitable for the user can be displayed.
  • the user information indicates, for example, nationality or language used as the user attribute, and the receiver 200 acquires the above-described AR image based on the attribute.
  • FIG. 61 is a diagram showing an example of recognition information in the present embodiment.
  • the transmitters 100a and 100b are configured as station names as with the transmitter 100, respectively. Even if these transmitters 100a and 100b are station names different from each other, they may be misrecognized because they are similar if they are located close to each other.
  • the recognition information of each of the transmitters 100a and 100b does not indicate each feature point and each feature amount of the entire image of the transmitter 100a or 100b, and each feature point of only a characteristic part of the image and Each feature amount may be indicated.
  • the part a1 of the transmitter 100a and the part b1 of the transmitter 100b are greatly different from each other, and the part a2 of the transmitter 100a and the part b2 of the transmitter 100b are greatly different from each other. Therefore, if the transmitters 100a and 100b are installed in a predetermined range (that is, a short distance), the server uses image characteristics of the parts a1 and a2 as the recognition information corresponding to the transmitter 100a. Preserve points and features. Similarly, the server holds the feature points and feature amounts of the images of the parts b1 and b2 as the identification information corresponding to the transmitter 100b.
  • the receiver 200 can appropriately use the identification information.
  • the target area can be recognized.
  • FIG. 62 is a flowchart showing another example of the processing operation of the receiver 200 in the present embodiment.
  • the receiver 200 first determines whether or not the user has a visual impairment based on the user information set and registered in the receiver 200 (step S131). If the receiver 200 determines that there is a visual impairment (Y in step S131), the receiver 200 outputs the characters of the AR image displayed in a superimposed manner by voice (step S132). On the other hand, when the receiver 200 determines that there is no visual impairment (N in step S131), the receiver 200 further determines whether the user has a hearing impairment based on the user information (step S133). Here, if the receiver 200 determines that there is a hearing impairment (Y in step S133), the receiver 200 stops the sound output (step S134). At this time, the receiver 200 stops outputting sound by all functions.
  • the receiver 200 may perform the process of step S133 when it is determined in step S131 that there is a visual impairment (Y in step S131). That is, when it is determined that there is a visual impairment and there is no hearing impairment, the receiver 200 may output the AR image characters displayed in a superimposed manner by voice.
  • FIG. 63 is a diagram illustrating an example in which the receiver 200 according to the present embodiment identifies bright line pattern regions.
  • the receiver 200 obtains a decoding image by imaging two transmitters each transmitting an optical ID, and performs decoding on the decoding image to obtain an optical ID as shown in FIG. To get.
  • the receiver 200 transmits the light ID of the transmitter corresponding to the bright line pattern area X and the transmission corresponding to the bright line pattern area Y. Get the machine's optical ID.
  • the light ID of the transmitter corresponding to the bright line pattern region X is made up of numerical values (that is, data) corresponding to addresses 0 to 9, for example, “5, 2, 8, 4, 3, 6, 1, 9, 4,3 ".
  • the transmitter optical ID corresponding to the bright line pattern region X is also composed of numerical values corresponding to addresses 0 to 9, for example, “5, 2, 7, 7, 1, 5, 3, 2, 7, 4 ".
  • the receiver 200 Even if the receiver 200 acquires these light IDs once, that is, even if these light IDs are known, from which bright line pattern area each light ID was obtained when taking an image. It may be a situation that you do not understand. In such a case, the receiver 200 can easily determine from which bright line pattern area each known light ID is obtained by performing the processes shown in FIGS. Can be determined quickly.
  • the receiver 200 first acquires the decoding image Pdec11 as shown in (a) of FIG. 63, and decodes each of the bright line pattern regions X and Y by decoding the decoding image Pdec11.
  • the numerical value of the optical ID address 0 is acquired.
  • the numerical value of the light ID address 0 of the bright line pattern region X is “5”
  • the numerical value of the light ID address 0 of the bright line pattern region Y is also “5”. Since the numerical value of the address 0 of each light ID is “5”, at this time, it cannot be determined from which bright line pattern area the known light ID is obtained.
  • the receiver 200 acquires the decoding image Pdec12 and decodes the decoding image Pdec12 to obtain the address 1 of each light ID of the bright line pattern regions X and Y. Get the number of. For example, the numerical value of the address 1 of the light ID in the bright line pattern region X is “2”, and the numerical value of the address 1 of the light ID in the bright line pattern region Y is also “2”. Since the numerical value of the address 1 of each light ID is “2”, it is impossible to determine from which bright line pattern region the known light ID is obtained.
  • the receiver 200 acquires the decoding image Pdec13, and decodes the decoding image Pdec13 to obtain the respective light IDs of the bright line pattern regions X and Y.
  • the numerical value of the address 2 of the light ID in the bright line pattern region X is “8”
  • the numerical value of the address 2 of the light ID in the bright line pattern region Y is “7”.
  • it can be determined that the known light ID “5, 2, 8, 4, 3, 6, 1, 9, 4, 3” has been obtained from the bright line pattern region X, and the known light ID “5, 2, 7, 7, 1, 5, 3, 2, 7, 4 "can be determined to have been obtained from the bright line pattern region Y.
  • the receiver 200 may further acquire the numerical value of the address 3 of each optical ID, as shown in FIG. That is, the receiver 200 acquires the decoding image Pdec14, and acquires the numerical value of the address 3 of the light ID of each of the bright line pattern areas X and Y by decoding the decoding image Pdec14.
  • the numerical value of the address 3 of the light ID in the bright line pattern region X is “4”
  • the numerical value of the address 3 of the light ID in the bright line pattern region Y is “7”.
  • the known light ID “5, 2, 8, 4, 3, 6, 1, 9, 4, 3” has been obtained from the bright line pattern region X
  • the numerical value of at least one address is reacquired without acquiring the numerical values (that is, data) of all addresses of the optical ID again. This makes it possible to easily and quickly determine from which bright line pattern region the known light ID is obtained.
  • the numerical value acquired for the predetermined address matches the numerical value of the known optical ID, but it does not have to match. Good.
  • the receiver 200 acquires “6” as the numerical value of the address 3 of the light ID of the bright line pattern region Y.
  • the numerical value “6” of the address 3 is different from the numerical value “7” of the address 3 of the known light ID “5, 2, 7, 7, 1, 5, 3, 2, 7, 4”.
  • the receiver 200 has a known light ID “5, 2, 7, 7, 1, 5, 3, 2, 7, 4” with a bright line.
  • the numerical value “6” is a numerical value close to the numerical value “7” depending on whether the numerical value “6” is within the range of the numerical value “7” ⁇ n (n is a number of 1 or more, for example). It may be determined whether or not.
  • FIG. 64 is a diagram illustrating another example of the receiver 200 in the present embodiment.
  • the receiver 200 is configured as a smartphone in the above example, but may be configured as a head-mounted display (also referred to as glass) including an image sensor.
  • a head-mounted display also referred to as glass
  • image sensor an image sensor
  • Such a receiver 200 detects a predetermined signal because power consumption increases when a processing circuit for displaying an AR image as described above (hereinafter referred to as an AR processing circuit) is always activated. When this occurs, the AR processing circuit may be activated.
  • the receiver 200 includes a touch sensor 202.
  • the touch sensor 202 outputs a touch signal when it touches a user's finger or the like.
  • the receiver 200 activates the AR processing circuit when detecting the touch signal.
  • the receiver 200 may activate the AR processing circuit when detecting a radio signal such as Bluetooth (registered trademark) or Wi-Fi (registered trademark).
  • a radio signal such as Bluetooth (registered trademark) or Wi-Fi (registered trademark).
  • the receiver 200 may include an acceleration sensor, and may activate the AR processing circuit when the acceleration sensor measures an acceleration equal to or greater than a threshold value in a direction opposite to the direction of gravity. That is, the receiver 200 activates the AR processing circuit when detecting the signal indicating the acceleration. For example, when the user pushes up the nose pad portion of the receiver 200 configured as a glass upward with a fingertip from below, the receiver 200 detects a signal indicating the acceleration and activates the AR processing circuit.
  • the receiver 200 may activate the AR processing circuit when it is detected by the GPS and the 9-axis sensor that the image sensor is directed to the transmitter 100. That is, the receiver 200 activates the AR processing circuit when detecting a signal indicating that the receiver 200 is directed in a predetermined direction. In this case, if the transmitter 100 is the above-mentioned Japanese station name mark, the receiver 200 displays an AR image indicating the English station name superimposed on the station name mark.
  • FIG. 65 is a flowchart showing another example of the processing operation of the receiver 200 in the present embodiment.
  • the receiver 200 When the receiver 200 acquires the optical ID from the transmitter 100 (step S141), the receiver 200 switches the mode of noise cancellation by receiving mode designation information corresponding to the optical ID (step S142). Then, the receiver 200 determines whether or not the mode switching process should be terminated (step S143), and when it is determined that the mode switching process should not be terminated (N in step S143), the process from step S141 is repeatedly executed.
  • the switching of the noise canceling mode is, for example, a mode (ON) for canceling noise such as an engine in an airplane and a mode (OFF) for not canceling the noise.
  • a user carrying the receiver 200 is listening to a sound such as music output from the receiver 200 by putting an earphone connected to the receiver 200 on the ear.
  • the receiver 200 When such a user gets on the airplane, the receiver 200 acquires an optical ID. As a result, the receiver 200 switches the noise cancellation mode from OFF to ON. As a result, the user can hear a voice that does not include noise such as engine noise even in the cabin.
  • the receiver 200 also acquires the light ID when the user leaves the airplane.
  • the receiver 200 that has acquired this optical ID switches the noise cancellation mode from ON to OFF.
  • the noise to be subject to noise cancellation is not limited to engine noise but may be any sound such as a human voice.
  • FIG. 66 is a diagram illustrating an example of a transmission system including a plurality of transmitters in the present embodiment.
  • This transmission system includes a plurality of transmitters 120 arranged in a predetermined order. Like the transmitter 100, these transmitters 120 are transmitters in any one of the first to third embodiments, and include one or a plurality of light emitting elements (for example, LEDs).
  • the leading transmitter 120 transmits the optical ID by changing the luminance of one or a plurality of light emitting elements according to a predetermined frequency (carrier frequency). Further, the first transmitter 120 outputs a signal indicating the change in luminance to the subsequent transmitter 120 as a synchronization signal.
  • the subsequent transmitter 120 receives the synchronization signal, it transmits the optical ID by changing the luminance of one or more light emitting elements in accordance with the synchronization signal. Further, the subsequent transmitter 120 outputs a signal indicating the change in luminance to the subsequent subsequent transmitter 120 as a synchronization signal. Thereby, all the transmitters 120 included in the transmission system transmit the optical ID in synchronization.
  • the synchronization signal is transferred from the first transmitter 120 to the subsequent transmitter 120, and is further transferred from the subsequent transmitter 120 to the next subsequent transmitter 120 to the last transmitter 120. reach. For example, it takes about 1 ⁇ sec to transfer the synchronization signal. Therefore, if N (N is an integer of 2 or more) transmitters 120 are provided in the transmission system, it takes 1 ⁇ N ⁇ seconds for the synchronization signal to reach the last transmitter 120 from the first transmitter 120. become. As a result, the transmission timing of the optical ID is shifted by a maximum of N ⁇ seconds.
  • the receiver 200 even if N transmitters 120 transmit optical IDs according to a frequency of 9.6 kHz, and the receiver 200 attempts to receive an optical ID at a frequency of 9.6 kHz, the receiver 200 receives light that is shifted by N ⁇ seconds. Since the ID is received, the optical ID may not be received correctly.
  • the head transmitter 120 transmits the optical ID at a higher speed according to the number of transmitters 120 included in the transmission system.
  • the first transmitter 120 transmits an optical ID according to a frequency of 9.605 kHz.
  • the receiver 200 receives the optical ID at a frequency of 9.6 kHz. At this time, even if the receiver 200 receives the optical ID shifted by N ⁇ s, the frequency of the leading transmitter 120 is higher than the frequency of the receiver 200 by 0.005 kHz. Occurrence can be suppressed.
  • the first transmitter 120 may control the frequency adjustment amount by having the synchronization signal fed back from the last transmitter 120. For example, the first transmitter 120 measures the time from when it outputs the synchronization signal itself until it receives the synchronization signal fed back from the last transmitter 120. And the head transmitter 120 transmits optical ID according to a frequency higher than a reference frequency (for example, 9.6 kHz), so that the time is long.
  • a reference frequency for example, 9.6 kHz
  • FIG. 67 is a diagram illustrating an example of a transmission system including a plurality of transmitters and receivers in the present embodiment.
  • This transmission system includes, for example, two transmitters 120 and a receiver 200.
  • One of the two transmitters 120 transmits an optical ID according to a frequency of 9.599 kHz.
  • the other transmitter 120 transmits an optical ID according to a frequency of 9.601 kHz.
  • each of the two transmitters 120 notifies the receiver 200 of the frequency of its own optical ID with a radio wave signal.
  • the receiver 200 When the receiver 200 receives the notification of those frequencies, the receiver 200 tries to perform decoding according to each of the notified frequencies. That is, the receiver 200 attempts to decode the decoding image according to the frequency of 9.599 kHz. If the optical ID cannot be received by this, the receiver 200 attempts to decode the decoding image according to the frequency of 9.601 kHz. As described above, the receiver 200 attempts to decode the decoding image according to each of all the notified frequencies. In other words, the receiver 200 performs brute force for each notified frequency. Alternatively, the receiver 200 may attempt decoding according to the average frequency of all the notified frequencies. That is, the receiver 200 attempts decoding according to 9.6 kHz, which is an average frequency of 9.599 kHz and 9.601 kHz.
  • FIG. 68A is a flowchart illustrating an example of the processing operation of the receiver 200 in the present embodiment.
  • the receiver 200 starts imaging (step S151) and initializes the parameter N to 1 (step S152).
  • the receiver 200 decodes the decoding image obtained by the imaging according to the frequency corresponding to the parameter N, and calculates an evaluation value for the decoding result (step S153).
  • the evaluation value indicates a higher numerical value as the decoding result is more similar to the correct optical ID.
  • the receiver 200 determines whether or not the numerical value of the parameter N is equal to Nmax that is a predetermined integer of 1 or more (step S154).
  • the receiver 200 determines that it is not equal to Nmax (N in step S154)
  • it increments the parameter N (step S155) and repeats the processing from step S153.
  • the receiver 200 determines that it is equal to Nmax (Y in step S154)
  • the frequency for which the maximum evaluation value is calculated is registered as the optimum frequency in association with the location information indicating the location of the receiver 200. To do.
  • the optimum frequency and location information registered in this way are used for receiving the optical ID by the receiver 200 that has moved to the location indicated by the location information after registration.
  • the location information may be information indicating a position measured by GPS, for example, or may be identification information (for example, SSID: Service Set Identifier) of an access point in a wireless LAN (Local Area Network).
  • the receiver 200 that has registered with the server displays, for example, the AR image as described above according to the optical ID obtained by decoding at the optimum frequency.
  • FIG. 68B is a flowchart illustrating an example of the processing operation of the receiver 200 in the present embodiment.
  • the receiver 200 After registration with the server shown in FIG. 68A is performed, the receiver 200 transmits location information indicating the location where the receiver 200 exists to the server (step S161). Next, the receiver 200 acquires the optimum frequency registered in association with the location information from the server (step S162).
  • the receiver 200 starts imaging (step S163), and decodes the decoding image obtained by the imaging according to the optimum frequency acquired in step S162 (step S164).
  • the receiver 200 displays an AR image as described above, for example, according to the optical ID obtained by this decoding.
  • the receiver 200 can acquire the optimum frequency and receive the optical ID without executing the processing shown in FIG. 68A.
  • the receiver 200 may acquire the optimum frequency by executing the process shown in FIG. 68A when the optimum frequency cannot be obtained in step S162.
  • FIG. 69A is a flowchart showing a display method in the present embodiment.
  • the display method in the present embodiment is a display method in which the display device that is the above-described receiver 200 displays an image, and includes steps SL11 to SL16.
  • step SL11 the image sensor captures an image of the subject to acquire a captured display image and a decoding image.
  • step SL12 the optical ID is acquired by decoding the decoding image.
  • step SL13 the optical ID is transmitted to the server.
  • step SL14 the AR image corresponding to the optical ID and the recognition information are acquired from the server.
  • step SL15 an area corresponding to the recognition information in the captured display image is recognized as a target area.
  • step SL16 the captured display image in which the AR image is superimposed on the target area is displayed.
  • the AR image is displayed superimposed on the captured display image, an image useful for the user can be displayed. Furthermore, it is possible to superimpose an AR image on an appropriate target area while suppressing the processing load.
  • augmented reality that is, AR
  • a huge number of recognition target images stored in advance are compared with the captured display image, and any of the recognition target images is included in the captured display image. It is determined whether or not If it is determined that the recognition target image is included, the AR image corresponding to the recognition target image is superimposed on the captured display image. At this time, the AR image is aligned based on the recognition target image.
  • a huge number of recognition target images are compared with captured display images, and further, position detection of the recognition target images in the captured display image is necessary even in alignment. There is a problem of a large amount of calculation and a high processing load.
  • the light ID is acquired by decoding the decoding image obtained by imaging the subject. That is, the optical ID transmitted from the transmitter that is the subject is received. Furthermore, the AR image corresponding to this optical ID and the recognition information are acquired from the server. Therefore, the server does not need to compare a huge number of recognition target images and captured display images, and can select and transmit an AR image previously associated with the optical ID to the display device. Thereby, the amount of calculation can be reduced and the processing load can be significantly suppressed.
  • the recognition information corresponding to this light ID is acquired from the server.
  • the recognition information is information for recognizing a target area that is an area in which an AR image is superimposed in a captured display image.
  • This recognition information may be information indicating that a white square is the target area, for example.
  • the target area can be easily recognized, and the processing load can be further suppressed. That is, the processing load can be further suppressed according to the content of the recognition information.
  • the server can arbitrarily set the content of the recognition information according to the optical ID, the balance between the processing load and the recognition accuracy can be appropriately maintained.
  • the recognition information is reference information for specifying a reference area in the captured display image.
  • the reference area is specified from the captured display image based on the reference information, and the captured display image is displayed.
  • the target area may be recognized based on the position of the reference area.
  • the recognition information may include reference information for specifying a reference area in the captured display image and target information indicating a relative position of the target area with respect to the reference area.
  • the reference area is identified from the captured display image based on the reference information, and the area at the relative position indicated by the target information in the captured display image with the position of the reference area as a reference, Recognize as a target area.
  • the reference information is such that the position of the reference area in the captured display image is the same as the position of the bright line pattern area composed of a plurality of bright line patterns appearing by exposure of the multiple exposure lines of the image sensor in the decoding image. May be indicated.
  • the target area can be recognized with reference to the area corresponding to the bright line pattern area in the captured display image.
  • the reference information may indicate that the reference area in the captured display image is an area in which the display of the captured display image is displayed.
  • the target area can be recognized with reference to the area where the display is displayed.
  • the first AR image is displayed for a predetermined display period while suppressing the display of the second AR image different from the first AR image that is the above-described AR image. May be.
  • decoding of a newly acquired decoding image may be prohibited during the display period.
  • decoding of the newly acquired decoding image is a wasteful process when the display of the second AR image is suppressed. Power consumption can be reduced.
  • the acceleration of the display device may be measured by an acceleration sensor during the display period, and it may be determined whether or not the measured acceleration is equal to or greater than a threshold value. And when it determines with more than a threshold value, you may display a 2nd AR image instead of a 1st AR image by canceling suppression of a display of the 2nd AR image.
  • the suppression of the display of the second AR image is released. Accordingly, for example, when the user moves the display device greatly to direct the image sensor toward another subject, the second AR image can be displayed immediately.
  • the display of the captured display image it may be further determined whether or not the user's face is approaching the display device by imaging with a face camera provided in the display device. And if it determines with the face approaching, you may display a 1st AR image, suppressing the display of the 2nd AR image different from a 1st AR image.
  • whether or not the user's face is approaching the display device may be further determined based on the acceleration of the display device measured by the acceleration sensor. And if it determines with the face approaching, you may display a 1st AR image, suppressing the display of the 2nd AR image different from a 1st AR image.
  • the first AR image is replaced with a different second AR image. That can be suppressed.
  • the captured display image and the decoding image are acquired by capturing a plurality of displays each displaying an image as a subject. Also good.
  • an area in which a transmission display that is a display that transmits the light ID among the plurality of displays appears in the captured display image is recognized as the target area.
  • the first subtitle corresponding to the image displayed on the transmission display is superimposed on the target area as an AR image, and further, in an area larger than the target area of the captured display image, The second subtitle, which is an expanded subtitle of the first subtitle, is superimposed.
  • the first subtitle is superimposed on the image on the transmission display, the user can easily grasp which display subtitle is the subtitle for the display image of the plurality of displays. it can.
  • the second subtitle which is an enlarged subtitle of the first subtitle, is also displayed, even if the first subtitle is small and difficult to read, the subtitle can be easily read by displaying the second subtitle. can do.
  • the display of the captured display image it is further determined whether or not audio information is included in the information acquired from the server.
  • the first and second subtitles are determined.
  • the voice indicated by the voice information may be output with priority.
  • FIG. 69B is a block diagram illustrating a configuration of the display device in the present embodiment.
  • the display device 10 is a display device that displays an image, and includes an image sensor 11, a decoding unit 12, a transmission unit 13, an acquisition unit 14, a recognition unit 15, and a display unit 16. Prepare.
  • the display device 10 corresponds to the receiver 200 described above.
  • the image sensor 11 acquires a captured display image and a decoding image by imaging a subject.
  • the decoding unit 12 acquires the optical ID by decoding the decoding image.
  • the transmission unit 13 transmits the optical ID to the server.
  • the acquisition unit 14 acquires the AR image corresponding to the optical ID and the recognition information from the server.
  • the recognition unit 15 recognizes a region corresponding to the recognition information in the captured display image as a target region.
  • the display unit 16 displays a captured display image in which an AR image is superimposed on the target area.
  • the AR image is displayed superimposed on the captured display image, an image useful for the user can be displayed. Furthermore, it is possible to superimpose an AR image on an appropriate target area while suppressing the processing load.
  • each component may be configured by dedicated hardware or may be realized by executing a software program suitable for each component.
  • Each component may be realized by a program execution unit such as a CPU or a processor reading and executing a software program recorded on a recording medium such as a hard disk or a semiconductor memory.
  • software for realizing the receiver 200 or the display device 10 according to the present embodiment is included in the flowcharts shown in FIGS. 45, 52, 56, 62, 65, and 68A to 69A.
  • FIG. 70 is a diagram illustrating an example in which the receiver in the first modification of the fourth embodiment displays an AR image.
  • the receiver 200 acquires the above-described captured display image Pk, which is the normal captured image, and the above-described decoding image, which is the visible light communication image or the bright line image, by capturing the subject with the image sensor.
  • the image sensor of the receiver 200 images the transmitter 100c configured as a robot and the person 21 adjacent to the transmitter 100c.
  • the transmitter 100c is the transmitter according to any one of the first to third embodiments, and includes one or a plurality of light emitting elements (for example, LEDs) 131.
  • the transmitter 100c changes its luminance by blinking the one or more light emitting elements 131, and transmits an optical ID (optical identification information) by the luminance change.
  • This light ID is the above-mentioned visible light signal.
  • the receiver 200 acquires the captured display image Pk on which the transmitter 100c and the person 21 are imaged by the normal exposure time. Furthermore, the receiver 200 acquires the decoding image by capturing the transmitter 100c and the person 21 with the communication exposure time shorter than the normal exposure time.
  • the receiver 200 acquires the optical ID by decoding the decoding image. That is, the receiver 200 receives the optical ID from the transmitter 100c. The receiver 200 transmits the optical ID to the server. Then, the receiver 200 acquires the AR image P10 corresponding to the optical ID and the recognition information from the server. The receiver 200 recognizes an area corresponding to the recognition information in the captured display image Pk as a target area. For example, the receiver 200 recognizes an area on the right side of the area where the robot that is the transmitter 100c is projected as the target area. Specifically, the receiver 200 specifies the distance between the two markers 132a and 132b of the transmitter 100c displayed in the captured display image Pk.
  • the receiver 200 recognizes an area having a width and a height corresponding to the distance as a target area. That is, the recognition information indicates the shape of the markers 132a and 132b and the position and size of the target region with reference to the markers 132a and 132b.
  • the receiver 200 superimposes the AR image P10 on the target area, and displays the captured display image Pk on which the AR image P10 is superimposed on the display 201.
  • the receiver 200 acquires an AR image P10 indicating another robot different from the transmitter 100c.
  • the captured display image Pk can be displayed so that another robot actually exists next to the transmitter 100c.
  • the person 21 can be photographed together with the other robot together with the transmitter 100c even if no other robot exists.
  • FIG. 71 is a diagram illustrating another example in which the receiver 200 according to the first modification of the fourth embodiment displays an AR image.
  • the transmitter 100 is configured as an image display device having a display panel, and transmits a light ID by changing the luminance while displaying a still image PS on the display panel.
  • the display panel is, for example, a liquid crystal display or an organic EL (electroluminescence) display.
  • the receiver 200 acquires the captured display image Pm and the decoding image by imaging the transmitter 100 in the same manner as described above.
  • the receiver 200 acquires the optical ID by decoding the decoding image. That is, the receiver 200 receives the optical ID from the transmitter 100.
  • the receiver 200 transmits the optical ID to the server.
  • the receiver 200 acquires the AR image P11 and the recognition information corresponding to the optical ID from the server.
  • the receiver 200 recognizes an area corresponding to the recognition information in the captured display image Pm as a target area. For example, the receiver 200 recognizes the area where the display panel of the transmitter 100 is displayed as the target area.
  • the receiver 200 superimposes the AR image P11 on the target area, and displays the captured display image Pm on which the AR image P11 is superimposed on the display 201.
  • the AR image P11 is a moving image having the same or substantially the same picture as the still image PS displayed on the display panel of the transmitter 100 as the first picture in the display order. That is, the AR image P11 is a moving image that starts to move from the still image PS.
  • the receiver 200 displays the captured display image Pm so that an image display device that displays a moving image actually exists. Can do.
  • FIG. 72 is a diagram illustrating another example in which the receiver 200 according to the first modification of the fourth embodiment displays an AR image.
  • the transmitter 100 is configured as a station name sign, and transmits a light ID by changing the luminance.
  • the receiver 200 images the transmitter 100 from a position away from the transmitter 100 as shown in FIG. Thereby, the receiver 200 acquires the captured display image Pn and the decoding image in the same manner as described above.
  • the receiver 200 acquires the optical ID by decoding the decoding image. That is, the receiver 200 receives the optical ID from the transmitter 100.
  • the receiver 200 transmits the optical ID to the server.
  • the receiver 200 acquires the AR images P12 to P14 corresponding to the optical ID and the recognition information from the server.
  • the receiver 200 recognizes two regions corresponding to the recognition information in the captured display image Pn as first and second target regions. For example, the receiver 200 recognizes the area around the transmitter 100 as the first target area.
  • the receiver 200 superimposes the AR image P12 on the first target area, and displays the captured display image Pn on which the AR image P12 is superimposed on the display 201.
  • the AR image P12 is an arrow that prompts the user of the receiver 200 to approach the transmitter 100.
  • the AR image P12 is displayed superimposed on the first target area of the captured display image Pn
  • the user approaches the transmitter 100 with the receiver 200 facing the transmitter 100.
  • the area of the transmitter 100 (corresponding to the reference area described above) displayed in the captured display image Pn becomes larger.
  • the receiver 200 further moves to a second target area, which is an area where the transmitter 100 is projected, as shown in FIG. 72 (b), for example.
  • the AR image P13 is superimposed. That is, the receiver 200 displays the captured display image Pn on which the AR images P12 and P13 are superimposed on the display 201.
  • the AR image P13 is a message that informs the user of the outline of the vicinity of the station indicated by the station name sign.
  • the AR image P13 is equal to the size of the area of the transmitter 100 displayed in the captured display image Pn.
  • the user transmits the transmitter 200 with the receiver 200 facing the transmitter 100.
  • Approaching 100 As the receiver 200 approaches the transmitter 100, the area of the transmitter 100 (corresponding to the reference area described above) displayed in the captured display image Pn becomes larger.
  • the receiver 200 changes the AR image P13 superimposed on the second target area to the AR image P14, for example, as illustrated in FIG. To do. Furthermore, the receiver 200 deletes the AR image P12 superimposed on the first target area.
  • the receiver 200 displays the captured display image Pn on which the AR image P14 is superimposed on the display 201.
  • the AR image P14 is a message that informs the user of the details around the station indicated by the station name sign.
  • the AR image P14 is equal to the size of the area of the transmitter 100 displayed in the captured display image Pn.
  • the area of the transmitter 100 is larger as the receiver 200 is closer to the transmitter 100. Therefore, the AR image P14 is larger than the AR image P13.
  • the receiver 200 enlarges the AR image and displays more information as it approaches the transmitter 100.
  • an arrow that prompts the user to approach such as the AR image P12, is displayed, the user can easily grasp that a large amount of information is displayed when approaching the transmitter 100.
  • FIG. 73 is a diagram illustrating another example in which the receiver 200 in the first modification of the fourth embodiment displays an AR image.
  • the receiver 200 displays a lot of information when approaching the transmitter 100, but displays a lot of information in the form of, for example, a balloon regardless of the distance to the transmitter 100. Also good.
  • the receiver 200 captures the captured display image Po and the decoding image by capturing an image of the transmitter 100 as described above.
  • the receiver 200 acquires the optical ID by decoding the decoding image. That is, the receiver 200 receives the optical ID from the transmitter 100.
  • the receiver 200 transmits the optical ID to the server.
  • the receiver 200 acquires the AR image P15 and the recognition information corresponding to the optical ID from the server.
  • the receiver 200 recognizes an area corresponding to the recognition information in the captured display image Po as a target area. For example, the receiver 200 recognizes an area around the transmitter 100 as a target area.
  • the receiver 200 superimposes the AR image P15 on the target area, and displays the captured display image Po on which the AR image P15 is superimposed on the display 201.
  • the AR image P15 is a message that informs the user of the details of the vicinity of the station indicated by the station name in a balloon form.
  • the user of the receiver 200 can display a lot of information on the receiver 200 without approaching the transmitter 100.
  • FIG. 74 is a diagram illustrating another example of the receiver 200 in the first modification of the fourth embodiment.
  • the receiver 200 is configured as a smartphone in the above-described example, but may be configured as a head-mounted display (also referred to as glass) including an image sensor, as in the example illustrated in FIG.
  • Such a receiver 200 acquires the optical ID by performing decoding only on a part of the decoding target area of the decoding image.
  • the receiver 200 includes a line-of-sight detection camera 203 as shown in FIG.
  • the line-of-sight detection camera 203 images the eyes of the user wearing the head-mounted display that is the receiver 200.
  • the receiver 200 detects the user's line of sight based on the eye image obtained by the imaging by the line-of-sight detection camera 203.
  • the receiver 200 displays the line-of-sight frame 204 so that the line-of-sight frame 204 appears in an area of the user's field of view where the detected line of sight is directed, for example. . Accordingly, the line-of-sight frame 204 moves according to the movement of the user's line of sight.
  • the receiver 200 treats an area corresponding to the line-of-sight frame 204 in the decoding image as a decoding target area. In other words, the receiver 200 does not decode the bright line pattern area in the decoding image even if there is a bright line pattern area outside the decoding target area, and performs decoding only on the bright line pattern area in the decoding target area. .
  • decoding is not performed for all of the bright line pattern areas, so that the processing load can be reduced and the display of an extra AR image can be suppressed. Can do.
  • the receiver 200 decodes only the bright line pattern area in the decoding target area, and the bright line pattern area Only the sound corresponding to may be output.
  • the receiver 200 decodes each of the plurality of bright line pattern regions included in the decoding image, outputs a large amount of sound corresponding to the bright line pattern region in the decoding target region, and outputs to the bright line pattern region outside the decoding target region.
  • the corresponding voice may be output small.
  • the receiver 200 may output a larger amount of sound corresponding to the bright line pattern area as the bright line pattern area is closer to the decoding target area.
  • FIG. 75 is a diagram illustrating another example in which the receiver 200 in the first modification of the fourth embodiment displays an AR image.
  • the transmitter 100 is configured as an image display device having a display panel, and transmits an optical ID by changing luminance while displaying an image on the display panel.
  • the receiver 200 acquires the captured display image Pp and the decoding image in the same manner as described above by imaging the transmitter 100.
  • the receiver 200 specifies, from the captured display image Pp, an area having the same position as the bright line pattern area in the decoding image and the same size as the bright line pattern area.
  • the receiver 200 may display the scanning line P100 that repeatedly moves from one end of the region to the other end.
  • the receiver 200 acquires the optical ID by decoding the decoding image and transmits the optical ID to the server. Then, the receiver 200 acquires the AR image corresponding to the optical ID and the recognition information from the server. The receiver 200 recognizes an area corresponding to the recognition information in the captured display image Pp as a target area.
  • the receiver 200 When recognizing such a target area, the receiver 200 ends the display of the scanning line P100, superimposes the AR image on the target area, and displays the captured display image Pp on which the AR image is superimposed on the display 201. .
  • the moving scanning line P100 is displayed from when the transmitter 100 is imaged until the AR image is displayed, processing such as reading of the optical ID and the AR image is performed. Can be notified to the user.
  • FIG. 76 is a diagram illustrating another example in which the receiver 200 in the first modification of the fourth embodiment displays an AR image.
  • Each of the two transmitters 100 is configured as an image display device having a display panel, for example, as shown in FIG. 76, and transmits an optical ID by changing the luminance while displaying the same still image PS on the display panel. is doing.
  • the two transmitters 100 transmit different optical IDs (for example, optical IDs “01” and “02”) by changing the luminance in different manners.
  • the receiver 200 acquires the captured display image Pq and the decoding image by capturing images of the two transmitters 100 as in the example illustrated in FIG.
  • the receiver 200 acquires the optical IDs “01” and “02” by decoding the decoding image. That is, the receiver 200 receives the optical ID “01” from one of the two transmitters 100 and receives the optical ID “02” from the other.
  • the receiver 200 transmits those optical IDs to the server.
  • the receiver 200 acquires the AR image P16 corresponding to the optical ID “01” and the recognition information from the server.
  • the receiver 200 acquires the AR image P17 corresponding to the optical ID “02” and the recognition information from the server.
  • the receiver 200 recognizes an area corresponding to the recognition information in the captured display image Pq as a target area. For example, the receiver 200 recognizes an area where the display panels of the two transmitters 100 are displayed as the target area. Then, the receiver 200 superimposes the AR image P16 on the target area corresponding to the light ID “01”, and superimposes the AR image P17 on the target area corresponding to the light ID “02”. Then, the receiver 200 displays the captured display image Pq on which the AR images P16 and P17 are superimposed on the display 201.
  • the AR image P16 is a moving image having the same or substantially the same picture as the still picture PS displayed on the display panel of the transmitter 100 corresponding to the optical ID “01” as the first picture in the display order. is there.
  • the AR image P17 is a moving image having the same or substantially the same picture as the still picture PS displayed on the display panel of the transmitter 100 corresponding to the optical ID “02” as the first picture in the display order. is there. That is, the leading pictures of the AR image P16 and the AR image P17, which are moving images, are the same. However, the AR image P16 and the AR image P17 are different moving images, and the pictures other than the head of each are different.
  • the receiver 200 actually has an image display device that displays different moving images reproduced from the same picture.
  • the captured display image Pq can be displayed.
  • FIG. 77 is a flowchart illustrating an example of a processing operation of the receiver 200 in the first modification of the fourth embodiment.
  • the processing operation shown by the flowchart of FIG. 77 is an example of the processing operation of the receiver 200 that individually images each of the transmitters 100 when there are two transmitters 100 shown in FIG. is there.
  • the receiver 200 acquires the first light ID by imaging the first transmitter 100 as the first subject (step S201).
  • the receiver 200 recognizes the first subject from the captured display image (step S202). That is, the receiver 200 acquires the first AR image and the first recognition information corresponding to the first light ID from the server, and recognizes the first subject based on the first recognition information.
  • the receiver 200 starts reproduction of the first moving image that is the first AR image from the beginning (step S203). That is, the receiver 200 starts reproduction from the first picture of the first moving image.
  • the receiver 200 determines whether or not the first subject is out of the captured display image (step S204). That is, the receiver 200 determines whether or not the first subject cannot be recognized from the captured display image. Here, if it is determined that the first subject has deviated from the captured display image (Y in step S204), the receiver 200 interrupts the reproduction of the first moving image that is the first AR image (step S205).
  • the receiver 200 captures a second transmitter 100 different from the first transmitter 100 as a second subject, thereby different from the first light ID acquired in step S201. It is determined whether or not the optical ID of the first one has been acquired (step S206).
  • the receiver 200 determines that the second optical ID has been acquired (Y in step S206)
  • the receiver 200 performs the same processing as steps S202 to S203 after the first optical ID is acquired. That is, the receiver 200 recognizes the second subject from the captured display image (step S207).
  • the receiver 200 starts reproduction of the second moving image that is the second AR image corresponding to the second optical ID from the beginning (step S208). That is, the receiver 200 starts playback from the first picture of the second moving image.
  • the receiver 200 determines in step S206 that the second light ID has not been acquired (N in step S206), it determines whether or not the first subject has entered the captured display image again (step S206). S209). That is, the receiver 200 determines whether or not the first subject is recognized again from the captured display image.
  • the receiver 200 determines that the first subject has entered the captured display image (Y in step S209)
  • the receiver 200 further determines whether or not a predetermined time (that is, a predetermined time) has passed (step S209).
  • Step S210 That is, the receiver 200 determines whether or not a predetermined time has elapsed from when the first subject is removed from the captured display image until it enters again.
  • the receiver 200 starts reproduction from the middle of the interrupted first moving image (step S211).
  • the first picture to be resumed for reproduction which is the first picture of the first moving picture that is displayed at the beginning of the reproduction from the middle, is the next picture that was last displayed when the reproduction of the first moving picture was interrupted.
  • the pictures may be in the display order.
  • the reproduction restart top picture may be a picture preceding the last displayed picture by n (n is an integer of 1 or more) in display order.
  • the receiver 200 starts playback of the interrupted first moving image from the beginning (step S212).
  • the receiver 200 superimposes the AR image on the target area of the captured display image.
  • the brightness of the AR image may be adjusted. That is, the receiver 200 determines whether or not the brightness of the AR image acquired from the server matches the brightness of the target area of the captured display image. If the receiver 200 determines that they do not match, the receiver 200 adjusts the brightness of the AR image to match the brightness of the AR image with the brightness of the target region. Then, the receiver 200 superimposes the AR image whose brightness has been adjusted on the target area of the captured display image. Thereby, the superimposed AR image can be brought closer to the image of the actual object, and the user's uncomfortable feeling with respect to the AR image can be suppressed.
  • the brightness of the AR image is the spatial average brightness of the AR image
  • the brightness of the target area is also the spatial average brightness of the target area.
  • the receiver 200 may enlarge the AR image and display it on the entire display 201.
  • the receiver 200 switches the AR image on which the AR image is tapped to another AR image, but may automatically switch the AR image regardless of the tap. For example, when the AR image is displayed for a predetermined time, the receiver 200 switches the AR image to another AR image for display. Further, when the current time reaches a predetermined time, the receiver 200 switches the AR image that has been displayed so far to another AR image and displays it. Thereby, the user can easily see a new AR image without performing an operation.
  • FIG. 78 is a diagram illustrating an example of a problem when displaying an AR image assumed in the receiver 200 according to the fourth embodiment or the modification 1 thereof.
  • the receiver 200 in the fourth embodiment or its modification example 1 captures the subject at time t1.
  • the above-described subject is a transmitter such as a television that transmits a light ID according to a change in luminance, or a poster, a guide board, or a signboard illuminated by light from the transmitter.
  • the receiver 200 displays the entire image obtained by the effective pixel area of the image sensor (hereinafter, referred to as all captured images) on the display 201 as a captured display image.
  • the receiver 200 recognizes, in the captured display image, an area corresponding to the recognition information acquired based on the light ID as a target area on which the AR image is superimposed.
  • the target area is an area indicating an image of a transmitter such as a television or an image of a poster, for example. Then, the receiver 200 superimposes the AR image on the target area of the captured display image, and displays the captured display image on which the AR image is superimposed on the display 201.
  • the AR image may be a still image or a moving image, or a character string including one or more characters or symbols.
  • a region corresponding to the target region in the image sensor protrudes from the effective pixel region at time t2.
  • the recognition area is an area in which an image of the target area in the captured display image is projected in the effective pixel area of the image sensor. That is, the effective pixel area and the recognition area in the image sensor correspond to the captured display image and the target area on the display 201, respectively.
  • the receiver 200 When the recognition area protrudes from the effective pixel area, the receiver 200 cannot recognize the target area from the captured display image and cannot display the AR image.
  • the receiver 200 in the present modification acquires an image having a wider angle of view than the captured display image displayed on the entire display 201 as the entire captured image.
  • FIG. 79 is a diagram illustrating an example in which the receiver 200 in the second modification of the fourth embodiment displays an AR image.
  • the field angle of all captured images of the receiver 200 according to this modification that is, the field angle of the effective pixel area of the image sensor is wider than the field angle of the captured display image displayed on the entire display 201.
  • an area corresponding to an image range displayed on the display 201 is hereinafter referred to as a display area.
  • the receiver 200 images the subject at time t1.
  • the receiver 200 displays, on the display 201, only the image obtained by the display area narrower than the effective pixel area among all the captured images obtained by the effective pixel area of the image sensor as the captured display image.
  • the receiver 200 recognizes an area corresponding to the recognition information acquired based on the light ID among all the captured images as a target area on which the AR image is superimposed. Then, the receiver 200 superimposes the AR image on the target area of the captured display image, and displays the captured display image on which the AR image is superimposed on the display 201.
  • the recognition area in the image sensor is expanded.
  • the recognition area protrudes from the display area in the image sensor. That is, an image of the target region (for example, a poster image) protrudes from the captured display image displayed on the display 201.
  • the recognition area in the image sensor does not protrude from the effective pixel area. That is, the receiver 200 acquires all captured images including the target area even at time t2.
  • the receiver 200 can recognize the target area from all the captured images, and superimposes a part of the AR image corresponding to the target area only on a part of the target area in the captured display image. Are displayed on the display 201.
  • the display of the AR image can be continued.
  • FIG. 80 is a flowchart illustrating an example of a processing operation of the receiver 200 in the second modification of the fourth embodiment.
  • the receiver 200 acquires the entire captured image and the decoding image by the image sensor capturing the subject (step S301). Next, the receiver 200 acquires an optical ID by decoding the decoding image (step S302). Next, the receiver 200 transmits the optical ID to the server (step S303). Next, the receiver 200 acquires an AR image and recognition information corresponding to the optical ID from the server (step S304). Next, the receiver 200 recognizes an area corresponding to the recognition information among all captured images as a target area (step S305).
  • the receiver 200 determines whether or not the recognition area that is the area corresponding to the image of the target area in the effective pixel area of the image sensor protrudes from the display area (step S306). Here, if it is determined that it is protruding (Yes in step S306), the receiver 200 extracts a part of the AR image corresponding to the area only in a part of the target area within the captured display image. It is displayed (step S307). On the other hand, when the receiver 200 determines that it does not protrude (No in step S306), the receiver 200 superimposes the AR image on the target area of the captured display image and displays the captured display image on which the AR image is superimposed. (Step S308).
  • the receiver 200 determines whether or not the AR image display process should be terminated (step S309), and when it is determined that the AR image display process should not be terminated (No in step S309), the process from step S305 is repeatedly executed.
  • FIG. 81 is a diagram illustrating another example in which the receiver 200 in the second modification of the fourth embodiment displays an AR image.
  • the receiver 200 may switch the screen display of the AR image according to the ratio of the size of the recognition area to the display area.
  • the receiver uses the ratio (h2 / h1). ) And (w2 / w1), the larger ratio is compared with the threshold value.
  • the receiver 200 displays the captured display image in which the AR image is superimposed on the target region, and sets the larger ratio to the first threshold ( For example, compare with 0.9).
  • the receiver 200 enlarges and displays the AR image on the entire display 201 as shown in (screen display 2) of FIG. Note that when the recognition area becomes larger than the display area and further when the recognition area becomes larger than the effective pixel area, the receiver 200 continues to enlarge and display the AR image on the entire display 201.
  • the receiver 200 sets the larger ratio to the second threshold ( For example, compare with 0.7).
  • the second threshold is smaller than the first threshold.
  • the receiver 200 displays a captured display image in which the AR image is superimposed on the target area as shown in (screen display 1) of FIG.
  • FIG. 82 is a flowchart illustrating another example of the processing operation of the receiver 200 in the second modification of the fourth embodiment.
  • the receiver 200 performs optical ID processing (step S301a).
  • This optical ID process is a process including steps S301 to S304 shown in FIG.
  • the receiver 200 recognizes an area corresponding to the recognition information in the captured display image as a target area (step S311). Then, the receiver 200 superimposes the AR image on the target area of the captured display image, and displays the captured display image on which the AR image is superimposed (step S312).
  • the receiver 200 repeatedly executes the processing from step S314.
  • the receiver 200 superimposes the AR image on the target area of the captured display image and displays the captured display image on which the AR image is superimposed. (Step S316).
  • the receiver 200 determines whether or not to end the AR image display process (step S317). If the receiver 200 determines that the AR image display process should not be ended (No in step S317), the receiver 200 repeatedly executes the processes from step S313.
  • the screen display of the receiver 200 can be frequently switched between (screen display 1) and (screen display 2). Can be prevented and the state of the screen display can be stabilized.
  • the display area and the effective pixel area may be the same or different.
  • the ratio of the size of the recognition area to the display area is used.
  • the size of the recognition area relative to the effective pixel area is used instead of the display area.
  • a ratio may be used.
  • FIG. 83 is a diagram illustrating another example in which the receiver 200 according to the second modification of the fourth embodiment displays an AR image.
  • the image sensor of the receiver 200 has an effective pixel area wider than the display area.
  • the receiver 200 images the subject at time t1.
  • the receiver 200 displays, on the display 201, only the image obtained by the display area narrower than the effective pixel area among all the captured images obtained by the effective pixel area of the image sensor as the captured display image.
  • the receiver 200 recognizes an area corresponding to the recognition information acquired based on the light ID among all the captured images as a target area on which the AR image is superimposed. Then, the receiver 200 superimposes the AR image on the target area of the captured display image, and displays the captured display image on which the AR image is superimposed on the display 201.
  • the recognition area in the image sensor moves, for example, in the upper left direction in FIG. 83, and protrudes from the display area at time t2. That is, an image of the target region (for example, a poster image) protrudes from the captured display image displayed on the display 201.
  • the recognition area in the image sensor does not protrude from the effective pixel area. That is, the receiver 200 acquires all captured images including the target area even at time t2.
  • the receiver 200 can recognize the target area from all the captured images, and superimposes a part of the AR image corresponding to the area only on a part of the target area in the captured display image. And displayed on the display 201.
  • the receiver 200 changes the size and position of a part of the displayed AR image in accordance with the movement of the recognition area in the image sensor, that is, the movement of the target area in all captured images.
  • the receiver 200 When the recognition area protrudes from the display area as described above, the receiver 200 counts the number of pixels corresponding to the distance between the edge of the effective pixel area and the edge of the display area (hereinafter referred to as inter-area distance). Is compared to a threshold.
  • the shorter distance (hereinafter referred to as the first distance) among the distance between the upper side of the effective pixel area and the upper side of the display area, and the distance between the lower side of the effective pixel area and the lower side of the display area.
  • dh be the number of pixels corresponding to (distance).
  • the shorter distance (hereinafter referred to as the second distance) among the distance between the left side of the effective pixel area and the left side of the display area, and the distance between the right side of the effective pixel area and the right side of the display area.
  • dw be the number of pixels corresponding to (distance).
  • the above-mentioned inter-region distance is the shorter one of the first and second distances.
  • the receiver 200 compares the smaller pixel number of the pixel numbers dw and dh with the threshold value N. Then, for example, when the smaller number of pixels becomes equal to or smaller than the threshold value N at time t2, the receiver 200 changes the size and position of a part of the AR image according to the position of the recognition area in the image sensor. Fix without. That is, the receiver 200 switches the screen display of the AR image. For example, the receiver 200 sets the size and position of a part of the displayed AR image to the size of the part of the AR image displayed on the display 201 when the smaller number of pixels reaches the threshold value N. Fix in position and position.
  • the receiver 200 continues to display a part of the AR image similarly to time t2. That is, as long as the smaller number of pixels dw and dh is equal to or smaller than the threshold value N, the receiver 200 captures a part of the AR image whose size and position are fixed as at time t2. Continue to superimpose on the display image.
  • the receiver 200 changes the size and position of a part of the displayed AR image in accordance with the movement of the recognition area in the image sensor, but the display magnification and position of the entire AR image are changed. It may be changed.
  • FIG. 84 is a diagram illustrating another example in which the receiver 200 according to the second modification of the fourth embodiment displays an AR image. Specifically, FIG. 84 shows an example in which the display magnification of the AR image is changed.
  • the recognition area in the image sensor is, for example, in the upper left direction in FIG. It moves and protrudes from the display area at time t2. That is, an image of the target region (for example, a poster image) protrudes from the captured display image displayed on the display 201.
  • the recognition area in the image sensor does not protrude from the effective pixel area. That is, the receiver 200 acquires all captured images including the target area even at time t2. As a result, the receiver 200 can recognize the target area from all captured images.
  • the receiver 200 displays the AR image display magnification so that the size of the entire AR image matches the size of a part of the target region in the captured display image.
  • the receiver 200 reduces the AR image.
  • the receiver 200 superimposes the AR image whose display magnification has been changed (that is, reduced) on the area and displays the AR image on the display 201.
  • the receiver 200 changes the display magnification and position of the displayed AR image in accordance with the movement of the recognition area in the image sensor, that is, the movement of the target area in all captured images.
  • the receiver 200 compares the smaller pixel number of the pixel numbers dw and dh with the threshold value N. Then, for example, if the smaller number of pixels becomes equal to or less than the threshold value N at time t2, the receiver 200 is fixed without changing the display magnification and position of the AR image according to the position of the recognition area in the image sensor. To do. That is, the receiver 200 switches the screen display of the AR image. For example, the receiver 200 fixes the display magnification and position of the displayed AR image to the display magnification and position of the AR image displayed on the display 201 when the smaller number of pixels reaches the threshold value N. .
  • the receiver 200 continues to display the AR image similarly to the time t2 even when the recognition area further moves at the time t3 and protrudes from the effective pixel area. That is, as long as the smaller number of pixels dw and dh is equal to or smaller than the threshold value N, the receiver 200 converts an AR image with a fixed display magnification and position into a captured display image as at time t2. Continue to overlay and display.
  • the smaller of the number of pixels dw and dh is compared with the threshold value, but the ratio of the smaller number of pixels may be compared with the threshold value.
  • the ratio of the number of pixels dw is, for example, the ratio of the number of pixels dw to the number of pixels w0 in the horizontal direction of the effective pixel region (dw / w0).
  • the ratio of the number of pixels dh is, for example, the ratio of the number of pixels dh to the number of pixels h0 in the vertical direction of the effective pixel region (dh / h0).
  • the ratio between the pixel numbers dw and dh may be expressed by using the number of pixels in the horizontal or vertical direction of the display area instead of the number of pixels in the horizontal or vertical direction of the effective pixel area.
  • the threshold value compared with the ratio of the number of pixels dw and dh is, for example, 0.05.
  • the smaller angle of view of the number of pixels dw and dh may be compared with a threshold value.
  • the angle of view corresponding to the number of pixels dw is ⁇ ⁇ dw / m
  • the angle of view corresponding to the number of pixels dh is ⁇ ⁇ dh / m.
  • the receiver 200 switches the screen display of the AR image based on the inter-region distance between the effective pixel region and the recognition region.
  • the screen display of the AR image may be switched based on the relationship.
  • FIG. 85 is a diagram illustrating another example in which the receiver 200 in the second modification of the fourth embodiment displays an AR image. Specifically, FIG. 85 shows an example in which the screen display of the AR image is switched based on the relationship between the display area and the recognition area. In the example shown in FIG. 85, as in the example shown in FIG. 79, the image sensor of the receiver 200 has an effective pixel area wider than the display area.
  • the receiver 200 images the subject at time t1.
  • the receiver 200 displays, on the display 201, only the image obtained by the display area narrower than the effective pixel area among all the captured images obtained by the effective pixel area of the image sensor as the captured display image.
  • the receiver 200 recognizes an area corresponding to the recognition information acquired based on the light ID among all the captured images as a target area on which the AR image is superimposed. Then, the receiver 200 superimposes the AR image on the target area of the captured display image, and displays the captured display image on which the AR image is superimposed on the display 201.
  • the receiver 200 changes the position of the displayed AR image according to the movement of the recognition area in the image sensor.
  • the recognition area in the image sensor moves in the upper left direction in FIG. 85, for example, and at time t2, a part of the edge of the recognition area coincides with a part of the edge of the display area.
  • an image of the target area for example, an image such as a poster
  • the receiver 200 superimposes the AR image on the target area at the corner of the captured display image and displays the AR image on the display 201.
  • the receiver 200 fixes the AR image displayed at time t2 without changing the size and position. That is, the receiver 200 switches the screen display of the AR image.
  • the receiver 200 continues to display the AR image similarly to the time t2 even when the recognition area further moves at the time t3 and protrudes from the effective pixel area. In other words, as long as the recognition area extends beyond the display area, the receiver 200 superimposes the AR image having the same size as that at time t2 on the same position as at time t2 in the captured display image. Continue to display.
  • the receiver 200 switches the screen display of the AR image depending on whether or not the recognition area protrudes from the display area.
  • the receiver 200 may include a determination area that includes the display area and is larger than the display area and smaller than the effective pixel area instead of the display area. In this case, the receiver 200 switches the screen display of the AR image depending on whether or not the recognition area protrudes from the determination area.
  • the screen display of the AR image has been described with reference to FIGS. 79 to 85.
  • the receiver 200 cannot recognize the target area from all the captured images, the target that has been recognized until immediately before the target area is recognized.
  • the AR image having the size of the area may be displayed superimposed on the captured display image.
  • FIG. 86 is a diagram illustrating another example in which the receiver 200 according to the second modification of the fourth embodiment displays an AR image.
  • the receiver 200 acquires the captured display image Pe and the decoding image in the same manner as described above by imaging the guide plate 107 illuminated by the transmitter 100.
  • the receiver 200 acquires the optical ID by decoding the decoding image. That is, the receiver 200 receives the optical ID from the guide plate 107.
  • the receiver 200 receives the light ID correctly because the surface is dark even when illuminated by the transmitter 100. It may not be possible.
  • the receiver 200 may not be able to correctly receive the light ID.
  • a reflecting plate 109 may be arranged near the guide plate 107.
  • the receiver 200 can receive light reflected from the transmitter 100 by the reflector 109, that is, visible light (specifically, light ID) transmitted from the transmitter 100.
  • the receiver 200 can appropriately receive the optical ID and display the AR image P5.
  • FIG. 87A is a flowchart illustrating a display method according to one embodiment of the present invention.
  • the display method according to one aspect of the present invention includes steps S41 to S43.
  • a picked-up image is acquired by picking up an image of an object illuminated by a transmitter that transmits a signal according to a change in luminance of light as a subject with an image sensor.
  • a signal is decoded from the captured image.
  • a moving image corresponding to the decoded signal is read from the memory, and the moving image is superimposed on a target area corresponding to the subject in the captured image and displayed on the display.
  • a predetermined image that is one of a plurality of images included in the moving image and includes display images of the image including the target object and the image including the target object.
  • the moving image is displayed from any one of the plurality of images.
  • the predetermined number is 10 frames.
  • the object is a still image
  • the moving image is displayed from the same image as the still image.
  • the image from which the display of the moving image is started is not limited to the same image as the still image, but is the same number as the still image, that is, the predetermined number of frames in the display order from the image including the target object. It may be an image.
  • the object is not limited to a still image, and may be a doll or the like.
  • the imaging sensor and the captured image are, for example, the image sensor and the entire captured image in the fourth embodiment.
  • the still image to be lit up may be a still image displayed on the display panel of the image display device, or may be a poster, a guide board, a signboard, or the like illuminated by light from the transmitter.
  • Such a display method may further include a transmission step of transmitting a signal to the server and a reception step of receiving a moving image corresponding to the signal from the server.
  • a moving image can be displayed virtually so that a still image starts to move, and an image useful for the user can be displayed.
  • the still image has an outer frame of a predetermined color
  • the display method according to one aspect of the present invention may further include a recognition step of recognizing a target area from the captured image by the predetermined color.
  • the moving image is resized so as to be the same as the size of the recognized target region, and the resized moving image is superimposed on the target region in the captured image and displayed on the display.
  • the outer frame of the predetermined color is a white or black rectangular frame surrounding a still image, and is indicated by the recognition information in the fourth embodiment.
  • the AR image in the fourth embodiment is resized and superimposed as a moving image.
  • the moving image can be displayed more realistically so that the moving image actually exists as a subject.
  • step S43 if the projection area onto which the subject is projected in the imaging area is larger than the display area, an image obtained by a portion of the projection area that exceeds the display area is displayed on the display. You don't have to.
  • the imaging region and the projection region are an effective pixel region and a recognition region of the image sensor.
  • the imaging sensor approaches the still image that is the subject, so that even if a part of the image obtained by the projection region (recognition region in FIG. 79) is not displayed on the display, the subject In some cases, the entire still image is projected onto the imaging region. Therefore, in this case, a still image that is a subject can be appropriately recognized, and a moving image can be appropriately superimposed on a target region corresponding to the subject in the captured image.
  • the horizontal and vertical widths of the display area are w1 and h1
  • the horizontal and vertical widths of the projection area are w2 and h2.
  • step S43 when h2 / h1 or w2 / w1 is greater than a predetermined value, a moving image is displayed on the entire screen of the display, and either h2 / h1 or w2 / w1 is displayed. If the larger value is smaller than the predetermined value, the moving image may be superimposed on the target area in the captured image and displayed on the display.
  • the moving image is displayed on the full screen. Therefore, the user enlarges the moving image by moving the imaging sensor closer to the still image. There is no need to display. Therefore, it is possible to prevent the signal from being unable to be decoded when the imaging sensor is too close to the still image and the projection area (recognition area in FIG. 81) protrudes from the imaging area (effective pixel area). .
  • the display method according to one aspect of the present invention may further include a control step of turning off the operation of the imaging sensor when a moving image is displayed on the entire screen of the display.
  • step S314 in FIG. 82 the power consumption of the image sensor can be suppressed by turning off the operation of the image sensor.
  • step S43 if the target area cannot be recognized from the captured image due to the movement of the imaging sensor, the moving image is displayed in the same size as the size of the target area recognized immediately before it cannot be recognized. May be.
  • the target area cannot be recognized from the captured image for example, is a situation in which at least a part of the target area corresponding to the still image that is the subject is not included in the captured image.
  • a moving image having the same size as the size of the target area recognized immediately before is displayed, for example, at time t3 in FIG. Accordingly, it is possible to suppress the at least part of the moving image from being displayed because the imaging sensor has been moved.
  • step S43 when only a part of the target area is included in the area displayed on the display of the captured image due to the movement of the imaging sensor, it corresponds to a part of the target area.
  • a part of the spatial area of the moving image to be displayed may be superimposed on a part of the target area and displayed on the display.
  • a part of the spatial area of the moving image is a part of each picture constituting the moving image.
  • step S43 if the target area cannot be recognized from the captured image due to the movement of the imaging sensor, the space of the moving image corresponding to a part of the target area displayed immediately before the target area cannot be recognized. A part of the area may be continuously displayed.
  • one space area of the moving image (AR image in FIG. 83) is displayed.
  • the part is displayed continuously.
  • step S43 the horizontal and vertical widths in the imaging area of the imaging sensor are w0 and h0, respectively, and the horizontal area between the projection area where the subject is projected in the imaging area and the imaging area.
  • the projection area is a recognition area shown in FIG. 83, for example.
  • the angle of view corresponding to the shorter one of the horizontal and vertical distances between the projection area where the subject is projected in the imaging area of the imaging sensor and the imaging area is predetermined. If the value is less than or equal to the value, it may be determined that the target area cannot be recognized.
  • FIG. 87B is a block diagram illustrating a structure of a display device according to one embodiment of the present invention.
  • the display device A10 includes an imaging sensor A11, a decoding unit A12, and a display control unit A13.
  • the imaging sensor A11 acquires a captured image by capturing a still image illuminated by a transmitter that transmits a signal according to a change in the luminance of light as a subject.
  • the decoding unit A12 is a decoding unit that decodes a signal from the captured image.
  • the display control unit A13 reads out the moving image corresponding to the decoded signal from the memory, and displays the moving image on the display by superimposing the moving image on the target area corresponding to the subject in the captured image.
  • the display control unit A13 displays the plurality of images in order from the first image that is the same image as the still image among the plurality of images included in the moving image.
  • the imaging sensor A11 may include a plurality of micromirrors and a photosensor
  • the display device A10 may further include an imaging control unit that controls the imaging sensor.
  • the imaging control unit identifies a region including a signal as a signal region in the captured image, and controls the angle of the micromirror corresponding to the identified signal region among the plurality of micromirrors.
  • an imaging control part makes the above-mentioned photosensor receive only the reflected light by the micro mirror by which the angle was controlled among a plurality of micro mirrors.
  • each component may be configured by dedicated hardware or may be realized by executing a software program suitable for each component.
  • Each component may be realized by a program execution unit such as a CPU or a processor reading and executing a software program recorded on a recording medium such as a hard disk or a semiconductor memory.
  • the program causes the computer to execute the display method shown by the flowcharts of FIGS. 77, 80, 82, and 87A.
  • the display method according to one or a plurality of aspects has been described based on the above-described embodiments and modifications.
  • the present invention is not limited to the embodiments. Unless it deviates from the gist of the present invention, various modifications conceived by those skilled in the art have been made in the present embodiment, and forms constructed by combining components in different embodiments and modifications are also within the scope of the present invention. May be included.
  • FIG. 88 is a diagram showing an example of expansion and movement of the AR image.
  • the receiver 200 superimposes the AR image P21 on the target area of the captured display image Ppre, as in the fourth embodiment or the first or second modification thereof. Then, the receiver 200 displays the captured display image Ppre on which the AR image P21 is superimposed on the display 201.
  • the AR image P21 is a moving image.
  • the receiver 200 when receiving an instruction to change the size, changes the size of the AR image P21 in accordance with the instruction. For example, when receiving an enlargement instruction, the receiver 200 enlarges the AR image P21 according to the instruction.
  • the size change instruction is given by, for example, a pinch operation, a double tap, or a long press on the AR image P21 by the user. Specifically, when receiving an enlargement instruction performed by pinching out, the receiver 200 enlarges the AR image P21 according to the instruction. Conversely, when receiving an instruction for reduction performed by pinch-in, the receiver 200 reduces the AR image P21 in accordance with the instruction.
  • the receiver 200 when the receiver 200 receives a position change instruction, the receiver 200 changes the position of the AR image P21 in accordance with the instruction.
  • the instruction to change the position is given by, for example, swiping the AR image by the user.
  • the receiver 200 when receiving an instruction to change the position performed by swiping, the receiver 200 changes the position of the AR image P21 according to the instruction. That is, the AR image P21 moves.
  • FIG. 89 is a diagram illustrating an example of the enlargement of the AR image.
  • the receiver 200 superimposes the AR image P22 on the target area of the captured display image Ppre, as in the fourth embodiment or the first or second modification thereof. Then, the receiver 200 displays the captured display image Ppre on which the AR image P22 is superimposed on the display 201.
  • the AR image P22 is a still image in which a character string is described.
  • the receiver 200 when receiving an instruction to change the size, changes the size of the AR image P22 in accordance with the instruction. For example, when receiving an enlargement instruction, the receiver 200 enlarges the AR image P22 according to the instruction.
  • the size change instruction is performed by, for example, a pinch operation, double tap, or long press on the AR image P22 by the user, as described above. Specifically, when receiving an enlargement instruction performed by pinching out, the receiver 200 enlarges the AR image P22 according to the instruction. By enlarging the AR image P22, the character string described in the AR image P22 can be easily read by the user.
  • the receiver 200 when the receiver 200 further receives a size change instruction, the receiver 200 changes the size of the AR image P22 according to the instruction. For example, when receiving a further enlargement instruction, the receiver 200 further enlarges the AR image P22 according to the instruction. By enlarging the AR image P22, it is possible to make it easier for the user to read the character string described in the AR image P22.
  • the receiver 200 may acquire a high-resolution AR image if the enlargement ratio of the AR image corresponding to the instruction is equal to or greater than a threshold value.
  • the receiver 200 may enlarge and display the high-resolution AR image up to the above-described enlargement factor instead of the original AR image that has already been displayed.
  • the receiver 200 displays an AR image of 1920 ⁇ 1080 pixels instead of the AR image of 640 ⁇ 480 pixels.
  • the AR image can be enlarged and a high-resolution image that cannot be obtained by the optical zoom can be displayed so that the AR image is actually captured as a subject.
  • FIG. 90 is a flowchart illustrating an example of processing operations related to enlargement and movement of an AR image by the receiver 200.
  • the receiver 200 starts imaging based on the normal exposure time and the communication exposure time as in step S101 shown in the flowchart of FIG. 45 (step S401).
  • a captured display image Ppre based on the normal exposure time and a decoding image (that is, a bright line image) Pdec based on the communication exposure time are periodically obtained.
  • the receiver 200 acquires the optical ID by decoding the decoding image Pdec.
  • the receiver 200 performs an AR image superimposition process including the processes of steps S102 to S106 shown in the flowchart of FIG. 45 (step S402).
  • the AR image is displayed superimposed on the captured display image Ppre.
  • the receiver 200 decreases the optical ID acquisition rate (step S403).
  • the light ID acquisition rate is a ratio of the number of decoding images (that is, bright line images) Pdec out of the number of captured images per unit time obtained by imaging started in step S401. For example, as the optical ID acquisition rate decreases, the number of decoding images Pdec obtained per unit time becomes smaller than the number of captured display images Ppre obtained per unit time.
  • the receiver 200 determines whether or not a size change instruction has been received (step S404). If it is determined that the size change instruction has been received (Yes in step S404), the receiver 200 further determines whether the size change instruction is an enlargement instruction (step S405). If it is determined that the size change instruction is an enlargement instruction (Yes in step S405), the receiver 200 further determines whether it is necessary to reacquire the AR image (step S406). For example, when the receiver 200 determines that the AR image enlargement rate according to the enlargement instruction is equal to or greater than a threshold, the receiver 200 determines that the AR image needs to be reacquired.
  • the receiver 200 determines that reacquisition is necessary (Yes in step S406), the receiver 200 acquires a high-resolution AR image from, for example, a server, and converts the AR image displayed in a superimposed manner into the high-resolution AR image. Replace with the AR image (step S407).
  • the receiver 200 changes the size of the AR image in accordance with the received size change instruction (step S408). That is, when a high-resolution AR image is acquired in step S407, the receiver 200 enlarges the high-resolution AR image. Further, when it is determined in step S406 that the re-acquisition of the AR image is unnecessary (No in step S406), the receiver 200 enlarges the superimposed AR image. If it is determined in step S405 that the size change instruction is a reduction instruction (No in step S405), the receiver 200 performs superimposition according to the received size change instruction, that is, the reduction instruction. Reduce the displayed AR image.
  • step S404 determines whether a position change instruction has been received (step S409). If it is determined that a position change instruction has been received (Yes in step S409), the receiver 200 changes the position of the superimposed AR image in accordance with the position change instruction (step S410). ). That is, the receiver 200 moves the AR image. If it is determined that the position change instruction has not been received (No in step S409), the receiver 200 repeatedly executes the processing from step S404.
  • the receiver 200 cannot acquire the light ID periodically acquired from step S401. It is determined whether or not (step S411). If it is determined that the light ID is no longer acquired (Yes in step S411), the receiver 200 ends the processing operation related to the expansion and movement of the AR image. On the other hand, if it is determined that the light ID is still acquired (No in step S411), the receiver 200 repeatedly executes the processing from step S404.
  • FIG. 91 is a diagram illustrating an example of superimposition of AR images by the receiver 200.
  • the receiver 200 superimposes the AR image P23 on the target area in the captured display image Ppre.
  • the AR image P23 is configured such that the transmittance of each part of the AR image P23 increases as the part of the AR image P23 is closer to the end of the AR image P23.
  • the transmittance is the degree to which the superimposed image is displayed through. For example, when the overall transmittance of the AR image is 100%, even if the AR image is superimposed on the target area of the captured display image, only the target area is displayed on the display 201 without displaying the AR image. Means that. Conversely, the transmittance of the entire AR image being 0% means that the target area of the captured display image is not displayed on the display 201 and only the AR image superimposed on the target area is displayed. .
  • the transmittance of each part in the AR image P23 is higher as the part is closer to the upper end, lower end, left end, or right end of the rectangle. More specifically, the transmittance at those ends is 100%.
  • the central portion of the AR image P23 there is a rectangular area having a transmittance of 0% which is smaller than that of the AR image P23. In the rectangular area, for example, “Kyoto Station” is written in English. That is, at the peripheral edge of the AR image P23, the transmittance changes stepwise from 0% to 100% like a gradation.
  • the receiver 200 superimposes such an AR image P23 on the target area in the captured display image Ppre as shown in FIG. At this time, the receiver 200 matches the size of the AR image P23 with the size of the target area, and superimposes the resized AR image P23 on the target area.
  • the station name has “Kyoto” written in Japanese.
  • the transmittance of each part of the AR image P23 is higher as the part is closer to the end of the AR image P23. Therefore, when the AR image P23 is superimposed on the target area, even if the rectangular area at the center of the AR image P23 is displayed, the end of the AR image P23 is not displayed, but the end of the target area, that is, the station name mark The edge of the image is displayed.
  • the deviation between the AR image P23 and the target area can be made inconspicuous. That is, even when the AR image P23 is superimposed on the target area, a shift may occur between the AR image P23 and the target area due to the movement of the receiver 200 or the like.
  • the overall transmittance of the AR image P23 is 0%
  • the end of the AR image P23 and the end of the target area are displayed, and the shift becomes conspicuous.
  • the AR image P23 in the present modification the closer to the end, the higher the transmittance of the part, so that the end of the AR image P23 can be made difficult to be displayed.
  • the AR image P23 and the target region It is possible to make the gap inconspicuous.
  • the transmittance changes like gradation in the peripheral portion of the AR image P23 it is difficult to notice that the AR image P23 is superimposed on the target region.
  • FIG. 92 is a diagram illustrating an example of AR image superimposition performed by the receiver 200.
  • the receiver 200 superimposes the AR image P24 on the target area in the captured display image Ppre.
  • the imaged subject is, for example, a restaurant menu. This menu is surrounded by a white frame, and the white frame is surrounded by a black frame. That is, the subject includes a menu, a white frame surrounding the menu, and a black frame surrounding the white frame.
  • the receiver 200 recognizes, as a target area, an area larger than the white frame image and smaller than the black frame image in the captured display image Pre. Then, the receiver 200 matches the size of the AR image P24 with the size of the target area, and superimposes the resized AR image P24 on the target area.
  • the AR image P24 can be continuously displayed in a state surrounded by a black frame. Therefore, the shift between the AR image P24 and the target region can be made inconspicuous.
  • the color of the frame is black or white, but is not limited to these colors and may be any color.
  • FIG. 93 is a diagram illustrating an example of superimposition of AR images by the receiver 200.
  • FIG. 93 is a diagram illustrating an example of superimposition of AR images by the receiver 200.
  • the receiver 200 images a poster on which a castle illuminated in the night sky is drawn as a subject.
  • the poster is illuminated by the above-described transmitter 100 configured as a backlight, and a visible light signal (ie, a light ID) is transmitted by the backlight.
  • the receiver 200 acquires the captured display image Ppre including the image of the subject that is the poster and the AR image P25 corresponding to the light ID by the imaging.
  • the AR image P25 has the same shape as the poster image from which the region where the castle is drawn is extracted. That is, the area corresponding to the castle of the poster image in the AR image P25 is masked.
  • the AR image P25 is configured so that the transmittance of each part of the AR image P25 is higher as the part of the AR image P25 is closer to the end of the AR image P25, similarly to the AR image P23 described above.
  • the transmittance is 0% in the AR image P25, fireworks launched in the night sky are displayed as moving images.
  • the receiver 200 matches the size of the AR image P25 with the size of the target area that is the image of the subject, and superimposes the resized AR image P25 on the target area.
  • the castle drawn on the poster is displayed as an image of the subject, not as an AR image, and a moving image of fireworks is displayed as an AR image.
  • the captured display image Ppre can be displayed as if fireworks are actually being launched in the poster.
  • the transmittance of each part of the AR image P25 is higher as the part is closer to the end of the AR image P25. Therefore, when the AR image P25 is superimposed on the target area, even if the center portion of the AR image P25 is displayed, the end of the AR image P25 is not displayed, but the end of the target area is displayed. As a result, the deviation between the AR image P25 and the target region can be made inconspicuous. Furthermore, since the transmittance changes like a gradation at the peripheral portion of the AR image P25, it is difficult to notice that the AR image P25 is superimposed on the target region.
  • FIG. 94 is a diagram illustrating an example of superposition of AR images by the receiver 200.
  • FIG. 94 is a diagram illustrating an example of superposition of AR images by the receiver 200.
  • the receiver 200 images the transmitter 100 configured as a television as a subject. Specifically, the transmitter 100 displays a castle illuminated in the night sky on a display and transmits a visible light signal (that is, a light ID).
  • the receiver 200 acquires the captured display image Ppre displayed by the transmitter 100 and the AR image P26 corresponding to the optical ID by the imaging.
  • the receiver 200 first displays the captured display image Ppre on the display 201.
  • the receiver 200 also displays on the display 201 a message m that prompts the user to turn it off.
  • the message m is, for example, “Turn off the room lighting and darken the room”.
  • the receiver 200 displays the AR image P26 superimposed on the captured display image Ppre.
  • the AR image P26 has the same size as the captured display image Ppre, and the area corresponding to the castle of the captured display image Ppre in the AR image P26 is cut out. That is, the area corresponding to the castle of the captured display image Ppre in the AR image P26 is masked. Therefore, the castle of the captured display image Ppre can be shown to the user from the area.
  • the transmittance may change stepwise from 0% to 100% like a gradation. In this case, the shift between the captured display image Ppre and the AR image P26 can be made inconspicuous.
  • the AR image having a high peripheral edge transmittance is superimposed on the target area of the captured display image Ppre, so that the shift between the AR image and the target area is less noticeable.
  • an AR image that is the same size as the captured display image Ppre and is entirely translucent that is, having a transmittance of 50%
  • the shift between the AR image and the target region can be made inconspicuous.
  • the captured display image Ppre is generally bright, the AR image having a uniform low transparency is superimposed on the captured display image Ppre.
  • the captured display image Ppre is generally dark, the transparency is uniformly uniform.
  • a high AR image may be superimposed on the captured display image Ppre.
  • the receiver 200 displays the message m that prompts the user to turn off the light, but the light may be automatically turned off without performing such display.
  • the receiver 200 outputs a turn-off signal to the lighting device in which the transmitter 100 that is a television is set by Bluetooth (registered trademark), ZigBee, a specific low-power radio station, or the like. Thereby, the lighting device is automatically turned off.
  • FIG. 95A is a diagram illustrating an example of a captured display image Ppre obtained by imaging by the receiver 200.
  • the transmitter 100 is configured as a large display installed in a stadium. Then, the transmitter 100 displays a message indicating that, for example, fast food and drinks can be ordered with the light ID, and transmits a visible light signal (that is, a light ID). When such a message is displayed, the user images the receiver 200 toward the transmitter 100. That is, the receiver 200 images the transmitter 100 configured as a large display installed in the stadium as a subject.
  • the receiver 200 acquires the captured display image Ppre and the decoding image Pdec by the imaging. Then, the receiver 200 acquires the optical ID by decoding the decoding image Pdec, and transmits the optical ID and the captured display image Ppre to the server.
  • the server specifies the installation information of the captured large display associated with the light ID transmitted from the receiver 200 from the installation information associated with the light ID.
  • the installation information indicates the position and orientation where the large display is installed, the size of the large display, and the like.
  • the server identifies the seat number where the captured display image Ppre was captured in the stadium based on the size and orientation of the large display displayed in the captured display image Ppre and the installation information. To do. Then, the server causes the receiver 200 to display a menu screen including the seat number.
  • FIG. 95B is a diagram showing an example of a menu screen displayed on the display 201 of the receiver 200.
  • the menu screen m1 includes, for example, for each product, an input field ma1 in which the number of orders for the product is input, a seat field mb1 in which the seat number of the stadium specified by the server is described, and an order button mc1. .
  • the user operates the receiver 200 to input the order quantity of the product in the input field ma1 corresponding to the desired product, and selects the order button mc1. As a result, the order is confirmed, and the receiver 200 transmits the order contents corresponding to the input result to the server.
  • the server When the server receives the order details, it instructs the stadium staff to deliver the number of products according to the order details to the seat of the number specified as described above.
  • FIG. 96 is a flowchart showing an example of processing operation between the receiver 200 and the server.
  • the receiver 200 first images the transmitter 100 configured as a large stadium display (step S421).
  • the receiver 200 acquires the optical ID transmitted from the transmitter 100 by decoding the decoding image Pdec obtained by the imaging (step S422).
  • the receiver 200 transmits the optical ID acquired in step S422 and the captured display image Ppre obtained by imaging in step S421 to the server (step S423).
  • the server When the server receives the light ID and the captured display image Pre (step S424), the server identifies installation information of a large display installed in the stadium based on the light ID (step S425). For example, for each light ID, the server holds a table indicating installation information of a large display associated with the light ID, and the installation information associated with the light ID transmitted from the receiver 200 is stored from the table. The installation information is specified by searching.
  • the server acquires (that is, captures) the captured display image Ppre in the stadium based on the specified installation information and the size and orientation of the large display displayed in the captured display image Ppre.
  • the assigned seat number is specified (step S426).
  • the server transmits the URL (Uniform Resource Locator) of the menu screen m1 including the identified seat number to the receiver 200 (step S427).
  • URL Uniform Resource Locator
  • the receiver 200 When the receiver 200 receives the URL of the menu screen m1 transmitted from the server (step S428), the receiver 200 accesses the URL and displays the menu screen m1 (step S429).
  • the user operates the receiver 200 to input the order contents into the menu screen m1, and selects the order button mc1, thereby confirming the order.
  • the receiver 200 transmits the order details to the server (step S430).
  • the server Upon receiving the order details transmitted from the receiver 200, the server performs an order receiving process according to the order details (step S431). At this time, for example, the server instructs the staff of the stadium to deliver the number of products corresponding to the order contents to the seat of the number specified in step S426.
  • the seat number is specified based on the captured display image Ppre obtained by imaging by the receiver 200, the user of the receiver 200 bothers to input the seat number when ordering a product. There is no need to do. Therefore, the user can easily place an order for a product without inputting the seat number.
  • the server specifies the seat number, but the receiver 200 may specify the seat number.
  • the receiver 200 acquires the installation information from the server, and specifies the seat number based on the installation information and the size and orientation of the large display displayed in the captured display image Pre.
  • FIG. 97 is a diagram for explaining the volume of sound reproduced by the receiver 1800a.
  • the receiver 1800a receives the light ID (visible light signal) transmitted from the transmitter 1800b configured as, for example, street digital signage. Then, the receiver 1800a reproduces sound at the same timing as the image reproduction by the transmitter 1800b. That is, the receiver 1800a reproduces sound so as to be synchronized with the image reproduced by the transmitter 1800b. Note that the receiver 1800a may reproduce the same image as the image reproduced by the transmitter 1800b (reproduced image) or an AR image (AR moving image) related to the reproduced image together with the sound.
  • the receiver 1800a may reproduce the same image as the image reproduced by the transmitter 1800b (reproduced image) or an AR image (AR moving image) related to the reproduced image together with the sound.
  • the receiver 1800a adjusts the volume of the sound according to the distance to the transmitter 1800b. Specifically, the receiver 1800a adjusts the volume smaller as the distance to the transmitter 1800b is longer, and conversely adjusts the volume larger as the distance to the transmitter 1800b is shorter.
  • the receiver 1800a may specify the distance to the transmitter 1800b using a GPS (Global Positioning System) or the like. Specifically, the receiver 1800a acquires position information of the transmitter 1800b associated with the optical ID from a server or the like, and further specifies the position of the receiver 1800a by GPS. Then, the receiver 1800a specifies the distance between the position of the transmitter 1800b indicated by the position information acquired from the server and the position of the specified receiver 1800a as the distance to the above-described transmitter 1800b. . Note that the receiver 1800a may specify the distance to the transmitter 1800b by using Bluetooth (registered trademark) instead of GPS.
  • Bluetooth registered trademark
  • the receiver 1800a may specify the distance to the transmitter 1800b based on the size of the bright line pattern region of the above-described decoding image Pdec obtained by imaging.
  • the bright line pattern region is a region formed of a plurality of bright line patterns that appear by exposure at the exposure time for communication of a plurality of exposure lines included in the image sensor of the receiver 1800a.
  • This bright line pattern area corresponds to the display area of the transmitter 1800b displayed in the captured display image Ppre.
  • the receiver 1800a specifies the shorter distance as the distance to the transmitter 1800b as the bright line pattern region is larger, and conversely specifies the longer distance as the distance to the transmitter 1800b as the bright line pattern region is smaller. .
  • the receiver 1800a uses distance data indicating the relationship between the size of the bright line pattern region and the distance, and in the distance data, the distance associated with the size of the bright line pattern region in the captured display image Pre is The distance to the transmitter 1800b may be specified. Note that the receiver 1800a may transmit the optical ID received as described above to the server, and obtain distance data associated with the optical ID from the server.
  • the user of the receiver 1800a makes the sound reproduced by the receiver 1800a like the sound actually reproduced by the transmitter 1800b. Can be heard.
  • FIG. 98 is a diagram showing the relationship between the distance from the receiver 1800a to the transmitter 1800b and the sound volume.
  • the volume increases or decreases in proportion to the distance in the range from Vmin to Vmax [dB].
  • the receiver 1800a linearly decreases the volume from Vmax [dB] to Vmin [dB] when the distance to the transmitter 1800b increases from L1 [m] to L2 [m].
  • the receiver 1800a maintains the volume at Vmax [dB]
  • the distance to the transmitter 1800b is longer than L2 [m].
  • the volume is maintained at Vmin [dB].
  • the receiver 1800a stores the maximum volume Vmax, the longest distance L1 at which the sound of the maximum volume Vmax is output, the minimum volume Vmin, and the shortest distance L2 at which the sound of the minimum volume Vmin is output. is doing.
  • the receiver 1800a may change the maximum volume Vmax, the minimum volume Vmin, the longest distance L1, and the shortest distance L2 according to the attributes set for the receiver 1800a. For example, when the attribute is the age of the user and the age indicates a high age, the receiver 1800a sets the maximum volume Vmax higher than the reference maximum volume and sets the minimum volume Vmin higher than the reference minimum volume. Also good. Further, the attribute may be information indicating whether audio output is performed from a speaker or an earphone.
  • the minimum volume Vmin is set in the receiver 1800a, it is possible to prevent the receiver 1800a from being inaudible because the receiver 1800a is too far from the transmitter 1800b. Furthermore, since the maximum volume Vmax is set in the receiver 1800a, the receiver 1800a is too close to the transmitter 1800b, so that it is possible to suppress an excessively loud sound from being output.
  • FIG. 99 is a diagram illustrating an example of AR image superimposition performed by the receiver 200.
  • the receiver 200 images the illuminated signboard.
  • the signboard is lit up by the illumination device which is the above-described transmitter 100 that transmits the optical ID. Therefore, the receiver 200 acquires the captured display image Ppre and the decoding image Pdec by the imaging. Then, the receiver 200 acquires the optical ID by decoding the decoding image Pdec, and acquires a plurality of AR images P27a to P27c associated with the optical ID and the recognition information from the server. Based on the recognition information, the receiver 200 recognizes the periphery of the area m2 in which the signboard is displayed in the captured display image Ppre as a target area.
  • the receiver 200 recognizes an area in contact with the left side of the area m2 as the first target area, and superimposes the AR image P27a on the first target area. To do.
  • the receiver 200 recognizes an area including the lower side of the area m2 as a second target area, and superimposes the AR image P27b on the second target area. .
  • the receiver 200 recognizes an area in contact with the upper side of the area m2 as the third target area, and superimposes the AR image P27c on the third target area.
  • each of the AR images P27a to P27c is, for example, an image of a snowman character and may be a moving image.
  • the receiver 200 switches the recognized target area to any one of the first to third target areas in a predetermined order and timing while continuously acquiring the optical ID. May be. That is, the receiver 200 may switch the recognized target area in the order of the first target area, the second target area, and the third target area. Alternatively, the receiver 200 may switch the recognized target area to any one of the first to third target areas in a predetermined order each time the above-described optical ID is acquired. That is, the receiver 200 first acquires the light ID, and while continuously acquiring the light ID, the receiver 200 recognizes the first target area as shown in FIG. Then, the AR image P27a is superimposed on the first target area. Then, when the receiver 200 cannot acquire the optical ID, the receiver 200 hides the AR image P27a.
  • the receiver 200 recognizes the second target area as shown in FIG. 99 (b) while continuously acquiring the light ID. Then, the AR image P27b is superimposed on the second target area. Then, when the receiver 200 cannot acquire the optical ID again, the receiver 200 hides the AR image P27b.
  • the receiver 200 recognizes the third target area as shown in (c) of FIG. 99 while continuously acquiring the light ID. Then, the AR image P27c is superimposed on the third target area.
  • the receiver 200 displays the AR image displayed once every N (N is an integer of 2 or more) times.
  • N is the number of times the AR image is displayed, and may be 200 times, for example. That is, the AR images P27a to P27c are images of the same white character, but an AR image of a pink character, for example, is displayed at a frequency of once every 200 times.
  • the receiver 200 may give points to the user when receiving an operation on the AR image by the user.
  • the user's interest can be directed to the imaging of the signboard lit up by the transmitter 100.
  • the user can repeatedly obtain the optical ID.
  • FIG. 100 is a diagram illustrating an example of superimposition of AR images by the receiver 200.
  • the receiver 200 functions as a so-called way finder (Way Finder) that presents a route to be followed by the user, for example, by imaging the mark M4 drawn on the floor surface at a position where a plurality of passages intersect in the building.
  • Way Finder Way finder
  • the mark M4 is lit up by the illumination device that is the above-described transmitter 100 that transmits the light ID by a change in luminance. Therefore, the receiver 200 acquires the captured display image Ppre and the decoding image Pdec by capturing the mark M4. Then, the receiver 200 acquires the optical ID by decoding the decoding image Pdec, and transmits the optical ID and the terminal information of the receiver 200 to the server.
  • the receiver 200 acquires a plurality of AR images P28 and recognition information associated with the optical ID and terminal information from the server.
  • the optical ID and terminal information are stored in the server in association with a plurality of AR images P28 and recognition information at the time of user check-in.
  • the receiver 200 Based on the recognition information, the receiver 200 recognizes a plurality of target areas around the area m4 in which the mark M4 is displayed in the captured display image Ppre. Then, as shown in FIG. 100, the receiver 200 superimposes and displays an AR image P28 such as an animal footprint on each of the plurality of target regions.
  • an AR image P28 such as an animal footprint
  • the recognition information indicates the course of turning to the right at the position of the mark M4.
  • the receiver 200 identifies a route in the captured display image Ppre and recognizes a plurality of target regions arranged along the route.
  • This route is a route that goes from the lower side of the display 201 to the region m4 and turns right in the region m4.
  • the receiver 200 arranges the AR image P28 in each of the recognized plurality of target regions as if the animal walked along the route.
  • the receiver 200 may use the geomagnetism detected by the 9-axis sensor provided in the receiver 200.
  • the recognition information indicates the direction to proceed at the position of the mark M4 with reference to the direction of geomagnetism.
  • the recognition information indicates west as the direction to proceed at the position of the mark M4.
  • the receiver 200 specifies a route from the lower side of the display 201 toward the area m4 and toward the west in the area m4 in the captured display image Ppre. Then, the receiver 200 recognizes a plurality of target areas arranged along the route. Note that the receiver 200 identifies the lower side of the display 201 by detecting gravitational acceleration using a nine-axis sensor.
  • the route of the user is presented by the receiver 200, the user can easily reach the destination by following the route.
  • the course is displayed as an AR image in the captured display image Ppre, the course can be presented to the user in an easy-to-understand manner.
  • the illuminating device which is the transmitter 100 can appropriately transmit the light ID while suppressing the brightness by illuminating the mark M4 with a short pulse of light.
  • the receiver 200 images the mark M4.
  • the receiver 200 may image the illumination device using a camera (a so-called self-taking camera) disposed on the display 201 side. The receiver 200 may capture both the mark M4 and the illumination device.
  • FIG. 101 is a diagram for explaining an example of how the receiver 200 obtains the line scan time.
  • the receiver 200 performs decoding using the line scan time when decoding the decoding image Pdec.
  • This line scan time is the time from the start of exposure of one exposure line included in the image sensor to the start of exposure of the next exposure line. If the line scan time is known, the receiver 200 decodes the decoding image Pdec using the known line scan time. However, when the line scan time is not known, the receiver 200 obtains the line scan time from the decoding image Pdec.
  • the receiver 200 finds a line having the minimum width from among a plurality of bright lines and a plurality of dark lines constituting a bright line pattern in the decoding image Pdec.
  • the bright line is a line on the decoding image Pdec generated when each of one or a plurality of continuous exposure lines is exposed when the luminance of the transmitter 100 is high.
  • the dark line is a line on the decoding image Pdec generated by exposure of each of one or a plurality of continuous exposure lines when the luminance of the transmitter 100 is low.
  • the receiver 200 When the receiver 200 finds the line with the minimum width, the receiver 200 specifies the number of exposure lines corresponding to the line with the minimum width, that is, the number of pixels.
  • the carrier frequency at which the luminance changes so that the transmitter 100 transmits the optical ID is 9.6 kHz
  • the time when the luminance of the transmitter 100 is high or low is 104 ⁇ s at the shortest. Therefore, the receiver 200 calculates the line scan time by dividing 104 ⁇ s by the number of pixels having the specified minimum width.
  • FIG. 102 is a diagram for explaining an example of how the receiver 200 obtains the line scan time.
  • the receiver 200 may perform a Fourier transform on the bright line pattern of the decoding image Pdec and obtain the line scan time based on the spatial frequency obtained by the Fourier transform.
  • the receiver 200 derives a spectrum indicating the relationship between the spatial frequency and the intensity of the component of the spatial frequency in the decoding image Pdec by the Fourier transform described above.
  • the receiver 200 sequentially selects each of the plurality of peaks indicated in the spectrum.
  • the receiver 200 calculates a line scan time such that the spatial frequency of the selected peak (for example, the spatial frequency f2 in FIG. 102) is obtained by a time frequency of 9.6 kHz.
  • 9.6 kHz is the carrier frequency of the luminance change of the transmitter 100 as described above.
  • the receiver 200 selects the most likely candidate among the plurality of line scan time candidates as the line scan time.
  • the receiver 200 calculates the allowable range of line scan time based on the frame rate in imaging and the number of exposure lines included in the image sensor. That is, the receiver 200 calculates the maximum value of the line scan time by 1 ⁇ 10 6 [ ⁇ s] / ⁇ (frame rate) ⁇ (number of exposure lines) ⁇ . Then, the receiver 200 determines the maximum value ⁇ constant K (K ⁇ 1) to the maximum value as the allowable range of the line scan time.
  • K is, for example, 0.9 or 0.8.
  • the receiver 200 selects a candidate within this allowable range from among a plurality of line scan time candidates as a maximum likelihood candidate, that is, a line scan time.
  • the receiver 200 may evaluate the reliability of the calculated line scan time depending on whether or not the line scan time calculated according to the example illustrated in FIG. 101 is within the above-described allowable range.
  • FIG. 103 is a flowchart showing an example of how to obtain the line scan time by the receiver 200.
  • the receiver 200 may obtain the line scan time by trying to decode the decoding image Pdec. Specifically, first, the receiver 200 starts imaging (step S441). Next, the receiver 200 determines whether or not the line scan time is known (step S442). For example, the receiver 200 may determine whether or not the line scan time is known by notifying the server of the type and model of the receiver 200 and inquiring the line scan time according to the type and model. . If it is determined that it is known (Yes in step S442), the receiver 200 sets the reference acquisition count of the optical ID to n (n is an integer equal to or larger than 2, for example, 4) (step S443). ).
  • the receiver 200 acquires the optical ID by decoding the decoding image Pdec using the known line scan time (step S444). At this time, the receiver 200 acquires a plurality of optical IDs by performing decoding on each of the plurality of decoding images Pdec obtained sequentially by the imaging started in step S441.
  • the receiver 200 determines whether or not the same optical ID has been acquired the reference acquisition times (that is, n times) (step S445). If it is determined that it has been acquired n times (Yes in step S445), the receiver 200 trusts the optical ID and starts processing using the optical ID (for example, superimposition of an AR image) (step S446). On the other hand, if it is determined that it has not been acquired n times (No in step S445), the receiver 200 does not trust the optical ID and ends the process.
  • step S442 If it is determined in step S442 that the line scan time is not known (No in step S442), the receiver 200 sets the optical ID reference acquisition count to n + k (k is an integer equal to or greater than 1) (step S447). . That is, when the line scan time is not known, the receiver 200 sets a larger reference acquisition count than when the line scan time is known. Next, the receiver 200 determines a temporary line scan time (step S448). Then, the receiver 200 acquires the optical ID by decoding the decoding image Pdec using the provisional line scan time (step S449).
  • the receiver 200 acquires a plurality of optical IDs by performing decoding on each of the plurality of decoding images Pdec obtained sequentially by the imaging started in step S441, as described above.
  • the receiver 200 determines whether or not the same optical ID has been acquired the reference acquisition times (that is, (n + k) times) (step S450).
  • the receiver 200 determines that the provisional line scan time is the correct line scan time. Then, the receiver 200 notifies the server of the type and model of the receiver 200 and the line scan time (step S451). As a result, the server stores the type and model of the receiver in association with the line scan time suitable for the receiver. Therefore, when another receiver of the same type and type starts imaging, the other receiver can specify its own line scan time by making an inquiry to the server. That is, the other receivers can determine that the line scan time is known in the determination in step S442.
  • the receiver 200 trusts the optical ID acquired (n + k) times, and starts processing using the optical ID (for example, superimposition of an AR image) (step S446).
  • step S450 determines whether or not an end condition is satisfied (step S452).
  • the end condition is, for example, that a predetermined time has elapsed since the start of imaging, or that the optical ID has been acquired more than the maximum number of acquisitions. If it is determined that such an end condition is satisfied (Yes in step S452), the receiver 200 ends the process. On the other hand, when determining that the termination condition is not satisfied (No in step S452), the receiver 200 changes the provisional line scan time (step S453). Then, the receiver 200 repeatedly executes the processing from step S449 using the changed provisional line scan time.
  • the receiver 200 can obtain the line scan time as in the examples shown in FIGS. Accordingly, regardless of the type and model of the receiver 200, the receiver 200 can appropriately decode the decoding image Pdec and obtain an optical ID.
  • FIG. 104 is a diagram illustrating an example of AR image superimposition performed by the receiver 200.
  • the receiver 200 images the transmitter 100 configured as a television.
  • the transmitter 100 periodically transmits an optical ID and a time code by changing luminance while displaying a television program, for example.
  • the time code is information indicating the time at the time of transmission each time it is transmitted, and may be, for example, a time packet shown in FIG.
  • the receiver 200 periodically acquires the captured display image Ppre and the decoding image Pdec through the above-described imaging.
  • the receiver 200 acquires the above-described optical ID and time code by decoding the decoding image Pdec while displaying the captured display image Ppre acquired periodically on the display 201.
  • the receiver 200 transmits the optical ID to the server 300.
  • the server 300 receives the optical ID, the server 300 transmits the audio data associated with the optical ID, the AR start time information, the AR image P29, and the recognition information to the receiver 200.
  • the receiver 200 When the receiver 200 acquires the audio data, the receiver 200 reproduces the audio data in synchronization with the video of the TV program displayed on the transmitter 100. That is, the sound data is composed of a plurality of sound unit data, and the plurality of sound unit data includes a time code.
  • the receiver 200 starts reproduction of a plurality of audio unit data from the audio unit data including the time code indicating the same time as the time code acquired from the transmitter 100 together with the optical ID in the audio data. Thereby, the reproduction of the audio data is synchronized with the video of the television program. It should be noted that such synchronization between audio and video may be performed by the same method as the audio synchronous reproduction shown in each of the drawings after FIG.
  • the receiver 200 When the receiver 200 acquires the AR image P29 and the recognition information, the receiver 200 recognizes an area corresponding to the recognition information in the captured display image Ppre as a target area, and superimposes the AR image P29 on the target area.
  • the AR image P29 is an image showing a crack in the display 201 of the receiver 200
  • the target area is an area that crosses the image of the transmitter 100 in the captured display image Ppre.
  • the receiver 200 displays the captured display image Ppre on which the AR image P29 as described above is superimposed at a timing according to the AR start time information.
  • the AR start time information is information indicating the time at which the AR image P29 is displayed. That is, the receiver 200 captures a display image in which the above-described AR image P29 is superimposed at the timing of receiving the time code indicating the same time as the AR start time information among the time codes transmitted from the transmitter 100 as needed.
  • Display Pre For example, the time indicated by the AR start time information is the time when a scene in which a magician girl applies ice magic appears in a television program. At this time, the receiver 200 may output from the speaker of the receiver 200 a sound in which the AR image P29 is cracked due to the reproduction of the audio data.
  • the receiver 200 may vibrate a vibrator provided in the receiver 200 at a time indicated by the AR start time information, or may cause the light source to emit light like a flash, and the display 201 may be instantaneously displayed. It may be brightened or flashed.
  • the AR image P29 may include not only an image showing a crack but also an image showing a state in which condensation of the display 201 is frozen.
  • FIG. 105 is a diagram illustrating an example of AR image superimposition performed by the receiver 200.
  • the receiver 200 images the transmitter 100 configured as a toy cane, for example.
  • the transmitter 100 includes a light source, and transmits an optical ID by changing the luminance of the light source.
  • the receiver 200 periodically acquires the captured display image Ppre and the decoding image Pdec through the above-described imaging.
  • the receiver 200 acquires the above-described optical ID by decoding the decoding image Pdec while displaying the captured display image Ppre acquired periodically on the display 201.
  • the receiver 200 transmits the optical ID to the server 300.
  • the server 300 receives the optical ID
  • the server 300 transmits the AR image P30 associated with the optical ID and the recognition information to the receiver 200.
  • the recognition information further includes gesture information indicating a gesture (that is, an action) by a person holding the transmitter 100.
  • the gesture information indicates, for example, a gesture in which a person moves the transmitter 100 from right to left.
  • the receiver 200 compares the gesture by the person holding the transmitter 100 displayed on each captured display image Ppre with the gesture indicated by the gesture information. Then, when the gestures coincide with each other, the receiver 200 arranges the AR images P30 such that, for example, many star-shaped AR images P30 are arranged along the trajectory of the transmitter 100 moved by the gestures. Is superimposed on the captured display image Ppre.
  • FIG. 106 is a diagram illustrating an example of AR image superimposition performed by the receiver 200.
  • the receiver 200 images the transmitter 100 configured as, for example, a toy cane, as described above.
  • the receiver 200 periodically acquires the captured display image Ppre and the decoding image Pdec by the imaging.
  • the receiver 200 acquires the above-described optical ID by decoding the decoding image Pdec while displaying the captured display image Ppre acquired periodically on the display 201.
  • the receiver 200 transmits the optical ID to the server 300.
  • the server 300 receives the optical ID, the server 300 transmits the AR image P31 associated with the optical ID and the recognition information to the receiver 200.
  • the recognition information includes gesture information indicating a gesture by a person holding the transmitter 100 as described above.
  • the gesture information indicates, for example, a gesture in which a person moves the transmitter 100 from right to left.
  • the receiver 200 compares the gesture by the person holding the transmitter 100 displayed on each captured display image Ppre with the gesture indicated by the gesture information. Then, when the gestures match, the receiver 200, for example, in the captured display image Ppre, an AR image P30 indicating a dress costume in a target area that is an area in which a person holding the transmitter 100 is projected. Is superimposed.
  • gesture information corresponding to the light ID is acquired from the server. Next, it is determined whether or not the movement of the subject indicated by the periodically acquired captured display image matches the movement indicated by the gesture information acquired from the server. And when it determines with matching, the picked-up display image Ppre on which AR image was superimposed is displayed.
  • an AR image can be displayed according to the movement of a subject such as a person. That is, the AR image can be displayed at an appropriate timing.
  • FIG. 107 is a diagram illustrating an example of the decoding image Pdec acquired according to the attitude of the receiver 200.
  • the receiver 200 images the transmitter 100 that transmits the optical ID according to the luminance change in the horizontal orientation.
  • the longitudinal direction of the display 201 of the receiver 200 is an orientation along the horizontal direction.
  • each exposure line of the image sensor provided in the receiver 200 is orthogonal to the longitudinal direction of the display 201.
  • the user changes the attitude of the receiver 200 from landscape to portrait.
  • the vertical orientation is a posture in which the longitudinal direction of the display 201 of the receiver 200 is along the vertical direction.
  • the receiver 200 having such an attitude can acquire the decoding image Pdec including the bright line pattern region Y having a large number of bright lines when the transmitter 100 that transmits the light ID is imaged.
  • the optical ID may not be appropriately acquired according to the attitude of the receiver 200. Therefore, when the receiver 200 acquires the optical ID, the attitude of the receiver 200 that is imaging is appropriately set. It is good to change. When the posture is changed, the receiver 200 can appropriately acquire the light ID at a timing when the posture is such that the light ID can be easily acquired.
  • FIG. 108 is a diagram illustrating another example of the decoding image Pdec acquired according to the attitude of the receiver 200.
  • the transmitter 100 is configured as a digital signage of a coffee shop, displays a video relating to a coffee shop advertisement during the video display period, and transmits a light ID by a change in luminance during the light ID transmission period. That is, the transmitter 100 alternately and repeatedly performs video display during the video display period and transmission of the optical ID during the optical ID transmission period.
  • the receiver 200 periodically acquires the captured display image Ppre and the decoding image Pdec by imaging of the transmitter 100.
  • the decoding cycle including the bright line pattern region is synchronized with the repetition cycle of the video display period and the optical ID transmission period of the transmitter 100 and the repetition cycle of acquisition of the captured display image Ppre and the decoding image Pdec by the receiver 200.
  • the image Pdec may not be acquired.
  • the decoding image Pdec including the bright line pattern region may not be acquired depending on the attitude of the receiver 200.
  • the receiver 200 images the transmitter 100 in a posture as shown in FIG. That is, the receiver 200 approaches the transmitter 100 and images the transmitter 100 so that the image of the transmitter 100 is projected on the entire image sensor of the receiver 200.
  • the receiver 200 appropriately acquires the captured display image Ppre displayed by the transmitter 100. .
  • the receiver 200 Even when the timing at which the receiver 200 acquires the decoding image Pdec extends between the video display period and the optical ID transmission period of the transmitter 100, the receiver 200 includes the bright line pattern region Z1. An image Pdec can be acquired.
  • the exposure of each exposure line included in the image sensor is started sequentially from the exposure line at the upper end in the vertical direction downward. Therefore, even if the receiver 200 starts exposure of the image sensor to acquire the decoding image Pdec during the video display period, it is not possible to obtain the bright line pattern region. However, when the video display period is switched to the light ID transmission period, a bright line pattern region corresponding to each exposure line that is exposed in the light ID transmission period can be obtained.
  • the receiver 200 images the transmitter 100 in a posture as shown in FIG. That is, the receiver 200 is separated from the transmitter 100 and images the transmitter 100 so that the image of the transmitter 100 is projected only on the area above the image sensor of the receiver 200.
  • the receiver 200 captures the captured display image Prep on which the transmitter 100 is projected. Get properly.
  • the receiver 200 selects the decoding image Pdec including the bright line pattern region. You may not be able to get it.
  • the decoding image Pdec having the bright line pattern region cannot be acquired.
  • the receiver 200 projects the image of the transmitter 100 only on the lower region of the image sensor of the receiver 200 in a state of being separated from the transmitter 100. Then, the transmitter 100 is imaged. At this time, as described above, if the timing at which the receiver 200 acquires the captured display image Ppre is within the video display period of the transmitter 100, the receiver 200 captures the captured display image Prep on which the transmitter 100 is projected. Get properly. Furthermore, even when the timing at which the receiver 200 acquires the decoding image Pdec extends between the video display period and the optical ID transmission period of the transmitter 100, the receiver 200 acquires the decoding image Pdec including the bright line pattern region. There are things that can be done.
  • the decoding image Pdec having the bright line pattern region Z2 can be acquired.
  • the receiver 200 may change the attitude of the receiver 200 when acquiring the optical ID. You may be encouraged. That is, when imaging starts, the receiver 200 displays, for example, a message “Please move” or “Shake” or output sound so that the attitude of the receiver 200 changes. Thereby, since the receiver 200 performs imaging while changing the posture, it can appropriately acquire the light ID.
  • FIG. 109 is a flowchart illustrating an example of processing operation of the receiver 200.
  • the receiver 200 determines whether or not the receiver 200 is shaken during imaging (step S461). Specifically, the receiver 200 determines whether or not it is shaken based on the output of the 9-axis sensor provided in the receiver 200.
  • the receiver 200 increases the above-described optical ID acquisition rate (step S462). Specifically, the receiver 200 acquires all captured images per unit time obtained during imaging as decoding images (that is, bright line images) Pdec, and decodes all the acquired decoding images. .
  • the receiver 200 starts acquisition and decoding when all the captured images are acquired as the captured display image Ppre, that is, when acquisition and decoding of the decoding image Pdec are stopped.
  • the receiver 200 determines that it is not shaken during imaging (No in step S461), the receiver 200 acquires the decoding image Pdec at a low optical ID acquisition rate (step S463). Specifically, if the optical ID acquisition rate is increased in step S462 and is still a high optical ID acquisition rate, the receiver 200 sets the optical ID acquisition rate because the current optical ID acquisition rate is high. Lower. Thereby, since the frequency with which the decoding process of the decoding image Pdec by the receiver 200 is reduced, power consumption can be suppressed.
  • the receiver 200 determines whether or not an end condition for ending the adjustment process of the optical ID acquisition rate is satisfied (step S464).
  • the receiver 200 determines that the end condition is not satisfied (No in step S464), The processing from S461 is repeatedly executed.
  • the receiver 200 determines that the end condition is satisfied (Yes in step S464), the receiver 200 ends the optical ID acquisition rate adjustment process.
  • FIG. 110 is a diagram illustrating an example of a camera lens switching process by the receiver 200.
  • the receiver 200 may include a wide-angle lens 211 and a telephoto lens 212 as camera lenses.
  • a captured image obtained by imaging using the wide-angle lens 211 is an image with a wide angle of view, and a subject is projected to be small in the image.
  • a captured image obtained by imaging using the telephoto lens 212 is an image with a narrow angle of view, and a subject is projected greatly in the image.
  • the receiver 200 as described above may switch the camera lens used for imaging by any one of the methods A to E shown in FIG.
  • the receiver 200 always uses the telephoto lens 212 when imaging, whether in normal imaging or when receiving an optical ID.
  • the case of normal imaging is a case where all captured images are acquired as captured display images Ppre by imaging.
  • the case where the optical ID is received is a case where the captured display image Ppre and the decoding image Pdec are periodically acquired by imaging.
  • the receiver 200 uses the wide-angle lens 211 in the case of normal imaging.
  • the receiver 200 first uses the wide-angle lens 211.
  • the receiver 200 switches the camera lens from the wide-angle lens 211 to the telephoto lens 212 if the bright line pattern region is included in the decoding image Pdec acquired when the wide-angle lens 211 is used. After this switching, the receiver 200 can acquire a decoding image Pdec with a narrow angle of view, that is, a bright line pattern region appearing large.
  • the receiver 200 uses the wide-angle lens 211 in the case of normal imaging.
  • the receiver 200 switches the camera lens between the wide-angle lens 211 and the telephoto lens 212. That is, the receiver 200 acquires the captured display image Ppre using the wide-angle lens 211 and acquires the decoding image Pdec using the telephoto lens 212.
  • the receiver 200 switches the camera lens between the wide-angle lens 211 and the telephoto lens 212 in accordance with the operation by the user regardless of whether it is normal imaging or receives an optical ID.
  • the receiver 200 when receiving the optical ID, decodes the decoding image Pdec acquired using the wide-angle lens 211. If the decoding image Pdec cannot be correctly decoded, the camera lens is changed from the wide-angle lens 211 to the telephoto lens 212. Switch. Alternatively, the receiver 200 decodes the decoding image Pdec acquired using the telephoto lens 212, and switches the camera lens from the telephoto lens 212 to the wide-angle lens 211 if it cannot be decoded correctly. Note that when determining whether or not the decoding image Pdec has been correctly decoded, the receiver 200 first transmits an optical ID obtained by decoding the decoding image Pdec to the server.
  • the server notifies the receiver 200 of the matching information indicating that it matches, and if it does not match, the server does not match. Is sent to the receiver 200.
  • the receiver 200 determines that the decoding image Pdec has been correctly decoded if the information notified from the server is coincidence information. If the information notified from the server is mismatch information, the receiver 200 correctly decodes the decoding image Pdec. Judge that it was not possible. Alternatively, the receiver 200 determines that the decoding image Pdec has been correctly decoded when the optical ID obtained by decoding the decoding image Pdec satisfies a predetermined condition. On the other hand, if the condition is not satisfied, the receiver 200 determines that the decoding image Pdec has not been correctly decoded.
  • FIG. 111 is a diagram illustrating an example of camera switching processing by the receiver 200.
  • the receiver 200 includes an in camera 213 and an out camera (not shown in FIG. 111) as cameras.
  • the in-camera 213 is also referred to as a face camera or a self-portrait camera, and is a camera arranged on the same surface as the display 201 in the receiver 200.
  • the out camera is a camera arranged on the surface of the receiver 200 opposite to the surface of the display 201.
  • Such a receiver 200 images the transmitter 100 configured as a lighting device with the in-camera 213 with the in-camera 213 facing upward. By this imaging, the receiver 200 acquires the decoding image Pdec, and acquires the optical ID transmitted from the transmitter 100 by decoding the decoding image Pdec.
  • the receiver 200 acquires the AR image and the recognition information associated with the optical ID from the server by transmitting the acquired optical ID to the server.
  • the receiver 200 starts a process of recognizing a target area corresponding to the recognition information from the captured display images Ppre obtained by the out camera and the in camera 213, respectively.
  • the receiver 200 prompts the user to move the receiver 200 when the target area cannot be recognized from any of the captured display images Ppre obtained by the out camera and the in camera 213 respectively.
  • the user who is prompted by the receiver 200 moves the receiver 200. Specifically, the user moves the receiver 200 so that the in-camera 213 and the out-camera face the front-rear direction of the user.
  • the receiver 200 recognizes the target area from the captured display image Ppre acquired by the out camera. That is, the receiver 200 recognizes a region in which a person is projected as a target region, superimposes an AR image on the target region in the captured display image Ppre, and displays the captured display image Ppre on which the AR image is superimposed. To do.
  • FIG. 112 is a flowchart showing an example of processing operation between the receiver 200 and the server.
  • the receiver 200 acquires an optical ID transmitted from the transmitter 100 by capturing an image of the transmitter 100, which is a lighting device, with the in-camera 213, and transmits the optical ID to the server (step S471).
  • the server receives the optical ID from the receiver 200 (step S472), and estimates the position of the receiver 200 based on the optical ID (step S473).
  • the server stores, for each light ID, a table indicating a room, a building, a space, or the like in which the transmitter 100 that transmits the light ID is arranged. Then, the server estimates the room associated with the optical ID transmitted from the receiver 200 as the position of the receiver 200 in the table. Further, the server transmits the AR image and recognition information associated with the estimated position to the receiver 200 (step S474).
  • the receiver 200 acquires the AR image and the recognition information transmitted from the server (Step S475).
  • the receiver 200 starts a process of recognizing a target area corresponding to the recognition information from each captured display image Ppre obtained by each of the out camera and the in camera 213.
  • the receiver 200 recognizes the target area from, for example, the captured display image Ppre acquired by the out-camera (step S476).
  • the receiver 200 superimposes the AR image on the target area in the captured display image Ppre, and displays the captured display image Ppre on which the AR image is superimposed (step S477).
  • the receiver 200 when the receiver 200 acquires the AR image and the recognition information transmitted from the server, in step S476, the receiver 200 selects from the captured display images Ppre obtained by the out camera and the in camera 213, respectively.
  • the process of recognizing the target area has started.
  • the receiver 200 may start the process of recognizing the target area from the captured display image Ppre obtained only by the out-camera in step S476. That is, the camera for acquiring the light ID (in-camera 213 in the above example) and the camera for acquiring the captured display image Pre on which the AR image is superimposed (out-camera in the above example) are always different. It may be allowed.
  • the receiver 200 images the transmitter 100 that is an illumination device with the in-camera 213, but the floor illuminated by the transmitter 100 may be captured with the out-camera. Even with such an out-camera imaging, the receiver 200 can acquire the optical ID transmitted from the transmitter 100.
  • FIG. 113 is a diagram illustrating an example of AR image superimposition performed by the receiver 200.
  • the receiver 200 images the transmitter 100 configured as a microwave oven installed in a store such as a convenience store.
  • the transmitter 100 includes a camera for imaging the inside of a microwave oven and an illumination device that illuminates the inside of the oven. And the transmitter 100 recognizes the food / beverage (namely, warming object) accommodated in the store
  • the transmitter 100 transmits the light ID indicating the recognized food or drink by causing the lighting device to emit light and changing the luminance of the lighting device.
  • this illuminating device illuminates the inside of the store
  • the user purchases food and drink at a convenience store, and puts the food and drink into the transmitter 100, which is a microwave oven, in order to warm the food and drink.
  • the transmitter 100 recognizes the food and drink with the camera, and starts warming the food and drink while transmitting the optical ID indicating the recognized food and drink.
  • the receiver 200 acquires the optical ID transmitted from the transmitter 100 by capturing an image of the transmitter 100 that has started the warming, and transmits the optical ID to the server. Next, the receiver 200 acquires an AR image, audio data, and recognition information associated with the optical ID from the server.
  • the above-mentioned AR image includes an AR image P32a that is a moving image showing a virtual state inside the transmitter 100, an AR image P32b that shows food and drink stored in the cabinet in detail, and steam from the transmitter 100.
  • AR image P32c which shows a state of being heated by a moving image
  • AR image P32d which shows the remaining time until the completion of warming of food and drink by a moving image.
  • the AR image P32a is a video in which a turntable with a pizza is rotating and a plurality of dwarfs are dancing around the pizza. It is.
  • the AR image P32b is, for example, an image showing the product name “pizza” and the material of the pizza if the food and drink stored in the warehouse is a pizza.
  • the receiver 200 Based on the recognition information, the receiver 200 recognizes the area in which the window of the transmitter 100 is projected in the captured display image Ppre as the target area of the AR image P32a, and the AR image P32a is displayed in the target area. Superimpose. Further, based on the recognition information, the receiver 200 recognizes the area above the area where the transmitter 100 is projected in the captured display image Ppre as the target area of the AR image P32b, and the target The AR image P32b is superimposed on the area. Furthermore, based on the recognition information, the receiver 200 recognizes an area between the target area of the AR image P32a and the target area of the AR image P32b in the captured display image Ppre as the target area of the AR image P32c.
  • the AR image P32c is superimposed on the target area. Further, based on the recognition information, the receiver 200 recognizes an area below the area where the transmitter 100 is projected in the captured display image Ppre as the target area of the AR image P32d, and sets the target area as the target area. The AR image P32d is superimposed.
  • the receiver 200 outputs sound generated when food or drink is heated by reproducing audio data.
  • the AR images P32a to P32d as described above are displayed by the receiver 200, and further, by outputting sound, it is possible to attract the user's interest to the receiver 200 until the warming of food and drink is completed. . As a result, the burden on the user who is waiting for completion of warming can be reduced. Further, the AR image P32c indicating steam or the like is displayed, and a sound generated when the food or drink is heated is output, so that a sizzle can be given to the user. In addition, the display of the AR image P32d allows the user to easily know the remaining time until the completion of the heating of the food and drink. Therefore, the user can read, for example, a book displayed in the store away from the transmitter 100 that is a microwave oven until the warming is completed. The receiver 200 may notify the user that the warming is completed when the remaining time becomes zero.
  • the AR image P32a is a moving image in which a turntable on which a pizza is placed is rotating and a plurality of dwarfs are dancing around the pizza. May be an image that virtually represents.
  • AR image P32b was an image which shows the brand name and material of the food / beverage accommodated in the store
  • the AR image P32b may be an image showing a discount ticket.
  • the subject is a microwave oven provided with an illumination device, and the illumination device illuminates the interior of the microwave oven and changes the luminance to change the light ID of the microwave oven.
  • the captured display image Ppre and the decoding image Pdec are acquired by imaging the microwave oven that transmits the optical ID.
  • the window portion of the microwave oven displayed in the captured display image Ppre is recognized as the target area.
  • the captured display image Ppre on which the AR image indicating the state change in the warehouse is superimposed is displayed.
  • the state of the oven can be easily communicated to the user of the microwave oven.
  • FIG. 114 is a sequence diagram showing the processing operation of the system including the receiver 200, the microwave oven, the relay server, and the electronic settlement server.
  • the microwave oven includes a camera and a lighting device as described above, and transmits the light ID by changing the luminance of the lighting device. That is, the microwave oven has a function as the transmitter 100.
  • the microwave oven recognizes food and drink stored in the cabinet with the camera (step S481).
  • the microwave oven transmits a light ID indicating the recognized food and drink to the receiver 200 by a luminance change of the lighting device.
  • the receiver 200 receives the optical ID transmitted from the microwave oven by imaging the microwave oven (step S483), and transmits the optical ID and the card information to the relay server.
  • the card information is information such as a credit card stored in advance in the receiver 200 and is information necessary for electronic payment.
  • the relay server holds a table indicating an AR image, recognition information, and product information corresponding to each optical ID. This merchandise information indicates the price of food and drink indicated by the light ID.
  • a relay server receives the optical ID and card information transmitted from the receiver 200 (step S486), it finds product information associated with the optical ID from the above table. Then, the relay server transmits the merchandise information and card information to the electronic settlement server (step S486).
  • the electronic settlement server receives the merchandise information and the card information transmitted from the relay server (step S487), the electronic settlement server performs an electronic settlement process based on the merchandise information and the card information (step S488). Then, when the electronic payment processing is completed, the electronic settlement server notifies the relay server of the completion (step S489).
  • the relay server When the relay server confirms the payment completion notification from the electronic payment server (step S490), the relay server instructs the microwave to start heating the food (step S491). Further, the relay server transmits the AR image and the recognition information associated with the optical ID received in step S485 in the above table to the receiver 200 (step S493).
  • the microwave oven When the microwave oven receives a warming start instruction from the relay server, it starts warming the food and drink stored in the cabinet (step S492). Further, when the receiver 200 receives the AR image and the recognition information transmitted from the relay server, the target according to the recognition information from the captured display image Ppre periodically acquired by the imaging started from step S483. Recognize the area. Then, the receiver 200 superimposes the AR image on the target area (step S494).
  • the user of the receiver 200 can easily settle the settlement and start warming the food and drink by placing the food and drink in the microwave oven and taking an image. Moreover, when payment cannot be performed, warming of food and drink by a user can be prohibited. Furthermore, when the warming is started, an AR image P32a shown in FIG. 113 can be displayed, and the user can be informed of the state in the warehouse.
  • FIG. 115 is a sequence diagram showing processing operations of a system including a POS terminal, a server, a receiver 200, and a microwave oven.
  • the microwave oven includes a camera and a lighting device as described above, and transmits the light ID by changing the luminance of the lighting device. That is, the microwave oven has a function as the transmitter 100.
  • a POS (point-of-sale) terminal is a terminal installed in a store such as the same convenience store as the microwave oven.
  • the user of the receiver 200 selects a food or drink as a product at a store and heads to a place where a POS terminal is installed in order to purchase the food or drink.
  • the store clerk operates the POS terminal and receives the price of food and drink from the user.
  • the POS terminal acquires operation input data and sales information (step S501).
  • the sales information indicates, for example, the name, number, and price of the product, the sales location, and the sales date and time.
  • the operation input data indicates, for example, the sex and age of the user input by the store clerk.
  • the POS terminal transmits the operation input data and sales information to the server (step S502).
  • the server receives operation input data and sales information transmitted from the POS terminal (step S503).
  • the user of the receiver 200 pays the clerk for the food and drink
  • the user puts the food and drink in the microwave oven to warm the food and drink.
  • the microwave oven recognizes the food and drink stored in the cabinet with the camera (step S504).
  • the microwave oven transmits the light ID indicating the recognized food and drink to the receiver 200 by the luminance change of the lighting device (step S505).
  • a microwave oven starts warming of food and drink (step S507).
  • the receiver 200 receives the optical ID transmitted from the microwave oven by imaging the microwave oven (step S508), and transmits the optical ID and the terminal information to the server (step S509).
  • the terminal information is information stored in advance in the receiver 200 and indicates, for example, the language type (for example, English or Japanese) displayed on the display 201 of the receiver 200.
  • the server determines whether the access from the receiver 200 is the first access (step S510).
  • the first access is the first access performed within a predetermined time from the time when the process of step S503 is performed. If the server determines that the access from the receiver 200 is the first access (Yes in step S510), the server associates and stores the operation input data and the terminal information (step S511).
  • the server determines whether or not the access from the receiver 200 is the first access, but may determine whether or not the product indicated by the sales information matches the food or drink indicated by the light ID. .
  • the server may store not only the operation input data and the terminal information but also the sales information in association with them.
  • FIG. 116 is a diagram showing a situation of indoor use such as an underground shopping mall.
  • the receiver 200 receives the light ID transmitted from the transmitter 100 configured as a lighting device, and estimates its current position. Further, the receiver 200 displays the current position on a map to provide route guidance, or displays information on nearby stores.
  • transmission of disaster information and evacuation information from the transmitter 100 can be used when communication is congested, when a communication base station fails, or when radio waves from the communication base station do not reach. Even this information can be obtained. This is effective for a hearing impaired person who has missed an emergency broadcast or cannot hear an emergency broadcast.
  • the receiver 200 acquires the optical ID transmitted from the transmitter 100 by taking an image, and further acquires the AR image P33 and the recognition information associated with the optical ID from the server. Then, the receiver 200 recognizes the target area corresponding to the recognition information from the captured display image Ppre obtained by the above-described imaging, and superimposes the AR image P33 in the shape of an arrow on the target area. Thereby, receiver 200 can be used as the above-mentioned way finder (refer to Drawing 100).
  • FIG. 117 is a diagram illustrating a state in which an augmented reality object is displayed.
  • the stage 2718e that displays the augmented reality is configured as the transmitter 100 described above, and the light emitting patterns and position patterns of the light emitting units 2718a, 2718b, 2718c, and 2718d are the reference positions for displaying the augmented reality object information and the augmented reality object. Send.
  • the receiver 200 displays the augmented reality object 2718f, which is an AR image, superimposed on the captured image based on the received information.
  • a recording medium such as an apparatus, a system, a method, an integrated circuit, a computer program, or a computer-readable CD-ROM. You may implement
  • the computer program which performs the method concerning one Embodiment is preserve
  • FIG. 118 is a diagram illustrating a configuration of a display system according to the fourth modification of the fourth embodiment.
  • This display system 500 performs object recognition and augmented reality (Augmented Reality / Mixed Reality) display using visible light signals.
  • augmented reality Augmented Reality / Mixed Reality
  • the receiver 200 performs imaging, receives a visible light signal, and extracts feature quantities for object recognition or space recognition.
  • the feature amount extraction is extraction of an image feature amount from a captured image obtained by imaging.
  • the visible light signal may be a visible light adjacent carrier signal such as infrared light or ultraviolet light.
  • the receiver 200 is configured as a recognition device that recognizes an object on which an augmented reality image (that is, an AR image) is displayed.
  • the target object is, for example, the AR target object 501.
  • the transmitter 100 transmits information such as an ID for identifying itself or the AR object 501 as a visible light signal or a radio wave signal.
  • the ID is identification information such as the above-described optical ID, for example, and the AR object 501 is the above-described target area.
  • the visible light signal is a signal transmitted by a change in luminance of a light source included in the transmitter 100.
  • the receiver 200 or the server 300 holds the identification information transmitted from the transmitter 100 in association with the AR recognition information and the AR display information.
  • the association may be one-to-one or one-to-many.
  • the AR recognition information is the above-described recognition information, and is information for recognizing the AR object 501 for performing AR display. Specifically, the AR recognition information is the image feature amount (SIFT feature amount, SURF feature amount, ORB feature amount, etc.), color, shape, size, reflectance, transmittance, or three-dimensional of the AR object 501. Model etc.
  • the AR recognition information may include identification information or a recognition algorithm indicating which recognition method is used for recognition.
  • the AR display information is information for performing AR display, and is an image (that is, the above-described AR image), video, audio, three-dimensional model, motion data, display coordinates, display size, transmittance, or the like. Further, the AR display information may be the absolute value or change ratio of each of hue, saturation, and brightness.
  • the transmitter 100 may also function as the server 300. That is, the transmitter 100 may hold the AR recognition information and the AR display information and transmit the information by wired or wireless communication.
  • the receiver 200 captures an image with a camera (specifically, an image sensor).
  • the receiver 200 receives a visible light signal or a radio wave signal such as WiFi or Bluetooth (registered trademark). Further, the receiver 200 acquires position information obtained by GPS or the like, information obtained by a gyro sensor or acceleration sensor, and information such as sound from a microphone, and integrates all or some of these information. AR objects existing in the vicinity may be recognized. Further, the receiver 200 may recognize the AR object using only one of the information without integrating the information.
  • FIG. 119 is a flowchart showing the processing operation of the display system according to the fourth modification of the fourth embodiment.
  • the receiver 200 determines whether or not a visible light signal has already been received (step S521). That is, for example, the receiver 200 determines whether or not the visible light signal indicating the identification information is acquired by photographing the transmitter 100 that transmits the visible light signal according to the luminance change of the light source. At this time, a captured image of the transmitter 100 is acquired by the shooting.
  • the receiver 200 determines that the visible light signal has already been received (Y in step S521)
  • the AR object object, reference point, spatial coordinate, or space in the space is determined from the received information.
  • the position and orientation of the receiver 200 are specified.
  • the receiver 200 recognizes the relative position of the AR object. This relative position is represented by the distance and direction from the receiver 200 to the AR object.
  • the receiver 200 identifies an AR object (that is, a target area that is a bright line pattern area) based on the size and position of the bright line pattern area shown in FIG. 50, and recognizes the relative position of the AR object. To do.
  • the receiver 200 transmits the information such as an ID included in the visible light signal and the relative position to the server 300, and uses the information and the relative position as a key, thereby registering the AR recognition information in the server 300.
  • AR display information are acquired (step S522).
  • the receiver 200 acquires not only the information on the recognized AR object but also information on other AR objects existing in the vicinity of the AR object (that is, the AR recognition information and the AR display information). Also good. Thereby, when another AR object existing in the vicinity is imaged by the receiver 200, the receiver 200 recognizes the other AR object existing in the vicinity quickly and without error. Can do. For example, other AR objects present in the vicinity are different from the first recognized AR object.
  • the receiver 200 may acquire these pieces of information from the database in the receiver 200 instead of accessing the server 300.
  • the receiver 200 receives these pieces of information after a certain period of time has elapsed from the time of acquisition or a specific process (for example, turning off the screen, pressing a button, ending or stopping an application, displaying an AR image, or another AR object. May be discarded after the recognition).
  • the receiver 200 may decrease the reliability of the information every time a certain period of time has elapsed since the acquisition of the information, and use highly reliable information among the plurality of information. Good.
  • the receiver 200 may preferentially acquire the AR recognition information of the AR object that is effective in relation to the relative position based on the relative position to each AR object. For example, the receiver 200 acquires a plurality of visible light signals (that is, identification information) by photographing the plurality of transmitters 100 in step S521, and corresponds to the plurality of visible light signals in step S522. A plurality of AR recognition information (that is, image feature amount) is acquired. At this time, in step S522, the receiver 200 selects an image feature amount of the AR object closest to the receiver 200 that performs imaging of the transmitter 100 among the plurality of AR objects. That is, the selected image feature amount is used to specify one AR object (that is, the first object) specified using the visible light signal. Thereby, even if a plurality of image feature amounts are acquired, an appropriate image feature amount can be used for specifying the first object.
  • a plurality of visible light signals that is, identification information
  • a plurality of AR recognition information that is, image feature amount
  • the receiver 200 further determines whether or not the AR recognition information has already been acquired (step S523). If it is determined that the AR recognition information has not been acquired (N in step S523), the receiver 200 does not use identification information such as an ID indicated by the visible light signal, or by image processing, or position information or radio wave information.
  • the AR object candidate is recognized using other information such as (step S524). This process may be performed only by the receiver 200. Alternatively, the receiver 200 may transmit information such as a captured image or an image feature amount of the captured image to the server 300, and the server 300 may recognize the AR object candidate. As a result, the receiver 200 acquires AR recognition information and AR display information corresponding to the recognized candidate from the server 300 or its own database.
  • the receiver 200 determines whether or not the AR object is detected by another method that does not use identification information such as an ID indicated by the visible light signal, such as image recognition (step S525). ). That is, the receiver 200 determines whether or not the AR object is recognized by a plurality of methods. Specifically, the receiver 200 specifies the AR object (that is, the first object) from the captured image using the image feature amount acquired based on the identification information indicated by the visible light signal. Then, the receiver 200 determines whether or not the AR object (that is, the second object) is specified from the captured image by image processing without using such identification information.
  • identification information such as an ID indicated by the visible light signal
  • the recognition result by the visible light signal is prioritized. That is, the receiver 200 confirms whether or not the AR objects recognized by the respective methods match. If they do not match, the receiver 200 selects one AR object on which the AR image is superimposed in the captured image as the AR object recognized by the visible light signal. Determination is made (step S526). That is, when the first object is different from the second object, the receiver 200 gives priority to the first object and recognizes it as the object on which the AR image is displayed.
  • the object on which the AR image is displayed is an object on which the AR image is superimposed.
  • the receiver 200 may prioritize a method having a higher priority order based on a priority order assigned to each of the plurality of methods. That is, the receiver 200 recognizes one AR object on which the AR image is superimposed in the captured image from among the AR objects recognized by each method, for example, by a method having the highest priority. AR target is determined. Alternatively, the receiver 200 may determine one AR object on which the AR image is superimposed in the captured image by a majority decision or a majority decision with priority. If the recognition result up to that point is overturned by this process, the receiver 200 performs an error handling process.
  • the receiver 200 determines the state of the AR object in the captured image (specifically, the absolute position, the relative position from the receiver 200, the size, the angle, and the illumination status) based on the acquired AR recognition information. Or occlusion or the like) (step S527). Then, the receiver 200 displays the AR display information (that is, the AR image) superimposed on the captured image in accordance with the recognition result (step S528). That is, the receiver 200 superimposes the AR display information on the recognized AR object in the captured image. Alternatively, the receiver 200 displays only the AR display information.
  • the difficult recognition or detection includes, for example, identification of AR objects that are image-similar (such as only different text content), detection of AR objects with fewer patterns, and AR objects with high reflectivity or transmittance. Detection of an object, detection of an AR object (for example, an animal) whose shape or pattern changes, or detection of an AR object from a wide angle (in various directions). That is, in this modification, recognition of these AR objects and AR display can be performed.
  • image processing that does not use a visible light signal, as the number of AR objects to be recognized increases, it takes time to search for neighborhoods of image feature values, which takes time for recognition processing, and the recognition rate also deteriorates. .
  • the influence of the increase in recognition time and the deterioration of the recognition rate due to the increase in recognition objects is not at all or extremely small, and effective AR object recognition is possible.
  • efficient recognition is possible by using the relative position of the AR object. For example, by using an approximate distance to the AR object, processing for making the AR object size independent of the calculation of the image feature amount can be omitted, or a size-dependent feature can be used. Can do.
  • the angle of the AR object is used and it is usually necessary to evaluate the image feature amount for many angles, only the retention and calculation of the image feature amount corresponding to the angle of the AR object is performed. The calculation speed or the memory efficiency can be improved.
  • FIG. 120 is a flowchart illustrating a recognition method according to an aspect of the present invention.
  • the display method is a method for recognizing an object on which an augmented reality image (AR image) is displayed, and includes steps S531 to S535.
  • AR image augmented reality image
  • step S531 the receiver 200 acquires the identification information by photographing the transmitter 100 that transmits a visible light signal by a change in luminance of the light source.
  • the identification information is, for example, an optical ID.
  • step S ⁇ b> 532 the receiver 200 transmits the identification information to the server 300 and acquires an image feature amount corresponding to the identification information from the server 300.
  • the image feature amount is indicated as AR recognition information or recognition information.
  • the receiver 200 specifies the first object from the captured image of the transmitter 100 using the image feature amount.
  • the receiver 200 specifies the second object from the captured image of the transmitter 100 by image processing without using the identification information (that is, the optical ID).
  • step S535 when the first object specified in step S533 is different from the second object specified in step S534, the receiver 200 gives priority to the first object and augments reality. Recognize as an object to be displayed.
  • the augmented reality image, the captured image, and the target object correspond to the AR image, the captured display image, and the target region in the fourth embodiment and the modifications thereof, respectively.
  • the first object specified by using the identification information indicated by the visible light signal, and the second object specified by the image processing without using the identification information Are different, the first object is preferentially recognized as the object on which the augmented reality image is displayed. Therefore, the object on which the augmented reality image is displayed can be appropriately recognized from the captured image.
  • the image feature amount includes an image feature amount of a third object that is located in the vicinity of the first object and is different from the first object. May be.
  • step S522 of FIG. 119 not only the image feature amount of the first object but also the image feature amount of the third object is acquired.
  • the third object can be quickly identified or recognized.
  • the receiver 200 may acquire a plurality of identification information by photographing a plurality of transmitters in step S531, and may acquire a plurality of image feature amounts corresponding to the plurality of identification information in step S532. is there.
  • the receiver 200 determines the image feature amount of the object closest to the receiver 200 that performs imaging by the plurality of transmitters among the plurality of objects as the first object. It may be used to identify
  • step S522 of FIG. 119 even if a plurality of image feature amounts are acquired, an appropriate image feature amount can be used to identify the first object.
  • the recognition device in the present modification is, for example, a device provided in the receiver 200 described above, and includes a processor and a recording medium. On this recording medium, a program for causing the processor to execute the recognition method shown in FIG. 120 is recorded. Moreover, the program in this modification is a program which makes a computer perform the recognition method shown in FIG.
  • FIG. 121 is a diagram showing an example of an operation mode of a visible light signal according to this embodiment.
  • the transmitter As shown in FIG. 121, there are two modes for the operation mode of the physical (PHY) layer of the visible light signal.
  • the first operation mode is a mode in which packet PWM (Pulse Width Modulation) is performed, and the second operation mode is a mode in which packet PPM (Pulse-Position Modulation) is performed.
  • the transmitter according to each of the above-described embodiments or modifications thereof generates and transmits a visible light signal by modulating a signal to be transmitted according to any one of these operation modes.
  • RLL Un-Length Limited
  • FEC forward error correction
  • the pulse width is modulated, and the pulse is represented by two brightness states.
  • the two brightness states are a bright state (Bright or High) and a dark state (Dark or Low), but typically the light is on and off.
  • a chunk of a physical layer signal called a packet corresponds to a MAC (medium access control) frame.
  • the transmitter can repeatedly transmit PHY packets and transmit a set of a plurality of PHY packets regardless of a special order.
  • the packet PWM is used to generate a visible light signal transmitted from a normal transmitter.
  • this pulse is a bright pulse of a bright pulse (High) and a dark pulse (Low), and the position of this pulse is modulated.
  • the position of this pulse is indicated by the interval between the pulse and the next pulse.
  • Packet PPM realizes deep dimming.
  • the format, waveform, and characteristics of the packet PPM not described in each embodiment and its modification are the same as those of the packet PWM.
  • the packet PPM is used to generate a visible light signal transmitted from a transmitter having a light source that emits very bright light.
  • dimming in the physical layer of the visible light signal is controlled by the average luminance of the optional field.
  • FIG. 122A is a flowchart showing another visible light signal generation method according to Embodiment 5.
  • This visible light signal generation method is a method for generating a visible light signal transmitted by a change in luminance of a light source provided in a transmitter, and includes steps SE1 to SE3.
  • step SE1 a preamble which is data in which each of the first and second luminance values, which are different luminance values, appears alternately along the time axis is generated.
  • step SE2 in the data in which the first and second luminance values appear alternately along the time axis, an interval from when the first luminance value appears until the next first luminance value appears is set as a transmission target. 1st payload is produced
  • step SE3 a visible light signal is generated by combining the preamble and the first payload.
  • FIG. 122B is a block diagram illustrating a configuration of another signal generation device according to Embodiment 5.
  • the signal generation device E10 is a signal generation device that generates a visible light signal transmitted by the luminance change of the light source provided in the transmitter, and includes a preamble generation unit E11, a payload generation unit E12, and a combining unit E13. .
  • the signal generation device E10 executes the processing of the flowchart shown in FIG. 122A.
  • the preamble generation unit E11 generates a preamble that is data in which the first and second luminance values, which are different luminance values, appear alternately on the time axis.
  • the payload generation unit E12 sets an interval from when the first luminance value appears until the next first luminance value appears.
  • a first payload is generated by determining according to a method according to a signal to be transmitted.
  • the combining unit E13 generates a visible light signal by combining the preamble and the first payload.
  • the first and second luminance values are Bright (High) and Dark (Low), and the first payload is a PHY payload.
  • the time length of the first luminance value in each of the preamble and the first payload is 10 ⁇ sec or less.
  • the preamble is a header for the first payload
  • the time length of the header includes three intervals from when the first luminance value appears until the next first luminance value appears.
  • each of the three intervals is 160 ⁇ s. That is, a pattern of intervals between pulses included in the header (SHR) in mode 1 of the packet PPM is defined.
  • Each of the pulses is a pulse having a first luminance value, for example.
  • the preamble is a header for the first payload
  • the time length of the header includes three intervals from when the first luminance value appears until the next first luminance value appears.
  • the first interval among the three intervals is 160 ⁇ sec
  • the second interval is 180 ⁇ sec
  • the third interval is 160 ⁇ sec. That is, a pattern of intervals between pulses included in the header (SHR) in mode 2 of the packet PPM is defined.
  • the preamble is a header for the first payload
  • the time length of the header includes three intervals from when the first luminance value appears until the next first luminance value appears.
  • the first one of the three intervals is 80 ⁇ s
  • the second interval is 90 ⁇ s
  • the third interval is 80 ⁇ s. That is, a pattern of intervals between pulses included in the header (SHR) in mode 3 of the packet PPM is defined.
  • the receiver can appropriately receive the first payload in the visible light signal.
  • the signal to be transmitted consists of 6 bits from the first bit x 0 to the bit x 5 in the sixth, the time length of the first payload, first from the first luminance value appears in the following 1 Two intervals until the brightness value of 2 appears.
  • y k x 3k + x 3k + 1 ⁇ 2 + x 3k + 2 ⁇ 4 (k is 0 or 1)
  • P k 180 + 30 ⁇ y k [ ⁇ sec], which is the above-described method. That is, in the mode 1 of the packet PPM, the signal to be transmitted is modulated as an interval between each pulse included in the first payload (PHY payload).
  • the signal to be transmitted consists of 12 bits from the first bit x 0 to bit x 11 of the 12, the time length of the first payload, first from the first luminance value appears in the following 1 This includes four intervals until the luminance value appears.
  • y k x 3k + x 3k + 1 ⁇ 2 + x 3k + 2 ⁇ 4 (k is 0, 1, 2, or 3)
  • y k x 3k + x 3k + 1 ⁇ 2 + x 3k + 2 ⁇ 4 (k is 0, 1, 2, or 3)
  • P k 180 + 30 ⁇ y k [ ⁇ sec], which is the above-described method. That is, in the mode 2 of the packet PPM, the signal to be transmitted is modulated as an interval between each pulse included in the first payload (PHY payload).
  • the signal to be transmitted consists of 3n bits from the first bit x 0 through bit x 3n-1 of the 3n (n is an integer of 2 or more), the time length of the first payload, a first It includes n intervals from when the luminance value appears until the next first luminance value appears.
  • the first payload is generated.
  • the signal to be transmitted is modulated as an interval between each pulse, so that the receiver appropriately transmits a visible light signal based on the interval. It can be demodulated into the signal of interest.
  • a footer for the first payload may be further generated.
  • the footer In the generation of the visible light signal, the footer may be combined next to the first payload. That is, in mode 3 of the packet PWM and the packet PPM, the footer (SFT) is transmitted following the first payload (PHY payload). Thereby, since the end of the first payload can be clearly specified by the footer, visible light communication can be performed efficiently.
  • a header for a signal next to a transmission target signal may be combined instead of the footer. That is, in the mode 3 of the packet PWM and the packet PPM, instead of the footer (SFT), the header (SHR) for the next first payload is transmitted following the first payload (PHY payload).
  • SFT footer
  • PHY payload the header for the next first payload
  • each component may be configured by dedicated hardware or may be realized by executing a software program suitable for each component.
  • Each component may be realized by a program execution unit such as a CPU or a processor reading and executing a software program recorded on a recording medium such as a hard disk or a semiconductor memory.
  • the program causes the computer to execute the visible light signal generation method shown by the flowchart of FIG. 122A.
  • the visible light signal generation method has been described based on the above-described embodiments and modifications.
  • the present invention is not limited to the embodiments. Unless it deviates from the gist of the present invention, various modifications conceived by those skilled in the art have been made in the present embodiment, and forms constructed by combining components in different embodiments and modifications are also within the scope of the present invention. May be included.
  • FIG. 123 is a diagram showing a format of a MAC frame in MPM.
  • the format of the MAC (medium access control) header in MPM (Mirror Pulse Modulation) consists of MHR (medium access control header) and MSDU (medium access control service-data unit).
  • MHR medium access control header
  • MSDU medium access control service-data unit
  • the MHR field includes a sequence number subfield.
  • the MSDU includes a frame payload and has a variable length.
  • the bit length of MPDU (medium access control control protocol-data unit) including MHR and MSDU is set as macMpmMpduLength.
  • the MPM is a modulation method in the fifth embodiment, for example, a method for modulating information or signals to be transmitted as shown in FIG.
  • FIG. 124 is a flowchart showing the processing operation of the encoding device for generating a MAC frame in MPM. Specifically, FIG. 124 is a diagram illustrating how to determine the bit length of the sequence number subfield. Note that the encoding device is provided in, for example, the above-described transmitter or transmission device that transmits a visible light signal.
  • the sequence number subfield includes a frame sequence number (also called a sequence number).
  • the bit length of the sequence number subfield is set as macMpmSnLength.
  • the first bit in the sequence number subfield is used as the last frame flag. That is, in this case, the sequence number subfield includes a final frame flag and a bit string indicating the sequence number.
  • the final frame flag is set to 1 for the final frame, and is set to 0 for the other frames. That is, the final frame flag indicates whether or not the processing target frame is the final frame. This final frame flag corresponds to the above-described stop bit.
  • the sequence number corresponds to the above address.
  • the encoding device determines whether or not SN is set to a variable length (step S101a).
  • SN is the bit length of the sequence number subfield. That is, the encoding apparatus determines whether macMpmSnLength indicates 0xf. When macMpmSnLength indicates 0xf, SN is a variable length, and when macMpmSnLength indicates other than 0xf, SN is a fixed length.
  • the encoding apparatus determines the SN to a value indicated by macMpmSnLength (step S102a). . At this time, the encoding apparatus does not use the final frame flag (that is, LFF).
  • the encoding apparatus determines whether the processing target frame is the last frame (step S103a).
  • the encoding apparatus determines SN to be 5 bits (step S104a). At this time, the encoding apparatus determines a final frame flag indicating 1 as the first bit in the sequence number subfield.
  • the encoding apparatus determines whether the value of the sequence number of the final frame is 1-15 (step S105a).
  • the sequence number is an integer assigned to each frame in ascending order from 0. In the case of N in step S103a, the number of frames is 2 or more. Therefore, in this case, the value of the sequence number of the last frame can take any of 1-15 except 0.
  • step S105a determines that the value of the sequence number of the final frame is 1, it determines SN as 1 bit (step S106a). At this time, the encoding apparatus determines that the value of the last frame flag which is the first bit in the sequence number subfield is 0.
  • the sequence number subfield of the last frame is represented as (1, 1) including the last frame flag (1) and the sequence number value (1).
  • the encoding apparatus determines that the bit length of the sequence number subfield of the processing target frame is 1 bit. That is, the encoding apparatus determines a sequence number subfield including only the last frame flag (0).
  • step S105a determines that the value of the sequence number of the last frame is 2, it determines SN to 2 bits (step S107a). Also at this time, the encoding apparatus determines the value of the final frame flag as 0.
  • the sequence number subfield of the last frame is represented as (1, 0, 1) including the last frame flag (1) and the sequence number value (2). Is done.
  • the sequence number is indicated by a bit string. In the bit string, the leftmost bit is LSB (leastlesignificant bit) and the rightmost bit is MSB (most significant bit). Therefore, the value (2) of the sequence number is expressed as a bit string (0, 1).
  • the encoding apparatus determines the bit length of the sequence number subfield of the processing target frame to be 2 bits. That is, the encoding apparatus determines the sequence number subfield including the last frame flag (0) and the bit (0) or (1) indicating the sequence number.
  • step S105a determines that the value of the sequence number of the final frame is 3 or 4, it determines SN to 3 bits (step S108a). Also at this time, the encoding apparatus determines the value of the final frame flag as 0.
  • step S105a determines that the value of the sequence number of the last frame is any integer of 5-8, it determines SN as 4 bits (step S109a). Also at this time, the encoding apparatus determines the value of the final frame flag as 0.
  • step S105a When the encoding apparatus determines in step S105a that the sequence number value of the final frame is any integer of 9-15, it determines SN as 5 bits (step S110a). Also at this time, the encoding apparatus determines the value of the final frame flag as 0.
  • FIG. 125 is a flowchart showing the processing operation of the decoding device for decoding the MAC frame in MPM. Specifically, FIG. 125 is a diagram showing how to determine the bit length of the sequence number subfield. Note that the decoding device is provided, for example, in the above-described receiver or receiving device that receives a visible light signal.
  • the decoding apparatus determines whether or not SN is set to a variable length (step S201a). That is, the decoding apparatus determines whether macMpmSnLength indicates 0xf. If the decoding apparatus determines that SN is not set to a variable length, that is, that SN is set to a fixed length (N in step S201a), SN is determined to be a value indicated by macMpmSnLength (step S202a). At this time, the decoding device does not use the final frame flag (ie, LFF).

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Electromagnetism (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Toxicology (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Optical Communication System (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne un procédé de communication qui permet une communication entre différents appareils. Ce procédé de communication consiste à : déterminer si un terminal peut effectuer ou non une communication par lumière visible (étape SG11) ; lorsqu'il est déterminé que le terminal peut effectuer une communication par lumière visible (Oui dans l'étape GS11), acquérir une image pour un décodage par imagerie d'un sujet à luminance variable en utilisant un capteur d'image, et acquérir des premières informations d'identification transmises par le sujet à partir d'un motif à bandes apparaissant dans l'image en vue du décodage (étape SG12) ; et lorsqu'il est déterminé dans la détermination d'une communication par lumière visible que le terminal ne peut pas effectuer une communication par lumière visible (Non dans l'étape SG11), acquérir une image capturée par imagerie du sujet en utilisant le capteur d'image, spécifier une région spécifique prescrite par détection des bords de l'image capturée, et acquérir des deuxièmes informations d'identification transmises par le sujet à partir d'un motif à lignes dans la région spécifique (étape SG13).
PCT/JP2019/014013 2018-03-30 2019-03-29 Procédé de communication, dispositif de communication, émetteur, et programme WO2019189768A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2020511094A JP7287950B2 (ja) 2018-03-30 2019-03-29 通信方法、通信装置、およびプログラム

Applications Claiming Priority (12)

Application Number Priority Date Filing Date Title
JP2018-066406 2018-03-30
JP2018066406 2018-03-30
JP2018-083454 2018-04-24
JP2018083454 2018-04-24
JP2018206923 2018-11-01
JP2018-206923 2018-11-01
US201962806977P 2019-02-18 2019-02-18
US62/806,977 2019-02-18
US201962808560P 2019-02-21 2019-02-21
US62/808,560 2019-02-21
JP2019042442 2019-03-08
JP2019-042442 2019-03-08

Publications (1)

Publication Number Publication Date
WO2019189768A1 true WO2019189768A1 (fr) 2019-10-03

Family

ID=68061897

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/014013 WO2019189768A1 (fr) 2018-03-30 2019-03-29 Procédé de communication, dispositif de communication, émetteur, et programme

Country Status (2)

Country Link
JP (1) JP7287950B2 (fr)
WO (1) WO2019189768A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113938194A (zh) * 2021-09-24 2022-01-14 华中科技大学 基于物理事件感知的目标设备无线电id辨认方法和系统
CN114666455A (zh) * 2020-12-23 2022-06-24 Oppo广东移动通信有限公司 拍摄控制方法、装置、存储介质及电子装置
WO2023272648A1 (fr) * 2021-06-30 2023-01-05 Oppo广东移动通信有限公司 Procédé, appareil et système de communication par lumière visible, et dispositif

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014103155A1 (fr) * 2012-12-27 2014-07-03 パナソニック株式会社 Procédé de communication d'informations
JP2015179392A (ja) * 2014-03-19 2015-10-08 カシオ計算機株式会社 コードシンボル表示装置、情報処理装置及びプログラム

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014103155A1 (fr) * 2012-12-27 2014-07-03 パナソニック株式会社 Procédé de communication d'informations
JP2015179392A (ja) * 2014-03-19 2015-10-08 カシオ計算機株式会社 コードシンボル表示装置、情報処理装置及びプログラム

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114666455A (zh) * 2020-12-23 2022-06-24 Oppo广东移动通信有限公司 拍摄控制方法、装置、存储介质及电子装置
WO2023272648A1 (fr) * 2021-06-30 2023-01-05 Oppo广东移动通信有限公司 Procédé, appareil et système de communication par lumière visible, et dispositif
CN113938194A (zh) * 2021-09-24 2022-01-14 华中科技大学 基于物理事件感知的目标设备无线电id辨认方法和系统

Also Published As

Publication number Publication date
JP7287950B2 (ja) 2023-06-06
JPWO2019189768A1 (ja) 2021-05-13

Similar Documents

Publication Publication Date Title
US10951310B2 (en) Communication method, communication device, and transmitter
JP6876617B2 (ja) 表示方法および表示装置
JP6524132B2 (ja) 情報通信方法、情報通信装置およびプログラム
CN110114988B (zh) 发送方法、发送装置及记录介质
TWI736702B (zh) 資訊通訊方法、資訊通訊裝置及程式
JP6378511B2 (ja) 情報通信方法、情報通信装置およびプログラム
JP7134094B2 (ja) 送信方法、送信装置、およびプログラム
JP7287950B2 (ja) 通信方法、通信装置、およびプログラム
WO2014103156A1 (fr) Procédé de communication d'informations
JP5608307B1 (ja) 情報通信方法
WO2018110373A1 (fr) Procédé d'émission, dispositif d'émission et programme
JP2020167521A (ja) 通信方法、通信装置、送信機、およびプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19774903

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020511094

Country of ref document: JP

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 19774903

Country of ref document: EP

Kind code of ref document: A1