WO2014103340A1 - 情報通信方法 - Google Patents
情報通信方法 Download PDFInfo
- Publication number
- WO2014103340A1 WO2014103340A1 PCT/JP2013/007708 JP2013007708W WO2014103340A1 WO 2014103340 A1 WO2014103340 A1 WO 2014103340A1 JP 2013007708 W JP2013007708 W JP 2013007708W WO 2014103340 A1 WO2014103340 A1 WO 2014103340A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- signal
- information
- light
- luminance
- receiver
- Prior art date
Links
- 238000004891 communication Methods 0.000 title claims abstract description 941
- 230000006854 communication Effects 0.000 title claims abstract description 940
- 238000000034 method Methods 0.000 title claims abstract description 767
- 230000005540 biological transmission Effects 0.000 claims abstract description 638
- 230000008859 change Effects 0.000 claims abstract description 509
- 230000004397 blinking Effects 0.000 claims description 77
- 241000282414 Homo sapiens Species 0.000 claims description 59
- 210000001508 eye Anatomy 0.000 claims description 14
- 230000036961 partial effect Effects 0.000 claims description 8
- 230000033764 rhythmic process Effects 0.000 claims description 7
- 101100398412 Arabidopsis thaliana ASK1 gene Proteins 0.000 abstract description 2
- 101100110018 Arabidopsis thaliana ASK3 gene Proteins 0.000 abstract description 2
- 238000010586 diagram Methods 0.000 description 957
- 235000019557 luminance Nutrition 0.000 description 757
- 238000003384 imaging method Methods 0.000 description 623
- 230000008569 process Effects 0.000 description 347
- 238000012545 processing Methods 0.000 description 259
- 238000003860 storage Methods 0.000 description 234
- 239000000047 product Substances 0.000 description 188
- 238000005286 illumination Methods 0.000 description 111
- 238000001514 detection method Methods 0.000 description 93
- 230000006870 function Effects 0.000 description 93
- 238000011112 process operation Methods 0.000 description 87
- 238000007726 management method Methods 0.000 description 81
- 230000003287 optical effect Effects 0.000 description 81
- 230000008054 signal transmission Effects 0.000 description 66
- 239000002131 composite material Substances 0.000 description 49
- 238000012384 transportation and delivery Methods 0.000 description 49
- 230000000694 effects Effects 0.000 description 48
- 230000033001 locomotion Effects 0.000 description 45
- 238000012986 modification Methods 0.000 description 43
- 230000004048 modification Effects 0.000 description 43
- 238000010411 cooking Methods 0.000 description 28
- 238000006243 chemical reaction Methods 0.000 description 27
- 238000004458 analytical method Methods 0.000 description 26
- 238000012790 confirmation Methods 0.000 description 25
- 230000001133 acceleration Effects 0.000 description 24
- 230000002829 reductive effect Effects 0.000 description 23
- 230000004044 response Effects 0.000 description 23
- 238000004364 calculation method Methods 0.000 description 20
- 238000009826 distribution Methods 0.000 description 20
- 230000001976 improved effect Effects 0.000 description 19
- 230000008901 benefit Effects 0.000 description 18
- 238000012937 correction Methods 0.000 description 18
- 230000002093 peripheral effect Effects 0.000 description 17
- 230000035945 sensitivity Effects 0.000 description 16
- 230000007423 decrease Effects 0.000 description 15
- 108010076504 Protein Sorting Signals Proteins 0.000 description 13
- 230000005856 abnormality Effects 0.000 description 13
- 230000004913 activation Effects 0.000 description 13
- 230000006399 behavior Effects 0.000 description 12
- 238000004140 cleaning Methods 0.000 description 12
- 230000007613 environmental effect Effects 0.000 description 11
- 239000000284 extract Substances 0.000 description 11
- 238000009434 installation Methods 0.000 description 11
- 239000004973 liquid crystal related substance Substances 0.000 description 11
- 238000012216 screening Methods 0.000 description 11
- 238000004590 computer program Methods 0.000 description 10
- 230000004438 eyesight Effects 0.000 description 10
- 238000009877 rendering Methods 0.000 description 10
- 241000282412 Homo Species 0.000 description 9
- 238000013459 approach Methods 0.000 description 9
- 230000003190 augmentative effect Effects 0.000 description 9
- 230000006835 compression Effects 0.000 description 9
- 238000007906 compression Methods 0.000 description 9
- 238000005259 measurement Methods 0.000 description 9
- 230000002441 reversible effect Effects 0.000 description 9
- 230000001360 synchronised effect Effects 0.000 description 9
- 230000007704 transition Effects 0.000 description 9
- 238000012795 verification Methods 0.000 description 9
- 230000002159 abnormal effect Effects 0.000 description 8
- 239000003086 colorant Substances 0.000 description 8
- 230000003247 decreasing effect Effects 0.000 description 8
- 235000013305 food Nutrition 0.000 description 7
- 238000007781 pre-processing Methods 0.000 description 7
- 238000000926 separation method Methods 0.000 description 7
- 230000007257 malfunction Effects 0.000 description 6
- 230000009467 reduction Effects 0.000 description 6
- 230000000630 rising effect Effects 0.000 description 6
- 238000005096 rolling process Methods 0.000 description 6
- 239000004065 semiconductor Substances 0.000 description 6
- 238000004904 shortening Methods 0.000 description 6
- 230000000295 complement effect Effects 0.000 description 5
- 238000009792 diffusion process Methods 0.000 description 5
- 230000000670 limiting effect Effects 0.000 description 5
- 230000000737 periodic effect Effects 0.000 description 5
- 238000003825 pressing Methods 0.000 description 5
- 238000012546 transfer Methods 0.000 description 5
- 230000001960 triggered effect Effects 0.000 description 5
- 241000209094 Oryza Species 0.000 description 4
- 235000007164 Oryza sativa Nutrition 0.000 description 4
- 230000009471 action Effects 0.000 description 4
- 238000004422 calculation algorithm Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- JCYWCSGERIELPG-UHFFFAOYSA-N imes Chemical compound CC1=CC(C)=CC(C)=C1N1C=CN(C=2C(=CC(C)=CC=2C)C)[C]1 JCYWCSGERIELPG-UHFFFAOYSA-N 0.000 description 4
- 235000009566 rice Nutrition 0.000 description 4
- 238000013519 translation Methods 0.000 description 4
- 238000009825 accumulation Methods 0.000 description 3
- 230000007175 bidirectional communication Effects 0.000 description 3
- 238000010276 construction Methods 0.000 description 3
- 210000003128 head Anatomy 0.000 description 3
- 230000001771 impaired effect Effects 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 238000002156 mixing Methods 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 230000010363 phase shift Effects 0.000 description 3
- 238000002360 preparation method Methods 0.000 description 3
- 238000005070 sampling Methods 0.000 description 3
- 230000001131 transforming effect Effects 0.000 description 3
- 210000000707 wrist Anatomy 0.000 description 3
- 238000012935 Averaging Methods 0.000 description 2
- 101100202463 Schizophyllum commune SC14 gene Proteins 0.000 description 2
- 230000003213 activating effect Effects 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 2
- 238000009835 boiling Methods 0.000 description 2
- 238000011109 contamination Methods 0.000 description 2
- 125000004122 cyclic group Chemical group 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 239000003814 drug Substances 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 238000012423 maintenance Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 239000003550 marker Substances 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 238000001454 recorded image Methods 0.000 description 2
- 230000008439 repair process Effects 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 230000003595 spectral effect Effects 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- 230000006641 stabilisation Effects 0.000 description 2
- 238000011105 stabilization Methods 0.000 description 2
- 239000000126 substance Substances 0.000 description 2
- 239000013589 supplement Substances 0.000 description 2
- 230000002123 temporal effect Effects 0.000 description 2
- YDLQKLWVKKFPII-UHFFFAOYSA-N timiperone Chemical compound C1=CC(F)=CC=C1C(=O)CCCN1CCC(N2C(NC3=CC=CC=C32)=S)CC1 YDLQKLWVKKFPII-UHFFFAOYSA-N 0.000 description 2
- 229950000809 timiperone Drugs 0.000 description 2
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 2
- 208000032041 Hearing impaired Diseases 0.000 description 1
- 101100172132 Mus musculus Eif3a gene Proteins 0.000 description 1
- 101100230207 Oryza sativa subsp. japonica GSK1 gene Proteins 0.000 description 1
- 206010047571 Visual impairment Diseases 0.000 description 1
- 239000000654 additive Substances 0.000 description 1
- 230000000996 additive effect Effects 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 230000004931 aggregating effect Effects 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 238000005282 brightening Methods 0.000 description 1
- 210000005252 bulbus oculi Anatomy 0.000 description 1
- 239000003990 capacitor Substances 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 238000000354 decomposition reaction Methods 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000023077 detection of light stimulus Effects 0.000 description 1
- 239000010432 diamond Substances 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 229940079593 drug Drugs 0.000 description 1
- 230000005684 electric field Effects 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000010304 firing Methods 0.000 description 1
- 239000012634 fragment Substances 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000006698 induction Effects 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000004315 low visual acuity Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 235000012054 meals Nutrition 0.000 description 1
- QSHDDOUJBYECFT-UHFFFAOYSA-N mercury Chemical compound [Hg] QSHDDOUJBYECFT-UHFFFAOYSA-N 0.000 description 1
- 229910052753 mercury Inorganic materials 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 231100000989 no adverse effect Toxicity 0.000 description 1
- 238000013439 planning Methods 0.000 description 1
- 238000010248 power generation Methods 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 239000002994 raw material Substances 0.000 description 1
- 230000004043 responsiveness Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 238000005549 size reduction Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 208000029257 vision disease Diseases 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 230000004393 visual impairment Effects 0.000 description 1
- 238000005406 washing Methods 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
- 238000004804 winding Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04B—TRANSMISSION
- H04B10/00—Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
- H04B10/11—Arrangements specific to free-space transmission, i.e. transmission through air or vacuum
-
- E—FIXED CONSTRUCTIONS
- E05—LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
- E05F—DEVICES FOR MOVING WINGS INTO OPEN OR CLOSED POSITION; CHECKS FOR WINGS; WING FITTINGS NOT OTHERWISE PROVIDED FOR, CONCERNED WITH THE FUNCTIONING OF THE WING
- E05F15/00—Power-operated mechanisms for wings
- E05F15/40—Safety devices, e.g. detection of obstructions or end positions
- E05F15/42—Detection using safety edges
- E05F15/43—Detection using safety edges responsive to disruption of energy beams, e.g. light or sound
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04B—TRANSMISSION
- H04B10/00—Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
- H04B10/11—Arrangements specific to free-space transmission, i.e. transmission through air or vacuum
- H04B10/114—Indoor or close-range type systems
- H04B10/1149—Arrangements for indoor wireless networking of information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04B—TRANSMISSION
- H04B10/00—Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
- H04B10/11—Arrangements specific to free-space transmission, i.e. transmission through air or vacuum
- H04B10/114—Indoor or close-range type systems
- H04B10/116—Visible light communication
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04B—TRANSMISSION
- H04B10/00—Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
- H04B10/50—Transmitters
- H04B10/516—Details of coding or modulation
- H04B10/54—Intensity modulation
- H04B10/541—Digital intensity or amplitude modulation
-
- E—FIXED CONSTRUCTIONS
- E05—LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
- E05F—DEVICES FOR MOVING WINGS INTO OPEN OR CLOSED POSITION; CHECKS FOR WINGS; WING FITTINGS NOT OTHERWISE PROVIDED FOR, CONCERNED WITH THE FUNCTIONING OF THE WING
- E05F15/00—Power-operated mechanisms for wings
- E05F15/40—Safety devices, e.g. detection of obstructions or end positions
- E05F15/42—Detection using safety edges
- E05F15/43—Detection using safety edges responsive to disruption of energy beams, e.g. light or sound
- E05F2015/434—Detection using safety edges responsive to disruption of energy beams, e.g. light or sound with cameras or optical sensors
- E05F2015/435—Detection using safety edges responsive to disruption of energy beams, e.g. light or sound with cameras or optical sensors by interruption of the beam
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/72409—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
- H04M1/72412—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces
Definitions
- the present invention relates to a communication method between a mobile terminal such as a smartphone, a tablet, and a mobile phone, and a home appliance such as an air conditioner, a lighting device, and a rice cooker.
- a mobile terminal such as a smartphone, a tablet, and a mobile phone
- a home appliance such as an air conditioner, a lighting device, and a rice cooker.
- Patent Document 1 in an optical space transmission apparatus for transmitting information to free space using light, limited transmission is performed by performing communication using a plurality of monochromatic light sources of illumination light.
- the devices techniques for efficiently realizing communication between devices are described.
- the conventional method is limited to the case where the applied device has a three-color light source such as illumination.
- the present invention solves such problems, and provides an information communication method that enables communication between various devices including devices with less computing power.
- An information communication method is an information communication method for transmitting a signal by a change in luminance, comprising: determining a pattern of change in luminance by modulating a signal to be transmitted; And a transmitting step of transmitting the signal to be transmitted by changing the luminance of the plurality of light emitters in accordance with the pattern of luminance change, and in a plane on which the plurality of light emitters are disposed, other than the plurality of light emitters
- the non-brightness-changing non-brightness-changing region does not traverse the surface between at least one of the plurality of light emitters along at least one of the vertical and horizontal directions of the surface.
- a plurality of light emitters are disposed on the surface.
- FIG. 5 is a timing diagram of a transmission signal in the information communication apparatus of the first embodiment.
- FIG. 5 is a diagram showing a relationship between a transmission signal and a reception signal in Embodiment 1.
- FIG. 5 is a diagram showing a relationship between a transmission signal and a reception signal in Embodiment 1.
- FIG. 5 is a diagram showing a relationship between a transmission signal and a reception signal in Embodiment 1.
- FIG. 5 is a diagram showing a relationship between a transmission signal and a reception signal in Embodiment 1.
- FIG. 5 is a diagram showing a relationship between a transmission signal and a reception signal in Embodiment 1.
- FIG. 7 is a diagram showing the principle of a second embodiment.
- FIG. 7 is a diagram showing an example of an operation of the second embodiment.
- FIG. 7 is a diagram showing an example of an operation of the second embodiment.
- FIG. 7 is a diagram showing an example of an operation of the second embodiment.
- FIG. 7 is a diagram showing an example of an operation of the second embodiment.
- FIG. 7 is a diagram showing an example of an operation of the second embodiment.
- FIG. 7 is a diagram showing an example of an operation of the second embodiment.
- FIG. 7 is a diagram showing an example of an operation of the second embodiment.
- FIG. 7 is a diagram showing an example of an operation of the second embodiment.
- FIG. 7 is a diagram showing an example of an operation of the second embodiment.
- FIG. 7 is a diagram showing an example of an operation of the second embodiment.
- FIG. 7 is a diagram showing an example of an operation of the second embodiment.
- FIG. 7 is a diagram showing an example of an operation of the second embodiment.
- FIG. 7 is a diagram showing an example of an operation of the second embodiment.
- FIG. 7 is a diagram showing an example of an operation of
- FIG. 7 is a diagram showing an example of an operation of the second embodiment.
- FIG. 7 is a diagram showing an example of an operation of the second embodiment.
- FIG. 7 is a diagram showing an example of an operation of the second embodiment.
- FIG. 7 is a diagram showing an example of an operation of the second embodiment.
- FIG. 21 is a diagram showing an example of an operation of the second embodiment.
- FIG. 21 is a diagram showing an example of a method of observing luminance of a light emitting unit in Embodiment 3.
- FIG. 21 is a diagram showing an example of a method of observing luminance of a light emitting unit in Embodiment 3.
- FIG. 21 is a diagram showing an example of a method of observing luminance of a light emitting unit in Embodiment 3.
- FIG. 21 is a diagram showing an example of a method of observing luminance of a light emitting unit in Embodiment 3.
- FIG. 21 is a diagram showing an example of a method of observing luminance of a light emitting unit in Embodiment 3.
- FIG. 21 is a diagram showing an example of a method of observing luminance of a light emitting unit in Embodiment 3.
- FIG. 21 is a diagram showing an example of a method of observing luminance of a light emitting unit in Embodiment 3.
- FIG. 21 is a diagram showing an example of a method of observing luminance of a light emitting unit in Embodiment 3.
- FIG. 21 is a diagram showing an example of a method of observing luminance of a light emitting unit in Embodiment 3.
- FIG. 21 is a diagram showing an example of a method of observing luminance of a light emitting unit in Embodiment 3.
- FIG. 21 is a diagram showing an example of a method of
- FIG. 21 is a diagram showing an example of a method of observing luminance of a light emitting unit in Embodiment 3.
- FIG. 21 is a diagram showing an example of a method of observing luminance of a light emitting unit in Embodiment 3.
- FIG. 21 is a diagram showing an example of a method of observing luminance of a light emitting unit in Embodiment 3.
- FIG. 21 is a diagram showing an example of a method of observing luminance of a light emitting unit in Embodiment 3.
- FIG. 18 is a diagram showing an example of a signal modulation scheme in Embodiment 3.
- FIG. 18 is a diagram showing an example of a signal modulation scheme in Embodiment 3.
- FIG. 18 is a diagram showing an example of a signal modulation scheme in Embodiment 3.
- FIG. 18 is a diagram showing an example of a signal modulation scheme in Embodiment 3.
- FIG. 18 is a diagram showing an example of a signal modulation scheme in Embodiment 3.
- FIG. 18 is a diagram showing an example of a signal modulation scheme in Embodiment 3.
- FIG. 18 is a diagram showing an example of a signal modulation scheme in Embodiment 3.
- FIG. 18 is a diagram showing an example of a signal modulation scheme in Embodiment 3.
- FIG. 18 is a diagram showing an example of a signal modulation scheme in Embodiment 3.
- FIG. 18 is a diagram showing an example of a signal modulation scheme in Embodiment 3.
- FIG. 18 is a diagram showing an example of a signal modulation scheme in Embodiment 3.
- FIG. 18 is a diagram showing an example of a signal modulation scheme in Embodiment 3.
- FIG. 18 is
- FIG. 18 is a diagram showing an example of a signal modulation scheme in Embodiment 3.
- FIG. 18 is a diagram showing an example of a signal modulation scheme in Embodiment 3.
- FIG. 18 is a diagram showing an example of a signal modulation scheme in Embodiment 3.
- FIG. 18 is a diagram showing an example of a signal modulation scheme in Embodiment 3.
- FIG. 18 is a diagram showing an example of a signal modulation scheme in Embodiment 3.
- FIG. 18 is a diagram showing an example of a signal modulation scheme in Embodiment 3.
- FIG. 18 is a diagram showing an example of a light emitting unit detection method according to Embodiment 3.
- FIG. 18 is a diagram showing an example of a light emitting unit detection method according to Embodiment 3.
- FIG. 18 is a diagram showing an example of a light emitting unit detection method according to Embodiment 3.
- FIG. 18 is a diagram showing an example of a light emitting unit detection method according to Embodiment 3.
- FIG. 18 is a diagram showing an example of a light emitting unit detection method according to Embodiment 3.
- FIG. 18 is a diagram showing an example of a light emitting unit detection method according to Embodiment 3.
- FIG. 16 is a diagram showing a timeline of a transmission signal and an image obtained by capturing a light emitting unit in Embodiment 3.
- FIG. 18 is a diagram showing an example of signal transmission by a position pattern in Embodiment 3.
- FIG. 21 is a diagram illustrating an example of a reception device in Embodiment 3.
- FIG. 18 is a diagram illustrating an example of a transmission device in Embodiment 3.
- FIG. 18 is a diagram illustrating an example of a transmission device in Embodiment 3.
- FIG. 18 is a diagram illustrating an example of a transmission device in Embodiment 3.
- FIG. 18 is a diagram illustrating an example of a transmission device in Embodiment 3.
- FIG. 18 is a diagram illustrating an example of a transmission device in Embodiment 3.
- FIG. 18 is a diagram illustrating an example of a transmission device in Embodiment 3.
- FIG. 18 is a diagram illustrating an example of a transmission device in Embodiment 3.
- FIG. 18 is a diagram showing an example of a structure of a light emitting unit in Embodiment 3.
- FIG. 18 is a diagram showing an example of a structure of a light emitting unit in Embodiment 3.
- FIG. 18 is a diagram showing an example of a signal carrier in the third embodiment.
- FIG. 18 is a diagram showing an example of an imaging unit in Embodiment 3.
- 61 is a diagram illustrating an example of position estimation of a reception device in Embodiment 3.
- FIG. 61 is a diagram illustrating an example of position estimation of a reception device in Embodiment 3.
- FIG. 61 is a diagram illustrating an example of position estimation of a reception device in Embodiment 3.
- FIG. 61 is a diagram illustrating an example of position estimation of a reception device in Embodiment 3.
- FIG. 61 is a diagram illustrating an example of position estimation of a reception device in Embodiment 3.
- FIG. 18 is a diagram showing an example of transmission information setting in Embodiment 3.
- FIG. 18 is a diagram showing an example of transmission information setting in Embodiment 3.
- FIG. 18 is a diagram showing an example of transmission information setting in Embodiment 3.
- FIG. 16 is a block diagram showing an example of components of a receiving device in a third embodiment.
- FIG. 16 is a block diagram showing an example of components of a transmission apparatus in a third embodiment.
- FIG. 18 is a diagram showing an example of a reception procedure in Embodiment 3.
- FIG. 18 is a diagram showing an example of a procedure of self-position estimation in the third embodiment.
- FIG. 18 is a diagram showing an example of a transmission control procedure in Embodiment 3.
- FIG. 18 is a diagram showing an example of a transmission control procedure in Embodiment 3.
- FIG. 18 is a diagram showing an example of a transmission control procedure in Embodiment 3.
- FIG. 18 is a diagram showing an example of information provision in a station yard according to the third embodiment.
- FIG. 18 is a diagram showing an example of a boarding service in a third embodiment.
- FIG. 18 is a diagram showing an example of in-store service in the third embodiment.
- FIG. 18 is a diagram showing an example of establishment of a wireless connection in Embodiment 3.
- FIG. 18 is a diagram showing an example of adjustment of the communication range in the third embodiment.
- FIG. 18 is a diagram showing an example of indoor use in the third embodiment.
- FIG. 18 is a diagram showing an example of outdoor use in the third embodiment.
- FIG. 17 is a diagram showing an example of directions of a route in the third embodiment.
- FIG. 21 is a diagram showing an example of use of a plurality of imaging devices in Embodiment 3.
- FIG. 18 is a diagram illustrating an example of transmission device autonomous control according to Embodiment 3.
- FIG. 18 is a diagram showing an example of transmission information setting in Embodiment 3.
- FIG. 18 is a diagram showing an example of transmission information setting in Embodiment 3.
- FIG. 18 is a diagram showing an example of transmission information setting in Embodiment 3.
- FIG. 18 is a diagram showing an example of transmission information setting in Embodiment 3.
- FIG. 18 is a diagram showing an example of a combination with a two-dimensional barcode in Embodiment 3.
- FIG. 18 is a diagram showing an example of creation and use of a map in the third embodiment.
- FIG. 18 is a diagram showing an example of state acquisition and operation of the electronic device in the third embodiment.
- FIG. 18 is a diagram showing an example of recognition of the electronic device in the third embodiment.
- FIG. 18 is a diagram showing an example of display of an augmented reality object in the third embodiment.
- FIG. 18 is a diagram showing an example of a user interface in Embodiment 3.
- FIG. 18 is a diagram showing an example of a user interface in Embodiment 3.
- FIG. 18 is a diagram showing an example of a user interface in Embodiment 3.
- FIG. 18 is a diagram showing an example of a user interface in Embodiment 3.
- FIG. 18 is a diagram showing an example of a user interface in Embodiment 3.
- FIG. 18 is a diagram showing an example of a user interface in Embodiment 3.
- FIG. 18 is a diagram showing an example of a user interface in Embodiment 3.
- FIG. 18 is a diagram showing an example of a user interface in Embodiment 3.
- FIG. 18 is a diagram showing an example of a user interface in Embodiment 3.
- FIG. 18 is a diagram showing an example of a user interface in Embodiment 3.
- FIG. 18 is a diagram showing an example of a user interface in Embodiment 3.
- FIG. 18 is a diagram showing an example of a user interface in Embodiment 3.
- FIG. 18 is a diagram showing an example of a user interface in Embodiment 3.
- FIG. 18 is a diagram showing an example of a user interface in Embodiment 3.
- FIG. 18 is a diagram showing an example of a user interface in Embodiment 3.
- FIG. 18 is a diagram showing an example of a user interface in Embodiment 3.
- FIG. 18 is a diagram showing an example of a user interface in Embodiment 3.
- FIG. 18 is a diagram showing an example of a user interface in Embodiment 3.
- FIG. 18 is a diagram showing an example of a user interface in Embodiment 3.
- FIG. 18 is a diagram showing an example of a user interface in Embodiment 3.
- 45 is a diagram illustrating an application example to ITS in Embodiment 4.
- FIG. 45 is a diagram illustrating an application example to ITS in Embodiment 4.
- FIG. 18 is a diagram showing an application example to a position information notification system and a facility system in a fourth embodiment.
- FIG. 21 is a diagram showing an application example to a system of a supermarket in a fourth embodiment.
- FIG. 21 is a diagram showing an application example to communication of a mobile phone terminal and a camera in a fourth embodiment.
- FIG. 18 is a diagram showing an application example to underwater communication in the fourth embodiment.
- FIG. 18 is a diagram for describing an example of service provision to a user in the fifth embodiment.
- FIG. 18 is a diagram for describing an example of service provision to a user in the fifth embodiment.
- 21 is a flowchart illustrating a case where a receiver in Embodiment 5 simultaneously processes a plurality of signals received from a transmitter.
- FIG. 21 is a diagram showing an example of realizing communication between devices by mutual communication in Embodiment 5.
- FIG. 21 is a diagram for describing a service that uses the characteristic of directivity in the fifth embodiment.
- FIG. 18 is a diagram for describing another example of service provision to a user in the fifth embodiment.
- FIG. 21 is a diagram illustrating an example of a format of a signal included in a light source emitted by a transmitter in a fifth embodiment.
- FIG. 18 is a diagram showing an example of the in-home environment in the sixth embodiment. It is a figure which shows the example of communication of the household appliances in Embodiment 6, and a smart phone.
- FIG. 24 is a diagram showing an example of a configuration of a transmission side apparatus in a sixth embodiment.
- FIG. 18 is a diagram showing an example of a configuration of a reception side apparatus in a sixth embodiment.
- FIG. 18 is a diagram showing a flow of processing for transmitting information to the reception side device by blinking of the LED of the transmission side device in the sixth embodiment.
- FIG. 18 is a diagram showing a flow of processing for transmitting information to the reception side device by blinking of the LED of the transmission side device in the sixth embodiment.
- FIG. 18 is a diagram showing a flow of processing for transmitting information to the reception side device by blinking of the LED of the transmission side device in the sixth embodiment.
- FIG. 18 is a diagram showing a flow of processing for transmitting information to the reception side device by blinking of the LED of the transmission side device in the sixth embodiment.
- FIG. 18 is a diagram showing a flow of processing for transmitting information to the reception side device by blinking of the LED of the transmission side device in the sixth embodiment.
- FIG. 21 is a diagram for describing a procedure for performing communication between a user and an apparatus using visible light according to a seventh embodiment.
- FIG. 21 is a diagram for describing a procedure for performing communication between a user and an apparatus using visible light according to a seventh embodiment.
- FIG. 21 is a diagram for describing a procedure for performing communication between a user and an apparatus using visible light according to a seventh embodiment.
- FIG. 21 is a diagram for describing a procedure from the purchase of the device by the user in the seventh embodiment to the initial setting of the device.
- FIG. 21 is a diagram for describing a service dedicated to a serviceperson in the case where a device in Embodiment 7 fails.
- FIG. 21 is a diagram for describing a service for confirming a cleaning state using a vacuum cleaner and visible light communication according to a seventh embodiment.
- FIG. 25 is a schematic diagram of home delivery service support using optical communication in the eighth embodiment.
- FIG. 31 is a flowchart for describing home delivery service support using optical communication in the eighth embodiment.
- FIG. 31 is a flowchart for describing home delivery service support using optical communication in the eighth embodiment.
- FIG. 31 is a flowchart for describing home delivery service support using optical communication in the eighth embodiment.
- FIG. 31 is a flowchart for describing home delivery service support using optical communication in the eighth embodiment.
- FIG. 31 is a flowchart for describing home delivery service support using optical communication in the eighth embodiment.
- FIG. 35 is a diagram for describing a process of registering a mobile phone in use with a user in the server in the ninth embodiment.
- 45 is a diagram for describing a process of analyzing user voice characteristics in Embodiment 9. [FIG. FIG.
- FIG. 35 is a diagram for describing a process of preparing for speech recognition in the ninth embodiment.
- FIG. 35 is a diagram for describing processing to collect sound from a nearby sound collection device in the ninth embodiment.
- FIG. 35 is a diagram for illustrating analysis processing of environmental sound characteristics in the ninth embodiment.
- FIG. 35 is a diagram for describing a process of canceling a sound from an audio output device present in the periphery in the ninth embodiment.
- FIG. 35 is a diagram for describing processing of selecting a cooking menu and setting operation content to a microwave according to a ninth embodiment.
- FIG. 35 is a diagram for illustrating processing of acquiring notification sound for a microwave according to Embodiment 9 from a DB of a server or the like and setting the sound to the microwave.
- FIG. 45 is a diagram for describing a process of adjusting notification sound of the microwave according to Embodiment 9.
- FIG. FIG. 40 is a diagram showing an example of a waveform of notification sound set in a microwave in the ninth embodiment. It is a figure for demonstrating the process which performs the display of the cooking content in Embodiment 9.
- FIG. FIG. 35 is a diagram for describing a process of recognizing notification sound of the microwave according to a ninth embodiment.
- FIG. 35 is a diagram for illustrating processing of sound collection from a nearby sound collection device and recognition of a notification sound of a microwave in the ninth embodiment.
- FIG. 33 is a diagram for illustrating processing of notifying the user of the end of driving of the microwave according to the ninth embodiment.
- FIG. 35 is a diagram for describing a process of checking a mobile phone operation state in a ninth embodiment.
- FIG. 35 is a diagram for describing a process of tracking the position of the user in the ninth embodiment. While canceling the voice from the voice output device, recognize the notification sound of the home appliance, make the communicable electronic device recognize the current position of the user (operator), and based on the recognition result of the user position, It is the figure which showed that a certain apparatus performs notification to a user.
- FIG. 21 is a diagram showing the contents of a database held in a server, a mobile phone, or a microwave according to a ninth embodiment.
- FIG. 40 is a diagram showing that the user has moved to another location while starting operation of the microwave according to Embodiment 9 and waiting for the end of operation, while boiling a simmered food, and so on.
- a device connected to a mobile phone via a network and capable of recognizing the position of the user or recognizing that the user is present such as a camera, a microphone, or a human sensor Is a diagram showing sending of an instruction to detect a user from.
- FIG. 35 is a diagram showing that a mobile phone recognizes a driving end sound of a microwave according to a ninth embodiment.
- the mobile phone that recognizes the end of the operation of the microwave transmits an instruction to notify the end of the operation of the microwave to the device having the screen display function and the voice output function among the devices detecting the user FIG.
- FIG. 35 is a diagram showing that a mobile phone recognizes a driving end sound of a microwave according to a ninth embodiment. The mobile phone that recognizes the end of the operation of the microwave transmits an instruction to notify the end of the operation of the microwave to the device having the screen display function and the voice output function among the devices detecting the user
- FIG. 33 is a diagram showing that the device that receives the command in Embodiment 9 notifies the user of the notification content. It is a figure which is connected with the mobile telephone via the network, the microphone which is connected with the mobile telephone via a network, and which the apparatus which exists in the vicinity of a microwave recognizes. It is a figure which shows notifying the completion
- FIG. It is a figure which shows notifying a user the completion
- FIG. It is a figure which shows transmitting information, such as driving
- FIG. 7 is a diagram showing that information can be transmitted to a mobile phone via a personal computer or the like when direct communication can not be performed from the microwave to the mobile phone serving as a hub. It is a figure which shows transmitting information, such as an operation command, to a microwave oven from a mobile telephone which received communication of FIG. 180, tracing back an information communication path
- FIG. 7 is a diagram showing that the user is notified of information when direct communication from the air conditioner which is the information source device is not possible to the mobile phone serving as the hub.
- FIG. 185 is a diagram showing a case where a television on the second floor plays the role of a relay device in place of the device relaying the notification recognition device and the information notification device in the same case as FIG. 185.
- FIG. 21 is a diagram showing an example of an environment inside a house in the tenth embodiment.
- FIG. 33 is a diagram showing an example of communication between a home electric device and a smartphone in the tenth embodiment.
- FIG. 24 is a diagram showing a configuration of a transmission side apparatus in a tenth embodiment.
- 55 is a diagram showing a configuration of a reception side apparatus in Embodiment 10.
- FIG. 187 is a sequence diagram when the transmitting terminal (TV) performs wireless LAN authentication using optical communication with the receiving terminal (tablet).
- FIG. 21 is a sequence diagram of the case of performing authentication by an application in the tenth embodiment.
- 51 is a flowchart illustrating an operation of a transmitting terminal in Embodiment 10.
- 51 is a flowchart showing the operation of the receiving side terminal in the tenth embodiment.
- FIG. 55 is a sequence diagram in which the mobile AV terminal 1 transmits data to the mobile AV terminal 2 in the eleventh embodiment.
- FIG. 26 is a screen transition diagram when mobile AV terminal 1 transmits data to mobile AV terminal 2 in Embodiment 11.
- FIG. 26 is a screen transition diagram when mobile AV terminal 1 transmits data to mobile AV terminal 2 in Embodiment 11.
- FIG. 55 is a schematic system diagram when mobile AV terminal 1 in the eleventh embodiment is a digital camera.
- FIG. 55 is a schematic system diagram when mobile AV terminal 1 in the eleventh embodiment is a digital camera.
- FIG. 55 is a schematic system diagram when mobile AV terminal 1 in the eleventh embodiment is a digital camera.
- FIG. 28 is a diagram illustrating an example of application of a receiver and a transmitter in Embodiment 12.
- FIG. 28 is a diagram illustrating an example of application of a receiver and a transmitter in Embodiment 12.
- FIG. 51 is a flowchart illustrating an example of process operations of a receiver and a transmitter in Embodiment 12.
- FIG. 28 is a diagram illustrating an example of application of a receiver and a transmitter in Embodiment 12.
- FIG. 51 is a flowchart illustrating an example of process operations of a receiver and a transmitter in Embodiment 12.
- FIG. 28 is a diagram illustrating an example of application of a receiver and a transmitter in Embodiment 12.
- FIG. 50 is a diagram illustrating an example of application of a receiver and a transmitter in Embodiment 12.
- FIG. 51 is a flowchart illustrating an example of process operations of a receiver and a transmitter in Embodiment 12.
- FIG. 28 is a diagram illustrating an example of application of a receiver and a transmitter in Embodiment 12.
- FIG. 51 is a flowchart illustrating an example of process operations of a receiver and a transmitter in Embodiment 12.
- FIG. 28 is a diagram illustrating an example of application of a receiver and a transmitter in Embodiment 12.
- FIG. 51 is a flowchart illustrating an example of process operations of a receiver and a transmitter in Embodiment 12.
- FIG. 28 is a diagram illustrating an example of application of a receiver and a transmitter in Embodiment 12.
- FIG. 28 is a diagram illustrating an example of application of a receiver and a transmitter in Embodiment 12. [FIG.
- FIG. 51 is a flowchart illustrating an example of process operations of a receiver and a transmitter in Embodiment 12.
- FIG. 28 is a diagram illustrating an example of application of a receiver and a transmitter in Embodiment 12.
- FIG. 51 is a flowchart illustrating an example of process operations of a receiver and a transmitter in Embodiment 12.
- FIG. 28 is a diagram illustrating an example of application of a receiver and a transmitter in Embodiment 12.
- FIG. 51 is a flowchart illustrating an example of process operations of a receiver and a transmitter in Embodiment 12.
- FIG. 28 is a diagram illustrating an example of application of a receiver and a transmitter in Embodiment 12.
- FIG. 28 is a diagram illustrating an example of application of a receiver and a transmitter in Embodiment 12. [FIG.
- FIG. 28 is a diagram illustrating an example of application of a receiver and a transmitter in Embodiment 12.
- FIG. 51 is a flowchart illustrating an example of process operations of a receiver and a transmitter in Embodiment 12.
- FIG. 28 is a diagram illustrating an example of application of a receiver and a transmitter in Embodiment 12.
- FIG. 51 is a flowchart illustrating an example of process operations of a receiver and a transmitter in Embodiment 12.
- FIG. 28 is a diagram illustrating an example of application of a receiver and a transmitter in Embodiment 12.
- FIG. 51 is a flowchart illustrating an example of process operations of a receiver and a transmitter in Embodiment 12. [FIG.
- FIG. 28 is a diagram illustrating an example of application of a receiver and a transmitter in Embodiment 12.
- FIG. 51 is a flowchart illustrating an example of process operations of a receiver and a transmitter in Embodiment 12.
- FIG. FIG. 20 is a diagram illustrating the state of the receiver in Embodiment 12.
- 51 is a flowchart illustrating an example of process operations of a receiver in Embodiment 12.
- FIG. FIG. 20 is a diagram illustrating the state of the receiver in Embodiment 12.
- 51 is a flowchart illustrating an example of process operations of a receiver in Embodiment 12.
- FIG. FIG. 20 is a diagram illustrating the state of the receiver in Embodiment 12.
- 51 is a flowchart illustrating an example of process operations of a receiver in Embodiment 12.
- FIG. 20 is a diagram illustrating the state of the receiver in Embodiment 12.
- 51 is a flowchart illustrating an example of process operations of a receiver in Embodiment 12.
- FIG. 20 is a diagram illustrating the state of the receiver in Embodiment 12.
- 51 is a flowchart illustrating an example of process operations of a receiver in Embodiment 12.
- FIG. 28 is a diagram illustrating an example of application of a receiver and a transmitter in Embodiment 12.
- FIG. 28 is a diagram illustrating an example of application of a receiver and a transmitter in Embodiment 12.
- FIG. 20 is a diagram illustrating the state of the receiver in Embodiment 12.
- 51 is a flowchart illustrating an example of process operations of a receiver in Embodiment 12.
- FIG. 28 is a diagram illustrating an example of application of a receiver and a transmitter in Embodiment
- FIG. 28 is a diagram illustrating an example of application of a receiver and a transmitter in Embodiment 12.
- FIG. 28 is a diagram illustrating an example of application of a receiver and a transmitter in Embodiment 12.
- FIG. 28 is a diagram illustrating an example of application of a receiver and a transmitter in Embodiment 12.
- FIG. 28 is a diagram illustrating an example of application of a receiver and a transmitter in Embodiment 12.
- FIG. 51 is a flowchart illustrating an example of process operations of a receiver and a transmitter in Embodiment 12.
- FIG. 51 is a flowchart illustrating an example of process operations of a receiver and a transmitter in Embodiment 12.
- FIG. 51 is a flowchart illustrating an example of process operations of a receiver and a transmitter in Embodiment 12. [FIG.
- FIG. 25 is a flowchart illustrating an example of process operations of a receiver and a transmitter in Embodiment 12.
- FIG. 25 is a diagram showing a change in luminance of a transmitter in a twelfth embodiment.
- 51 is a flowchart illustrating an example of process operations of a receiver in Embodiment 12.
- FIG. 25 is a diagram showing a change in luminance of a transmitter in a twelfth embodiment.
- 51 is a flowchart illustrating an example of process operations of a receiver in Embodiment 12.
- FIG. FIG. 25 is a diagram showing a change in luminance of a transmitter in a twelfth embodiment.
- FIG. 100 is a flowchart illustrating an example of processing operation of a transmitter in Embodiment 12.
- FIG. FIG. 25 is a diagram showing a change in luminance of a transmitter in a twelfth embodiment.
- 51 is a flowchart illustrating an example of process operations of a receiver in Embodiment 12.
- FIG. 51 is a flowchart illustrating an example of process operations of a receiver in Embodiment 12.
- FIG. 100 is a flowchart illustrating an example of processing operation of a transmitter in Embodiment 12.
- FIG. Fig. 24 is a diagram illustrating an example of a configuration of a transmitter in a twelfth embodiment. Fig.
- FIG. 24 is a diagram illustrating an example of a configuration of a transmitter in a twelfth embodiment.
- Fig. 24 is a diagram illustrating an example of a configuration of a transmitter in a twelfth embodiment.
- 51 is a flowchart illustrating an example of process operations of a receiver in Embodiment 12.
- FIG. 24 is a diagram illustrating an example of display and imaging by a receiver and a transmitter in Embodiment 12.
- 100 is a flowchart illustrating an example of processing operation of a transmitter in Embodiment 12.
- FIG. 51 is a flowchart illustrating an example of process operations of a receiver in Embodiment 12.
- FIG. 28 is a diagram illustrating an example of application of a receiver and a transmitter in Embodiment 12.
- FIG. 51 is a flowchart illustrating an example of process operations of a receiver and a transmitter in Embodiment 12.
- FIG. FIG. 20 is a diagram illustrating the state of the receiver in Embodiment 12.
- 51 is a flowchart illustrating an example of process operations of a receiver in Embodiment 12.
- FIG. FIG. 20 is a diagram illustrating the state of the receiver in Embodiment 12.
- 51 is a flowchart illustrating an example of process operations of a receiver in Embodiment 12.
- FIG. 51 is a flowchart illustrating an example of process operations of a receiver in Embodiment 12.
- FIG. 28 is a diagram illustrating an example of a wavelength of a transmitter in Embodiment 12.
- FIG. 51 is a flowchart illustrating an example of process operations of a receiver and a transmitter in Embodiment 12.
- FIG. 24 is a diagram illustrating a configuration example of a system including a receiver and a transmitter in Embodiment 12.
- 55 is a flowchart illustrating an example of process operations of a system in Embodiment 12.
- FIG. 24 is a diagram illustrating a configuration example of a system including a receiver and a transmitter in Embodiment 12.
- 55 is a flowchart illustrating an example of process operations of a system in Embodiment 12.
- FIG. 51 is a flowchart illustrating an example of process operations of a receiver in Embodiment 12.
- FIG. 51 is a flowchart illustrating an example of process operations of a receiver in Embodiment 12.
- FIG. 24 is a diagram illustrating a configuration example of a system including a receiver and a transmitter in Embodiment 12.
- 51 is a flowchart illustrating an example of process operations of a receiver in Embodiment 12.
- FIG. 28 is a diagram illustrating an example of application of a receiver and a transmitter in Embodiment 12.
- FIG. 51 is a flowchart illustrating an example of process operations of a receiver in Embodiment 12.
- FIG. 24 is a diagram illustrating a configuration example of a system including a receiver and a transmitter in Embodiment 12.
- 55 is a flowchart illustrating an example of process operations of a system in Embodiment 12.
- 51 is a flowchart illustrating an example of process operations of a receiver in Embodiment 12.
- FIG. 110 is a diagram illustrating an example of a configuration of a transmitter in Embodiment 12.
- FIG. FIG. 56 is a diagram illustrating another example of a configuration of a transmitter in Embodiment 12.
- 51 is a flowchart illustrating an example of process operations of a receiver and a transmitter in Embodiment 12.
- FIG. 20 is a flowchart illustrating an example of processing operations relating to a receiver and a transmitter in Embodiment 13.
- FIG. 20 is a flowchart illustrating an example of processing operations relating to a receiver and a transmitter in Embodiment 13.
- FIG. 20 is a flowchart illustrating an example of processing operations relating to a receiver and a transmitter in Embodiment 13.
- FIG. 20 is a flowchart illustrating an example of processing operations relating to a receiver and a transmitter in Embodiment 13.
- FIG. FIG. 20 is a flowchart illustrating an example of processing operations relating to a receiver and a transmitter in Embodiment 13.
- FIG. 117 is a diagram illustrating an example of application of a transmitter in Embodiment 13.
- FIG. 117 is a diagram illustrating an example of application of a transmitter in Embodiment 13.
- FIG. 117 is a diagram illustrating an example of application of a transmitter in Embodiment 13.
- FIG. 28 is a diagram illustrating an example of application of a transmitter and a receiver in Embodiment 13.
- FIG. 28 is a diagram illustrating an example of application of a transmitter and a receiver in Embodiment 13.
- FIG. 28 is a diagram illustrating an example of application of a transmitter and a receiver in Embodiment 13.
- FIG. 36 is a diagram illustrating an example of a transmission signal in Embodiment 13.
- 45 is a diagram illustrating another example of the transmission signal in Embodiment 13.
- FIG. 36 is a diagram illustrating an example of a transmission signal in Embodiment 13.
- FIG. 36 is a diagram illustrating an example of a transmission signal in Embodiment 13.
- 45 is a diagram illustrating another example of the transmission signal in Embodiment 13.
- FIG. 36 is a diagram illustrating an example of a transmission signal in Embodiment 13.
- FIG. 36 is a diagram illustrating an example of a transmission signal in Embodiment 13.
- FIG. 36 is a diagram illustrating an example of a transmission signal in Embodiment 13.
- FIG. 117 is a diagram illustrating an example of application of a transmitter in Embodiment 13.
- FIG. 117 is a diagram illustrating an example of application of a transmitter in Embodiment 13.
- FIG. 55 is a diagram for describing an imaging device in Embodiment 13.
- FIG. 55 is a diagram for describing an imaging device in Embodiment 13.
- FIG. 55 is a diagram for describing an imaging device in Embodiment 13.
- FIG. It is a flowchart which shows the processing operation of the receiver (imaging device) concerning the modification of each embodiment. It is a figure which compares and shows the normal imaging mode and the macro imaging mode which concern on the modification of each embodiment. It is a figure which shows the display apparatus which displays an image
- FIG. 1 It is a flow chart of the information communication method concerning one mode of the present invention. It is a block diagram of the information communication apparatus concerning one mode of the present invention. It is a flow chart of the information communication method concerning one mode of the present invention. It is a block diagram of the information communication apparatus concerning one mode of the present invention. It is a figure showing an example of the picture obtained by the information communication method concerning one mode of the present invention. It is a flowchart of the information communication method which concerns on the other aspect of this invention. It is a block diagram of the information communication apparatus which concerns on the other aspect of this invention. It is a flowchart of the information communication method which concerns on the further another aspect of this invention. It is a block diagram of the information communication apparatus which concerns on the further another aspect of this invention. FIG.
- FIG. 335 is a diagram illustrating an example of modes of a receiver in Embodiment 14.
- FIG. 336 is a diagram illustrating an example of imaging operation of a receiver in Embodiment 14.
- FIG. 337 is a diagram illustrating another example of imaging operation of a receiver in Embodiment 14.
- FIG. 338A is a diagram illustrating another example of imaging operation of a receiver in Embodiment 14.
- FIG. 338B is a diagram illustrating another example of imaging operation of a receiver in Embodiment 14.
- FIG. 338C is a diagram illustrating another example of imaging operation of a receiver in Embodiment 14.
- FIG. 339A is a diagram illustrating an example of camera arrangement of a receiver in Embodiment 14.
- FIG. 339B is a diagram illustrating another example of camera arrangement of a receiver in Embodiment 14.
- FIG. 340 is a diagram illustrating an example of a display operation of a receiver in Embodiment 14.
- FIG. 341 is a diagram illustrating an example of a display operation of a receiver in Embodiment 14.
- FIG. 342 is a diagram illustrating an example of operation of a receiver in Embodiment 14.
- FIG. 343 is a diagram illustrating another example of operation of a receiver in Embodiment 14.
- FIG. 344 is a diagram illustrating another example of operation of a receiver in Embodiment 14.
- FIG. 345 is a diagram illustrating another example of operation of a receiver in Embodiment 14.
- FIG. 340 is a diagram illustrating an example of a display operation of a receiver in Embodiment 14.
- FIG. 341 is a diagram illustrating an example of a display operation of a receiver in Embodiment 14.
- FIG. 342 is
- FIG. 346 is a diagram illustrating another example of operation of a receiver in Embodiment 14.
- FIG. 347 is a diagram illustrating another example of operation of a receiver in Embodiment 14.
- FIG. 348 is a diagram illustrating another example of operation of a receiver in Embodiment 14.
- FIG. 349 is a diagram illustrating an example of operation of a receiver, a transmitter, and a server in Embodiment 14.
- FIG. 350 is a diagram illustrating another example of operation of a receiver in Embodiment 14.
- 351 is a diagram illustrating another example of operation of a receiver in Embodiment 14.
- FIG. 352 is a diagram illustrating an example of initial setting of a receiver in Embodiment 14.
- FIG. 352 is a diagram illustrating an example of initial setting of a receiver in Embodiment 14.
- FIG. 352 is a diagram illustrating an example of initial setting of a receiver in Embodiment 14.
- FIG. 352 is a diagram illustrating an
- FIG. 353 is a diagram illustrating another example of operation of a receiver in Embodiment 14.
- FIG. 354 is a diagram illustrating another example of operation of a receiver in Embodiment 14.
- FIG. 355 is a diagram illustrating another example of operation of a receiver in Embodiment 14.
- FIG. 356 is a diagram illustrating another example of operation of a receiver in Embodiment 14.
- FIG. 357 is a diagram illustrating another example of operation of a receiver in Embodiment 14.
- FIG. 358 is a diagram illustrating another example of operation of a receiver in Embodiment 14.
- FIG. 359A is a diagram illustrating a pen used to operate a receiver in Embodiment 14.
- FIG. 359B is a diagram illustrating operation of a receiver using a pen in Embodiment 14.
- FIG. 360 is a diagram illustrating an example of the appearance of a receiver in Embodiment 14.
- FIG. 361 is a diagram illustrating another example of the appearance of a receiver in Embodiment 14.
- FIG. 362 is a diagram illustrating another example of operation of a receiver in Embodiment 14.
- FIG. 363A is a diagram illustrating another example of operation of a receiver in Embodiment 14.
- 363B is a diagram illustrating an application example using the receiver in Embodiment 14.
- FIG. 364A is a diagram illustrating another example of operation of a receiver in Embodiment 14.
- FIG. 364B is a diagram illustrating an application example using the receiver in Embodiment 14.
- FIG. 365A is a diagram illustrating an example of operation of a transmitter in Embodiment 14.
- FIG. 365B is a diagram illustrating another example of operation of a transmitter in Embodiment 14.
- FIG. 366 is a diagram illustrating another example of operation of a transmitter in Embodiment 14.
- FIG. 367 is a diagram illustrating another example of operation of a transmitter in Embodiment 14.
- FIG. 368 is a diagram illustrating an example of communication form between a plurality of transmitters and a receiver in Embodiment 14.
- FIG. 369 is a diagram illustrating an example of operation of a plurality of transmitters in the fourteenth embodiment.
- FIG. 365A is a diagram illustrating an example of operation of a transmitter in Embodiment 14.
- FIG. 365B is a diagram illustrating another example of operation of a transmitter in Embodiment 14.
- FIG. 366
- FIG. 370 is a diagram illustrating another example of communication form between a plurality of transmitters and a receiver in Embodiment 14.
- FIG. 371 is a diagram illustrating another example of operation of a receiver in Embodiment 14.
- FIG. 372 is a diagram illustrating an example of application of a receiver in Embodiment 14.
- FIG. 373 is a diagram illustrating an example of application of a receiver in Embodiment 14.
- FIG. 374 is a diagram illustrating an example of application of a receiver in Embodiment 14.
- FIG. 375 is a diagram illustrating an example of application of a transmitter in Embodiment 14.
- FIG. 376 is a diagram illustrating an example of application of a transmitter in Embodiment 14.
- FIG. 371 is a diagram illustrating another example of operation of a receiver in Embodiment 14.
- FIG. 372 is a diagram illustrating an example of application of a receiver in Embodiment 14.
- FIG. 373 is a diagram illustrating an
- FIG. 377 is a diagram illustrating an application example of the reception method in Embodiment 14.
- FIG. 378 is a diagram illustrating an example of application of a transmitter in Embodiment 14.
- FIG. 379 is a diagram illustrating an example of application of a transmitter in Embodiment 14.
- FIG. 380 is a diagram illustrating an example of application of a transmitter in Embodiment 14.
- FIG. 381 is a diagram illustrating another example of operation of a receiver in Embodiment 14.
- FIG. 382 is a flowchart illustrating an example of operation of a receiver in Embodiment 15.
- FIG. 383 is a flowchart illustrating another example of operation of a receiver in Embodiment 15.
- FIG. 384A is a block diagram illustrating an example of a transmitter in Embodiment 15.
- FIG. 384B is a block diagram illustrating another example of a transmitter in Embodiment 15.
- FIG. 385 is a diagram illustrating a configuration example of a system including a plurality of transmitters in a fifteenth embodiment.
- FIG. 386 is a block diagram illustrating another example of a transmitter in Embodiment 15.
- FIG. 387A is a diagram illustrating an example of a transmitter in Embodiment 15.
- FIG. 387B is a diagram illustrating an example of a transmitter in Embodiment 15.
- FIG. 387C is a diagram illustrating an example of a transmitter in Embodiment 15.
- FIG. 388A is a diagram illustrating an example of a transmitter in Embodiment 15.
- FIG. 388B is a diagram illustrating an example of a transmitter in Embodiment 15.
- FIG. 389 is a diagram illustrating an example of processing operation of a receiver, a transmitter, and a server in Embodiment 15.
- FIG. 390 is a diagram illustrating an example of processing operation of a receiver, a transmitter, and a server in Embodiment 15.
- FIG. 391 is a diagram illustrating an example of processing operation of a receiver, a transmitter, and a server in Embodiment 15.
- FIG. 392A is an explanatory diagram for illustrating synchronization of a plurality of transmitters in the fifteenth embodiment.
- FIG. 392A is an explanatory diagram for illustrating synchronization of a plurality of transmitters in the fifteenth embodiment.
- FIG. 392B is an explanatory diagram for illustrating synchronization of a plurality of transmitters in the fifteenth embodiment.
- FIG. 393 is a diagram illustrating an example of operation of a transmitter and a receiver in Embodiment 15.
- FIG. 394 is a diagram illustrating an example of operation of a transmitter and a receiver in Embodiment 15.
- FIG. 395 is a diagram illustrating an example of operation of a transmitter, a receiver, and a server in Embodiment 15.
- FIG. 396 is a diagram illustrating an example of operation of a transmitter and a receiver in Embodiment 15.
- FIG. 397 is a diagram illustrating an example of an appearance of a receiver in Embodiment 15.
- FIG. 398 is a diagram illustrating an example of operation of a transmitter, a receiver, and a server in Embodiment 15.
- FIG. 399 is a diagram illustrating an example of operation of a transmitter and a receiver in Embodiment 15.
- FIG. 400 is a diagram illustrating an example of operation of a transmitter and a receiver in Embodiment 15.
- FIG. 401 is a diagram illustrating an example of operation of a transmitter and a receiver in Embodiment 15.
- FIG. 402 is a diagram illustrating an example of operation of a transmitter and a receiver in Embodiment 15.
- FIG. 403A is a diagram illustrating an example of a structure of information transmitted by a transmitter in Embodiment 15.
- FIG. 403B is a diagram illustrating another example of a structure of information transmitted by a transmitter in Embodiment 15.
- FIG. 404 is a diagram illustrating an example of a 4-level PPM modulation scheme by a transmitter in Embodiment 15.
- FIG. 405 is a diagram illustrating an example of a PPM modulation scheme by a transmitter in Embodiment 15.
- FIG. 406 is a diagram illustrating an example of a PPM modulation scheme in a transmitter in Embodiment 15.
- FIG. 407A is a diagram illustrating an example of a luminance change pattern corresponding to the header (preamble unit) in the fifteenth embodiment.
- FIG. 407B is a diagram illustrating an example of a luminance change pattern in Embodiment 15.
- FIG. 408A is a diagram illustrating an example of a luminance change pattern in Embodiment 15.
- FIG. 408B is a diagram illustrating an example of a luminance change pattern in Embodiment 15.
- FIG. 409 is a diagram illustrating an example of operation of a receiver in a situation in front of a store in the sixteenth embodiment.
- FIG. 410 is a diagram illustrating another example of operation of a receiver in a situation in front of a store in the sixteenth embodiment.
- FIG. 411 is a diagram illustrating an example of next operation of a receiver in a situation in front of a store in the sixteenth embodiment.
- FIG. 412 is a diagram illustrating an example of next operation of a receiver in a situation in front of a store in the sixteenth embodiment.
- FIG. 413 is a diagram illustrating an example of next operation of a receiver in a situation in front of a store in the sixteenth embodiment.
- FIG. 414 is a diagram illustrating an example of operation of the display device in the in-store situation according to the sixteenth embodiment.
- FIG. 415 is a diagram showing an example of next operation of the display device in the in-store situation in the sixteenth embodiment.
- FIG. 416 is a diagram showing an example of next operation of the display device in the in-store situation in the sixteenth embodiment.
- FIG. 417 is a diagram illustrating an example of next operation of a receiver in an in-store situation in Embodiment 16.
- FIG. 418 is a diagram illustrating an example of next operations of a receiver in an in-store situation in Embodiment 16.
- FIG. 414 is a diagram illustrating an example of operation of the display device in the in-store situation according to the sixteenth embodiment.
- FIG. 415 is a diagram showing an example of next operation of the display device in the in
- FIG. 419 is a diagram illustrating an example of next operations of a receiver in an in-store situation in Embodiment 16.
- FIG. 420 is a diagram illustrating an example of next operation of a receiver in an in-store situation in Embodiment 16.
- 421 is a diagram illustrating an example of next operation of a receiver in an in-store situation in Embodiment 16.
- FIG. 422 is a diagram illustrating an example of next operations of a receiver in an in-store situation in Embodiment 16.
- FIG. 423 is a diagram illustrating an example of operation of a receiver in a store search situation in Embodiment 16.
- FIG. 424 is a diagram illustrating an example of next operation of a receiver in a store search situation in the sixteenth embodiment.
- FIG. 425 is a diagram illustrating an example of next operations of a receiver in a store search situation in the sixteenth embodiment.
- FIG. 426 is a diagram illustrating an example of operation of a receiver in a movie advertisement situation, in the sixteenth embodiment.
- FIG. 427 is a diagram illustrating an example of next operation of a receiver in a movie advertisement situation in a sixteenth embodiment.
- FIG. 428 is a diagram illustrating an example of next operations of a receiver in a movie advertisement situation in a sixteenth embodiment.
- FIG. 429 is a diagram illustrating an example of next operation of a receiver in a movie advertisement situation in Embodiment 16.
- FIG. 430 is a diagram illustrating an example of operation of a receiver in a situation in a museum in the sixteenth embodiment.
- FIG. 431 is a diagram illustrating an example of next operation of a receiver in a situation in a museum in the sixteenth embodiment.
- FIG. 432 is a diagram illustrating an example of next operation of a receiver in a situation in a museum in the sixteenth embodiment.
- FIG. 433 is a diagram illustrating an example of next operation of a receiver in a situation in a museum in the sixteenth embodiment.
- FIG. 434 is a diagram illustrating an example of next operation of a receiver in a situation in a museum in the sixteenth embodiment.
- FIG. 435 is a diagram illustrating an example of next operation of a receiver in a museum setting in Embodiment 16.
- 436 is a diagram illustrating an example of operation of a receiver in a bus stop situation according to Embodiment 16.
- FIG. 16 is a diagram illustrating an example of next operation of a receiver in a bus stop situation according to Embodiment 16.
- FIG. 437 is a diagram illustrating an example of next operation of a receiver in a bus stop situation according to Embodiment 16.
- FIG. 438 is a diagram for illustrating imaging in a sixteenth embodiment.
- FIG. 439 is a diagram for illustrating transmission and imaging in the sixteenth embodiment.
- FIG. 440 is a diagram for illustrating transmission in the sixteenth embodiment.
- FIG. 441 is a diagram illustrating an example of operation of a transmitter in Embodiment 17.
- FIG. 442 is a diagram illustrating an example of operation of a transmitter in Embodiment 17.
- FIG. 443 is a diagram illustrating an example of operation of a transmitter in Embodiment 17.
- FIG. 444 is a diagram illustrating an example of operation of a transmitter and a receiver in Embodiment 17.
- FIG. 441 is a diagram illustrating an example of operation of a transmitter in Embodiment 17.
- FIG. 442 is a diagram illustrating an example of operation of a transmitter in Embodiment
- FIG. 445 is a diagram illustrating an example of operation of a receiver in Embodiment 17.
- FIG. 446 is a diagram illustrating an example of operation of a receiver in Embodiment 17.
- FIG. 447 is a diagram illustrating an example of operation of a system including a transmitter, a receiver, and a server in Embodiment 17.
- FIG. 448 is a block diagram showing a configuration of a transmitter in a seventeenth embodiment.
- FIG. 449 is a block diagram illustrating a configuration of a receiver in Embodiment 17.
- FIG. 450 is a diagram illustrating an example of operation of a transmitter in Embodiment 17.
- 451 is a diagram illustrating an example of operation of a transmitter in Embodiment 17. [FIG.
- FIG. 452 is a diagram illustrating an example of operation of a transmitter in Embodiment 17.
- FIG. 453 is a diagram illustrating an example of operation of a transmitter in Embodiment 17.
- FIG. 454 is a diagram illustrating an example of operation of a transmitter in Embodiment 17.
- FIG. 455 is a diagram illustrating an example of operation of a transmitter in Embodiment 17.
- FIG. 456 is a diagram illustrating an example of operation of a transmitter in Embodiment 17.
- FIG. 457 is a diagram illustrating an example of operation of a transmitter and a receiver in Embodiment 17.
- FIG. 458 is a diagram illustrating an example of operation of a transmitter and a receiver in Embodiment 17.
- FIG. 453 is a diagram illustrating an example of operation of a transmitter in Embodiment 17.
- FIG. 454 is a diagram illustrating an example of operation of a transmitter in Embodiment 17.
- FIG. 455 is a diagram illustrating an example
- FIG. 459 is a diagram illustrating an example of operation of a transmitter and a receiver in Embodiment 17.
- FIG. 460 is a diagram illustrating an example of operation of a transmitter and a receiver in Embodiment 17.
- FIG. 461 is a diagram illustrating an example of operation of a transmitter and a receiver in Embodiment 17.
- FIG. 462 is a diagram illustrating an example of operation of a transmitter and a receiver in Embodiment 17.
- FIG. 463 is a diagram illustrating an example of operation of a transmitter and a receiver in Embodiment 17.
- FIG. 464 is a diagram illustrating an example of operation of a transmitter and a receiver in Embodiment 17.
- FIG. 460 is a diagram illustrating an example of operation of a transmitter and a receiver in Embodiment 17.
- FIG. 460 is a diagram illustrating an example of operation of a transmitter and a receiver in Embodiment 17.
- FIG. 461 is a diagram illustrating an example of operation
- FIG. 465 is a diagram illustrating an example of operation of a transmitter and a receiver in Embodiment 17.
- FIG. 466 is a diagram illustrating an example of operation of a transmitter and a receiver in Embodiment 17.
- FIG. 467 is a diagram illustrating an example of operation of a transmitter and a receiver in Embodiment 17.
- FIG. 468 is a diagram illustrating an example of operation of a transmitter and a receiver in Embodiment 17.
- FIG. 469 is a diagram illustrating an example of operation of a transmitter and a receiver in Embodiment 17.
- FIG. 470 is a diagram illustrating a coding scheme in Embodiment 17.
- FIG. 471 is a diagram illustrating a coding scheme which can receive light even when imaging is performed in the oblique direction in Embodiment 17.
- FIG. 472 is a diagram illustrating a coding scheme in which the amount of information differs depending on the distance in Embodiment 17.
- FIG. 473 is a diagram illustrating a coding scheme in which the amount of information differs depending on the distance in Embodiment 17.
- FIG. 474 is a diagram illustrating a coding scheme obtained by dividing data according to a seventeenth embodiment.
- FIG. 475 is a diagram showing an effect of inserting a reverse phase image in a seventeenth embodiment.
- FIG. 476 is a diagram showing an effect of inserting a reverse phase image in a seventeenth embodiment.
- FIG. 477 is a diagram showing super-resolution processing in the seventeenth embodiment.
- FIG. 478 is a diagram showing a display corresponding to visible light communication in the seventeenth embodiment.
- FIG. 479 is a diagram showing information acquisition using a visible light communication signal in the seventeenth embodiment.
- FIG. 480 is a diagram showing a data format in the seventeenth embodiment.
- FIG. 481 is a diagram illustrating reception by estimating a three-dimensional shape in a seventeenth embodiment.
- FIG. 482 is a diagram illustrating reception by estimating a three-dimensional shape in a seventeenth embodiment.
- FIG. 483 is a diagram showing a stereoscopic projection in the seventeenth embodiment.
- FIG. 484 is a diagram showing a stereoscopic projection in the seventeenth embodiment.
- FIG. 485 is a diagram illustrating an example of operation of a transmitter and a receiver in Embodiment 17.
- FIG. 486 is a diagram illustrating an example of operation of a transmitter and a receiver in Embodiment 17.
- FIG. 487 is a diagram illustrating an example of a transmission signal in the eighteenth embodiment.
- FIG. 488 is a diagram illustrating an example of a transmission signal in the eighteenth embodiment.
- FIG. 489A is a diagram illustrating an example of a captured image (bright line image) of a receiver in Embodiment 18.
- FIG. 489B is a diagram illustrating an example of a captured image (bright line image) of a receiver in Embodiment 18.
- FIG. 489C is a diagram illustrating an example of a captured image (bright line image) of a receiver in Embodiment 18.
- FIG. 490A is a diagram illustrating an example of a captured image (bright line image) of a receiver in Embodiment 18.
- FIG. 490B is a diagram illustrating an example of a captured image (bright line image) of a receiver in Embodiment 18.
- 491A is a diagram illustrating an example of a captured image (bright line image) of a receiver in Embodiment 18.
- FIG. 491B is a diagram illustrating an example of a captured image (bright line image) of a receiver in Embodiment 18.
- FIG. 491C is a diagram illustrating an example of a captured image (bright line image) of a receiver in Embodiment 18.
- FIG. 492 is a diagram illustrating an example of a captured image (bright line image) of a receiver in Embodiment 18.
- FIG. 493 is a diagram illustrating an example of a transmission signal in Embodiment 18.
- FIG. 494 is a diagram illustrating an example of operation of a receiver in Embodiment 18.
- FIG. 495 is a diagram illustrating an example of an instruction to a user displayed on a screen of a receiver in Embodiment 18.
- FIG. 496 is a diagram illustrating an example of an instruction to a user displayed on a screen of a receiver in Embodiment 18.
- FIG. 497 is a diagram illustrating an example of a signal transmission method in the eighteenth embodiment.
- FIG. 498 is a diagram illustrating an example of a signal transmission method in the eighteenth embodiment.
- FIG. 499 is a diagram illustrating an example of a signal transmission method in Embodiment 18.
- FIG. 500 is a diagram illustrating an example of a signal transmission method in Embodiment 18.
- FIG. 501 is a diagram for describing a use case in the eighteenth embodiment.
- 502 is a diagram of an information table transmitted by the smartphone in the eighteenth embodiment to the server.
- FIG. FIG. 503 is a block diagram of a server in an eighteenth embodiment.
- FIG. 504 is a flowchart showing the overall processing of the system in the eighteenth embodiment.
- FIG. 505 is a diagram of an information table transmitted by the server in the eighteenth embodiment to a smartphone.
- FIG. 506 is a diagram showing a screen flow displayed on the wearable device from when the user in the eighteenth embodiment receives information from the server in front of a store until when the user actually purchases a product.
- FIG. 507 is a diagram for describing another use case in the eighteenth embodiment.
- FIG. 508 is a diagram illustrating a service providing system using the receiving method described in each embodiment.
- FIG. 509 is a flowchart showing the flow of service provision.
- FIG. 510 is a flowchart showing service provision in another example.
- FIG. 511 is a flowchart showing service provision in another example.
- FIG. 512A is a diagram for describing a modulation scheme that is easy to receive in the twentieth embodiment.
- FIG. 512B is a diagram for describing a modulation scheme which is easy to receive in the twentieth embodiment.
- FIG. 513 is a diagram for describing a modulation scheme that is easy to receive in the twentieth embodiment.
- FIG. 514 is a diagram for describing combined use of communication by bright line and image recognition in Embodiment 20.
- FIG. 515A is a diagram for describing a method of using an imaging device suitable for receiving a visible light signal in Embodiment 20.
- FIG. 515B is a diagram for describing a method of using an imaging device suitable for receiving a visible light signal in Embodiment 20.
- FIG. 515C is a diagram for describing a method of using an imaging device suitable for receiving a visible light signal in Embodiment 20.
- FIG. 515D is a diagram for describing a method of using an imaging device suitable for receiving a visible light signal in Embodiment 20.
- FIG. 515E is a flowchart for describing how to use the imaging device suitable for receiving a visible light signal in Embodiment 20.
- FIG. 516 is a diagram showing a size of a captured image suitable for receiving a visible light signal in Embodiment 20.
- FIG. 516 is a diagram showing a size of a captured image suitable for receiving a visible light signal in Embodiment 20.
- FIG. 517A is a diagram showing a size of a captured image suitable for receiving a visible light signal in Embodiment 20.
- FIG. 517B is a flowchart showing an operation for switching to a captured image size suitable for reception of a visible light signal in Embodiment 20.
- FIG. 517C is a flowchart showing an operation for switching to a captured image size suitable for reception of a visible light signal in Embodiment 20.
- FIG. 518 is a diagram for describing reception of a visible light signal using a zoom in Embodiment 20.
- FIG. 519 is a diagram for describing an image data size compression method suitable for receiving a visible light signal in a twentieth embodiment.
- FIG. 520 is a diagram for describing a modulation scheme with high reception error detection accuracy in the twentieth embodiment.
- 521 is a diagram for describing change in operation of a receiver according to a difference in situation in Embodiment 20.
- FIG. FIG. 522 is a diagram for describing notification of visible light communication to humans in the twentieth embodiment.
- FIG. 523 is a diagram for describing expansion of the reception range by the diffuser plate in the twentieth embodiment.
- FIG. 524 is a diagram for describing a synchronization method of signal transmission from a plurality of projectors in the twentieth embodiment.
- FIG. 525 is a diagram for describing a synchronization method of signal transmission from a plurality of displays in Embodiment 20.
- FIG. 526 is a diagram for describing reception of a visible light signal by the illuminance sensor and the image sensor in Embodiment 20.
- FIG. 527 is a diagram for illustrating a trigger of reception start in the twentieth embodiment.
- FIG. 528 is a diagram for describing a reception start gesture in the twentieth embodiment.
- FIG. 529 is a diagram for illustrating an application example to a car navigation system in the twentieth embodiment.
- FIG. 530 is a diagram for illustrating an application example to a car navigation system in the twentieth embodiment.
- FIG. 531 is a diagram for describing an application example to content protection in the twentieth embodiment.
- FIG. 532 is a diagram for describing an application example as an electronic lock in the twentieth embodiment.
- FIG. 533 is a diagram for describing an application example as visit-store information transmission in the twentieth embodiment.
- FIG. 534 is a diagram for illustrating an application example of order control according to a place in the twentieth embodiment.
- FIG. 535 is a diagram for illustrating an application example to the route guidance in the twentieth embodiment.
- FIG. 536 is a diagram for illustrating an application example to location contact in the twentieth embodiment.
- FIG. 537 is a diagram for describing an application example to usage log accumulation and analysis in the twentieth embodiment.
- FIG. 538 is a diagram for describing an application example to screen sharing in the twentieth embodiment.
- FIG. 539 is a diagram for describing an application example to screen sharing in the twentieth embodiment.
- FIG. 540 is a diagram for describing an application example of position estimation using a wireless access point according to the twentieth embodiment.
- FIG. 541 is a diagram illustrating a configuration for performing position estimation by visible light communication and wireless communication in Embodiment 20.
- FIG. 542A is a flowchart of an information communication method according to an aspect of the present invention.
- FIG. 542B is a block diagram of an information communication device according to an aspect of the present invention.
- FIG. 543 is a diagram showing a watch equipped with an optical sensor.
- FIG. 544 is a diagram illustrating an application example of the information communication method according to an aspect of the present invention.
- FIG. 545 is a diagram illustrating an application example of the information communication method according to an aspect of the present invention.
- FIG. 545 is a diagram illustrating an application example of the information communication method according to an aspect of the present invention.
- FIG. 546 is a diagram illustrating an application example of an information communication method according to an aspect of the present invention.
- FIG. 547 is a diagram illustrating an example of application of a transmitter and a receiver in Embodiment 21.
- FIG. 548 is a diagram illustrating an example of application of a transmitter in Embodiment 21.
- FIG. 549A is a diagram illustrating an example of application of a transmitter and a receiver in Embodiment 21.
- FIG. 549B is a flowchart illustrating operation of a receiver in Embodiment 21.
- FIG. 550 is a diagram illustrating an example of application of a transmitter and a receiver in Embodiment 21.
- 551 is a diagram illustrating an example of application of a transmitter in Embodiment 21.
- FIG. 547 is a diagram illustrating an example of application of a transmitter and a receiver in Embodiment 21.
- FIG. 548 is a diagram illustrating an example of application of a transmitter in Embod
- FIG. 552A is a diagram illustrating an example of application of a transmitter and a receiver in Embodiment 21.
- FIG. FIG. 552B is a flowchart illustrating operation of a receiver in Embodiment 21.
- FIG. 553 is a diagram illustrating operation of a receiver in Embodiment 21.
- FIG. 554 is a diagram illustrating an example of application of a transmitter in Embodiment 21.
- FIG. 555 is a diagram illustrating an example of application of a receiver in Embodiment 21.
- FIG. 556A is a flowchart illustrating an example of operation of a transmitter in Embodiment 21.
- FIG. 556B is a flowchart illustrating an example of operation of a transmitter in Embodiment 21.
- FIG. 557 is a flowchart illustrating an example of operation of a transmitter in Embodiment 21.
- FIG. FIG. 558 is a flowchart illustrating an example of operation of the imaging device in Embodiment 21.
- FIG. 559 is a flowchart illustrating an example of operation of the imaging device in Embodiment 21.
- FIG. 560 is a diagram illustrating an example of a signal transmitted by a transmitter in Embodiment 21.
- 561 is a diagram illustrating an example of a signal transmitted by a transmitter in Embodiment 21.
- FIG. FIG. 562 is a diagram illustrating an example of a signal transmitted by a transmitter in Embodiment 21.
- 563 is a diagram illustrating an example of a signal transmitted by a transmitter in Embodiment 21.
- FIG. 564 is a diagram illustrating an example of a system configuration including a transmitter and a receiver in Embodiment 21.
- FIG. 565 is a diagram illustrating an example of a system configuration including a transmitter and a receiver in Embodiment 21.
- FIG. 566 is a diagram illustrating an example of a system configuration including a transmitter and a receiver in Embodiment 21.
- FIG. 567A is a flowchart of an information communication method according to an aspect of the present invention.
- FIG. 567B is a block diagram of an information communication device according to an aspect of the present invention.
- FIG. 568A is a flowchart of an information communication method according to an aspect of the present invention.
- FIG. 568B is a block diagram of an information communication device according to an aspect of the present invention.
- FIG. 569 is a diagram illustrating an example of operation of a transmitter in Embodiment 21.
- FIG. 570 is a diagram illustrating an example of operation of a transmitter in Embodiment 21.
- 571 is a diagram illustrating an example of operation of a transmitter in Embodiment 21.
- FIG. 572 is a diagram illustrating an example of operation of a transmitter in Embodiment 21.
- FIG. 573 is a diagram illustrating an example of a receiver in Embodiment 21.
- FIG. 574 is a diagram illustrating an example of a receiver in Embodiment 21.
- FIG. 575 is a diagram illustrating an example of a reception system in Embodiment 21.
- FIG. 576 is a diagram illustrating an example of a reception system in Embodiment 21.
- FIG. 577A is a diagram illustrating an example of a modulation scheme in Embodiment 21.
- FIG. 577B is a diagram illustrating an example of a modulation scheme in Embodiment 21.
- FIG. 577C is a diagram illustrating an example of separation of mixed signals in Embodiment 21.
- FIG. 577D is a diagram illustrating an example of separation of mixed signals in Embodiment 21.
- FIG. 578 is a diagram illustrating an example of a visible light communication system in Embodiment 21.
- FIG. 579 is a flowchart showing a reception method from which interference is eliminated in the twenty-first embodiment.
- FIG. 579 is a flowchart showing a reception method from which interference is eliminated in the twenty-first embodiment.
- FIG. 580 is a flowchart of an estimation method of a direction of a transmitter in Embodiment 21.
- FIG. 581 is a flowchart of a method of starting reception in the twenty-first embodiment.
- FIG. 582 is a flowchart of a method of generating an ID using information of another medium in the twenty-first embodiment.
- FIG. 583 is a flowchart of a method of selecting a reception method by frequency separation in Embodiment 21.
- FIG. 584 is a flowchart showing a signal reception method in the case where the exposure time is long in the twenty-first embodiment.
- FIG. 585 is an image view showing a usage scene in a twenty-second embodiment.
- FIG. 586 is a schematic diagram of a portable terminal in the twenty-second embodiment.
- FIG. 587 is an image diagram when a portable terminal in Embodiment 22 is held horizontally.
- FIG. 588 is an image view when a portable terminal in Embodiment 22 is held vertically.
- FIG. 589 is a conceptual diagram of the in-store map in the twenty-second embodiment.
- FIG. 590 is an image diagram of a product UI in the twenty-second embodiment.
- FIG. 591 is an image diagram when a product UI in the twenty-second embodiment is operated.
- FIG. 592 is an image diagram of swinging the portable terminal in Embodiment 22 from right to left.
- FIG. 593 is an image view of a watch-type device in a twenty-second embodiment.
- FIG. 594 is a diagram showing an entire structure in a twenty-second embodiment.
- FIG. 595 is a diagram of the structure of the product information storage unit A11016 according to the twenty-second embodiment.
- FIG. 596 is a layout image diagram of the product UI in the twenty-second embodiment.
- FIG. 597 is a structural diagram of the map information storage unit A 11017 in the twenty-second embodiment.
- FIG. 598 is a flowchart of lighting equipment A 11002 in the twenty-second embodiment.
- FIG. 599 is a flowchart of portable terminal A11001 in the twenty-second embodiment.
- FIG. 600 is a structural diagram of a state management unit A 11019 in the twenty-second embodiment.
- FIG. 601 is a flowchart of processing related to ceiling light in the twenty-second embodiment.
- FIG. 602 is a flowchart of base-light related processing in the twenty-second embodiment.
- FIG. 603 is a flowchart of UI related processing in the twenty-second embodiment.
- FIG. 604 is a flowchart of map information oriented UI processing according to the twenty-second embodiment.
- FIG. 605 is a flowchart of product information oriented UI processing according to the twenty-second embodiment.
- FIG. 607 is a flowchart of the entire display processing in the twenty-second embodiment.
- FIG. 607 is a flowchart of a process of pre-display update in the twenty-second embodiment.
- FIG. 608 is a flowchart of display update processing in the twenty-second embodiment.
- FIG. 609 is a structural diagram of a light reception control unit in the twenty-third embodiment.
- FIG. 610 is a flowchart of illuminance pattern detection in the twenty-third embodiment.
- FIG. 611 is a structural diagram of a light reception control unit in the twenty-fourth embodiment.
- FIG. 612 is a flowchart of illuminance pattern detection in the twenty-fourth embodiment.
- FIG. 613 is an image diagram of a movement of a sight line in the twenty-fifth embodiment.
- FIG. 614 is a block diagram of a portable terminal in the twenty-fifth embodiment.
- FIG. 615 is a diagram of a DB structure of a shelf identifier according to the twenty-fifth embodiment.
- FIG. 616 is a flowchart at the time of server inquiry in the twenty-fifth embodiment.
- FIG. 617 is a structural diagram of a light reception control unit in the twenty-sixth embodiment.
- FIG. 618 is a structural diagram of a light reception control unit in the twenty-seventh embodiment.
- FIG. 619 is a diagram for describing a use case in the twenty-eighth embodiment.
- FIG. 620 is a diagram showing system components in the twenty-ninth embodiment.
- FIG. 621 is a flowchart of area detection processing of the mobile terminal (B 0101) in the twenty-ninth embodiment.
- FIG. 622 is a processing flowchart in the area ID information server (B0411) when area ID information is requested from the mobile terminal (B0101) in the twenty-ninth embodiment.
- FIG. 623 is a flowchart of a process when the mobile terminal (B0101) in the twenty-ninth embodiment receives area ID information from the area ID information server (B0411).
- FIG. 624 is a flowchart of a process when the mobile terminal (B0101) in the twenty-ninth embodiment receives an ID from the visible light transmitting device (B0120).
- FIG. 625 is a flowchart of a process when the mobile terminal (B0101) in the twenty-ninth embodiment requests visible light ID correspondence information.
- FIG. 626 is a flowchart of processing when the ID correspondence information server (B0111) in the twenty-ninth embodiment requests ID correspondence information from the mobile terminal (B0101).
- FIG. 627 is a flowchart of a process when the mobile terminal (B0101) in the twenty-ninth embodiment receives a short ID from a visible light transmitting device (B0120).
- FIG. 628 is a flowchart of a process of displaying a mobile terminal (B0101) according to the twenty-ninth embodiment.
- FIG. 629 is a processing flowchart of the interpolation ID generation unit (B0110) according to the twenty-ninth embodiment for generating an interpolation ID based on a user attribute.
- FIG. 630 is a flowchart of processing by which the interpolation ID generation means (B0110) in the twenty-ninth embodiment specifies the installation position of the visible light transmitting device (B0120) based on the sensing means (B0103) and the receiving camera information.
- FIG. 631 is a flowchart of processing in which the interpolation ID generation means (B0110) in the twenty-ninth embodiment generates an interpolation ID based on the installation position of the visible light transmitting device.
- FIG. 632 is a diagram showing an example in which the interpolation ID generation means (B0110) in the twenty-ninth embodiment specifies the position of the visible light transmission device (B0120).
- FIG. 633 is a diagram showing an example in which the interpolation ID generation means (B0110) in the twenty-ninth embodiment detects the attitude of the mobile terminal (B0101).
- FIG. 634 is a diagram showing an example of a table used by the interpolation ID generation means (B0110) in the twenty-ninth embodiment for selecting an interpolation ID based on a device position.
- FIG. 635 is a diagram showing an example of a user attribute table held by the user information holding means (B0151) in the twenty-ninth embodiment.
- FIG. 636 is a diagram showing an example of a table used by the interpolation ID generation means (B0110) in the twenty-ninth embodiment for selecting an interpolation ID based on a user attribute.
- FIG. 637 is a diagram showing an example of a data table held by the visible light ID correspondence information data holding means (B0114) in the twenty-ninth embodiment.
- FIG. 638 is a diagram showing an example of an area ID information table held by the area ID information server (B0141) in the twenty-ninth embodiment.
- FIG. 639 is a diagram showing a use case in the twenty-ninth embodiment.
- FIG. 640 is a diagram showing an example of the internal configuration of the inquiry ID to the ID correspondence information conversion server (B0111) by the mobile terminal (B0101) in the twenty-ninth embodiment.
- FIG. 641 is a diagram showing an example in which the mobile terminal (B0101) in the twenty-ninth embodiment generates a query ID.
- 642 is a diagram illustrating a detailed use case of the example 2 of FIG. 641 in Embodiment 29.
- FIG. FIG. 643 is a diagram of a detailed use case of the example 3 of FIG. 641 in the twenty-ninth embodiment.
- An information communication method is an information communication method for transmitting a signal by a change in luminance, comprising: determining a pattern of change in luminance by modulating a signal to be transmitted; And a transmitting step of transmitting the signal to be transmitted by changing the luminance of the plurality of light emitters in accordance with the pattern of luminance change, and in a plane on which the plurality of light emitters are disposed, other than the plurality of light emitters
- the non-brightness-changing non-brightness-changing region does not traverse the surface between at least one of the plurality of light emitters along at least one of the vertical and horizontal directions of the surface.
- a plurality of light emitters are disposed on the surface.
- the bright line area can be made continuous.
- the signal to be transmitted can be easily received, and communication can be performed between various devices including devices having a small computing power.
- the non-brightness change area which is not changed in luminance outside the plurality of light emitters is one of the vertical direction and the horizontal direction of the surface.
- the method may include disposing the plurality of light emitters on the surface so as not to traverse the surface between at least one of the plurality of light emitters.
- the transmission step whether the brightness level of at least one of the plurality of light emitters is smaller than a reference level which is a predetermined brightness or is equal to or less than the reference level
- a brightness determination step of determining the light emission level, and in the transmission step, when it is determined that the brightness level of the light emitter is smaller than the reference level or less than the Transmission of the signal to be transmitted may be stopped.
- the transmission step whether the brightness level of at least one of the plurality of light emitters is greater than or equal to or greater than a reference level which is a predetermined brightness.
- a first pattern of change in luminance corresponding to a body that is a part of the signal to be transmitted and a second pattern of change in luminance indicating a header for identifying the body
- the first pattern of change in luminance, the second pattern of change in luminance, and the first pattern of change in luminance are changed in order of luminance according to the respective patterns,
- the header and the body may be sent.
- a third luminance change pattern indicating another header different from the header is further determined, and in the transmitting step, the first luminance change pattern, the second luminance change
- the header, the body, and the other header may be transmitted by changing the luminance according to the respective patterns in the order of the pattern, the pattern of the first luminance change, and the pattern of the third luminance change.
- the signal length of the body can be specified if the header, the body, and the other headers are continuously received at one time.
- the signal length can be used to properly combine portions of the body before and after the header.
- it is impossible to specify the signal length of that body unless the header, the two bodies, and the header are received in succession at one time, but as described above, the other By transmitting the header, it is possible to shorten the signal that needs to be received one at a time in order to specify the signal length of the body.
- the patterns of the luminance change different in timing from each other are set such that two luminance change patterns adjacent to each other in the timing when the predetermined luminance value appears are not assigned to the same parity signal unit.
- the pattern of luminance change assigned to each of the signal units may be determined for each of the signal units included in the signal to be transmitted that is assigned in advance to each of the different signal units.
- an image obtained by photographing at least one light emitter of the plurality of light emitters by an image sensor further includes a bright line corresponding to an exposure line included in the image sensor.
- An exposure time setting step of setting an exposure time of the image sensor so as to occur according to a change in luminance of the light emitter; and photographing the light emitter changing in luminance with the set exposure time.
- the information acquiring step is performed by using the pattern of the second luminance change in the pattern of the bright line.
- a step of specifying a second portion and a third portion sandwiching a first portion corresponding to the second line in a direction perpendicular to the bright line, and data specified by the second and third portions Obtaining the body by demodulating the signal, and in the partial identification step, the sum of the length of the second portion and the length of the third portion in the direction perpendicular to the bright line is
- the second part and the third part may be specified to have a length associated with the body.
- FIG. 512A and FIG. 512B a body can be acquired appropriately.
- the information communication method further includes a flash determination step of determining whether or not a flash flashing in a predetermined rhythm is received, and in the transmission step, when it is determined that the flash is received,
- the plurality of light emitters may change in luminance by raising the luminance.
- the receiver that receives signals from the plurality of light emitters by photographing the plurality of light emitters can not receive the signals, as described above.
- the brightness of the plurality of light emitters can be increased and the signal can be properly received.
- the information communication method further includes a blinking step of blinking at least one light emitter of the plurality of light emitters so as to be visually recognized by human eyes, and in the light emitter, the transmitting step The blinking step may be alternately repeated.
- FIG. 1 is a timing chart of transmission signals in the information communication apparatus of the first embodiment.
- a reference waveform (a) is a clock signal of period T, which serves as a reference of the timing of the transmission signal.
- the transmission symbol (b) indicates a symbol string generated based on the data string to be transmitted.
- the transmission waveform (c) is a transmission waveform phase-modulated according to the transmission symbol with respect to the reference waveform, and the transmission light source is driven according to this waveform.
- the phase modulation is performed by shifting the phase of the reference waveform corresponding to the symbol, and in this example, symbol 0 is allocated at phase 0 ° and symbol 1 is allocated at phase 180 °.
- FIG. 2 is a diagram showing the relationship between the transmission signal and the reception signal in the first embodiment.
- the transmission signal is the same as that of FIG. 1, and the light source emits light only during the period in which the transmission signal is 1, and the light emission period is indicated by the area of the lower right diagonal line.
- a band painted with a diagonally upper right represents a period during which the pixel of the image sensor is exposed (exposure time tE), and the signal charge of the pixel of the image sensor is generated in a region overlapping with the diagonally lower right Ru.
- the pixel value p is proportional to the area of this overlapping area.
- Equation 1 holds between the exposure time tE and the cycle T.
- tE T / 2 ⁇ (2n + 1)
- the received waveform indicates the pixel value p of each line.
- the value of the pixel value axis is normalized with the amount of light received for one cycle as one.
- the pixel value p since the exposure time tE has a section of T (n + 1/2), the pixel value p always exists in the range of n ⁇ p ⁇ n + 1, and 2 ⁇ p ⁇ 3 in the example of FIG. Become.
- FIGS. 3 to 5 are diagrams showing the relationship between the transmission signal and the reception signal for symbol sequences different from those in FIG.
- the transmit signal has a preamble that includes consecutive identical symbol sequences (not shown) (e.g., a sequence of consecutive symbol 0), and the receiver uses the successive symbol sequences in this preamble as a reference for reception.
- a signal is created and used as a timing signal for reading out a symbol string from the received waveform. Specifically, for the continuous symbol 0, as shown in FIG. 2, the reception waveform returns a constant waveform repeating 2.fwdarw.3.fwdarw.2, and the clock signal is output based on the timing when the pixel value 3 is output. Generate as a reference signal.
- the reception signal in one section of the reference signal can be read out, and it can be read out as symbol 0 in the case of pixel value 3 and symbol 1 in the case of pixel value 2.
- FIGS. 3 to 5 show how symbols located in the fourth period are read out.
- FIG. 6 is a diagram summarizing FIG. 2 to FIG. 5. Since the lines are densely arranged, the pixel division in the line direction is omitted and drawn continuously. Here, symbol values in the fourth to eighth periods are read out.
- the intensity of the optical signal when the intensity of the optical signal is averaged for a time sufficiently longer than the period of the reference wave, it is always constant. Then, if the frequency of the reference wave is set appropriately high, that time can be set shorter than the time for which a person perceives a change in light quantity, so when the transmission light source is observed from human eyes, the light source remains constant. It is perceived as glowing. Therefore, it is not perceived as flickering of the light source, and it has the advantage that it does not make people feel troublesome in the previous embodiment.
- the frequency (symbol rate) of the signal is increased in the amplitude modulation (on-off modulation) in the above embodiment.
- the rising and falling edges of the signal can be detected even in such a situation, so that the frequency of the signal can be increased. And a high signal transmission rate can be realized.
- phase modulation means phase modulation with respect to the reference signal waveform.
- the carrier wave is light, and since it is modulated by amplitude modulation (on / off modulation) and transmitted, the modulation scheme in this signal transmission is a type of amplitude modulation.
- the above transmission signal is an example, and the number of bits per symbol can be set to 2 or more, and the correspondence between the symbol and the phase shift is not limited to 0 ° and 180 °, and the offset can be set. You may have.
- FIG. 17 an example of receiving data such as position information from an optical signal from an illuminator using a face camera that is a camera on the display unit side of a mobile phone and an in-camera on the opposite side in FIG. showed that.
- the vertical direction can be detected by gravity by using a 9-axis sensor.
- the face camera when receiving an optical signal by the mobile phone on the table in the restaurant, if the front side of the mobile phone is facing up according to the signal of the 9-axis sensor, the face camera is activated; If the camera is facing downward, switching to the in-camera and receiving the light signal can reduce power consumption and can quickly receive the light signal, and stop the useless camera operation. Can. In this case, it is possible to detect the direction of the camera on the table from the brightness of the camera and perform the same operation. In addition, when the camera switches from the shooting mode to the light signal reception mode, a command to increase the shutter speed is sent, and a command to increase the sensitivity of the imaging device is sent to the imaging circuit unit to increase the sensitivity and brighten the image. There is.
- the present invention mainly generates an optical signal from a lighting device in a space indoors, and an optical signal from a camera unit of a portable terminal having a communication unit, an audio microphone, an audio speaker, a display unit, an in-camera and a camera unit of a face camera.
- a camera unit of a portable terminal having a communication unit, an audio microphone, an audio speaker, a display unit, an in-camera and a camera unit of a face camera.
- it receives and obtains position information etc., when it goes out from indoors to outdoor, it can detect position information by GPS using satellites. For this reason, there is an effect that the position detection can be performed seamlessly by acquiring the position information of the boundary from the light signal area, activating the signal reception from the GPS, and automatically performing this switching.
- the boundary is detected by position information such as GPS and automatic switching to position information of the light signal is performed.
- position information such as GPS
- automatic switching to position information of the light signal is performed.
- response time is required when using a server.
- mutual authentication can be performed by transmitting an optical signal from the light emitting unit of the reader of the terminal such as POS to the face camera unit of the mobile phone using the present invention, thereby improving security.
- FIG. 7 shows the principle of the second embodiment. 8 to 20 show an example of the operation of the second embodiment.
- each line 1 In the image sensor shown in FIG. 7A, the exposure time of each line 1 is delayed for each line. At normal shutter speeds, each line has an overlapping portion in time, so light signals of the same time can not be determined because they are mixed in each line.
- the light signals do not overlap as shown in (a) of FIG. be able to.
- the clock for line access is raised or lowered to obtain the most contrast or to reduce the data error rate by looking at the signal of the light receiving element of the camera as shown in FIG. You can synchronize by changing to.
- the clock of the line of the image sensor is faster than the light signal, synchronization can be achieved by receiving one symbol of the light signal by n lines of two lines or three lines as shown in FIG.
- a display such as the TV shown in FIG. 9 or the TV shown on the left in FIG. 10, or a light source divided into n vertically, for example, 10, with a camera of a mobile phone
- the transfer rate is 10 times (n times).
- HD video can be divided into 50 because it has 1980 pixels horizontally. Then, it will be 1.5 Mbps and video data of moving pictures can be received. With 200, HD video can be transmitted.
- the shutter time needs to be half or less of 1 / fp. Because the blanking which occurs at the time of photography becomes half the size of one frame at the maximum. That is, since the blanking time is equal to or less than half of the photographing time, the actual photographing time is 1/2 fp in the shortest time.
- communication is initially asynchronous.
- a scan line that is twice or more the optical signal clock and 2 to 10 times the specific scan line, compatible communication is realized although the information rate is lowered.
- a lighting device that needs to prevent flickering, it emits light by turning off or reducing light in a four-value PPM, that is, one time zone of four bits. In this case, the bit rate drops to half, but since flickering is eliminated, it can be used for lighting equipment and can transmit light and data.
- Figure 11 sends a common signal from the entire illumination device in the common time zone indoors, the individual time slot, a situation being sent individual sub-information from the individual lighting apparatus L 4, the situation of receiving an optical signal It shows. Since L 4 has a small area, it takes time to send a large amount of data. Therefore, in the individual time zone, only a few bit ID is sent, and in the common time zone, all of L 1 , L 2 , L 3 , L 4 and L 5 send common information having the same content.
- time zone A in the lower part of FIG. 12A “the position information of the reference position of the room, the arrangement information of the individual devices of each ID (difference position information from the reference position), the server 'S URL, data broadcasting, LAN transmission data' two lights of the main area M which is all lights in the room, and S 1 , S 2 , S 3 , S 4 in the part are all the same at the same time Send an optical signal. Since the room is illuminated with the same light signal, the portable camera unit can reliably receive data during the common time zone.
- the main region M does not blink and continuously emits light at a normal 1 / n light amount.
- the average light amount does not change and flickering can be prevented. There is no flicker if it blinks in a range where the average light amount does not change, but this is not preferable because it becomes noise for reception of the partial regions S 1 , S 2 , S 3 and S 4 in the time zone B.
- S 1 , S 2 , S 3 and S 4 each transmit optical signals of different data content.
- the main area M does not transmit a modulation signal, it is separated in position like the upper right hand-held screen.
- the noise since the noise is small, it becomes easy to detect the stripes occurring in the region, stable data is obtained.
- FIG. 12B is a diagram for describing the operation of the transmitter and the receiver in this embodiment.
- the transmitter 8161 is configured as, for example, signage, and changes luminance of an area A described as "A shop” and an area B described as "B shop". Thereby, the signal A and the signal B are transmitted from each area.
- each of the signal A and the signal B includes a common portion indicating common content and a unique portion indicating different content.
- the respective common parts of signal A and signal B are transmitted simultaneously.
- the receiver 8162 displays an image of the entire signage.
- the transmitter may simultaneously transmit the unique part of the signal A and the unique part of the signal B or may transmit at different timings.
- the receiver 8162 displays, for example, detailed information on the store corresponding to the area B described above.
- FIG. 12C is a diagram for describing operations of a transmitter and a receiver in this embodiment.
- the transmitter 8161 simultaneously transmits the common part of the signal A and the signal B, and then simultaneously transmits unique parts indicating different contents of each of the signal A and the signal B.
- Receiver 8162 receives signals from transmitter 8161 by imaging transmitter 8161.
- the transmitter 8161 transmits the common part of the signal A and the signal B
- the transmitter 8161 can be regarded as one large area without dividing it into two areas.
- the receiver 8162 can receive the common part even if it is at a position far from the transmitter 8161.
- the receiver 8162 acquires the information associated with the common part from the server and displays it.
- the server transmits to the receiver 8162 information on all stores shown in the signage that is the transmitter 8161.
- the server selects information on any store from those stores and transmits the information to the receiver 8162.
- the server preferentially transmits to the receiver 8162 information on stores that are paying the highest registration amount among those stores.
- the server transmits information on stores corresponding to the area (area A or area B) at the center of the area captured by the camera of the receiver 8162.
- the server randomly selects a store and transmits information on the store to the receiver 8162.
- receiver 8162 can receive the unique portion of signal A or signal B when in a position close to transmitter 8161. At this time, the receiver 8162 acquires information associated with the unique portion from the server.
- the individual IDs of L 1 , L 2 , L 3 , and L 4 to L 8 in FIG. 14A can be 3-bit demodulated.
- the reception error can be reduced by allocating the signal so that the reciprocal or logarithm of the frequency becomes equal, rather than allocating the frequency to the signal at equal intervals.
- transmission can be performed at 60 bits per second.
- a general imaging device captures an image of 30 frames per second, so when transmitting a signal at the same frequency for 1/15 seconds, imaging is reliably performed even if the transmitter is only shown in a part of the captured image It can be performed.
- the frequency of the transmission signal appears as a peak.
- a plurality of frequencies are imaged in one frame, such as a frequency switching portion, a plurality of peaks can be obtained as weaker peaks than when a single frequency signal is subjected to Fourier transform.
- a protection part may be provided in order to avoid mixing of the front and back frequencies.
- the transmission frequency can be analyzed, and the time is shorter than 1/15 seconds or 1/30 seconds. Even if the frequency of the transmission signal is changed, it can be received.
- Fourier transform may be performed within a range shorter than one screen.
- the captured screens may be connected to perform Fourier transform in a range longer than one screen.
- the luminance value of the imaging blanking time is treated as unknown.
- the protection portion is a signal of a specific frequency or does not change in luminance (frequency 0 Hz).
- the time zone A as shown in FIG. 11, by sending compass of distance difference d between the light source and the reference position of each ID, it is found the exact position in cm of lighting L 4.
- the position of the mobile phone can be determined with high accuracy.
- transmitting the common optical signal to the time zone A and transmitting the individual optical signals to the time zone B it is possible to transmit large-capacity common information and small-capacity individual information such as ID almost simultaneously. The effect is obtained.
- the area S 1 by camera shake is likely to deviate from the scanning range of the image sensor.
- image stabilization as shown in FIG. 16 is important.
- the gyro sensor built into the mobile phone can not usually detect a narrow fine rotation range such as hand movement.
- the in-camera is turned on to detect camera shake from the in-camera image, and the scanning range or detection range can be corrected to reduce the influence of camera shake. This is because the camera shake of the face camera and the camera shake of the in-camera are the same.
- the shutter speed of the scan area other than the light signal pattern of the face camera can be reduced to obtain a normal image from this area and shake correction can be performed from this image.
- camera shake detection and signal detection can be performed with one camera. This has the same effect when using the in-camera shown in FIG.
- light signal detection is performed by the face camera, and first, the position information of the terminal is obtained.
- FIG. 18 is a diagram showing a data broadcast which is common data is received from the illumination of the ceiling in the station yard and the position of the user is obtained from the individual data.
- the light emitting unit of the terminal of the shop is made to emit light. It can be used to receive, mutually authenticate, and improve security. Authentication may be reversed.
- the location information obtained is transmitted to the terminal of the store via a wireless LAN or the like, and the position of the customer is displayed on the terminal of the clerk, so the customer who ordered the ordered drink The clerk can carry to the table of position information.
- FIG. 20 among trains and aircrafts, passengers know their positions using the method of the present embodiment, and order products such as food at terminals.
- the crew has the terminal of the present invention in the cart, and the ID number of the ordered item is displayed at the position of the ordered customer on the screen, so that the item of the ordered ID can be accurately delivered to the customer.
- FIG. 10 is a diagram showing a case where the method or apparatus of the present embodiment is used for the backlight of a display such as a TV. Since the fluorescent light, the LED, and the organic EL can be subjected to the luminance modulation of the sluggishness, the transmission of this embodiment is possible. However, in terms of characteristics, the scanning direction is important. When used vertically as in a smartphone, scanning is performed in the horizontal direction, so a horizontally long light emitting area is provided at the bottom of the screen, and the contrast of the image such as a TV is lightened. There is.
- the image sensor in either scanning direction can receive the signal.
- the communication speed can be significantly increased by synchronizing the scan line reading clock of the image sensor of the camera with the light emission pattern of the light emitting unit as shown in FIG.
- bi-directional reception can be performed without adding components in the lighting device by using this also for reception.
- the terminal side can transmit using a strobe light for a camera, or an inexpensive infrared light emitting unit may be separately provided. In this way, bi-directional communication is realized without adding much components.
- FIG. 21 shows an example in which the imaging elements arranged in one row are exposed at the same time, and imaging is performed while shifting the exposure start time in the order of rows.
- an exposure line of an imaging element to be exposed simultaneously a line of pixels on an image corresponding to the imaging element is called a bright line.
- the bright line (bright and dark lines of pixel values) along the exposure line is generated on the captured image as shown in FIG. .
- the pattern of the bright line it is possible to estimate a change in light source luminance at a speed higher than the imaging frame rate.
- communication at a speed higher than the imaging frame rate can be performed.
- LO lower luminance value
- HI high
- Low may be a state in which the light source is not illuminated, or may be weaker than high.
- the exposure time is set shorter than, for example, 10 milliseconds.
- FIG. 22 shows a case where the exposure of one exposure line is completed and then the exposure of the next exposure line is started.
- the information may be transmitted by transmitting information based on whether each exposure line receives light of a predetermined level or more.
- the information can be transmitted at a rate of fl bits per second.
- communication can be performed at higher speed when exposure is performed with a time difference for each pixel instead of for each line.
- the transmission speed is at the maximum at flm bits per second.
- the light emission time of the light emitting part is controlled by a unit time shorter than the exposure time of each exposure line. , Can transmit more information.
- information can be transmitted at a rate of at most flElv bits per second.
- the basic period of transmission can be recognized by causing the light emitting unit to emit light at a timing slightly shifted from the exposure timing of each exposure line.
- FIG. 24A shows the case where the exposure of the next exposure line is started before the exposure of one exposure line is completed. That is, the exposure times of the adjacent exposure lines partially overlap in time.
- the exposure time of the adjacent exposure lines does not have to partially overlap in time, and the partial exposure lines do not partially overlap in time. It is also possible.
- By forming a partial exposure line so as not to have a partial temporal overlap it is possible to suppress the generation of intermediate colors due to overlapping exposure times on the imaging screen, and it is possible to detect bright lines more appropriately. .
- the exposure time is calculated from the brightness of each exposure line, and the light emission state of the light emitting unit is recognized.
- the light emitting unit when the brightness of each exposure line is determined by the binary value of whether the luminance is equal to or more than the threshold value, the light emitting unit does not emit light in order to recognize the non-emitting state. It must continue for more than the exposure time.
- FIG. 24B shows the influence of the difference in exposure time when the exposure start time of each exposure line is equal.
- 7500a is the case where the exposure end time of the previous exposure line and the exposure start time of the next exposure line are equal
- 7500b is the case where the exposure time is longer than that.
- FIG. 24C shows the influence of the difference in exposure start time of each exposure line when the exposure time is equal.
- 7501a is the case where the exposure end time of the previous exposure line is equal to the exposure start time of the next exposure line
- 7501b is the case where the exposure of the next exposure line is started earlier than the completion of the exposure of the previous exposure line.
- By making the exposure times overlap it is possible to recognize the blinking of the light source shorter than the exposure time by using the difference in exposure amount between the adjacent exposure lines.
- the exposure time is longer than that in the normal shooting mode. It becomes possible to dramatically improve the communication speed by using the bright line pattern generated by setting short for signal transmission.
- the exposure time needs to be set to the exposure time ⁇ 1/8 ⁇ f. The blanking which occurs at the time of photography becomes half the size of one frame at maximum.
- the blanking time is half or less of the photographing time
- the actual photographing time is 1 / 2f in the shortest time.
- at least exposure time needs to be shorter than 1 / (2f ⁇ 4) because it is necessary to receive four-value information within 1 / 2f time.
- the normal frame rate is 60 frames / second or less
- by setting the exposure time to 1/480 seconds or less it is possible to generate an appropriate bright line pattern in image data and perform high-speed signal transmission. Become.
- FIG. 24D shows the advantage of short exposure times when the exposure times of the exposure lines do not overlap.
- the exposure time is long, even if the light source has a binary luminance change like 7502a, in the picked-up image there is a middle color part like 7502e, and it tends to be difficult to recognize the luminance change of the light source There is.
- the brightness change of the light source is provided by providing a predetermined non-exposure free time (predetermined waiting time) t D2 until the exposure start of the next exposure line after the completion of the exposure of one exposure line. It is possible to make it easy to recognize That is, it is possible to detect a more appropriate bright line pattern such as 7502 f.
- the configuration in which the predetermined non-exposure free time is provided can be realized by making the exposure time t E smaller than the time difference t D of the exposure start time of each exposure line.
- the exposure time is set shorter than in the normal shooting mode until a predetermined non-exposure time occurs.
- the exposure time is set short until a predetermined non-exposure time occurs. It can be realized.
- a predetermined idle time before exposure start of the next exposure line (predetermined Wait time) t D2 can be provided.
- predetermined Wait time since the exposure time can be extended, it is possible to capture a bright image, and since noise is reduced, error tolerance is high.
- the number of exposure lines that can be exposed within a certain time decreases, there is a disadvantage that the number of samples decreases as in 7502 h.
- the estimation error of the light source luminance change can be reduced by using the former configuration when the imaging target is bright and using the latter configuration when the imaging target is dark.
- the exposure time of the adjacent exposure lines does not have to partially overlap in time, and the partial exposure lines do not partially overlap in time. It is also possible. Moreover, it is not necessary to provide a predetermined non-exposure vacant time (predetermined waiting time) until the start of exposure of the next exposure line after the end of exposure of one exposure line in all the exposure lines, and a part of the exposure It is also possible to have a configuration in which the lines partially overlap in time. With such a configuration, it is possible to take advantage of each configuration.
- the same readout method is used in the normal imaging mode in which imaging is performed at an ordinary frame rate (30 fps, 60 fps) and in the visible light communication mode in which imaging is performed with an exposure time of 1/480 seconds or less in which visible light communication is performed.
- the signal may be read out by a circuit.
- FIG. 24E shows the relationship among the minimum change time t S of light source luminance, the exposure time t E , the time difference t D of the exposure start time of each exposure line, and the captured image.
- t E + t D ⁇ t S since one or more exposure lines are always imaged without the light source changing from the start to the end of exposure, an image with a clear luminance like 7503 d is obtained, It is easy to recognize the change in luminance of the light source.
- 2t E > t S a bright line of a pattern different from the luminance change of the light source may be obtained, and it becomes difficult to recognize the luminance change of the light source from the captured image.
- FIG. 24F shows the relationship between the transition time t T of the light source luminance and the time difference t D of the exposure start time of each exposure line.
- t D is larger than the t T, exposure lines to be neutral is reduced, it is easy to estimate the light source luminance.
- the intermediate color exposure line becomes 2 lines or less in a row, which is desirable.
- t T is about 1 microsecond or less when the light source is an LED, and about 5 microseconds when the light source is an organic EL. By setting t D to 5 microseconds or more, estimation of the light source luminance is facilitated be able to.
- FIG. 24G shows the relationship between the high frequency noise t HT of the light source luminance and the exposure time t E.
- t E is larger than t HT , the effect of high frequency noise on a captured image decreases, and estimation of light source luminance becomes easier.
- t E is an integral multiple of t HT , the influence of high frequency noise is eliminated, and estimation of the light source luminance is the easiest. It is desirable that t E > t HT for the estimation of the light source luminance.
- the main cause of high frequency noise is derived from the switching power supply circuit, and t HT is 20 microseconds or less in many lamp switching power supplies, so t E should be 20 microseconds or more to estimate the light source luminance. It can be done easily.
- FIG. 24H is a graph showing the relationship between the exposure time t E and the magnitude of high frequency noise when t HT is 20 microseconds.
- t E is a value equal to the value when the noise amount takes a maximum, 15 microseconds or more, 35 microseconds or more, or It can be confirmed that the efficiency is good if it is determined as 54 microseconds or more or 74 microseconds or more. From the viewpoint of high frequency noise reduction, it is desirable that t E be large.
- t E when the light source luminance change period is 15 to 35 microseconds, t E is 15 microseconds or more, and when the light source luminance change period is 35 to 54 microseconds, t E is 35 microsecond or more.
- t E is 54 microseconds or more when the cycle is 54 to 74 microseconds of change, t E when the period of the change in light source luminance is 74 microseconds or more may be set as 74 microseconds or more.
- FIG. 24I shows the relationship between the exposure time t E and the recognition success rate. Since the exposure time t E has a relative meaning with respect to the time when the luminance of the light source is constant, the horizontal axis represents a value (relative exposure time) obtained by dividing the period t S at which the light source luminance changes by the exposure time t E There is. From the graph, it can be seen that the relative exposure time should be 1.2 or less if the recognition success rate is to be approximately 100%. For example, when the transmission signal is 1 kHz, the exposure time may be about 0.83 milliseconds or less.
- the relative exposure time should be 1.25 or less, and if the recognition success rate is 80% or more, the relative exposure time should be 1.4 or less. Recognize. In addition, since the recognition success rate decreases rapidly at relative exposure time around 1.5 and becomes almost 0% at 1.6, it is understood that relative exposure time should be set so as not to exceed 1.5 . In addition, after the recognition rate becomes 0 at 7507c, it can be seen that the recognition rate increases again at 7507d, 7507e, and 7507f. Therefore, when you want to take a bright image by extending the exposure time, etc., use an exposure time with a relative exposure time of 1.9 to 2.2, 2.4 to 2.6, or 2.8 to 3.0. Just do it. For example, these exposure times may be used as intermediate modes in FIG.
- the luminance of the light emitting unit at that time can not be observed.
- the transmission loss due to blanking can be prevented by the light emitting unit repeatedly transmitting the same signal twice or more or adding an error correction code.
- the light emitting unit transmits a signal at a cycle shorter than the cycle for capturing an image or shorter than the cycle for capturing an image.
- the light emitting unit When visible light is used as the carrier wave, the light emitting unit is caused to emit light so as to keep the moving average value of the luminance of the light emitting unit constant when the time resolution of human vision (about 5 milliseconds to 20 milliseconds) is the window width.
- the time resolution of human vision about 5 milliseconds to 20 milliseconds
- the light emitting unit of the transmitting device appears to emit light with uniform luminance to humans, and at the same time, the receiving device can observe the luminance change of the light emitting unit.
- modulation method shown in FIG. 27 As a modulation method that causes the light emitting unit to emit light so as to keep the moving average value of the luminance of the light emitting unit constant when the time resolution of human vision is a window width.
- the signal after modulation is 0, it does not emit light, and when it is 1, light is emitted, and if there is no bias in the transmission signal, the average value of the luminance of the light emitting part is about 50% of the luminance at the time of light emission.
- modulation method shown in FIG. 28 is a modulation method for causing the light emitting portion to emit light so as to keep the moving average value of the luminance of the light emitting portion constant when the time resolution of human vision is a window width.
- the signal after modulation is 0, it does not emit light, and when it is 1, light is emitted, and if there is no bias in the transmission signal, the average value of the luminance of the light emitting part is about 75% of the luminance at the time of light emission.
- the encoding efficiency is equal to 0.5, but the average luminance can be increased.
- modulation method shown in FIG. 29 As a modulation method that causes the light emitting unit to emit light so as to keep the moving average value of the luminance of the light emitting unit constant when the time resolution of human vision is a window width.
- the signal after modulation is 0, it does not emit light, and when it is 1, light is emitted, and if there is no bias in the transmission signal, the average value of the luminance of the light emitting part is about 87.5% of the luminance at the time of light emission.
- the coding efficiency is inferior to 0.375 as compared with the modulation scheme shown in FIGS. 27 and 28, the average luminance can be kept high.
- modulation method shown in FIG. 30 as a modulation method for causing the light emitting unit to emit light so as to keep the moving average value of the luminance of the light emitting unit constant when the time resolution of human vision is a window width.
- the signal after modulation When the signal after modulation is 0, it does not emit light, and when it is 1, light is emitted, and if there is no bias in the transmission signal, the average value of the luminance of the light emitting part is about 25% of the luminance at the time of light emission.
- the light emitting unit can be made to appear to be blinking for a person or an imaging device having a long exposure time.
- the modulation method it is possible for a human being or an imaging device having a long exposure time to make it appear that the light emitting unit emits light while performing any luminance change.
- the light emitting unit When visible light is used as a carrier wave, the light emitting unit is caused to emit light so as to periodically change the moving average value of the luminance of the light emitting unit when the time resolution of human vision is the window width, as shown in FIG. At the same time as the human being sees that the light emitting part of the light emitting device blinks or changes at an arbitrary rhythm, the receiver can observe the light emitting signal.
- the same effect can be obtained by emitting light from the LED section of a liquid crystal TV using an LED light source as a backlight.
- an LED light source as a backlight.
- optical communication with a small error rate becomes possible.
- the communication speed can be further increased if the entire screen or the screen portion used for communication is white.
- the moving average value of the light emitting unit brightness is adjusted to the brightness of an image to be shown to a human when the time resolution of human vision is the window width.
- the receiver can observe the light emission signal at the same time as the human being sees a television picture as usual.
- the moving average value of the luminance of the light emitting unit when the window width is about the time per one frame of the captured image is adjusted to the value of the signal in the case of transmitting the signal for each imaging frame.
- the image is taken from a short distance, the light emission state of the transmitter is observed for each exposure line, and when the image is taken from a long distance, the light emission state of the transmitter is observed for each imaging frame to obtain signals at two speeds. Propagation is possible.
- FIG. 34 is a diagram showing how light emission is observed at each exposure time.
- the luminance of the imaged pixel is proportional to the average luminance of the imaging target during the time when the imaging device is exposed, so if the exposure time is short, the light emission pattern 2217a is observed as it is as 2217b, and if the exposure time is long, 2217c. , 2217d, and 2217e.
- 2217a is a modulation scheme that repeatedly uses the modulation scheme shown in FIG. 28 in a fractal manner.
- the reception device recognizes that 1 has been received if the luminance of the pixel at the estimated light emitting position is above a predetermined value or 0 if the luminance of a pixel at a predetermined light emission position is above a certain number of lines of the exposure line.
- the transmitting apparatus may transmit different numbers when the same number continues in a fixed number.
- a header part including 1 and 0 must be transmitted and a body part transmitting a signal may be transmitted separately. In this case, the same number does not appear in succession more than five times.
- the image pickup device of the receiving apparatus When the light emitting unit is at a position where it does not appear in one of the exposure lines, or when blanking is present, it is not possible for the image pickup device of the receiving apparatus to capture all aspects of the light emitting unit.
- FIG. 36 there is a method of collectively transmitting a data part and an address part indicating the position of data.
- the length of the light emission pattern is the sum of the size of the data portion and the address portion so that the light emitting pattern is captured in one captured image by the receiving device. It is desirable to set short enough.
- the transmitting device transmits the reference part and the data part
- the receiving device recognizes the position of data from the difference between the time when the reference part is received and the time.
- the transmitting apparatus transmits the reference part, the address pattern part, and the data part
- the receiving apparatus obtains the data part of the data part and the pattern of its position from the address pattern part next to the reference part.
- the light emission pattern of the header portion a pattern which does not appear in the address portion or the data portion is used.
- the light emission pattern of the header portion can be set to “0011”.
- the pattern of the header portion is "11110011"
- the average luminance is equal to that of the other portions, and it is possible to suppress flicker when viewed by human eyes. Since this header part has high redundancy, it is possible to superimpose information here. For example, when the pattern of the header portion is “11101111”, it is possible to indicate that the content to be communicated between transmitting devices is being transmitted.
- the light emitting pattern is a combination of the data portion, the address portion, and the header portion so that the light emitting pattern is captured in one captured image by the receiving device. It is desirable to set the length of the short enough.
- the transmission apparatus determines the transmission order of information according to the priority.
- the number of transmissions is made proportional to the priority.
- the reception device can not receive signals continuously when the light emitting unit of the transmission device is not fully captured in the imaging unit of the reception device or when there is blanking, so the higher the frequency of transmission, the easier it is for reception.
- FIG. 41 shows a pattern in which a plurality of nearby transmitters tune and transmit information.
- the transmitting device transmits the individual information in a time zone where the light emitting unit of the nearby transmitting device emits uniform light (the signal is not transmitted) so as not to be confused with the light emission pattern of the nearby transmitting device.
- the transmission device may learn the light emission pattern of the transmission device present nearby by determining the light emission pattern of the transmission device present nearby by receiving the light emission pattern of the transmission signal present present nearby by the light receiving unit. In addition, the transmission device may determine its own emission pattern in accordance with an instruction of another transmission device by receiving a light emission pattern of a transmission signal existing nearby by the light receiving unit. Also, the transmission device may determine the light emission pattern according to the command of the central control device.
- Detection of light emitting part As a method of determining which part of the captured image the light emitting part is captured in, as shown in FIG. 42, the number of lines in which the light emitting part is captured is counted in the direction perpendicular to the exposure line There is a method of making the selected row the row in which the light emitting parts exist.
- the part near the end of the light emitting part has fluctuation in the degree of light reception, and it is easy to misjudge whether the light emitting part is imaged or not. Take the signal from the result.
- the middle point of the part obtained by capturing the light emitting part is determined for each exposure line, and an approximate line connecting straight points Or, there is a method of estimating that the light emitting part exists on the quadratic curve).
- the estimated position of the light emitting unit in the previous frame may be a priori probability, and the estimated position of the light emitting unit may be updated from the information of the current frame.
- the current estimated position of the light emitting unit may be updated from the value of the 9-axis sensor or the gyro sensor during this period.
- a composite image 2212 f is obtained, and the position of the light emitting unit in the captured image can be specified.
- the receiving device detects on / off of light emission of the light emitting unit from the position of the specified light emitting unit.
- the probability that the light emitting portion in the composite image 2212 f is seen to emit light is 1 ⁇ It is 0.25 n .
- this probability is approximately 0.984.
- the attitude of the imaging unit is estimated from the sensor values of the gyro sensor and the 9-axis sensor, the direction of the imaging is compensated, and the image is synthesized more accurately.
- the imaging time is short, and thus no adverse effect is small even if the imaging direction is not compensated.
- FIG. 46 is a diagram illustrating a case where the reception device captures a plurality of light emitting units.
- FIG. 47 shows a timeline of the transmission signal at this time and an image obtained by capturing the light emitting unit.
- the light emitting units 2216a, 2216c, and 2216e emit light uniformly, and the light emitting units 2216b, 2216d, and 2216f transmit signals according to light emission patterns. Note that the light emitting units 2216 b, 2216 d, and 2216 f may simply emit light so as to appear as a striped pattern if the receiving device captures an image for each exposure line.
- the light emitting units 2216 a to 2216 f may be the light emitting units of the same transmission device or different transmission devices.
- the transmitting device expresses the signal to be transmitted by the position pattern (position pattern) of the light emitting unit transmitting the signal and the position of the light emitting unit not transmitting.
- the transmitting device may transmit a signal according to a position pattern in a part of time zone and may transmit a signal in a light emission pattern in another time zone. For example, in some time zones, it is possible to tune all the light emitting units and transmit the ID and position information of the transmission device in the light emission pattern.
- the receiving device is the ID and position information of the transmitting device transmitted by the transmitting device by the light emission pattern, the estimated position of the receiving device by the wireless base station, and the positional information of the receiving device estimated by GPS, gyro sensor or 9-axis sensor
- the server obtains a list of position patterns present in the vicinity, and analyzes the position pattern based on the list.
- the signal represented by the position pattern does not need to be unique all over the world, and it is preferable that the same position pattern does not exist near (a few meters to 300 meters in radius), and the transmitter has few light emitting parts. Can solve the problem that the number of position patterns that can be expressed is small.
- the size, shape, and position information of the light emitting unit can be acquired from the server, and the position of the receiving apparatus can be estimated from the information, the size and shape of the captured position pattern, and the lens characteristics of the imaging unit.
- a mobile phone As a communication device mainly performing reception, as shown in FIG. 49, a mobile phone, a digital still camera, a digital video camera, a head mounted display, a robot (for cleaning, care, industrial, etc.), a surveillance camera, etc. can be considered. .
- the receiving device is not limited to these.
- the receiving device is a communication device that mainly receives a signal, and may transmit a signal according to the scheme of this embodiment or another scheme.
- sending device As a communication device that mainly performs transmission, as shown in FIG. 50, lighting (for home, store, office, underground mall, street, etc.), flashlight, home appliance, robot, and other electronic devices can be considered. However, the transmitter is not limited to these.
- the transmission device is a communication device that mainly transmits a signal, and may receive a signal according to the method of this embodiment or another method.
- the light emitting unit has high speed switching of light emission / non-light emission, such as LED lighting, liquid crystal using LED backlight, and the like.
- the light emitting unit is not limited to these.
- the light emitting unit may be illuminated with a fluorescent lamp, an incandescent lamp, a mercury lamp, or an organic EL display.
- the transmitting apparatus may include a plurality of light emitting units that emit light synchronously as shown in FIG.
- the light emitting units may be arranged in a line.
- the light emitting units may be arranged vertically to the exposure line.
- the light emitting units may be arranged in a cross shape.
- a circular light emitting unit may be used, or the light emitting unit may be arranged in a circular shape.
- the transmission apparatus may cover the light emitting unit with a light diffusion plate as shown in FIG.
- the light emitting units that transmit different signals are spaced apart so as not to be imaged simultaneously.
- the light emitting units that transmit different signals have light emitting units that do not transmit signals between them so as not to be imaged simultaneously.
- FIG. 58 is a diagram showing a desirable structure of the light emitting unit.
- the light (electromagnetic wave) carrying the signal uses light (electromagnetic wave) in the near infrared band to visible light band to the near ultraviolet band frequency band shown in FIG. 59 which can be received by the receiving apparatus.
- the imaging unit of the receiving apparatus detects a light emitting unit 2310 b emitting pattern light in an imaging range 2310 a.
- the imaging control unit acquires the captured image 2310 d by repeatedly using the exposure line 2310 c at the center position of the light emitting unit instead of using another exposure line.
- the captured image 2310d is an image of the same place with different exposure times.
- the light emission pattern of the light emitting unit can be observed by scanning the pixel in which the light emitting unit of the captured image 2310 d is taken in the direction perpendicular to the exposure line.
- the signal can be read even when the light emitting unit or the light emitting unit is imaged from a distance.
- this method makes it possible to observe all changes in luminance of the light emitting unit as long as the light emitting unit is visible even in a part of the imaging apparatus.
- the same effect can be obtained by imaging using multiple exposure lines at the center of the light emitting part.
- imaging is performed using only a point closest to the center of the light emitting unit or only a plurality of points in the vicinity thereof. At this time, by shifting the exposure start time of each pixel, it is possible to detect the light emission state of the light emitting unit with a finer cycle.
- sensor values such as a gyro sensor or 9-axis sensor, or to use an image captured by an imaging device other than an imaging device capturing a light emitting unit.
- the part closer to the center of the light emitting part is preferably used as the exposure line or the exposure pixel, since the light emitting part is less likely to come off the exposure line or the exposure pixel when the camera shakes when closer to the center than the end of the light emitting part.
- the peripheral portion of the light emitting portion be as far as possible from the periphery of the light emitting portion and have a portion with high luminance be an exposure line or an exposure pixel.
- the transmitting device transmits the installed position information, the size of the light emitting device, the shape of the light emitting device, and the ID of the transmitting device.
- the position information includes the latitude, longitude, altitude, height from the floor surface, and the like of the central portion of the light emitting device.
- the receiving device estimates the imaging direction based on the information obtained from the 9-axis sensor and the gyro sensor.
- the receiving device estimates the distance from the receiving device to the light emitting device from the size and shape of the light emitting device transmitted from the transmitting device, the size and the shape of the light emitting device in the captured image, and the information of the imaging device.
- the information of the imaging device includes the focal length of the lens, the distortion of the lens, the size of the imaging device, the distance between the lens and the imaging device, the size in the captured image of the object of the reference size, and the imaging device A comparison table of the distance to the imaging object is included.
- the receiving device estimates the positional information of the receiving device from the information transmitted from the transmitting device, the imaging direction, and the distance from the receiving device to the light emitting device.
- the transmitting device transmits the installed position information, the size of the light emitting unit, the shape of the light emitting unit, and the ID of the transmitting device.
- the position information includes the latitude, longitude, altitude, height from the floor surface, and the like of the central portion of the light emitting unit.
- the receiving device estimates the imaging direction based on the information obtained from the 9-axis sensor and the gyro sensor.
- the receiving device estimates the distance from the receiving device to the light emitting unit from the size and shape of the light emitting unit transmitted from the transmitting device, the size and shape of the light emitting unit in the captured image, and the information of the imaging device.
- the information of the imaging device includes the focal length of the lens, the distortion of the lens, the size of the imaging device, the distance between the lens and the imaging device, the size in the captured image of the object of the reference size, and the imaging device A comparison table of the distance to the imaging object is included.
- the receiving device estimates the positional information of the receiving device from the information transmitted from the transmitting device, the imaging direction, and the distance from the receiving device to the light emitting unit.
- the receiving device estimates the moving direction and the moving distance based on the information obtained from the 9-axis sensor and the gyro sensor.
- the receiver estimates the position information of the receiver using the position information estimated at a plurality of points and the positional relationship between the points estimated from the movement direction and the movement distance.
- point Random field of the position information of the receiver estimated by From point to The random field of the movement direction and movement distance estimated when moving to Then the random field of the position information finally estimated is It can be calculated.
- the transmitting device may transmit the position information of its own and the ID of the transmitting device.
- the position information includes the latitude, longitude, altitude, height from the floor surface, and the like of the central portion of the light emitting device.
- the receiving device estimates the imaging direction based on the information obtained from the 9-axis sensor and the gyro sensor.
- the receiving device estimates positional information of the receiving device by a three-point survey method.
- the transmitting device transmits the ID of the transmitting device.
- the receiving device receives the ID of the transmitting device, and obtains the location information of the transmitting device from the Internet, the size of the light emitting device, the shape of the light emitting device, and the like.
- the position information includes the latitude, longitude, altitude, height from the floor surface, and the like of the central portion of the light emitting device.
- the receiving device estimates the imaging direction based on the information obtained from the 9-axis sensor and the gyro sensor.
- the receiving device estimates the distance from the receiving device to the light emitting device from the size and shape of the light emitting device transmitted from the transmitting device, the size and the shape of the light emitting device in the captured image, and the information of the imaging device.
- the information of the imaging device includes the focal length of the lens, the distortion of the lens, the size of the imaging device, the distance between the lens and the imaging device, the size in the captured image of the object of the reference size, and the imaging device A comparison table of the distance to the imaging object is included.
- the receiving device estimates the position information of the receiving device from the information obtained from the Internet, the imaging direction, and the distance from the receiving device to the light emitting device.
- the transmitting device transmits its own installed position information and the ID of the transmitting device.
- the position information includes the latitude, longitude, altitude, height from the floor surface, and the like of the central portion of the light emitting device.
- the receiving device estimates the imaging direction based on the information obtained from the 9-axis sensor and the gyro sensor.
- the receiving device estimates positional information of the receiving device by a triangulation method.
- the transmitting device transmits the position information of its own and the ID of the transmitting device.
- the position information includes the latitude, longitude, altitude, height from the floor surface, and the like of the central portion of the light emitting device.
- the receiving device estimates the imaging direction based on the information obtained from the 9-axis gyro sensor.
- the receiving device estimates positional information of the receiving device by a triangulation method.
- the receiving device estimates the posture change and movement of the receiving device from the gyro sensor or the 9-axis sensor.
- the receiving device may simultaneously perform zero adjustment and calibration of the 9-axis sensor.
- the reception device 2606c estimates the distance and direction of movement from changes in the captured image and sensor values of the 9-axis sensor and the gyro sensor.
- the receiving device captures an image of the light receiving unit of the transmitting device 2606a, estimates the center position of the light emitting unit, and transmits the position to the transmitting device.
- the transmitting device should transmit the size information of the light emitting unit even if part of the information to be transmitted is missing. desirable. If the size of the light emitting unit is not known, the height of the ceiling is estimated from the distance between the transmitter 2606b and the receiver 2606c used for position estimation by the receiver, and the transmitter 2606a is used by using the result. And the distance between the receiver 2606c and the receiver 2606c.
- the transmission may be transmission by a light emission pattern, transmission by a sound pattern, or a transmission method by a radio wave.
- the light emission pattern of the transmission device and its time may be stored, and may be transmitted to the transmission device or the central control device later.
- the transmitting device or the central control device specifies the transmitting device that the receiving device was imaging from the light emission pattern and its time, and stores the position information in the transmitting device.
- the receiving device calculates the positional relationship from the position set point to the center of the light emitting unit of the transmitting device, and transmits the position obtained by adding the positional relationship to the position to be set to the transmitting device.
- the reception device receives the transmitted signal by imaging the transmission device. It communicates with a server or an electronic device based on the received signal.
- the receiving device acquires information of the transmitting device, the position / size of the transmitting device, service information related to the position, and the like from the server using the ID of the transmitting device included in the signal as a key.
- the receiving device estimates the position of the receiving device from the position of the transmitting device included in the signal, and acquires map information, service information related to the position, and the like from the server.
- the receiving device acquires the modulation scheme of the nearby transmitting device from the server using the current rough position as a key.
- the receiving device may use the ID of the transmitting device included in the signal as a key to the server, position information of the receiving device or the transmitting device, nearby information, and information of processing performed by the receiving device in the vicinity. Register on
- the receiving device operates the electronic device using the ID of the transmitting device included in the signal as a key.
- FIG. 69 is a block diagram showing a receiving apparatus.
- the receiving apparatus is configured by the whole or a part including an imaging unit and a signal analysis unit. Note that blocks having the same name in FIG. 69 may be the same as or different from each other.
- the receiver in a narrow sense is included in a smartphone, a digital camera, or the like.
- the input unit 2400 h includes all or part of the user operation input unit 2400 i, the illuminance sensor 2400 j, the microphone 2400 k, the timing unit 2400 n, the position estimation unit 2400 m, and the communication unit 2400 p.
- the imaging unit 2400a includes all or part of the lens 2400b, the imaging device 2400c, the focus control unit 2400d, the imaging control unit 2400e, the signal detection unit 2400f, and the imaging information storage unit 2400g.
- the imaging unit 2400a may be operated by the user, a change in illuminance, a sound or voice pattern, that a specific time has come, that the receiving apparatus has moved to a specific location, or the communication unit.
- the imaging device is started to start imaging.
- the focus control unit 2400d performs control such as focusing on the light emitting unit 2400ae of the transmission apparatus or focusing so that the light emitting unit 2400ae of the transmission apparatus may be largely projected.
- the exposure control unit 2400ak sets an exposure time and an exposure gain.
- the imaging control unit 2400e limits the position to be imaged to a specific pixel.
- the signal detection unit 2400 f detects, from the captured image, pixels in which the light emitting unit 2400 ae of the transmission device is included and pixels in which signal transmission by light emission is included.
- the imaging information storage unit 2400g stores control information of the focus control unit 2400d, control information of the imaging control unit 2400e, and information detected by the signal detection unit 2400f. When there are a plurality of imaging devices, imaging may be performed simultaneously, and one may be used to estimate the position and orientation of the receiving device.
- the light emission control unit 2400 ad transmits a signal by controlling the light emission pattern of the light emitting unit 2400 ae based on an input from the input unit 2400 h.
- the light emission control unit 2400 ad acquires the time when the light emitting unit 2400 ae emits light from the time measuring unit 2400 ac and records the time.
- the captured image storage unit 2400 w stores the image captured by the imaging unit 2400 a.
- the signal analysis unit 2400y Get the transmitted signal.
- the received signal storage unit 2400 z stores the signal analyzed by the signal analysis unit 2400 y.
- the sensor unit 2400 q includes all or part of the GPS 2400 r, the magnetic sensor 2400 t, the acceleration sensor 2400 s, and the gyro sensor 2400 u.
- Each of the magnetic sensor 2400t and the acceleration sensor 2400s may be a nine-axis sensor.
- the position estimation unit estimates the position and orientation of the receiving device from the information from the sensor unit, the captured image, and the received signal.
- Arithmetic unit 2400 aa displays on the display unit 2400 ab the received signal, the estimated position of the receiver, and information obtained from the network 2400 ah based on these (information related to a map or a place, information related to a transmitter, etc.)
- the arithmetic unit 2400 aa controls the transmission device based on the received signal and the information input to the input unit 2400 h from the estimated position of the reception device.
- the terminals communicate with each other without passing through the network 2400ah.
- a peer-to-peer connection method blue or the like
- the electronic device 2400 aj is controlled by the receiving device.
- the server 2400 ai stores the information of the transmission device, the position of the transmission device, and the information related to the position of the transmission device in association with the ID of the transmission device.
- the server 2400 ai stores the modulation scheme of the transmission device in association with the position.
- FIG. 70 is a block diagram of a transmitter.
- the transmission apparatus is configured by the whole of the configuration diagram or a part including the light emitting unit, the transmission signal storage unit, the modulation scheme storage unit, and the operation unit.
- the transmitter 2401ab in a narrow sense is provided in a light, an electronic device, or a robot.
- the lighting control switch 2401 n is a switch that switches on and off of lighting.
- the light diffusion plate 2401p is a member attached near the light emitting unit 2401q in order to diffuse light of the light emitting unit 2401q.
- the light emitting unit 2401 q performs lighting and extinguishing at a speed at which a light emission pattern is detected for each line, using the difference in exposure time for each line of the imaging device of the receiving device in FIG.
- the light emitting unit 2401 q is configured by a light source such as an LED or a fluorescent lamp that can be turned on and off at high speed.
- the light emission control unit 2401 r controls lighting and extinguishing of the light emitting unit 2401 q.
- the light receiving unit 2401 s is configured of a light receiving element or an imaging element.
- the light receiving unit 2401 s converts the intensity of the received light into an electric signal.
- an imaging unit may be used instead of the light receiving unit 2401s.
- the signal analysis unit 2401 t acquires a signal from the pattern of light received by the light receiving unit 2401 s.
- the operation unit 2401 u converts the transmission signal stored in the transmission signal storage unit 2401 d into a light emission pattern according to the modulation scheme stored in the modulation scheme storage unit 2401 e.
- the arithmetic unit 2401 u controls communication by editing information in the storage unit 2401 a or controlling the light emission control unit 2401 r based on the signal obtained from the signal analysis unit 2401 t.
- the calculation unit 2401 u controls communication by editing information in the storage unit 2401 a or controlling the light emission control unit 2401 r based on a signal from the attachment unit 2401 w.
- the calculation unit 2401 u controls the light emission control unit 2401 r such as editing information in the storage unit 2401 a based on a signal from the communication unit 2401 v.
- calculation unit 2401 u edits the information of the storage unit 2401 b of the attachment device 2401 h.
- the calculation unit 2401 u copies the information of the storage unit 2401 b of the attachment device 2401 h to the storage unit 2401 a.
- the arithmetic unit 2401 u controls the light emission control unit 2401 r at a predetermined time.
- the arithmetic unit 2401 u controls the electronic device 2401 zz via the network 2401 aa.
- the storage unit 2401a includes all or part of a transmission signal storage unit 2401d, a shape storage unit 2401f, a modulation scheme storage unit 2401e, and a device state storage unit 2401g.
- the transmission signal storage unit 2401 d stores a signal to be transmitted from the light emitting unit 2401 q.
- the modulation scheme storage unit 2401 e stores a modulation scheme for converting a transmission signal into a light emission pattern.
- the shape storage unit 2401 f stores the shapes of the transmission device and the light emitting unit 2401 q.
- the device state storage unit 2401 g stores the state of the transmission device.
- the mounting portion 2401 w is configured of a mounting bracket or a power supply port.
- the storage unit 2401 b of the attachment device 2401 h stores information to be stored in the storage unit 2401 a.
- the storage unit 2401a is not provided, and the storage unit 2401b of the attachment device 2401h or the storage unit 2401c of the central control device 2401m may be used.
- the terminals communicate with each other without passing through the network 2401 aa.
- a peer-to-peer connection method blue or the like
- the server 2401 y stores the information of the transmission device, the position of the transmission device, and the information related to the position of the transmission device in association with the ID of the transmission device. In addition, the server 2401 y stores the modulation scheme of the transmission device in association with the position.
- step 2800a it is checked if there are multiple imaging devices in the receiving device. In the case of No, the processing proceeds to step 2800 b, the imaging device to be used is selected, and the processing proceeds to step 2800 c. On the other hand, in the case of Yes, the process proceeds to step 2800c.
- step 2800d the exposure gain is set.
- step 2800 e imaging is performed.
- step 2800 f for each exposure line, a portion in which a predetermined number or more of pixels having a luminance exceeding a predetermined threshold continue is determined, and the center position of the portion is determined.
- step 2800 g a linear or quadratic approximate line connecting the above center positions is calculated.
- step 2800h the luminance value of the pixel on the approximation line of each exposure line is set as the signal value of each exposure line.
- the assigned time per exposure line is calculated from imaging information configured by imaging frame rate, resolution, blanking time, and the like.
- step 2800j if the blanking time is less than a predetermined time, it is considered that the exposure line following the last exposure line of a certain imaging frame is the first exposure line of the next frame. Otherwise, between the last exposure line of one imaging frame and the first exposure line of the next frame, the number of unobservable exposure lines obtained by dividing the blanking time by the charge time per one exposure line It considers it to exist.
- step 2800k the reference position pattern and the address pattern are read from the decode information.
- step 2800 m a pattern indicating the reference position of the signal is detected from the signal value of each exposure line.
- step 2800 n the data part and the address part are calculated based on the detected reference position.
- step 2800p a transmission signal is obtained.
- step 2801a the probability map of the position recognized as the current position of the receiving apparatus or the probability map of the current position is used as prior information of the self position.
- step 2801 b the imaging unit of the receiving apparatus is directed to the light emitting unit of the transmitting apparatus.
- step 2801c the azimuth and elevation angle to which the imaging device is directed are calculated from the sensor values of the 9-axis sensor and the gyro sensor.
- step 2801 d the light emission pattern is imaged to obtain a transmission signal.
- step 2801e the distance between the imaging device and the light emitting portion is calculated from the information on the size and shape of the light emitting portion included in the transmission signal, the size of the light emitting portion captured, and the imaging magnification of the imaging device. Do.
- step 2801 f the relative angle between the direction from the imaging unit to the light emitting unit and the normal to the imaging surface is calculated from the position of the light emitting unit in the captured image and the lens characteristics.
- step 2801 g the relative positional relationship between the imaging device and the light emitting unit is calculated from the numerical values calculated so far.
- the position of the receiving device is calculated from the position of the light emitting unit included in the transmission signal and the relative positional relationship between the imaging device and the light emitting unit.
- the position of the reception device can be calculated with high accuracy by calculating the coordinates of the imaging device from the signal included in each transmission device.
- a triangulation method can be used.
- step 2801i the probability map of the current position or the current position of the receiving device is updated from the advance information of the self position and the calculation result of the position of the receiving device.
- step 2801 j the imaging apparatus is moved.
- step 2801k the direction and distance of movement are calculated from the sensor values of the 9-axis sensor and the gyro sensor.
- step 2801 m the direction and distance of movement are calculated from the captured image and the attitude of the imaging device, and the process returns to step 2801 a.
- step 2802a the user presses a button.
- step 2802b the light emitting unit is made to emit light.
- the signal may be represented by a light emission pattern.
- step 2802c the light emission start time, the end time, and the time when a specific pattern is transmitted are recorded.
- step 2802 d an image is captured by the imaging device.
- step 2802e the light emission pattern of the transmission device present in the captured image is imaged, and the transmitted signal is acquired. Note that the light emission pattern may be analyzed synchronously using the recorded time. And it ends.
- the light receiving device receives light or the image pickup device picks up an image.
- step 2803b it is checked whether it is a specific pattern.
- step 2803a the process returns to step 2803a.
- step 2803c the start time at which the reception pattern is received or imaged, the end time, and the time at which the specific pattern appears are recorded.
- step 2803d the transmission signal is read from the storage unit and converted into a light emission pattern.
- step 2803e the light emitting unit is caused to emit light according to the light emission pattern, and the process is ended.
- the light may be emitted after a predetermined time has elapsed from the recorded time, and the process may end.
- step 2804a light is received by the light receiving device, and the received light energy is converted into electricity and stored.
- step 2804 b it is checked whether the stored energy has reached a predetermined level or more.
- step 2804a the process returns to step 2804a.
- step 2804c the received light is analyzed, and the time when the specific pattern appears is recorded.
- step 2804 d the transmission signal is read from the storage unit and converted into a light emission pattern.
- step 2804 e the light emitting unit is caused to emit light according to the light emission pattern, and the process ends.
- the light may be emitted after a predetermined time has elapsed from the recorded time, and the process may end.
- FIG. 76 is a diagram for explaining the situation in which information provision is received in the station yard.
- the reception device 2700a captures the illumination installed in the station facility and reads the light emission pattern and the position pattern to receive the information transmitted by the illumination device.
- the receiving device 2700a acquires information on illumination and facilities from the server based on the received information, and further estimates the current position of the receiving device 2700a from the size and shape of the captured illumination.
- the receiving device 2700a displays information obtained based on the facility ID and the position information (2700b).
- the receiving device 2700a downloads the map of the facility based on the facility ID, and navigates to the boarding place from the ticket information purchased by the user (2700c).
- FIG. 76 shows an example at a railway station, the same applies to facilities such as an airport, a port and a bus stop.
- FIG. 77 is a diagram showing a state of use in a vehicle.
- the reception device 2704a possessed by the passenger and the reception device 2704b possessed by the salesperson receive the signal transmitted by the illumination 2704e, and estimate their own current position.
- Each receiving device may acquire information necessary for self-position estimation from the illumination 2704e, or may acquire information transmitted from the illumination 2704e from the server using a key, or the boarding station or ticket gate It may be acquired in advance based on position information and the like.
- the receiving device 2704a may recognize that the current position is within the vehicle from the boarding time information of the ticket purchased by the user (passenger) and the current time, and may download the information associated with the vehicle. .
- Each receiver notifies the server of its current position.
- the reception device 2704a notifies the server of the user (passenger) ID, the ID of the reception device, and the information of the ticket purchased by the user (passenger), so that the server allows the person sitting in the seat to get in Confirm that you are the person with the reserved seat right.
- the reception device 2704a displays the current position of the salesperson, so that the user (passenger) can consider the purchase timing of in-vehicle sales.
- the reception device 2704a When the passenger places an order for in-vehicle sales via the reception device 2704a, the reception device 2704a notifies the salesperson's reception device 2704b or the server of the position, the order content, and the charging information of itself.
- the sales clerk reception device 2704 b displays a map _ 2704 d indicating the position of the orderer.
- the passenger can also purchase a designated seat ticket and a connecting ticket via the reception device 2704a.
- the receiver 2704a displays the vacant seat information 2704c.
- the reception device 2704a notifies the server of purchase information and charge information of the designated seat ticket and the connecting ticket from the boarding section information of the ticket purchased by the user (passenger) and the current position of the user.
- FIG. 77 shows an example in a railway, the same applies to airplanes, ships, and vehicles such as bus stops.
- FIG. 78 is a diagram showing a state of use in a store.
- the reception devices 2707b, 2707c, and 2707d receive the signal transmitted by the illumination 2707a, estimate their own current position, and notify the server.
- Each receiving device may acquire the information necessary for self-position estimation and the address of the server from the illumination 2707a, or may acquire the information transmitted from the illumination 2707a from another server using a key. , May be obtained from the accounting system.
- the accounting system associates the accounting information with the receiving device 2707d, displays the current position of the receiving device 2707d (2707c), and delivers the ordered item.
- the receiving device 2707 b displays the product information based on the information transmitted from the illumination 2707 a.
- the receiving device 2707b notifies the server of the product information, the charging information, and the current position.
- the seller can deliver the ordered product based on the position information of the reception device 2707b, and the purchaser can purchase the product while sitting at a seat.
- FIG. 79 is a diagram showing a state in which a certificate of wireless connection is communicated and wireless connection is established.
- the electronic device (digital camera) 2701 b operates as an access point for wireless connection, and transmits an ID and a password as light emission patterns as information necessary for the connection.
- the electronic device (smartphone) 2701a acquires transmission information from the light emission pattern, and establishes a wireless connection.
- connection to be established may be a wired connection network.
- Communication between the two electronic devices may be performed via the third electronic device.
- FIG. 80 is a diagram showing the range of communication based on the light emission pattern and the position pattern.
- the radio waves In the communication method using radio waves, the radio waves reach the next room separated by the wall, and it is difficult to limit the communication range.
- FIG. 81 is a diagram showing a state of indoor use in an underground mall or the like.
- the receiver 2706a receives the signal transmitted by the light 2706b, and estimates its current position. In addition, the reception device 2706a displays the current position on the map to perform route guidance, and displays information on nearby stores.
- the communication In an emergency, by transmitting disaster information and evacuation information from the lighting 2706b, the communication is congested, the communication base station fails, or the radio wave from the communication base station does not reach. You can also get these information. This is effective for hearing-impaired persons who can not listen to the emergency broadcast or can not listen to the emergency broadcast.
- FIG. 82 is a diagram showing a state of outdoor use such as a street.
- the receiving device 2705a receives the signal transmitted by the streetlight 2705b, and estimates its own current position. In addition, the reception device 2705a displays the current position on a map to perform route guidance, and displays information on nearby stores.
- displaying movements of other vehicles or pedestrians on a map or notifying a user that there are vehicles or pedestrians approaching can help prevent an accident.
- FIG. 83 is a diagram showing a state of direction indication.
- the reception device 2703e can download a map in the vicinity, or estimate its own position with an accuracy of 1 cm to several tens of cm.
- Knowing the exact position of the reception device 2703e enables automatic driving of the wheelchair 2703d and safe passage of blind people.
- the receiving device in FIG. 84 includes an in-camera 2710 a, a touch panel 2710 b, a button 2710 c, an out-camera 2710 d, and a flash 2710 e.
- the camera shake can be corrected by estimating the movement and posture of the reception device from the image taken by the in-camera.
- the in-camera By receiving signals from other transmitting devices by the in-camera, it is possible to simultaneously receive signals from a plurality of devices or to improve the accuracy of self-position estimation of the receiving device.
- the transmitting device 1 receives light emission of the light emitting unit of the transmitting device 2 by the light receiving unit, and acquires the signal transmitted by the transmitting device 2 and the timing of transmission.
- the light emission is performed in the same pattern in synchronization with the light emission of the transmission device 2 to transmit the signal.
- the common part with the transmission signal of the transmission device 2 is synchronized with the light emission of the transmission device 2 and emits light with the same pattern .
- the part which is not common transmits at the time when the transmitter 2 is not transmitting a signal. If there is no time when the transmitter 2 does not transmit a signal, the cycle is appropriately determined, and the non-common part is transmitted according to the cycle. In this case, the transmitter 2 receives the light emission of the transmitter 1 by the light receiving unit, detects that different signals are transmitted at the same time, and is not common at the time when the transmitter 1 does not transmit a signal Send part of the signal.
- CSMA / CD Carrier Sense Multiple Access with Collision Detection
- the transmission device 1 causes the light emitting unit to emit light using its own information as a light emission pattern.
- the transmitting device 2 acquires the information of the transmitting device 1 from the light receiving unit.
- a transmitting device creates an arrangement map of transmitting devices by exchanging information with each other by communicable transmitting devices.
- the transmission device obtains an optimum light emission pattern as a whole so that signal transmission by light emission does not collide.
- the transmission device acquires information obtained by another transmission device by communication between the transmission devices.
- the transmission device transmits the information stored in the storage unit of the attachment device when the transmission device is attached to the attachment device or when the information stored in the storage unit of the attachment device is changed. Store in the storage unit of the device.
- the information stored in the storage unit of the attachment device or the transmission device includes the transmission signal and the timing of transmission.
- the transmission device stores the information in the storage unit of the attachment device.
- Information in the storage unit of the attachment device and the storage unit of the transmission device is edited from the central control unit and the switchboard. Power line communication is used for operation from the switchboard.
- the shape storage unit of the transmission device stores the positional relationship between the attachment portion of the transmission device and the center position of the light emission unit.
- the transmitting device When transmitting the position information, transmits the position information obtained by adding this positional relationship to the position information stored in the storage unit.
- Information is stored in the storage unit of the mounting device at the time of construction of a building or the like.
- the exact position is stored by using a design drawing of a building or CAD data.
- the position can be confirmed, and can be used for automation of construction, confirmation of the use position of materials, and the like.
- the attachment device notifies the central control device of the information of the transmission device.
- the mounting device notifies the central control device that a device other than the transmitting device has been mounted.
- the light receiving unit receives light
- the signal analysis unit acquires information from the light pattern
- the information is stored in the storage unit.
- the transmission device converts the information stored in the storage unit into a light emission pattern and causes the light emission unit to emit light.
- the transmission device stores the signal received by the communication unit in the storage unit.
- the transmission device converts the information stored in the storage unit into a light emission pattern and causes the light emission unit to emit light.
- the transmission device converts an appropriate signal into a light emission pattern and causes the light emission unit to emit light.
- the receiving device acquires a signal transmitted by the transmitting device from the imaging unit, and transmits the signal and information to be stored in the transmitting device to the transmitting device or the central control device via the communication unit.
- the transmitting device or the central control device stores the transmitted information in the storage unit of the transmitting device that has transmitted the same signal as the signal acquired from the imaging unit from the receiving device.
- the receiving device transmits the time at which the signal transmitted by the transmitting device is captured, and the transmitting device and the central control device use the time to specify the transmitting device imaged by the receiving device. good.
- the communication unit of the receiving device may be a light emitting unit
- the communication unit of the transmitting device may be a light receiving unit or an imaging unit, and information may be transmitted from the receiving device to the transmitting device using a light emission pattern.
- the communication unit of the receiving device may be a sound generation unit
- the communication unit of the transmitting device may be a sound collecting unit
- information may be transmitted from the receiving device to the transmitting device using an audio pattern.
- FIG. 89 is a diagram showing a case where it is used in combination with a two-dimensional barcode.
- the user causes the communication device 2714a to be opposite to the communication device 2714d.
- the communication device 2714a causes the display to display the transmission information as a two-dimensional barcode 2714c.
- the communication device 2714 d reads the two-dimensional barcode 2714 c with the two-dimensional barcode reader 2714 f.
- the communication device 2714 d expresses transmission information as a light emission pattern of the light emitting unit 2714 e.
- the communication device 2714a captures an image of the light emitting unit with the imaging unit 2714b and reads a signal. This method enables direct communication in both directions, and when the amount of data to be transmitted is small, communication can be performed at higher speed than communication via a server.
- FIG. 90 is a figure which shows a mode of map preparation and its utilization.
- the robot 2715a performs self-position estimation based on the light 2715d and the signal transmitted from the electronic device 2715c, thereby creating a map of the room_2715f, and provides map information, position information, and the ID of the light 2715d and the electronic device 2715c as a server. 2715e is stored.
- the reception device 2715b creates a map _ 2715f of a room from the signal transmitted by the illumination 2715d or the electronic device 2715c, the picked up image during movement, or the sensor value of the gyro sensor or 9 axis sensor, and the map information and position information And the ID of the lighting 2715 d and the electronic device 2715 c are stored in the server 2715 e.
- the robot 2715a performs cleaning and layout efficiently based on the map 2715f acquired from the server 2715e.
- the receiving device 2715b instructs the robot 2715a on a cleaning place or a moving place based on the map 2715f acquired from the server 2715e, or operates an electronic device in the direction in which the receiving device is directed.
- FIG. 91 is a diagram showing the state acquisition and operation of the electronic device.
- the communication device 2716a converts the control information into a light emission pattern, and causes the light emitting unit to emit light toward the light receiving unit 2716d of the electronic device 2716b.
- the electronic device 2716b reads the control information from the light emission pattern, and operates according to the control information.
- the electronic device 2716b converts information indicating the state of the electronic device into a light emission pattern, and causes the light emitting unit 2716c to emit light.
- the electronic device 2716b converts the information into a light emission pattern, and causes the light emitting unit 2716c to emit light.
- the communication device 2716a captures an image of the light emitting unit 2716c, and obtains the transmitted signal.
- FIG. 92 is a diagram showing how an electronic device being imaged is recognized.
- the communication device 2717a has a communication path to the electronic device 2717b and the electronic device 2717e, and transmits an ID display instruction to each electronic device.
- the electronic device 2717b receives the ID display command, and transmits the ID signal in the light emission pattern of the light emitting unit 2717c.
- the electronic device 2717e receives the ID display command, and transmits an ID signal in a position pattern using the light emitting units 2717f, 2717g, 2717h, and 2717i.
- the ID signal transmitted by each electronic device may be an ID held by the electronic device, or may be content instructed by the communication device 2717a.
- the communication device 2717a recognizes the positional relationship between the electronic device being imaged and the electronic device and the receiving device from the light emission pattern and the position pattern of the light emitting unit present in the captured image.
- the electronic device include three or more light emitting units.
- FIG. 93 is a view showing how an augmented reality object is displayed.
- the stage 2718e for displaying the augmented reality transmits information of the augmented reality object and a reference position for displaying the augmented reality object in the light emission patterns and position patterns of the light emitting units 2718a, 2718b, 2718c, and 2718d.
- the receiving device superimposes the augmented reality object 2718f on the captured image and displays it based on the received information.
- the center of the imaging range is directed to the light emitting unit as shown in FIG. 94 in order to direct the center of the imaging range toward the light emitting unit. Display to prompt the user.
- the center of the imaging range is directed to the light emitting unit as shown in FIG. 95 in order to direct the center of the imaging range toward the light emitting unit. Display to prompt the user.
- the position of the light emitting unit can be estimated from information of the previous imaging result or the 9-axis sensor, gyro sensor, microphone or position sensor attached to the imaging terminal. As shown in FIG. 96, the display for prompting the user to direct the center of the imaging range toward the light emitting unit is performed.
- the size of the graphic to be displayed is adjusted according to the distance over which the imaging range is desired to be moved, as shown in FIG.
- the user When the light emitting unit is out of the center of the imaging range and the size of the image is not sufficient, the user is urged to direct the center of the imaging range toward the light emitting unit as shown in FIG. And a display prompting the user to take images closer to the light emitting unit.
- the center of the imaging range Is displayed to prompt the user to turn the light toward the light emitting unit and to prompt the user to rotate the imaging range.
- a display which urges the user to approach and capture an image, and prompts the user to rotate the imaging range.
- the light emitting unit is out of the center of the imaging range, and the size of the image of the light emitting unit is not sufficient, and it is easy to receive the signal of the light emitting unit by changing the angle between the light emitting unit and the imaging range
- the user is urged to direct the center of the imaging range toward the light emitting unit, and the user is urged to take images closer to the light emitting unit, and the imaging range is rotated. Display a message prompting the user to
- an indication indicating that a signal is being received and an amount of information of the received signal are displayed.
- the progress bar displays the percentage of the signal that has been completely received and the amount of information.
- the progress bar displays the ratio of the signal that has been received, the location of reception, and the amount of information of the received signal.
- the light emitting unit When the light emitting unit is detected, as shown in FIG. 108, for example, the light emitting unit is blinked to indicate that the object is the light emitting unit.
- the light emitting unit is blinked to indicate that the signal is received from the light emitting unit.
- the user when a plurality of light emitting units are detected, the user causes the user to designate a transmitting device that receives a signal or performs an operation by causing the user to tap one of the light emitting units.
- Embodiment 4 (Application to ITS)
- ITS Intelligent Transport Systems: Intelligent Transportation System
- high-speed communication of visible light communication is realized, and adaptation to the ITS field is possible.
- FIG. 111 is a diagram for explaining communication between a traffic system equipped with a visible light communication function and vehicles and pedestrians.
- the traffic light 6003 is equipped with the visible light communication function of the present embodiment, and can communicate with the vehicle 6001 and the pedestrian 6002.
- Information transmission from the vehicle 6001 and the pedestrian 6002 to the traffic light 6003 is performed by a headlight or a flash light emitting unit of a portable terminal held by the pedestrian.
- Information transmission from the traffic light 6003 to the vehicle 6001 and the pedestrian 6002 is performed by lighting a signal with a camera sensor of the traffic light 6003 or a camera sensor of the vehicle 6001.
- the traffic light 6003 provides road traffic information to the vehicle 6001.
- the road traffic information is information that assists driving such as traffic congestion information, accident information, and information on the peripheral service area.
- the traffic light 6003 is equipped with the LED lighting, communication with this LED lighting makes it possible to provide information to the vehicle 6001 without mounting a new device.
- the vehicle 6001 since the vehicle 6001 generally travels at high speed, the amount of data that can be transmitted is small with the conventional visible light communication technology, but since the communication speed is improved according to the present embodiment, data that can be transmitted to the vehicle It has the effect of increasing the size.
- the traffic light 6003 and the lighting 6004 different information can be provided for each signal and lighting. This makes it possible to transmit information according to the position of the vehicle, such as transmitting information only to the vehicle traveling on the right turn lane, for example.
- the illumination distributing the current position information it is possible to provide position information to the vehicle 6001 and the pedestrian 6002 by the illumination distributing the current position information.
- Facilities that have roofs on the roof such as shopping streets and tunnels, may have difficulty acquiring position information using GPS, but if visible light communication is used, it is possible to acquire position information even in such situations. There is a merit that there is.
- the communication speed can be further increased as compared with the conventional case, so that information can be received while passing a specific spot such as a store or an intersection.
- the present embodiment is intended to speed up visible light communication, so that it can be applied to all other ITS systems using visible light communication.
- FIG. 112 is a schematic view of the case where the present invention is applied to inter-vehicle communication in which vehicles communicate with each other using visible light communication.
- the vehicle 6001 transmits information to the rear vehicle 6001a through a brake lamp and other LED lighting. It is also possible to transmit data to the opposing vehicle 6001b through headlights or other lights that illuminate the front.
- the communication speed of visible light communication has been improved by the present invention, there is an advantage that information can be transmitted while passing an oncoming vehicle.
- the information transmission interval is shortened also for the rear vehicles, it becomes possible to transmit information to many vehicles in a shorter period.
- by raising the communication speed it is also possible to send voice and image information. This enables richer information to be shared between vehicles.
- FIG. 113 is a schematic view of a position information notification system and a facility system using the visible light communication technology of the present embodiment.
- a system will be described as a system for delivering a patient's medical chart, a transported object, medicine and the like by a robot.
- the robot 6101 has a visible light communication function.
- the lighting distributes location information.
- the robot 6101 can deliver medicines and other articles to a specific hospital room by acquiring the position information of the illumination. This reduces the burden on the doctor. In addition, since the lighting does not leak to the next room, there is an effect that the robot 6101 does not mistake the room.
- the system using visible light communication is not limited to a hospital, and can be applied to a system that distributes position information using a lighting fixture.
- position and guidance information may be transmitted from illumination of a guidance display board, or may be used for moving a cart in an airport.
- radio waves when transmitting information that is inside / outside a room, if location information is distributed using a wireless LAN, radio waves will leak to adjacent rooms and corridors, so radio waves do not fly outside the room It was necessary to have a function to block radio waves on the outer wall. If the radio wave is still blocked at the outer wall, there is a problem that devices that communicate with the outside such as a mobile phone can not be used.
- position information is transmitted using visible light communication according to the present embodiment
- communication is possible only within the reach of the illumination, so that it is easy to transmit position information of a specific room to the user, for example. is there. Also, since the outer wall usually shields light, there is an effect that no special device is required.
- FIG. 114 illustrates a supermarket system in which a device equipped with the communication method of the present embodiment is mounted on a shopping cart in a store, and location information is acquired from illumination of a product rack or indoor illumination.
- the cart 6201 is mounted with a visible light communication apparatus using the communication method of this embodiment.
- the illumination 6100 distributes position information and information of its shelf by visible light communication.
- the cart can receive the product information distributed from the lighting. Also, by receiving the position information, it is possible to know which shelf the user is in. For example, if the position information of the shelf is stored in the cart, the user simply designates the shelf to which the user wants to go or the product desired Then, it becomes possible to display on the cart where to go.
- the cart transmits together with the cart information to the illumination using visible light communication, or transmits to the server using a wireless LAN or the like.
- the cart is equipped with a memory, and data is collected after the store is closed, etc., thereby aggregating to the server what route the cart has passed.
- FIG. 115 shows an application example using visible light communication according to this embodiment.
- the mobile phone terminal 6301 transmits data to the camera 6302 using a flashlight.
- the camera 6302 receives data transmitted from the mobile phone terminal 6301 from the light information received by the imaging unit.
- shooting settings for the camera are set and transmitted to the camera 6302. This makes it possible to set the camera using the rich user interface of the mobile phone terminal.
- the setting information may be transmitted from the mobile phone terminal to the camera without mounting a new communication device such as a wireless LAN. It becomes possible.
- FIG. 116 is a schematic view of the case where the communication system of this embodiment is applied to underwater communication. Since water does not transmit radio waves, underwater divers, ships on the sea and ships in the sea can not communicate via radio. The visible light communication of this embodiment can also be used in water.
- the visible light communication method of the present embodiment it is possible to transmit data from an object or structure that emits light.
- the light receiving unit is directed to a building, guide information and detailed information of the building can be acquired, and thus useful information can be provided to tourists.
- the visible light communication method of the present embodiment is applicable to communication between a lighthouse and a ship. Since communication with a larger capacity can be performed than ever before, more detailed information can be exchanged.
- communication control can be performed in units of rooms, such as communication in only a specific room.
- communication control can be performed in units of rooms, such as communication in only a specific room.
- actual communication is wireless using the communication method of the present embodiment or using the communication method of the present embodiment for exchange of key information
- the present invention is applicable to applications such as communication using LAN and the like.
- the communication method of the present embodiment can be diverted to all communication by LED and an imaging device provided with a MOS sensor, and can be applied to, for example, a digital camera and a smartphone.
- FIG. 117 is a diagram for describing an example of service provision to a user in the fifth embodiment.
- FIG. 117 shows a net server 4000a, transmitters 4000b, 4000d, 4000e, receivers 4000c, 4000f, and a building 4000g.
- Services can be provided to the user by receiving and processing signals from the plurality of transmitters 4000b, 4000d, and 4000e located inside and outside the house by the receivers 4000c and 4000f.
- the transmitter and the receiver may process signals independently and may be able to provide a service to the user, or in response to an instruction from the network in cooperation with the net server 4000a configuring the network, their behavior or transmission Services may be provided to the user while changing the
- the transmitter and the receiver may be mounted on a moving object such as a person or a vehicle, may be mounted on a stationary object, or may be mounted on an existing object later. Good.
- FIG. 118 is a diagram for describing an example of service provision to a user in the fifth embodiment.
- a transmitter 4001a and a receiver 4001b are shown.
- the receiver 4001b receives a signal transmitted from the plurality of transmitters 4001a, and processes the information included in the signal to provide service to the user.
- the contents included in the signal include the device ID uniquely indicating the device, location information, map, signs, tourist information, traffic information, regional services, coupons, advertisements, product descriptions, characters, music, videos, photos, voice , Menu, broadcast, emergency guide, timetable, guide, application, news, bulletin board, command to device, information identifying an individual, cash voucher, credit card, security, URL etc.
- the user performs in advance registration processing and the like for utilizing the information contained in these signals on the net server, and the user actually sends these signals to the receiver 4001b at the place where the transmitter 4001a transmits the signal. It may be possible to receive the provision of the service by receiving the information at the time of receiving the request, or may be able to receive the provision of the service without intervention of the net server.
- FIG. 119 is a flow chart showing a case where a receiver in this embodiment simultaneously processes a plurality of signals received from a transmitter.
- step 4002a starts.
- step 4002b the receiver receives signals from multiple light sources.
- step 4002 c the area in which each light source is displayed is determined, and a signal is extracted from each light source.
- the process is repeated based on the information included in the signal in step 4002 e until the number becomes zero in the process for the number of signals acquired in step 4002 d.
- the process ends at step 4002 f.
- FIG. 120 is a diagram showing an example in the case of realizing communication between devices by mutual communication in the fifth embodiment.
- FIG. 118 shows an example of the case where communication between devices is realized by the transmitter and the plurality of transceivers 4003a having a receiver, the transceiver 4003b, and the transceiver 4003c communicating with each other.
- the transmitter / receiver may be able to communicate only between the same devices as shown in FIG. 118, or may be able to communicate between different devices.
- the application is distributed to a mobile phone, a smart phone, a personal computer, a game machine, etc. using communication means provided by the present embodiment, other networks, and removable storage, and those applications
- communication means provided by the present embodiment, other networks, and removable storage, and those applications
- LEDs, photodiodes, image sensors By providing the devices (LEDs, photodiodes, image sensors) of the devices already installed, it is possible to provide services to the user.
- the application may be installed in the device from the beginning.
- the present invention is applied to a public facility such as a movie theater, a concert hall, a museum, a hospital, a public hall, a school, a company, a shopping street, a department store, a government office, a food shop and the like.
- the present invention can lower the directivity of the signal generated by the transmitter to the receiver as compared to conventional visible light communication, so that information can be simultaneously transmitted to a large number of receivers present in public facilities. Can.
- FIG. 121 is a diagram for illustrating a service using the characteristic of directivity in the fifth embodiment.
- FIG. 121 shows a screen 4004a, a receiver 4004b, and an illumination 4004c.
- the transmitter uses, as a signal, light emitted from a light projected in a picture projected on a screen 4004a displaying a movie or in a facility as a signal, and includes a command for controlling the receiver 4004b.
- the command for controlling the receiver 4004 b includes the sound emitted from the power supply or the reception, the communication function, the display of the LED, the ON / OFF of the vibration, the level adjustment, and the like.
- the receiver can control the intensity of directivity by filtering the signal transmitted by the transmitter using the intensity of the light source or the like.
- the directivity by setting the directivity low, it is possible to simultaneously transmit commands and information to receivers present in the facility.
- the transmitter side may limit the amount of light source, or the receiver side may lower the sensitivity for receiving the light source or may process the amount of the received light source. You may impose restrictions.
- the order sent from the transmitter at hand of the user Can be detected by a receiver located at a location overlooking the store, which receives the signal including the symbol “1” and which menu the user has ordered.
- the service provider can provide a fair service to the user by setting the time axis and processing the order.
- the transmission path between the transmitter and the receiver may use a protocol such as SSL, which is used as a standard in the Internet, to suppress interception of signals from other devices.
- FIG. 122 is a diagram for describing another example of service provision to a user in the fifth embodiment. Specifically, FIG. 122 illustrates an example of a service in the case where the present embodiment is utilized using a camera 4005a mounted in a receiver such as a mobile phone, a smartphone, or a game machine. Here, FIG. 122 shows a camera 4005a, a light source 4005b, and contents to be superimposed 4005c.
- Signals 4005d transmitted by the plurality of light sources 4005b are extracted from the imaging result of the camera 4005a, and information included in the signals 4005d is superimposed and displayed on the camera 4005a.
- the content 4005c to be superimposed on the camera 4005a includes a character string, an image, a moving image, a character, an application, a URL, and the like. Note that not only the superposition with the camera but also voice, vibration and the like may be used to process the information contained in the signal.
- FIG. 123 is a diagram illustrating an example format of a signal included in a light source emitted by a transmitter.
- the light source characteristic 4006a, the service type 4006b, and the information 4006c related to the service are shown.
- the information 4006c related to the service of superimposing the signal received by the receiver on the camera is obtained by filtering the information that can be acquired from the signal according to the information such as the service type 4006b included in the signal emitted by the transmitter and the distance from the camera to the light source It is a result.
- the content to be filtered by the receiver may be in accordance with the setting of the receiver set in advance, or may be in accordance with the preference of the user set in the receiver by the user.
- the receiver can estimate the distance to the transmitter that emits the signal and display the distance to the light source.
- the receiver performs digital signal processing on the amount of light emitted from the transmitter captured by the camera to estimate the distance to the transmitter.
- the amount of light of each transmitter imaged by the receiver with the camera varies depending on the position and intensity of the light source, and therefore, when the distance is estimated based on only the amount of light of the transmitter thus imaged, the distance may be significantly deviated.
- the light source characteristic 4006 a indicating the intensity, color, type, etc. of the light source is included in the signal emitted by the transmitter.
- the receiver can estimate the distance with high accuracy by performing digital signal processing in consideration of the characteristics of the light source included in the signal.
- the distance is estimated by the light quantity of the light sources. If there is a transmitter whose intensity of the light source imaged by the receiver is not the same, the distance is not estimated only by the amount of light source, but combined with other distance measuring means to estimate the distance from the transmitter to the receiver.
- the distance may be estimated using the parallax of the image captured by the two-lens lens, or the distance may be estimated using an infrared ray, a millimeter wave radar or the like.
- the amount of movement of the receiver may be acquired by imaging or the like mounted on a 9-axis sensor or receiver, and the moved distance may be combined with triangulation to estimate the distance.
- the receiver not only filters and displays the signal using the strength and distance of the signal generated on the transmitter side, but also adjusts the directivity of the signal received by the receiver from the transmitter. You may use as a means.
- FIG. 124 is a diagram showing an example of the in-home environment in the present embodiment.
- the smartphone 1105 and the like are also around the user with respect to the television 1101, the microwave oven 1106, and the air purifier 1107.
- FIG. 125 is a diagram showing an example of communication between a home electric device and a smartphone in the present embodiment.
- FIG. 125 shows an example of information communication, and is a diagram showing a configuration for obtaining information by acquiring information output from each device such as the television 1101 or the microwave 1106 in FIG. 124 with the smartphone 1201 owned by the user. is there.
- each device transmits information using a blinking pattern of LEDs
- the smartphone 1201 receives information using an imaging function by a camera or the like.
- FIG. 126 is a diagram showing an example of a configuration of transmission-side apparatus 1301 in the present embodiment.
- the transmission side apparatus 1301 uses information as a blinking pattern of light by detecting a change in state such as a failure inside the apparatus by sending a transmission instruction by a user pressing a button, NFC (Near Field Communication) or the like. Send. At this time, transmission is repeated for a fixed time.
- a shortened ID may be used as long as it is between devices registered in advance.
- authentication information required for the connection can be transmitted as a blinking pattern.
- the transmission speed determination unit 1309 grasps the performance of the clock generation device inside the device, and slows the transmission speed when the clock generation device is inexpensive and has low accuracy, and the transmission speed when high. It is possible to perform processing to speed up. In the case where the performance of the clock generation device is poor, it is also possible to reduce an error due to accumulation of a shift of the blinking interval in long-time communication by dividing the information itself to be short.
- FIG. 127 is a diagram showing an example of a configuration of reception side apparatus 1401 in the present embodiment.
- the reception-side device 1401 determines an area in which light blinks can be seen from the frame image acquired by the image acquisition unit 1404. At this time, blinking may be performed by tracking an area in which high and low brightness levels are observed.
- the blinking information acquisition unit 1406 acquires the transmitted information from the blinking pattern, and when the apparatus association such as the device ID is included in the information, the information is used to accompany the related server on the cloud Information is interrogated, or interpolation is performed using information stored in a device in a wireless communication area stored in advance or in the receiving device. As a result, it is possible to obtain an effect such as error correction due to noise when imaging a light emission pattern, or reducing the time for the user to hold the smartphone over the light emitting unit of the transmission side device in order to acquire already acquired information. .
- FIG. 128 will be described.
- FIG. 128 is a diagram showing a flow of processing for sending information to a reception-side device such as a smartphone by blinking of the LED of the transmission-side device in the present embodiment.
- a transmitting device having a function of communicating with a smartphone by NFC and a light emission pattern of an LED embedded in a part of a communication mark for NFC possessed by the transmitting device.
- step 1001a the user purchases an electric home appliance, and for the first time, the power is connected to the outlet to be in an energized state.
- step 1001 b it is checked whether the initial setting information has been written. In the case of Yes, the process proceeds to circle 3 in the figure. On the other hand, in the case of No, the process proceeds to step 1001 c, and the mark blinks at a blink rate (eg, 1 to 2/5) that the user can easily understand.
- a blink rate eg, 1 to 2/5
- step 1001 d it is checked whether the user touches the smartphone with NFC communication to acquire the device information of the home appliance.
- the process proceeds to step 1001 e where the smartphone receives the device information in the cloud server and registers it in the cloud.
- step 1001 f the abbreviated ID associated with the account of the smartphone user is received from the cloud, transmitted to the home appliance, and the process proceeds to step 1001 g.
- the process proceeds to step 1001 g.
- step 1001 g it is confirmed whether there is registration by NFC. In the case of Yes, the process proceeds to step 1001j, and after blinking of blue twice, the blinking is finished in step 1001k.
- step 1001 g the process proceeds to step 1001 h. Subsequently, in step 1001h, it is checked whether 30 seconds have passed.
- step 1001i the process proceeds to step 1001i, and the LED part outputs device information (device model number, registration processing presence / absence with NFC, unique ID of the device) by blinking of light, and to circle 2 in FIG. move on.
- step 1001 h the process returns to step 1001 d.
- FIGS. 129 to 132 are diagrams showing a flow of processing of transmitting information to the reception side device by blinking of the LED of the transmission side device.
- FIG. 129 will be described.
- step 1002 a the user activates an application for acquiring light blink information of the smartphone.
- step 1002b the image acquisition portion acquires the blinking of light. Then, the blinking area determination unit determines the blinking area from the time-series change of the image.
- step 1002c the blink information acquisition unit determines the blink pattern of the blink area, and waits for preamble detection.
- step 1002d when the preamble can be detected, information of the blinking area is acquired.
- step 1002e when the device ID information can be acquired, the information is sent to the server on the cloud side even in the reception continuation state, and the information obtained from the cloud side by the information interpolation unit and the information on the blinking information acquisition unit Interpolate while comparing.
- step 1002 f when all the information including the interpolated information is prepared, the smartphone or the user is notified. At this time, by displaying related sites and GUIs obtained from the cloud, rich and easy notification becomes possible, and the process proceeds to circle 4 in FIG. 130.
- FIG. 130 will be described.
- step 1003a when the home appliance generates a message such as a breakdown, the number of times of use notified to the user, or a room temperature, the information transmission mode is started.
- step 1003 b the mark is blinked at 1/2 / s.
- the LED will also start transmitting information.
- step 1003c it is confirmed whether communication by NFC has been started. In the case of No, the process proceeds to circle 7 in FIG. In the case of Yes, the process proceeds to step 1003 d to stop the blinking of the LED.
- step 1003e the smartphone accesses the cloud server to display related information.
- step 1003f in the case of a failure that requires on-site response, the server side searches for a service person to support. Use home appliances, installation locations, and location information.
- step 1003 g the service person enters the support mode by pressing the button of the home appliance in a predetermined order.
- step 1003 h some or all of the LEDs that are simultaneously visible when the blinking of the marker is visible from the smartphone with LEDs of home appliances other than the marker blink so as to interpolate information, and the process proceeds to circle 5 in FIG.
- FIG. 131 will be described.
- step 1004a the serviceman presses the setting button when the performance of the owning receiving terminal can detect blinking at high speed (e.g. 1000 / s times).
- step 1004 b the LED of the home appliance blinks in the high-speed mode and proceeds to circle 6.
- FIG. 132 will be described.
- step 1005a blinking continues.
- step 1005b the user comes to obtain LED blink information on the smartphone.
- step 1005 c the user activates an application for acquiring light blink information of the smartphone.
- step 1005d the image acquisition portion acquires the blinking of light. Then, the blinking area determination unit determines the blinking area from the time-series change of the image.
- step 1005e the blink information acquisition unit determines the blink pattern of the blink area, and waits for preamble detection.
- step 1005f when the preamble can be detected, the information of the blinking area is acquired.
- step 1005g when the device ID information can be acquired, the information is sent to the server on the cloud side even in the reception continuation state, and the information obtained from the cloud side by the information interpolation unit and the information on the blinking information acquisition unit Interpolate while comparing.
- step 1005h when all the information including the interpolated information is prepared, the smartphone or the user is notified. At this time, by displaying related sites and GUIs obtained from the cloud, richer and easy-to-understand notification becomes possible.
- a transmitting device such as a home appliance can transmit information to the smartphone by blinking the LED.
- Even devices that do not have a wireless function or a communication means such as NFC can transmit information, and can also provide users with rich information in a server on the cloud via a smartphone.
- both two-way communication for example, NFC communication
- one-way communication for example, communication due to a change in luminance of LEDs
- data transmission / reception is realized by two-way communication when data transmission / reception is performed by one-way communication from one device to the other device when having a function of transmitting / receiving data by a communication method It is possible to stop one-way communication. This is efficient because waste of power consumption required for one-way communication can be eliminated.
- the information communication apparatus uses an information management unit that manages device information including its own unique ID and device status information, a light emitting element, and information as a blinking pattern of the light emitting element.
- the optical transmission unit converts the device information into a blinking pattern of light and transmits the light when there is a change in the internal state of the device.
- an activation history management unit for storing sensed information in the device such as its own activation state and user usage history is further provided, and the light transmission unit is registered in advance of the clock generation device to be used. Performance information may be acquired and transmission speed may be changed.
- a second light emitting element is disposed around a first light emitting element for transmitting information by flickering of light, and the second light emitting element is the second light emitting element.
- the information transmission by the blinking of one light emitting element is repeated a certain number of times, light may be emitted between the end and the start of the information transmission.
- FIG. 133 and FIG. 134 are diagrams for explaining the procedure for communicating between the user and the device using visible light in the present embodiment.
- FIG. 133 will be described.
- step 2001a the user turns on the power.
- step 2001b it is checked whether initial setting such as installation setting and NW setting is performed as the start-up process.
- step 2001 f the normal operation is started, and the process ends as indicated by circle 3.
- step 2001 c the process proceeds to step 2001 c, and notifies the user that the initial setting is necessary by “LED normal light emission” and “buzzer sound”.
- step 2001 d device information (product number and serial number) is collected to prepare for visible light communication.
- step 2001e the device information (product number and serial number) visible light communication is possible to the user by "LED communication light emission” "icon display on display” "buzzer sound” "light multiple LEDs” Notice.
- FIG. 134 will be described.
- step 2002a the proximity sensor, the illuminance sensor, and the human sensor detect the approach of the visible light receiving terminal.
- step 2002b visible light communication is started using the sensing as a trigger.
- step 2002c the user acquires device information at the visible light receiving terminal.
- step 2002f sensing by the "sensitivity sensor” and sensing that the room is turned off by "cooperation with the light control device” is stopped to stop the light emission of the device information, as shown by circle 5 finish.
- step 2002g the visible light receiving terminal notifies that the device information has been sensed to be acquired by “NFC communication” and “NW communication”, and ends the process.
- step 2002h the disconnection of the visible light receiving terminal is detected, the device information is stopped, and the process ends. In the case of proceeding to step 2002i, with a lapse of a fixed time, the light emission of the device information is stopped and the process is ended.
- step 2002a if not detected, the process proceeds to step 2002d, and with a lapse of a fixed time, "brighten” a notification capable of visible light communication, "make loud”, “move an icon”, etc. Enhance and notify.
- the process returns to step 2002d.
- the process proceeds to step 2002e, and further proceeds to step 2002i after a predetermined time has elapsed.
- FIG. 135 is a diagram for describing a procedure until the user in the present embodiment purchases a device and performs initial setting of the device.
- step 2003a position information of the smartphone that has received the device information is acquired by GPS (Global Positioning System).
- GPS Global Positioning System
- step 2003b if there is user information in the smartphone, user information such as a user name, a telephone number, and an e-mail address is collected in the terminal.
- step 2003c if there is no user information in the smartphone, user information is collected from the peripheral device through the NW.
- step 2003d the device information, the user information, and the position information are transmitted to the cloud server.
- step 2003e information necessary for initial setting and activation information are collected using the device information and the position information.
- step 2003 f linkage information such as IP, an authentication method, and available services necessary for setting for linkage with a user-registered device is collected.
- step 2003g the device information and setting information are transmitted to the user-registered device through the NW to perform cooperation setting with the peripheral device.
- step 2003h user setting is performed using the device information and the user information.
- step 2003i initial setting information, activity information, and cooperation setting information are transmitted to the smartphone.
- step 2003 j the initial setting information, the activation information, and the cooperation setting information are transmitted to the home appliance by NFC.
- step 2003k the device is set by the initial setting information, the activation information, and the cooperation setting information.
- FIG. 136 is a diagram for describing a service dedicated to the serviceperson in the case where the device in this embodiment fails.
- step 2004a history information such as an operation log generated during normal operation of the device and a user operation log is recorded on the local storage medium.
- step 2004b simultaneously with the occurrence of a failure, error information such as an error code and an error detail is recorded, and notification that visible light communication is possible is notified by LED abnormal light emission.
- step 2004c the serviceman's special command is executed to set the LED high-speed light emission mode to start high-speed communication of visible light.
- step 2004d it is determined whether the terminal in proximity is a smartphone or a dedicated reception terminal of a serviceman.
- the process proceeds to step 2004e, in the case of a smartphone, error information is acquired and the process ends.
- step 2004f in the case of the serviceman, the dedicated receiving terminal acquires error information and history information.
- step 2004g the device information, error information, and history information are transmitted to the cloud, and a repair method is acquired.
- the LED high-speed light emission mode is canceled by the special command execution of the serviceman, and the process ends.
- step 2004i the related product of the device information, the product information of the similar product, the sales price of the nearest store, and the new product information are acquired from the cloud server.
- step 2004j user information is acquired through visible light communication with the smartphone of the user and the dedicated terminal of the serviceman, and an item is ordered from the nearest store through the cloud server.
- FIG. 137 is a diagram for describing a service for confirming a cleaning state using the vacuum cleaner and the visible light communication according to the present embodiment.
- step 2005a cleaning information during normal operation of the device is recorded.
- step 2005b the stain information is created in combination with the room arrangement information, and encrypted and compressed.
- the contamination information is recorded on the local storage medium using the compression of the contamination information as a trigger.
- dirt information is transmitted to the lighting apparatus by visible light communication using a temporary stop of the cleaning (stop of the suction process) as a trigger.
- the recording of the dirt information is used as a trigger to transmit the dirt information to the home local server and the cloud server via the NW.
- step 2005f using the transmission and recording of the dirt information as a trigger, the device information, the storage location, and the decryption key are transmitted to the smartphone by visible light communication.
- step 2005g dirt information is obtained and decoded through the NW and NFC.
- the seventh embodiment it is possible to realize a visible light communication system including an information communication apparatus that enables communication between various devices including devices having a small computing power.
- the visible light communication system including the information communication apparatus of the present embodiment is a visible light transmission possibility determination unit for determining whether preparation for visible light transmission is completed, and visible light transmission
- a visible light transmission notification unit for notifying the user that it is in the medium, and visible light communication for notifying the user visually and aurally when visible light communication becomes possible It is a system.
- the convenience of the user can be improved by notifying the user of the situation where visible light can be received by "emission color”, “sound”, “icon display” and “multiple LED emission” to the light emission mode of the LED.
- a terminal proximity sensing unit that senses the approach of the visible light receiving terminal, and a visible light transmission determination unit that determines the start and stop of visible light transmission according to the position of the visible light receiving terminal;
- the visible light communication system may be a visible light communication system in which visible light transmission is started triggered by sensing of the proximity of the visible light receiving terminal by the terminal proximity sensing unit.
- the visible light communication system may be configured to stop visible light transmission when triggered by the terminal proximity sensing unit detecting that the visible light receiving terminal has left.
- a peripheral illumination sensing unit that senses the turning off of the room is mounted, and visible light transmission is stopped with the sensing of the turning off of the room by the ambient illumination sensing unit. It may be a visible light communication system.
- a visible light communication time monitoring unit that measures a time during which visible light transmission is performed, and a visible light transmission notification unit that notifies the user that visible light transmission is in progress.
- a visible light communication system that enhances visual and auditory notification to users, triggered by the fact that visible light communication is performed more than a certain amount but there is no approach of visible light receiving terminals. It is also good.
- the visible light transmission time is longer than a certain time after the visible light transmission notification unit strengthens the notification, but the visible light receiving terminal is not visible as a trigger. It may be a visible light communication system in which light transmission is stopped.
- the user when the user does not receive the visible light even if the visible light transmission time exceeds a certain time, the user can receive the visible light and stop receiving, thereby preventing the user from forgetting the reception and forgetting to erase the user's convenience. It can improve.
- the visible light communication system including the information communication apparatus of the present embodiment is a visible light reception determination unit that determines that visible light communication has been received, and reception terminal position acquisition for acquiring the terminal position.
- a device setting information collecting unit for acquiring device setting information by acquiring device information and position information, and acquiring the position of the receiving terminal by using visible light reception as a trigger to acquire the device setting It is good also as a visible light communication system which collects necessary information.
- acquisition of device information by visible light communication is a trigger to automatically collect and set device information and position information necessary for user registration and user information, thereby setting the user's input and registration procedure. Omission can improve convenience.
- the device information management unit that manages device information the device relationship management unit that manages the similarity between devices, and store information management that manages store information that sells devices
- the nearest store search unit for searching the nearest store from the position information, and the nearest store and price at which similar equipment is sold triggered by receiving the device information and the position information It may be a visible light communication system to be acquired.
- the convenience of the user can be improved by collecting the sales situation and sales stores of the related devices according to the device information and saving the time for searching for the devices.
- the visible light communication system including the information communication apparatus of the present embodiment is a user information monitoring unit that monitors that user information is stored in the terminal, and user information from peripheral devices through the NW.
- a user information collection unit to collect and a user registration processing unit that acquires user information and device information and performs user registration are mounted, and user information can be accessed from peripheral devices that can be accessed triggered by the absence of user information. It is good also as a visible light communication system which collects and carries out user registration with apparatus information. As a result, acquisition of device information by visible light communication is a trigger to automatically collect and set device information and position information necessary for user registration and user information, thereby setting the user's input and registration procedure. Omission can improve convenience.
- the visible light communication system including the information communication apparatus of the present embodiment includes a command determination unit that receives a special command, a visible light communication frequency, and a visible light communication speed adjustment unit that operates multiple LED linkage.
- a visible light communication system that accelerates visible light communication by adjusting the frequency of the visible light communication and the number of transmission LEDs by receiving a special command.
- a terminal type determination unit that determines the type of the proximity terminal by NFC communication and a transmission information type determination unit that determines information to be transmitted according to the terminal type are installed.
- the visible light communication system may change the amount of information to be transmitted by the close terminals and the visible light communication speed.
- the visible light communication system including the information communication device of the present embodiment includes a cleaning information recording unit for recording cleaning information, a room arrangement information recording unit for recording room arrangement information, room arrangement information and cleaning information.
- An information combining unit that generates dirty location information by superimposing and an operation monitoring unit that monitors the stop of normal operation are mounted, and the dirty location information is visible light by using the detection of the stop of the device as a trigger. It may be a visible light communication system that transmits.
- the home delivery service will be taken as an example to explain Web information using optical communication and cooperation of devices.
- FIG. 138 shows an outline of this embodiment. That is, FIG. 138 is a schematic diagram of home delivery service support using optical communication in the present embodiment.
- the orderer places an order for a product from the product purchase site using the mobile terminal 3001a.
- an order number is issued from the product purchase site.
- the mobile terminal 3001a having received the order number is transmitted to the door phone indoor unit 3001b by using the NFC communication.
- the intercom indoor unit 3001b displays the order number received from the mobile terminal 3001a on the monitor of its own device or the like to indicate to the user that the transmission has been completed.
- the intercom indoor unit 3001 b transmits, to the intercom outdoor unit 3001 c, a blink instruction and an ON / OFF pattern of an LED built in the intercom outdoor unit 3001 c.
- the blinking pattern is generated by the intercom indoor unit 3001b according to the order number received from the mobile terminal 3001a.
- the intercom outdoor unit 3001 c blinks the LED according to the blink pattern specified by the intercom indoor unit 3001 b.
- a home network may be used besides NFC communication.
- the mobile terminal 3001a may transmit the order number directly to the intercom outdoor unit 3001c without intermediating the intercom indoor unit 3001b.
- the home delivery order reception server 3001 e transmits the order number to the home delivery mobile terminal 3001 f.
- optical communication is performed using the LED blink pattern generated based on the order number in both directions of the home delivery mobile terminal 3001 f and the door phone outside device 3001 c.
- FIGS. 139 to 144 are flowcharts for explaining home delivery service support using optical communication in the embodiment 3 of the invention.
- FIG. 139 shows the flow until the orderer places an order and the order number is issued.
- FIG. 139 will be described.
- the orderer mobile terminal 3001a implements home delivery reservation using the web browser or application of the smartphone. Then, the process proceeds to circle 1 in FIG.
- step 3002c it is checked whether the order number transmission destination device has been touched. In the case of Yes, the process proceeds to step 3002 d where the order number is NFC-touched to the doorphone indoor unit and transmitted (if the doorphone is in the same network as the smartphone, there is also a method of transmitting via the network). On the other hand, in the case of No, the process returns to step 3002b.
- step 3002 e the intercom indoor unit 3001 b waits for a request for LED blinking from the other terminal. Subsequently, in step 3002f, the order number is received from the smartphone. Subsequently, in step 3002 g, an instruction to blink the LED of the doorphone outdoor unit is issued according to the received order number. Then, the process proceeds to circle 3 in FIG.
- step 3002 h the intercom outdoor unit 3001 c waits for an LED blink instruction from the intercom indoor unit. Then, the process proceeds to circle 7 in FIG.
- the courier mobile terminal 3001f waits for an order notification.
- step 3002j it is confirmed whether or not notification has been made from the order notification and the home delivery order server.
- the process returns to step 3002i.
- step 3002 k information such as an order number and a delivery address is received.
- step 3002 n the process waits for the camera activation for the LED emission instruction of the order number received by the user or the LED emission recognition of another device. Then, the process proceeds to circle 5 in FIG.
- FIG. 140 shows a flow until the orderer makes a delivery order on the orderer mobile terminal 3001a.
- FIG. 140 will be described.
- step 3003a the home delivery order server 3001e waits for an order number. Subsequently, in step 3003b, it is checked whether a home delivery order has been received. Here, in the case of No, the process returns to step 3003a. In the case of Yes, the process proceeds to step 3003c, and an order number is issued for the received home delivery order. Subsequently, in step 3003d, the home delivery person is notified that the home delivery order has been received, and the process ends.
- the orderer mobile terminal 3001a selects the order contents from the menu presented by the home delivery order server in step 3003e following the circle 1 in FIG. Subsequently, in step 3003f, the order is confirmed and transmitted to the home delivery server. Subsequently, in step 3003g, it is confirmed whether or not the order number has been received. Here, in the case of No, the process returns to step 3003 f. In the case of Yes, the process proceeds to step 3003 h, and the received order number is displayed to display a message prompting a touch on the door phone indoor unit. Then, the process proceeds to circle 2 in FIG.
- FIG. 141 shows a flow in which the deliverer performs optical communication with the door phone outdoor unit 3001 c of the delivery destination using the deliverer mobile terminal 3001 f.
- FIG. 141 will be described.
- the courier mobile terminal 3001 f checks in step 3004 a whether or not to activate the camera in order to recognize the LED of the door phone outdoor unit 3001 c of the delivery destination.
- the process returns to circle 5 in FIG.
- step 3004b the blinking of the LED on the doorphone home door of the delivery destination is confirmed by the camera of the courier mobile terminal.
- step 3004c the LED emission of the door phone external unit is recognized and compared with the order number.
- step 3004d it is checked whether the flashing of the LED on the door phone outdoor unit matches the order number.
- the process proceeds to circle 6 in FIG.
- FIG. 142 shows a flow of order number collation between the door phone indoor unit 3001 b and the door phone outdoor unit 3001 c. Hereinafter, FIG. 142 will be described.
- the intercom outdoor unit 3001c checks in step 3005a whether an instruction to blink the LED has been issued from the intercom indoor unit. If No, return to the circle in FIG. In the case of Yes, the process proceeds to step 3005b, where the LED blinks in accordance with the LED blinking instructed by the door phone indoor unit. Then, the process proceeds to circle 8 in FIG.
- the door phone outdoor unit 3001c notifies the door phone indoor unit of the flashing of the LED recognized by the camera of the door phone outdoor unit in step 3005 c following the circle 9 in FIG. Then, the process proceeds to circle 10 in FIG.
- step 3005 d the door phone indoor unit 3001 c instructs the door phone outdoor unit to blink the LED according to the order number.
- step 3005e the camera of the door phone outdoor unit waits until it recognizes LED flashing of the courier mobile terminal.
- step 3005 f it is checked whether the notification of the recognition of the LED blinking has been received from the door phone outdoor unit.
- step 3005 g the LED flashing on the door phone outdoor unit and the order number are collated.
- step 3005 h it is checked whether the LED flashing on the door phone outdoor unit and the order number match. In the case of Yes, the process proceeds to circle 11 in FIG.
- step 3005i the door phone outdoor unit is instructed to cancel the LED blinking and the process is ended.
- FIG. 143 shows a flow between the intercom indoor unit 3001 c and the courier mobile terminal 3001 f after order number verification.
- FIG. 143 will be described.
- the courier mobile terminal 3001 f starts to blink the LED according to the order number held by the courier mobile terminal in step 3006 a.
- the LED blinking portion is arranged in a range where the camera can shoot from the door phone outdoor unit.
- step 3006c whether the door phone blinks the LED blinks on the door phone's home terminal's camera and indicates whether the LED blinks on the courier mobile terminal and the order number held by the door phone on the door phone coincide. Check.
- step 3006e the matching result is displayed on the courier mobile terminal and the processing is ended.
- FIG. 144 shows the flow between the intercom indoor unit 3001 c and the courier mobile terminal 3001 f after order number verification.
- FIG. 144 will be described.
- step 3007a the intercom outdoor unit 3001c confirms whether or not notification has been made as to whether the LED flashing notified by the intercom indoor unit matches the order number.
- the process returns to circle 11 in FIG.
- step 3007b the doorphone external unit performs LED blinking indicating whether or not the match is obtained, and the process ends.
- the door phone indoor unit 3001b notifies the orderer of the indication that the delivery person has arrived at the door phone indoor unit at step 3007c following the circle 10 in FIG. Do.
- the door phone indoor unit is instructed to stop the blinking of the LED and to blink the LED indicating that the order number is matched. And it ends.
- a delivery box for storing the delivery item may be installed at the entrance or the like.
- the delivery person stores the delivery item in the delivery box.
- FIG. 145 is a diagram for describing a process of registering a mobile phone in use with the user in the server in the present embodiment. Hereinafter, FIG. 145 will be described.
- step 4001 b the user starts an application.
- step 4001c the server is queried for information on this user and mobile phone.
- step 4001 d it is checked whether the user information and the information of the mobile phone in use are registered in the DB of the server.
- step 4001f the processing proceeds to step 4001f, and as parallel processing (processing a), analysis of user voice characteristics is started, and the processing proceeds to B in FIG.
- FIG. 146 is a diagram for describing a process of analyzing user voice characteristics in the present embodiment. Hereinafter, FIG. 146 will be described.
- step 4002a sound is collected from the microphone.
- step 4002b it is checked whether the voice collected as a result of speech recognition is estimated to be the user's voice.
- the process returns to step 4002a.
- step 4002c the process proceeds to step 4002c, and it is confirmed whether the content of the occurrence is a keyword used in this application ("Next", "Return”, etc.).
- step 4002 f voice data is registered in the user keyword voice table of the server, and the process proceeds to step 4002 d.
- step 4002d the process proceeds to step 4002d.
- step 4002d voice characteristics (frequency, sound pressure, speech rate) are analyzed.
- step 4002e the analysis result is registered in the user voice characteristic table of the mobile phone and the server.
- FIG. 147 is a diagram for describing a process of preparing for the speech recognition process in the present embodiment. Hereinafter, FIG. 147 will be described.
- step 4003a an operation to display a (user operation) cooking menu list is performed.
- step 4003 b the cooking menu list is acquired from the server.
- step 4003c the cooking menu list is displayed on the portable screen.
- step 4004d sound collection is started from the microphone connected to the mobile phone.
- step 4003e sound collection from sound collection devices in the vicinity is started as parallel processing (processing b).
- step 4003 f analysis of environmental sound characteristics is started as parallel processing (processing c).
- step 4003 g cancellation of voices from voice output devices present in the periphery is started as parallel processing (processing d).
- step 4003h the user voice characteristic is acquired from the DB of the server.
- step 4003i user speech recognition is started, and the process proceeds to C of FIG.
- FIG. 148 is a diagram for describing a process of collecting sound from a nearby sound collection device in the present embodiment. Hereinafter, FIG. 148 will be described.
- step 4004a a device (sound collecting device) that can communicate and collect sound from a mobile phone is searched.
- step 4004b it is checked whether or not a sound collection device has been found.
- step 4004c the position information of the sound collector and the microphone characteristic information are acquired from the server.
- step 4004d it is checked whether the information exists in the server.
- step 4004e the processing proceeds to step 4004e, and it is confirmed whether the installation position of the sound collection device is sufficiently close to the position of the mobile phone and it is possible to collect the user's voice. In the case of No at step 4004e, the process returns to step 4004a. On the other hand, if Yes in step 4004e, the process proceeds to step 4004f to cause the sound collection device to start sound collection. Subsequently, in step 4004 g, the sound collected by the sound collection device is transmitted to the mobile phone until an instruction to end the sound collection processing is received. Note that the collected voice may not be sent to the mobile phone as it is, but the result of voice recognition may be sent to the mobile phone. Also, the voice transmitted to the mobile phone is processed in the same manner as the voice collected from the microphone connected to the mobile phone, and the process returns to step 4004a.
- step 4004d the process advances to step 4004h to cause the sound collection device to start sound collection.
- step 4004i a signal sound is output from the mobile phone.
- step 4004 j the sound collected by the sound collection device is transmitted to the mobile phone.
- step 4004k it is checked whether the signal sound can be recognized from the sound sent from the sound collection device.
- the process proceeds to step 4004g, and in the case of No, the process returns to step 4004a.
- FIG. 149 is a diagram for illustrating analysis processing of environmental sound characteristics in the present embodiment. Hereinafter, FIG. 149 will be described.
- step 4005 f a list of devices excluding those whose position is far enough from the position of the microwave among the devices owned by the user is acquired. Data of sounds emitted by these devices are acquired from DB.
- step 4005 g the characteristics (frequency, sound pressure, etc.) of the acquired sound data are analyzed and held as environmental sound characteristics.
- the sound emitted from a rice cooker or the like in the vicinity of the microwave oven is particularly likely to be misrecognized, and is set and held at a high degree of importance.
- step 4005a sound is collected from the microphone.
- step 4005b it is confirmed whether the collected voice is the user's voice, and in the case of Yes, the process returns to step 4005a. In the case of No, the process proceeds to step 4005c, where the characteristic (frequency, sound pressure) of the collected sound is analyzed.
- step 4005d the environmental sound characteristic is updated from the analysis result.
- step 4005e it is checked whether the end flag is set, and if it is Yes, the process ends. In the case of No, the process returns to step 4005a.
- FIG. 150 is a diagram for illustrating the process of canceling the sound from the audio output device present in the periphery in the present embodiment. Hereinafter, FIG. 150 will be described.
- a communicable device is searched for a device (voice output device) capable of voice output.
- step 4006b it is checked whether or not an audio output device has been found, and if No, the process ends. In the case of Yes, the process proceeds to step 4006c, and a signal sound including various frequencies is output to the audio output device.
- step 4006d the mobile phone and the sound collection device (each sound collection device) of FIG. 148 collect sound, and the signal sound output from the voice output device is collected.
- step 4006 e it is checked whether the signal sound can be collected and recognized. If the result is No, the process ends. In the case of Yes, the processing proceeds to step 4006f, and the transmission characteristics from the audio output device to each sound collecting device (the relationship between the output sound volume for each frequency and the sound volume collected and the delay time from the signal sound output to the sound collection) Analyze
- step 4006 g it is checked whether the audio data output from the audio output device can be accessed from the mobile phone.
- step 4006h the sound source, the output location, and the volume output from the audio output device are acquired until the cancel processing end instruction is given, and each sound collecting device Cancels the audio output from the collected audio on the audio output device. It returns to step 4006a.
- step 4006i the voice output from the voice output device is acquired until the cancel processing end command is given, and voice output is performed from the voice collected by each sound collector while considering the transmission characteristics. The voice output from the device is canceled, and the process returns to step 4006a.
- FIG. 151 is a diagram for describing processing of selecting a cooking menu and setting operation contents to a microwave according to the present embodiment. Hereinafter, FIG. 151 will be described.
- step 4007a user operation
- a menu to be cooked is selected.
- step 4007b (user operation) recipe parameters (amount to be cooked, taste intensity, baking, etc.) are set.
- step 4007c recipe data and microwave oven operation content setting command are acquired from the server according to the recipe parameters.
- step 4007d the user is prompted to cause the non-contact IC tag embedded in the microwave to touch the mobile phone.
- step 4007 e it is checked whether a touch on the microwave is to be detected.
- step 4007e the process returns to step 4007e.
- step 4007f the microwave setting command acquired from the server is transmitted to the microwave.
- step 4007g (process e) a notification sound for a microwave is acquired from a DB such as a server and set in the microwave.
- step 4007h (processing f) adjustment of notification sound of the microwave is performed, and the process proceeds to D in FIG.
- FIG. 152 is a diagram for describing a process of obtaining notification sound for the microwave according to the present embodiment from a DB of a server or the like and setting the notification sound to the microwave.
- FIG. 152 will be described.
- step 4008b the electronic oven is inquired as to whether notification sound data for the mobile phone (data of sound output at the time of operation or end of operation of the electronic oven) is registered in the microwave.
- step 4008c it is checked whether notification sound data for the mobile phone is registered in the microwave.
- step 4008d it is checked whether notification sound data for this mobile phone is registered in the mobile phone.
- step 4008 h notification sound data registered in the mobile phone is registered in the microwave, and the process ends.
- step 4008 e refers to the DB of the server or the mobile phone or the microwave.
- step 4008 f if there is notification sound data for this mobile phone (data of notification sound that this mobile phone can easily recognize) in the DB, if there is no such data, a notification sound for general mobile phones Data (generally, notification sound data that a mobile phone can easily recognize) is acquired from DB.
- step 4008 g the acquired notification sound data is registered in the mobile phone.
- step 4008 h notification sound data registered in the mobile phone is registered in the microwave, and the process ends.
- FIG. 153 is a diagram for describing a process of adjusting notification sound of the microwave according to the present embodiment. Hereinafter, FIG. 153 will be described.
- step 4009a notification sound data of this microwave registered in the mobile phone is acquired.
- step 4009b it is checked whether the overlap between the frequency of the notification sound for the terminal and the frequency of the environmental sound is equal to or more than a predetermined value.
- step 4009c the volume of the notification sound is set to a sufficiently large volume compared to the environmental sound. Or change the frequency of notification sound.
- the notification sound is created with the pattern of (c) if the microwave can output the sound of (c) of FIG. 154, and the process ends. . If (c) is not possible and (b) is possible, a notification sound is created with the pattern of (b) and the process ends. (A) If possible, create a notification sound with the pattern of (a) and then exit.
- FIG. 154 is a diagram showing an example of a waveform of notification sound set in the microwave according to the present embodiment.
- the waveform shown in (a) of FIG. 154 is a simple rectangular wave and can be output by most audio output devices. As it is easy to be confused with voices other than notification sounds, it should be output several times, and if it is possible to recognize some of them, it should be recognized that the notification sounds are heard, etc.
- the waveform shown in (b) of FIG. 154 is a waveform in which the waveform of (a) is finely divided by a rectangular wave for a short time, and can be output if the operation clock of the audio output device is sufficiently fast.
- This sound sounds similar to the sound of (a) to the human ear, but in machine recognition, it has the property that the amount of information is larger than that of (a) and it is difficult to be confused with the sound other than the notification sound.
- the waveform shown in (c) of FIG. 154 is obtained by changing the time length of the audio output portion, and is called a PWM waveform. Although it is more difficult to output than (b), the amount of information is larger than (b), and it is also possible to improve the recognition rate or to simultaneously transmit information to be transmitted from the microwave to the mobile phone.
- the waveforms in (b) and (c) in FIG. 154 have a lower possibility of misrecognition than (a) in FIG. 154, but by repeating the same waveform several times as in (a) in FIG. The recognition rate can be further improved.
- FIG. 155 is a diagram showing an example of a waveform of notification sound set in the microwave according to the present embodiment. Hereinafter, FIG. 155 will be described.
- step D in the figure the contents of cooking are displayed in step 4011a.
- step 4011 b it is confirmed whether the cooking content is the operation of the microwave.
- step 4011 c the process proceeds to step 4011 c, and the user is notified that the food is put in the microwave and the operation start button is pressed, and the process proceeds to E of FIG.
- step 4011 d the cooking content is displayed, and the processing proceeds to F in the drawing or the processing proceeds to step 4011 e.
- step 4011 e it is confirmed what the user's operation is. In case of application end, it ends.
- step 4011f confirms whether cooking is finished as a result of display contents change Do.
- the processing proceeds to step 4011 g, the user is notified of cooking completion, and the processing ends.
- the process proceeds to step 4011a.
- FIG. 156 is a diagram for describing a process of recognizing notification sound of the microwave according to the present embodiment. Hereinafter, FIG. 156 will be described.
- step 4012a as parallel processing (processing g), collection of sound from sound collection devices in the vicinity and recognition of notification sound of microwave oven are started.
- step 4012 f confirmation of the mobile phone operation state is started as parallel processing (processing i).
- step 4012 g tracking of the user position is started as parallel processing (processing j).
- step 4012 b the recognition content is confirmed.
- step 4012 c the setting change is registered, and the process returns to step 4012 b.
- step 4012 e the process proceeds to F of FIG. If the food is put in the microwave and displayed to press the operation start button, and (when the operation time has passed, an operation end notification sound or a sound of opening the door of the microwave) is recognized, step 4012 e (process h) ) Notify the user of the end of operation of the microwave, and proceed to G in FIG.
- step 4012 d stands by until the operation time elapses, proceeds to step 4012 e, and (process h) notifies the user of the end of operation of the microwave to the user.
- FIG. 157 is a diagram for describing a process of performing sound collection from a surrounding sound collection device and recognition of a notification sound of a microwave in the present embodiment. Hereinafter, FIG. 157 will be described.
- step 4013a a device (sound collecting device) that can communicate with the mobile phone and can collect sound is searched.
- step 4013b it is confirmed whether or not a sound collection device has been found.
- step 4013c the position information of the sound collector and the microphone characteristic information are acquired from the server.
- step 4013d it is checked whether the information exists in the server.
- step 4013r the installation position of the sound collection device is close enough from the microwave to confirm whether or not notification sound can be collected.
- step 4013r the process returns to step 4013a.
- step 4013 s the computing device of the sound collection device checks whether voice recognition is possible.
- step 4013 s in the case of Yes, in step 4013 u, information for recognizing notification sound of the microwave is transmitted to the sound collection device.
- step 4013v the sound collection device starts sound collection and speech recognition, and the recognition result is transmitted to the mobile phone.
- step 4013 q the notification sound of the microwave is recognized until it proceeds to the next cooking stage, and the recognition result is transmitted to the mobile phone.
- step 4013t the sound collection device starts sound collection and transmits collected sound to the mobile phone.
- step 4013j the collected sound is transmitted to the mobile phone until the next cooking stage is performed, and the notification sound of the microwave is recognized by the mobile phone.
- step 4013d the processing proceeds to step 4013e, where the computing device of the sound collection device checks whether voice recognition is possible.
- step 4013k the processing proceeds to step 4013k, and information for recognizing notification sound of the microwave is transmitted to the sound collection device.
- step 4013 m the sound collection device is made to start sound collection and speech recognition, and the recognition result is transmitted to the mobile phone.
- step 4013n the notification sound of the microwave is sounded.
- step 4013p it is checked whether the notification sound has been recognized by the sound collection device. If YES at step 4013p, the process proceeds to 4013q, performs notification processing of notification sound of the microwave until the next cooking stage is performed, transmits a recognition result to the mobile phone, and returns to step 4013a. In the case of No at step 4013p, the process returns to step 4013a.
- step 4013e the process advances to step 4013f to cause the sound collection device to start collecting sound and to transmit collected sound to the mobile phone.
- step 4013 g the notification sound of the microwave is sounded.
- step 4013h recognition processing is performed on the voice sent from the sound collection device.
- step 4013i was the notification sound recognized? Check if.
- the processing proceeds to 4013j, and the voice collected until the next cooking stage is transmitted to the mobile phone, the notification sound of the microwave is recognized by the mobile phone, and the processing returns to step 4013a. In the case of No, the process returns to step 4013a.
- FIG. 158 is a diagram for describing a process of notifying the user of the end of the operation of the microwave according to the present embodiment. Hereinafter, FIG. 158 will be described.
- step 4013a it is checked whether it can be determined that the mobile phone is in use or sensor data is moving. In the case of Yes, the process proceeds to step 4014 m, and the user is notified of the end of the operation of the microwave using screen display of the mobile phone, voice, vibration and the like, and the process ends.
- step 4013a the process proceeds to step 4014b, and a device (user-in-operation device) being operated by the PC or the like to which the user is logging in is searched.
- step 4014 c it is checked whether a device under user operation has been found. In the case of Yes, the user is notified of the end of the operation of the microwave using the screen display of the device being operated by the user, and the process is ended.
- step 4014c the process proceeds to step 4014e, and a device (imaging device) capable of acquiring an image that can be communicated from the mobile phone is searched.
- step 4014f it is checked whether the imaging device has been found.
- the process proceeds to step 4014p, and the imaging device is caused to capture an image, the user's face data is transmitted to the imaging device, and the user's face is recognized.
- the captured image is sent to a mobile phone or a server, and the user's face is recognized at the transmission destination of the image.
- step 4014 q it is checked whether the user's face has been recognized. If No, the process returns to step 4014e. In the case of Yes, the processing proceeds to step 4014r, and it is confirmed whether the device (discovery device) which has found the user has a display device or an utterance device. In the case of Yes in step 4014r, the process proceeds to step 4014s, notifies the user of the end of the operation of the microwave using an apparatus attached to the device, and ends the process.
- step 4014f the process advances to step 4014g to search for a device (sound collecting device) that can communicate with the mobile phone and can collect sound.
- a device sound collecting device
- step 4014h the process advances to step 4014i to find an apparatus that can specify the position of the user by means of operating another apparatus or walking vibration. Subsequently, the process proceeds to step 4014 m to notify the user of the end of the operation of the microwave using screen display of the mobile phone, voice, vibration and the like, and the process ends.
- step 4014i the process advances to step 4014r to check whether there is a display device or an utterance device in the device (discovery device) which has found the user.
- step 4014r the processing proceeds to step 4014t, and the location information of the discovery device is acquired from the server.
- step 4014 u a device (notification device) which is near the discovery device and has a display device or a voice generation device is searched.
- step 4014 v in consideration of the distance from the notification device to the user, the user is notified of the end of the operation of the microwave with screen display or a sound of sufficient volume, and the process ends.
- FIG. 159 is a diagram for describing a process of confirming a mobile phone operation state in the present embodiment. Hereinafter, FIG. 159 will be described.
- step 4015a the mobile phone is being operated, the mobile phone is moving, or there is an input / output to an input / output device connected to the mobile phone, or video or music is being played, or the mobile phone is Check if the device located near the phone is operating or the camera or various sensors of the device located near the mobile phone recognize the user.
- step 4015b the process proceeds to step 4015b, and it is recognized that the position of the user is likely to be close to the mobile phone, and the process returns to step 4015a.
- step 4015c the process proceeds to step 4015c, and the user is recognized by the camera or various sensors of the device in the position far from the mobile phone or the device in the position far from the mobile phone Check if it is.
- step 4015c in the case of Yes, the process proceeds to step 4015d, recognizing that the position of the user is likely to be far from this mobile phone, and the process returns to step 4015a. In the case of No at step 4015c, the process returns to step 4015a.
- FIG. 160 is a diagram for describing a process of tracking the position of the user in the present embodiment. Hereinafter, FIG. 160 will be described.
- step 4016a it is checked whether the mobile phone is determined to be moved by the azimuth sensor, the position sensor or the 9-axis sensor.
- the nine-axis sensor is a sensor including at least one of an acceleration sensor, an angular velocity sensor, and a geomagnetic sensor.
- step 4016a If Yes in step 4016a, the process proceeds to step 4016b, where the positions of the mobile phone and the user are registered in DB, and the process returns to step 4016a.
- step 4016a the process proceeds to step 4016c, and it is possible to communicate from the mobile phone, a device that can detect the position of the user such as a camera or microphone or a human sensor or the presence of the user (user Search for detection equipment).
- a device that can detect the position of the user such as a camera or microphone or a human sensor or the presence of the user (user Search for detection equipment).
- step 4016d it is confirmed whether or not the sound collecting device is to be found. In the case of No at step 4016d, the process returns to step 4016a.
- step 4016d in the case of Yes, the process proceeds to step 4016e, in which it is confirmed whether the user detection device detects the user. If No in step 4016e, the process returns to step 4016a.
- step 4016e If Yes in step 4016e, the process proceeds to step 4016f to transmit the user detection to the mobile phone.
- step 4016 g the presence of the user near the user detection device is registered in the DB.
- step 4016 h if there is position information of the user detection device in the DB, the position is acquired by identifying the position of the user, and the process returns to step 4016 a.
- FIG. 161 while canceling the voice from the voice output device, the notification sound of the home appliance is recognized, the communicable electronic device recognizes the current position of the user (operator), and the user position is recognized based on the user position recognition result. It is the figure which showed that the apparatus in the position near is made to notify a user.
- FIG. 162 is a diagram showing the contents of a database held in a server, a mobile phone, or a microwave according to the present embodiment.
- the mobile phone recognizes the model of the microwave, data (speaker characteristics, modulation method, etc.) for specifying the sound that can be output, and the type of mobile phone.
- Notification sound data having an easy-to-use characteristic is associated with notification sound data that can be easily recognized on average by a general mobile phone.
- each individual mobile phone, the model of the mobile phone, the user who uses the mobile phone, and data indicating the position of the mobile phone are stored in association with each other.
- the mobile device type table 4040c the mobile device type and the sound collection characteristics such as the microphone attached to the mobile device type are associated and held.
- the user and the acoustic characteristic of the user's voice are associated and held.
- the user and the voice waveform data when the user utters a keyword such as “next” or “return” to be recognized by the mobile phone are associated and held.
- This data may not be speech waveform data itself, but may be analyzed and transformed into a form that can be easily handled.
- the user owned device position table 4040 f holds the user, the device owned by the user, and the position data of the device in association with each other.
- the user owned device position table 4040 g holds the user, the device owned by the user, and the sound data such as notification sound and driving sound emitted by the device in association with each other.
- the user and data of the position of the user are associated and held.
- FIG. 163 the user according to the present embodiment performs cooking based on the cooking procedure displayed on the mobile phone, and the user operates the display contents of the mobile phone by voice such as “next” and “return”.
- FIG. FIG. 164 shows that the user has moved to another location while starting operation of the microwave according to the present embodiment and waiting for the end of the operation, or while boiling a simmered food, etc.
- FIG. FIG. 165 shows an apparatus connected to a mobile phone via a network, and capable of recognizing the position of the user and recognizing that the user is present, such as a camera, a microphone, and a human sensor. It is a figure which shows transmitting a command which detects a user from a mobile telephone.
- a camera attached to a television recognizes a user's face, and a human sensor of an air conditioner recognizes movement and presence of the user.
- the recognition process may be performed by a television or an air conditioner, or image data may be transmitted to a mobile phone or a server, and the recognition process may be performed at the transmission destination.
- the user's data should not be sent to an external server.
- FIG. 167 is a diagram showing that a device which has detected a user detects that a user has been detected and a relative position from the detected device to the user is transmitted to a mobile phone.
- the position information of the device that has detected the user is present in the DB, it is possible to specify the position of the user.
- FIG. 168 is a diagram showing that the mobile phone recognizes the operation end sound of the microwave according to the present embodiment.
- FIG. 169 shows, among devices detected by the mobile phone that recognizes the end of operation of the microwave, a device having a screen display function and an audio output function (in this figure, a television in front of the user) It is a figure which shows transmitting the command which notifies a user of driving
- FIG. 170 is a diagram showing that the device that has received the above-described command notifies the user of the notification content (in the figure, the screen of the television displays that the operation of the microwave has ended).
- FIG. 171 is a diagram showing that the operation end sound of the microwave is connected to the mobile phone via the network, is equipped with a microphone, and is recognized by devices present in the vicinity of the microwave.
- FIG. 172 is a diagram showing that the device which recognizes the end of the operation of the microwave is notified to the mobile phone.
- the mobile phone receives the notification of the end of the operation of the microwave, if the mobile phone is in the vicinity of the user, it notifies the user of the end of the operation of the microwave using the screen display and voice output of the mobile phone.
- FIG. 173 when the mobile phone receives the notification of the end of the operation of the microwave, if the mobile phone is in the vicinity of the user, it notifies the user of the end of the operation of the microwave using the screen display and voice output of the mobile phone.
- FIG. 174 is a diagram showing that the user is notified of the end of the operation of the microwave.
- the mobile phone receives the notification of the end of operation of the microwave
- the screen display function and the voice output function among the devices detecting the user Sends an instruction to notify the user of the end of the operation of the microwave to the device (in this figure, the television in front of the user), and the device receiving the instruction notifies the user of the end of the operation of the microwave FIG.
- the mobile phone is often not present near the microwave oven or near the user, indicating that the situation shown in the figure is likely to occur.
- FIG. 175 is a diagram showing that the user who has received the notification of the end of the operation of the microwave moves to the kitchen.
- the mobile phone is displaying the next cooking contents.
- the mobile phone may recognize that the user has moved to the kitchen by voice or the like, and may start explaining the next cooking procedure at just the right time.
- FIG. 176 information such as driving completion is transmitted from the microwave to the mobile phone by wireless communication, the mobile phone issues a notification command to the television viewed by the user, and the user is notified by the screen display of the television or voice. It is a figure which shows what is done.
- Communication between the information source device (in the figure, the microwave) and the mobile phone, and communication between the mobile phone and the device that notifies the user (in the figure, the television) are LAN in the home and direct wireless communication.
- wireless communication of 700 MHz to 900 MHz can be used.
- a mobile phone is used as a hub here, other devices having communication capability may be used instead of the mobile phone.
- FIG. 177 is a diagram showing that information such as driving completion is transmitted from the microwave to the television watched by the user by wireless communication, and the user is notified using screen display or sound of the television. This shows the operation in the case of not passing through a mobile phone which plays a role as a hub in FIG.
- 178 transmits information to the air conditioner on the second floor, transmits information from the air conditioner on the second floor to the mobile phone, and transmits the information from the mobile phone to the television viewed by the user when the air conditioner on the first floor performs some kind of information notification.
- FIG. 179 is a diagram showing that information is notified to a user who is at a remote location. Specifically, FIG. 179 shows that information is notified to a user at a remote place through the Internet or carrier communication from a mobile phone notified of by a microwave via voice, light, wireless communication, etc. ing.
- FIG. 180 is a diagram showing that information can be transmitted to a mobile phone via a personal computer or the like when direct communication can not be performed from the microwave to the mobile phone serving as a hub.
- FIG. 181 is a diagram showing that, from the mobile phone which has received the communication in FIG. 180, the information communication path is traced backward to transmit information such as an operation command to the microwave.
- the mobile phone may automatically transmit information upon reception of the information shown in FIG. 180, or may notify the user of the information and receive the notification and transmit information of the operation performed by the user. .
- FIG. 182 is a diagram showing that a user is notified of information when direct communication from the air conditioner which is an information source device is not possible to a mobile phone serving as a hub.
- the air conditioner which is the information source device when it is not possible to directly communicate with the mobile phone serving as the hub from the air conditioner which is the information source device, first, in step 1, communication steps for the mobile phone such as a personal computer etc.
- the information is transmitted to the device to be the terminal, and the information is transmitted from the personal computer to the mobile phone through the Internet and a carrier communication network in circles 2 and 3, and the mobile phone processes the information automatically or holds the mobile phone.
- the information is transmitted to the personal computer through the Internet and the carrier communication network in circle 4 and circle 5, and in circle 6, the personal computer can notify the user who wants to notify (TV in this figure)
- the notification instruction is sent to the user, and the circle 7 indicates that the user is notified of the information using the screen display of the television and the voice.
- communication between a personal computer and a mobile phone is via the Internet or a carrier communication network in this figure, communication may be performed by home LAN, direct communication, or the like.
- FIG. 183 is a diagram for describing a system using a communication device using a radio wave of 700 to 900 MHz.
- a system using a communication device hereinafter referred to as G device
- G radio signal a radio wave of 700 to 900 MHz
- the information is transmitted from the range to the 3rd floor mobile phone with G device using G electric wave, and from the 3rd floor mobile phone with G device to the 2nd floor mobile phone without G device using the home network It is shown that information is sent, and the second floor mobile phone notifies the user of the information.
- the method of using NFC attached to both apparatuses can be considered for registration / authentication of the communication between apparatuses with G apparatus.
- the output of G radio waves is lowered so that communication can be performed only in a distance of about 10 to 20 cm as a registration mode, both devices are brought close, and communication succeeds.
- a method of registering and authenticating communication between G devices can be considered.
- the information source device may be any device other than the microwave oven as long as it has a G device.
- devices that relay information source devices and information notification devices can be mobile phones if they can access G radio waves and home networks.
- a device such as a personal computer, an air conditioner or a smart meter may be used.
- the information notification device may be a device such as a personal computer or a television instead of a mobile phone, as long as it can access a home network and can notify a user using screen display or voice output.
- FIG. 184 is a diagram showing that a mobile phone in a remote location notifies a user of information. Specifically, FIG. 184 transmits information from an air conditioner with a G device to a mobile phone in a home with a G device, and from the mobile phone in the home to the remote mobile phone through the Internet and a carrier communication network. It has been shown to transmit and notify the user of the remote mobile phone.
- the information source device may be any device other than the microwave as long as it has a G device.
- devices that relay information source devices and information notification devices can access G radio waves and the Internet or carrier communication network
- a mobile phone a computer, an air conditioner, a smart meter, or other device may be used.
- the information notification device can access the Internet and a carrier communication network, and can be a device such as a personal computer or a television instead of a mobile phone, as long as notification to the user is possible using screen display or voice output. .
- FIG. 185 shows that a mobile phone in a remote location notifies a user of information.
- FIG. 185 shows that the television with the G device recognizes notification sound of the microwave oven without the G device, and the television transmits information to the mobile phone in the home with the G device via the G radio wave, It is shown that information is transmitted from the mobile phone of the present invention to a mobile phone of a remote place via the Internet or a carrier communication network, and the mobile phone of the remote place notifies the user of the information.
- the information source device in the figure, the microwave
- the method by which the notification recognition device the television in the figure
- the notification of the information source device is not voice but light emission status And so on.
- the notification recognition device is the same even if it is another device with a G device. If the notification recognition device and the device for relaying the information notification device (mobile phone in a remote location in this figure) (mobile phone in the house in this figure) can access G radio waves and the Internet or carrier communication network, Instead of a mobile phone, a computer, an air conditioner, a smart meter, or other device may be used.
- the information notification device can access the Internet and a carrier communication network, and can be a device such as a personal computer or a television instead of a mobile phone, as long as notification to the user is possible using screen display or voice output. .
- FIG. 186 is a device similar to FIG. 185 that relays a notification recognition device (television on the second floor in this figure) and an information notification device (mobile phone in a remote location in this figure) (in FIG. 185)
- the second floor television plays a role of a relay device in place of the mobile phone of FIG.
- the device of the present embodiment realizes the following functions.
- the sound collected by the sound collection device can be used as it is or as a voice recognition result.
- Function to perform setting to microwave oven ⁇ It is an apparatus that can communicate from mobile phone, using an apparatus that can detect the user such as camera, microphone and human sensor, search for the user, and the current position of the user to the mobile phone Function to be sent to or stored in DB • Function to notify user from nearby device using user position stored in DB • State of mobile phone (operation state, sensor value, charge state, Function to estimate whether the user exists near the mobile phone from the data link status etc.)
- the microwave oven is used as an example, but the electronic device that emits a notification sound to be recognized is not a microwave oven, but a washing machine, a rice cooker, a vacuum cleaner, a refrigerator, an air cleaner, a pot, a dish washer / dryer
- a washing machine a rice cooker, a vacuum cleaner, a refrigerator, an air cleaner, a pot, a dish washer / dryer
- the same effect can be obtained by changing to an air conditioner, a personal computer, a mobile phone, a television, an automobile, a telephone, a mail receiving device, and the like.
- the microwave oven, the mobile phone, and the device for notifying the user such as a television
- the microwave oven, the mobile phone, and the device for notifying the user directly communicate, if there is a problem with the direct communication
- another device may be used. Communication may be performed indirectly through
- communication using a home LAN is mainly assumed, but the same applies to direct wireless communication between devices, communication via the Internet, and communication via a carrier communication network. Function can be realized.
- the location of the user is identified by a TV camera or the like by the simultaneous inquiry from the mobile phone, and the personal information is encrypted and sent to the mobile phone of the person. It has the effect of preventing it. Even when there are multiple people in the house, by storing data of human sensors in air conditioners, air purifiers, and refrigerators in a position management database such as a mobile phone, it is possible to use sensors as the operator recognizes once. By tracking, the position of the operator can be estimated.
- the movement of the physical sensor remains stationary for a certain period of time, which can be detected.
- separation is detected using a human sensor of a home appliance or a lighting device, button operation, a camera such as a TV, a microphone of a portable phone, or the like. And the position is registered in the user position database of the portable or server in the house.
- the ninth embodiment it is possible to realize an information communication apparatus (recognition apparatus) that enables communication between devices.
- the information communication apparatus searches for an electronic device (sound collecting device) having a sound collecting function among electronic devices that can be communicated from the operation terminal, and the sound collecting function of the sound collecting device May be included to recognize the notification sound of another electronic device.
- an electronic device sound collecting device having a sound collecting function among electronic devices that can be communicated from the operation terminal, and the sound collecting function of the sound collecting device May be included to recognize the notification sound of another electronic device.
- the recognition device may be a recognition device that uses the sound collection function of only the sound collection device capable of collecting the signal sound emitted from the operation terminal.
- the information communication apparatus searches for an electronic device (voice output device) having an audio output function among electronic devices that can be communicated from the operation terminal, and detects the voice output device and the sound collection device. Analysis of voice transmission characteristics during the period, acquiring output voice data of the voice output device, and collecting the voice output characteristics from the voice output device from the voice transmission characteristics and the output voice data to cancel the voice output from the voice output device A device may be included.
- the information communication apparatus of the present embodiment may include a recognition device that adjusts the notification sound of the electronic device that wants to recognize the notification sound so as not to be buried in the environmental sound.
- the information communication apparatus holds in a database the data of the sound output from the electronic device (held electronic device) owned by the user and the owned electronic device and the position data of the owned electronic device.
- a recognition device for adjusting the notification sound of the electronic device to be recognized may be included so that the voice to be output and the notification sound of the electronic device to be recognized can be easily distinguished.
- the recognition device may further be a recognition device that adjusts the speech recognition process so as to easily distinguish between the sound output from the owned electronic device and the notification sound of the electronic device that wants to recognize the notification sound.
- the positions of the operation terminal and the operator are close by using the operation state of the operation terminal, the sensor value of the physical sensor, the data link state, and the charge state.
- a recognition device for recognizing whether or not it may be included.
- the recognition device further uses an operation state of an electronic device that can be communicated from the operation terminal, a camera, a microphone, a human sensor, and position data of the electronic device held in a database. It may be a recognition device that recognizes the position of the user.
- the recognition device further includes an electronic device (notification device) having a function capable of notifying the user by means such as screen display and voice output stored in the database as a result of recognition of the user position. It may be included in an information notification device that notifies a user of information using the notification device that can notify the user using position data.
- an electronic device notification device having a function capable of notifying the user by means such as screen display and voice output stored in the database as a result of recognition of the user position. It may be included in an information notification device that notifies a user of information using the notification device that can notify the user using position data.
- Tenth Embodiment various simple authentication methods are being considered in wireless communication.
- a push button method, a PIN input method, an NFC method, and the like are defined in WPS of a wireless LAN formulated by the Wi-Fi Alliance.
- WPS Wi-Fi Alliance
- the method of limiting time is not safe if there are malicious users within a close distance to some extent.
- direct touch may be difficult or troublesome.
- FIG. 187 is a diagram showing an example of the in-home environment in the present embodiment.
- FIG. 188 is a diagram illustrating an example of communication between a home electric device and a smartphone in the present embodiment.
- FIG. 189 is a diagram showing a configuration of a transmission side apparatus according to the present embodiment.
- FIG. 190 is a diagram showing a configuration of a reception side apparatus in the present embodiment.
- FIGS. 187 to 190 are the same as FIGS. 124 to 127, and the detailed description will be omitted.
- the indoor environment is considered to be an environment as shown in FIG. 187 that authenticates the tablet held by the user in the kitchen and the TV held in the living room. It is assumed that both terminals are terminals that can connect to a wireless LAN and that a WPS module is implemented.
- FIG. 191 is a sequence diagram when the transmitting terminal (TV) performs wireless LAN authentication using optical communication with the receiving terminal (tablet) in FIG.
- FIG. 191 will be described.
- the transmitting terminal as shown in FIG. 189 creates a random number (step 5001a). Then, it registers with the registrar of WPS (step 5001 b). Further, the light emitting element is caused to emit light according to the pattern of the random number registered in the registrar (Step 5001 c).
- the reception side device activates the camera in the light authentication mode.
- the light authentication mode is a mode in which it can be recognized that the light emitting element is illuminated for authentication, and refers to a moving image shooting mode capable of shooting along with the cycle on the light emitting side.
- the user captures a light emitting element of the transmission side terminal (step 5001 d).
- the receiving terminal receives a random number by photographing (step 5001 e).
- the receiving side terminal receiving the random number inputs the random number as the PIN of WPS (step 5001 f).
- the transmitting / receiving terminal in which both sides have shared the PIN performs an authentication process according to the definition of WPS (step 5001 g).
- the transmitting terminal deletes the random number from the registrar, and does not accept authentication from a plurality of terminals (5001 h).
- the present method is applicable not only to wireless LAN authentication but also to all wireless authentication methods using a shared key.
- the present method is not limited to the wireless authentication method.
- it is possible to adapt to authentication of an application installed on both the TV and the tablet.
- FIG. 192 is a sequence diagram in the case of performing authentication by the application in the present embodiment. Hereinafter, FIG. 192 will be described.
- the transmitting terminal creates a transmitting ID according to the state of the terminal (step 5002a).
- the sender ID may be a random number or a key for encryption. Also, it may include the terminal ID (MAC address, IP address) of the transmitting side terminal. Subsequently, the transmitting terminal emits light according to the pattern of the transmitting ID (step 5002b).
- the reception side apparatus receives the transmission side ID in the same procedure as in the case of wireless authentication (step 5002 f). Subsequently, when the reception side apparatus receives the transmission side ID, the reception side apparatus creates a reception side ID that can prove that the transmission side ID has been received (step 5002 g). For example, it may be the terminal ID of the receiving side terminal encrypted by the transmitting side ID. It may also include the process ID and password of the application running on the receiving terminal. Subsequently, the receiving terminal wirelessly broadcasts the receiving ID (step 5002 h). If the sender ID includes the terminal ID of the sender terminal, unicast may be performed.
- the 5002c transmission side terminal that has received the reception side ID wirelessly performs authentication using the transmission side ID shared with the terminal that has sent the received reception side ID (step 5002 d).
- FIG. 193 is a flowchart showing the operation of the transmitting terminal in the present embodiment. Hereinafter, FIG. 193 will be described.
- the transmitting terminal emits an ID according to the state of the terminal (step 5003a).
- step 5003c it is checked whether there is a wireless response corresponding to the emitted ID (step 5003c). If there is a response (Yes in step 5003c), authentication processing is performed on the terminal that has made the response (step 5003d). If there is no response in step 5003c, the process waits for a timeout time (step 5003i), displays no response, and ends (step 5003j).
- step 5003 e it is confirmed whether the authentication process is successful, and if the authentication process is successful (Yes in step 5003 e), if the emitted ID includes a command other than authentication (Yes in step 5003 f) ), Perform processing according to the command (step 5003 g).
- step 5003e If the authentication fails in step 5003e, an authentication error is displayed (step 5003h), and the process ends.
- FIG. 194 is a flowchart showing an operation of the receiving side terminal in the present embodiment. Hereinafter, FIG. 194 will be described.
- the receiving terminal activates the camera in the light authentication mode (step 5004a).
- step 5004b it is confirmed whether light can be received in a specific pattern (step 5004b), and if it can be confirmed (Yes in step 5004b), a receiver ID capable of proving that the transmitter ID has been received is created ( Step 5004 c). If the confirmation can not be made (No in step 5004b), the timeout time is waited (Yes in step 5004i), the timeout is displayed (step 5004j), and the process ends.
- step 5004k it is checked whether the transmitting terminal includes the ID of the transmitting terminal (step 5004k). If it is included (Yes in step 5004k), the receiving ID is unicast to the terminal (step 5004) 5004d). On the other hand, when it is not included (No in step 5004k), broadcast is performed (step 5004l).
- step 5004 e authentication processing is started from the transmitting terminal side (step 5004 e), and when the authentication processing is successful (Yes in step 5004 e), it is judged whether or not the received ID includes a command (step 5004 f) . If it is determined in step 5004 f that the information is included (YES in step 5004 f), processing according to the ID is performed (step 5004 g).
- step 5004e If the authentication fails in step 5004e (No in step 5004e), an authentication error is displayed (step 5004h), and the process ends.
- FIG. 195 is a sequence diagram in which the mobile AV terminal 1 transmits data to the mobile AV terminal 2 in the present embodiment. Specifically, FIG. 195 shows a sequence diagram of data transmission / reception using NFC / wireless LAN wireless communication. Hereinafter, FIG. 195 will be described.
- the mobile AV terminal 1 first displays data to be transmitted to the mobile AV terminal 2 on the screen.
- the mobile AV terminal 1 displays a confirmation screen as to whether or not to transmit data on the screen.
- This confirmation screen may request the user to select “Yes / No” together with the text “Do you want to transmit data?” Or by touching the screen of the mobile AV terminal 1 again It may be an interface that starts data transmission.
- the mobile AV terminal 1 and the mobile AV terminal 2 exchange information of data to be transmitted and information for establishing high-speed wireless communication by NFC.
- Information on data to be transmitted may be performed by wireless LAN communication.
- Information on establishment of wireless LAN communication may be a communication channel, SSID, encryption key information, or randomly generated ID information is exchanged to establish a secure communication path by this information. Also good.
- the mobile AV terminal 1 and the mobile AV terminal 2 When the wireless LAN communication is established, the mobile AV terminal 1 and the mobile AV terminal 2 perform data communication by the wireless LAN communication, and transmit data to be transmitted of the mobile AV terminal 1 to the mobile AV terminal 2.
- FIG. 196 is a screen transition diagram when mobile AV terminal 1 transmits data to mobile AV terminal 2 in the present embodiment.
- FIG. 197 is a screen transition diagram when the mobile AV terminal 1 transmits data to the mobile AV terminal 2 in the present embodiment.
- the user first activates an application for reproducing moving image still images.
- This application displays still image and video data in the mobile AV terminal 1.
- NFC communication is performed by bringing the mobile AV terminal 1 and the mobile AV terminal 2 into quasi-contact.
- This NFC communication is a process for starting exchange of still image and moving image data in the mobile AV terminal 1.
- a confirmation screen as to whether data may be transmitted is displayed on the screen of the mobile AV terminal 1.
- the confirmation screen may be an interface that allows the user to touch the screen to start data transmission as shown in FIG. 196, or may be an interface that allows the user to select whether data transmission is allowed or not.
- Yes in the data transmission start determination that is, when the mobile AV terminal 1 transmits data to the mobile AV terminal 2, the mobile AV terminal 1 starts information on data exchanged with the mobile AV terminal 2 and high speed wireless communication by wireless LAN Send information about The information on the exchanged data may be performed using high-speed wireless communication.
- the mobile AV terminal 1 and the mobile AV terminal 2 transmit and receive information related to the start of high-speed wireless communication by the wireless LAN
- the mobile AV terminal 1 and the mobile AV terminal 2 perform processing for establishing a connection of wireless LAN communication.
- This process includes what channel is used for communication, which is the parent terminal and which is the child terminal in the communication topology, password information, mutual exchange of SSID and terminal information, and the like.
- the mobile AV terminal 1 and the mobile AV terminal 2 transmit data by wireless LAN communication.
- the mobile AV terminal 1 displays the reproduction screen of the moving image as usual, and the mobile AV terminal 2 which is the data receiving side displays a screen indicating that the data is being received.
- the mobile AV terminal 2 is receiving data, and by displaying a screen indicating that the data is being received on the screen so that the received data can be displayed immediately, it is possible to display the data received immediately upon completion of data reception. There is.
- the mobile AV terminal 2 displays the received data on the screen.
- FIGS. 198 to 200 are system schematic diagrams in the case where mobile AV terminal 1 in the present embodiment is a digital camera.
- the mobile AV terminal 1 is a digital camera
- a digital camera often has a means for accessing the Internet via a wireless LAN but does not have a means for accessing the Internet via mobile communication.
- the digital camera transmits image data captured by a wireless LAN to a photo sharing service under an environment where wireless LAN communication can be performed, In an environment where wireless LAN communication can not be performed, data is first transmitted to the mobile AV terminal 2 using the wireless LAN, and the mobile AV terminal 2 is configured to transmit the received data as it is to the photo sharing service by mobile phone communication. Is desirable.
- wireless LAN communication is faster than mobile phone communication, when wireless LAN communication is possible, it is possible to transmit photos to the photo sharing service at high speed by performing wireless LAN communication.
- the mobile telephone communication network generally has a wider service area than the wireless LAN communication network, when there is no wireless LAN environment, data is transmitted to the photo sharing service by the mobile telephone communication by relaying the mobile AV terminal 2 With the ability to do that, it's possible to send photos to photo sharing services instantly in different places.
- data can be exchanged using NFC communication and high-speed wireless communication.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Computing Systems (AREA)
- Optical Communication System (AREA)
- Selective Calling Equipment (AREA)
- Telephone Function (AREA)
- Dc Digital Transmission (AREA)
- Telephonic Communication Services (AREA)
- Studio Devices (AREA)
Abstract
Description
(位相変調による信号伝送)
図1は、実施の形態1の情報通信装置における送信信号のタイミング図である。
なお、図2~図6は、n=2の場合を示す。即ちtE=2.5T。
図7は、実施の形態2の原理を示す図である。図8~図20は、実施の形態2の動作の一例を示す図である。
以下、実施の形態3について説明する。
1枚の画像を撮像するとき、全ての撮像素子を同一のタイミングで露光させるのではなく、撮像素子ごとに異なる時刻に露光を開始・終了する撮像方法を提案する。図21は、1列に並んだ撮像素子は同時に露光させ、列が近い順に露光開始時刻をずらして撮像する場合の例である。ここでは、同時に露光する撮像素子の露光ラインと呼び、その撮像素子に対応する画像上の画素のラインを輝線と呼ぶ。
搬送波に可視光を用いる場合、人間の視覚の時間分解能(5ミリ秒から20ミリ秒程度)を窓幅としたときの発光部の輝度の移動平均値を一定に保つように発光部を発光させることで、図26のように、人間には送信装置の発光部が一様な輝度で発光しているように見えると同時に、受信装置は発光部の輝度変化を観察することができる。
撮像画像のどの部分に発光部が撮像されているかを判断する方法として、図42のように、露光ラインに垂直な方向に、発光部が撮像されたライン数を数え、最も多く発光部が撮像された列を発光部の存在する列とする方法がある。
図48で、発光部2216a、2216c、2216eは一様に発光しており、発光部2216b、2216d、2216fは、発光パターンにより信号を発信している。なお、発光部2216b、2216d、2216fは、単純に、受信装置が露光ラインごとに撮像すれば縞模様に見えるように発光しているだけでも良い。
主に受信を行う通信装置としては、図49のように、携帯電話やデジタルスチルカメラやデジタルビデオカメラやヘッドマウントディスプレイやロボット(掃除用、介護用、産業用等)や監視カメラ等が考えられる。ただし、受信装置はこれらに限定されない。
主に送信を行う通信装置としては、図50のように、照明(家庭用、店舗用、オフィス用、地下街用、街路用等)、懐中電灯、家電、ロボット、その他電子機器が考えられる。ただし、送信装置はこれらに限定しない。
図58は、発光部の望ましい構造を示す図である。
信号を搬送する光(電磁波)は、受信装置が受光可能な、図59に示す近赤外線帯~可視光帯~近紫外線帯の周波数帯の光(電磁波)を用いる。
図60では、受信装置の撮像部は、撮像範囲2310aの中にパターン発光をしている発光部2310bを検出する。
図61では、送信装置は自身の設置されている位置情報と、発光装置の大きさと、発光装置の形状と、送信装置のIDとを送信する。ここで、位置情報には、発光装置の中心部分の緯度や、経度や、標高や、床面からの高さ等が含まれる。
図66では、受信装置2606cで送信装置2606bの発光パターンを撮像して送信された信号を取得し、受信装置の位置を推定する。
図69は、受信装置を示す構成図である。受信装置はこの全部、または、撮像部と信号解析部を含む一部から構成される。なお、図69で同じ名称のブロックは、同一のものが兼ねても良いし、異なるものであっても良い。
図70は、送信装置を構成するブロック図である。
図71を解説すると、ステップ2800aで、受信装置に撮像装置が複数あるかどうかを確認する。Noの場合はステップ2800bへ進み、使用する撮像装置を選択し、ステップ2800cへ進む。一方、Yesの場合はステップ2800cへ進む。
図72を解説すると、まず、ステップ2801aで、受信装置の現在位置として認識している位置または現在位置の確率マップを、自己位置の事前情報とする。
図73を解説すると、まず、ステップ2802aで、ユーザがボタン押下する。
図74を解説すると、まず、ステップ2803aで、受光装置で受光、または、撮像装置で撮像する。
図75を解説すると、まず、ステップ2804aで、受光装置で受光し、受光した光エネルギーを電気に変換して蓄電する。
図76は、駅構内で情報提供を受ける状況を説明する図である。
図77は、乗り物内での利用の様子を示す図である。
図78は、店舗内での利用の様子を示す図である。
図79は、無線接続の認証状を通信し、無線接続を確立する状況を示している図である。
図80は、発光パターンや位置パターンによる通信の範囲を示す図である。
図81は、地下街等の屋内での利用の様子を示す図である。
図82は、街路などの屋外での利用の様子を示す図である。
図83は、道順の指示の様子を示す図である。
図84の受信装置は、インカメラ2710aと、タッチパネル2710bと、ボタン2710cと、アウトカメラ2710dと、フラッシュ2710eを備える。
図85では、送信装置1は、送信装置2の発光部の発光を受光部で受光し、送信装置2の送信している信号と送信のタイミングを取得する。
図86では、送信装置は、送信装置が取り付け装置に取り付けられたときや、取り付け装置の記憶部に記憶された情報が変更されたときに、取り付け装置の記憶部に記憶されている情報を送信装置の記憶部に記憶する。取り付け装置や送信装置の記憶部に記憶されている情報には、送信信号や送信のタイミングが含まれる。
図89は、2次元バーコードと組み合わせて利用する場合を示す図である。
図90は、地図作成とその利用の様子を示す図である。
図91は、電子機器の状態取得と操作の様子を示す図である。
図92は、撮像している電子機器を認識する様子を示す図である。
図93は、拡張現実オブジェクトを表示する様子を示す図である。
発光部が撮像範囲の中央付近から外れている場合には、撮像範囲の中心を発光部のほうへ向けさせるため、図94のように、撮像範囲の中心を発光部のほうへ向けさせるようにユーザを促すような表示を行う。
(ITSへの応用)
以下、本発明の適用例として、ITS(Intelligent Transport Systems:高度道路交通システム)を説明する。本実施の形態では、可視光通信の高速通信を実現しており、ITS分野への適応が可能である。
図113に、本実施の形態の可視光通信技術を用いた位置情報報知システムおよび施設システムの概略図を示す。例えば病院内で、患者のカルテや搬送物、薬品などをロボットで配送するシステムを代表として説明する。
図114に、商店において、本実施の形態の通信方式を搭載した機器をショッピングカートに搭載し、商品だなの照明もしくは、屋内照明から位置情報を取得するスーパーマーケットのシステムを説明する。
図115に、本実施の形態の可視光通信を用いた応用例を示す。
図116に、本実施の形態の通信方式を水中通信に適応した場合の概略図を示す。水は電波を通さないため、水中のダイバー同士や海上の船と海中の船は無線を通じて通信をすることができない。本実施の形態の可視光通信では水中でも利用可能である。
(サービス提供例)
以下、本実施の形態では、本発明の適用例として、図117を用いて、ユーザへのサービス提供例を説明する。図117は、実施の形態5におけるユーザへのサービス提供例を説明するための図である。図117には、ネットサーバ4000aと、送信機4000b、4000d、4000eと、受信機4000c、4000fと、建物4000gとが示されている。
また、本発明の適用例として、本実施の形態の指向性の特性を利用したサービスについて説明する。具体的には、映画館、コンサートホール、博物館、病院、公民館、学校、会社、商店街、デパート、役所、フードショップなどのような公共施設で本発明を利用する場合の例である。本発明は、従来の可視光通信と比較して、送信機が受信機へ発生する信号の指向性を低くすることができるため、公共施設に存在する多数の受信機へ情報を同時に発信することができる。
また、本発明の適用例として、カメラに撮像された実世界とインターネット上の世界の情報を重畳してユーザへサービスを提供するサービスについて説明する。
以下では、機器が備えるLEDの点滅パターンとして情報を送信することで、スマートフォンのカメラを用いて通信する処理の流れについて述べる。
本実施の形態では、掃除機を例にとって、可視光通信を用いた機器とユーザの通信手順、及び、可視光通信を用いた初期設定から故障時の修理対応サービス、掃除機を用いたサービス連携について説明する。
本実施の形態では、宅配サービスを例に挙げて、光通信を用いたWeb情報、及び、機器の連携について説明する。
以下、実施の形態9について説明する。
図145は、本実施の形態におけるサーバへユーザと使用中の携帯電話を登録する処理を説明するための図である。以下、図145を解説する。
図146は、本実施の形態におけるユーザ音声特性の解析を行う処理を説明するための図である。以下、図146を解説する。
図147は、本実施の形態における音声認識処理の準備を行う処理を説明するための図である。以下、図147を解説する。
図148は、本実施の形態における周辺の集音機器からの集音を行う処理を説明するための図である。以下、図148を解説する。
図149は、本実施の形態における環境音特性の解析処理を説明するための図である。以下、図149を解説する。
図150は、本実施の形態における周辺に存在する音声出力機器からの音声をキャンセルする処理を説明するための図である。以下、図150を解説する。
図151は、本実施の形態における調理メニューの選択と電子レンジへの運転内容設定を行う処理を説明するための図である。以下、図151を解説する。
図152は、本実施の形態における電子レンジ用の通知音をサーバ等のDBから取得し電子レンジへ設定する処理を説明するための図である。以下、図152を解説する。
図153は、本実施の形態における電子レンジの通知音の調整を行う処理を説明するための図である。以下、図153を解説する。
図155は、本実施の形態における電子レンジに設定する通知音の波形の例を示す図である。以下、図155を解説する。
図156は、本実施の形態における電子レンジの通知音の認識を行う処理を説明するための図である。以下、図156を解説する。
図157は、本実施の形態における周辺の集音機器からの集音と電子レンジ通知音の認識を行う処理を説明するための図である。以下、図157を解説する。
図158は、本実施の形態における電子レンジの運転終了をユーザへ通知する処理を説明するための図である。以下、図158を解説する。
図159は、本実施の形態における携帯電話操作状態の確認を行う処理を説明するための図である。以下、図159を解説する。
図160は、本実施の形態におけるユーザ位置の追跡を行う処理を説明するための図である。以下、図160を解説する。
・アプリケーションの利用を通じて、ユーザの音声特性を学習する機能
・携帯電話から通信可能で、集音機能を持つ機器のうち、携帯電話から発した音を集音可能な集音機器を見つけ出す機能
・携帯電話から通信可能で、集音機能を持つ機器のうち、電子機器から発した音を集音可能な集音機器を見つけ出す機能
・集音機器で集音した音声をそのまま、あるいは、音声認識結果を携帯電話に送信させる機能
・環境音の特性を解析し、音声認識の精度を向上させる機能
・ユーザの所有する機器から出力され得る音声をDBから取得し、音声認識の精度を向上させる機能
・携帯電話から通信可能で、音声出力機能を持つ機器のうち、その機器が発した音声を携帯電話や集音機器が集音可能な音声出力機器を見つけ出す機能
・音声出力機器から出力される音声データを取得し、伝送特性を考慮して集音音声から差し引くことで、集音音声から不要な音声をキャンセルする機能
・調理レシピのパラメータ入力を受け、ユーザに指示を行う調理手順と、調理機器をコントロールするためのコントロールデータをサーバから取得する機能
・機器の出力可能な音のデータを基に、機器の発する通知音を、携帯電話や集音機器が認識しやすいように設定する機能
・ユーザの音声特性を基に認識機能を調整することで、ユーザ音声の認識精度を向上させる機能
・複数の集音機器を用いてユーザの音声を認識する機能
・複数の集音機器を用いて電子機器の通知音を認識する機能
・携帯電話と電子機器の非接触ICカード等を介して、一度の操作だけで、一連の作業を行うために、電子機器から必要な情報を得、電子レンジに設定を行う機能
・携帯電話から通信可能な機器で、カメラやマイクや人感センサ等のユーザを探知することができる機器を用い、ユーザを探し、ユーザの現在位置を携帯電話へ送信させ、または、DBに保存させる機能
・DBに保存されたユーザ位置を用いて、近くに存在する機器からユーザに通知を行う機能
・携帯電話の状態(操作状態、センサ値、充電状態、データリンク状態等)から、携帯電話の付近にユーザが存在するかどうかを推定する機能
現在、無線通信において様々な簡単認証方式が検討されている。例えば、Wi―Fiアライアンスで策定されている、無線LANのWPSにおいては、プッシュボタン方式、PIN入力方式、NFC方式などが規定されている。無線における様々な簡単認証方式では、機器を使用しているユーザが認証を行おうとしているかどうかを、時間を限定するか双方の機器に直接触れる事ができる距離にいるということを特定することで判断し、認証を行っている。
上述した実施の形態では、NFC通信、及び、高速無線通信を用いてデータ交換する際のフローを説明したがそれに限らない。本実施の形態は、例えば図195~図197に示されるようなフローにすることも当然可能である。
本実施の形態では、上記実施の形態1~11におけるスマートフォンなどの受信機と、LEDの点滅パターンとして情報を送信する送信機とを用いた各適用例について説明する。
本実施の形態では、上記実施の形態1~12におけるスマートフォンなどの受信機と、LEDや有機ELの点滅パターンとして情報を送信する送信機とを用いた各適用例について説明する。
ここで、上記各実施の形態についての変形例または補足について説明する。
本発明の一態様に係る情報通信方法は、被写体から情報を取得する情報通信方法であって、複数の露光ラインを有するイメージセンサを用いて、前記被写体を撮像することにより第1の画像を取得する第1の撮像ステップと、前記第1の画像から、前記被写体が撮像されている範囲を検出する検出ステップと、前記複数の露光ラインのうち、前記被写体が撮像されている範囲を撮像する所定の露光ラインを決定する決定ステップと、前記所定の露光ラインを用いて取得する第2の画像に、前記所定の露光ラインに対応する輝線が前記被写体の輝度変化に応じて生じるように、前記イメージセンサの露光時間を設定する露光時間設定ステップと、前記所定の露光ラインを用いて、輝度変化する前記被写体を、設定された前記露光時間で撮像することによって、前記輝線を含む第2の画像を取得する第2の撮像ステップと、取得された前記第2の画像に含まれる前記輝線のパターンによって特定されるデータを復調することにより情報を取得する情報取得ステップと、を含む。
本実施の形態では、上記実施の形態1~13におけるスマートフォンなどの受信機と、LEDや有機ELなどの点滅パターンとして情報を送信する送信機とを用いた各適用例について説明する。
本実施の形態における情報通信方法は、被写体から情報を取得する情報通信方法であって、イメージセンサによる前記被写体の撮影によって得られる画像に、前記イメージセンサに含まれる露光ラインに対応する輝線が前記被写体の輝度変化に応じて生じるように、前記イメージセンサの露光時間を設定する第1の露光時間設定ステップと、前記イメージセンサが、輝度変化する前記被写体を、設定された前記露光時間で撮影することによって、前記輝線を含む画像である輝線画像を取得する輝線画像取得ステップと、前記輝線画像に基づいて、前記輝線が現われた部位の空間的な位置が識別し得る態様で、前記被写体と当該被写体の周囲とが映し出された表示用画像を表示する画像表示ステップと、取得された前記輝線画像に含まれる前記輝線のパターンによって特定されるデータを復調することにより送信情報を取得する情報取得ステップとを含む。
本実施の形態では、上記実施の形態1~14におけるスマートフォンなどの受信機と、LEDや有機ELなどの点滅パターンとして情報を送信する送信機とを用いた各適用例について説明する。
本実施の形態における情報通信方法は、輝度変化によって信号を送信する情報通信方法であって、送信対象の信号を変調することによって、輝度変化のパターンを決定する決定ステップと、発光体が、決定された前記パターンにしたがって輝度変化することによって前記送信対象の信号を送信する送信ステップとを含み、前記輝度変化のパターンは、予め定められた時間幅における任意の各位置に、互いに異なる2つの輝度値のうちの一方が出現するパターンであって、前記決定ステップでは、送信対象の互いに異なる信号のそれぞれに対して、前記時間幅における輝度の立ち上がり位置または立ち下がり位置である輝度変化位置が互いに異なり、且つ、前記時間幅における前記発光体の輝度の積分値が、予め設定された明るさに応じた同一の値となるように、前記輝度変化のパターンを決定する。
本実施の形態では、上記実施の形態1~15におけるスマートフォンなどの受信機と、LEDや有機ELなどの点滅パターンとして情報を送信する送信機とを用いた、シチュエーションごとの適用例について説明する。
まず、送信機として構成されている広告用の看板を掲げている店舗の前に、受信機を携帯したユーザがいるシチュエーションでの適用例について、図409~図413を用いて説明する。
次に、受信機8300を携帯したユーザが、表示された広告情報(サービス情報)に対応する店舗に入ったシチュエーションでの適用例について、図414~図422を用いて説明する。
次に、受信機8300を携帯したユーザが興味のある店舗を探しているシチュエーションでの適用例について、図423~図425を用いて説明する。
次に、受信機8300を携帯したユーザが、興味のある映画広告が掲載されたサイネージの前にいるシチュエーションでの適用例について、図426~図429を用いて説明する。
次に、受信機8300を携帯したユーザが美術館に入って館内の各展示物を鑑賞するシチュエーションでの適用例について、図430~図435を用いて説明する。
次に、受信機8300を携帯したユーザがバス停留所にいるシチュエーションでの適用例について、図436~図437を用いて説明する。
撮像側の走査方向が携帯端末の垂直方向(上下方向)である場合に、露光時間を短くして撮像すると、LEDの照明装置全体のON/OFFに対して、図438の(a)のように、走査方向と同じ方向に白・黒のパターンである輝線を撮像することができる。図438の(a)では、縦長のLED照明装置の長辺方向を、撮像側の走査方向に対して垂直になるように撮像しているため(携帯端末の左右方向)、走査方向と同じ方向に、多数の白・黒パターンの輝線を撮像することができる。即ち、送受信可能な情報量を大きくすることができる。一方、図438の(b)のように、縦長のLED照明装置を、撮像側の走査方向に対して平行になるように撮像した場合(携帯端末の上下方向)、撮像できる白・黒パターンの輝線は少なくなる。即ち、送信可能な情報量が小さくなる。
本実施の形態におけるサービス提供方法は、複数の露光ラインを有するイメージセンサを備える端末装置を用いて、前記端末装置のユーザにサービスを提供するサービス提供方法であって、前記イメージセンサの各露光ラインの露光を順次異なる時刻で開始し、かつ、前記各露光ラインの露光時間が、隣接する露光ラインとの間で、部分的に時間的な重なりを持つように、1/480秒以下の露光時間で被写体の撮影を行うことにより画像データを取得する画像取得ステップと、前記画像データに現れる、前記各露光ラインに対応する輝線パターンを復調することにより、前記被写体の識別情報を取得する可視光通信ステップと、前記被写体の識別情報に関連付けられているサービス情報を前記ユーザに提示するサービス提示ステップとを含む。
本実施の形態では、上記実施の形態1~16におけるスマートフォンなどの受信機と、LEDや有機ELなどの点滅パターンとして情報を送信する送信機とを用いた適用例について説明する。
図459は、電車のホームにおける本発明の利用形態の一例を示したものである。ユーザが、携帯端末を電子掲示板や照明にかざし、可視光通信により、電子掲示板に表示されている情報、または、電子掲示板の設置されている駅の電車情報・駅の構内情報などを取得する。ここでは、電子掲示板に表示されている情報自体が、可視光通信により、携帯端末に送信されてもよいし、電子掲示板に対応するID情報が携帯端末に送信され、携帯端末が取得したID情報をサーバに問い合わせることにより、電子掲示板に表示されている情報を取得してもよい。サーバは、携帯端末からID情報が送信されてきた場合に、ID情報に基づき、電子掲示板に表示されている内容を携帯端末に送信する。携帯端末のメモリに保存されている電車のチケット情報と、電子掲示板に表示されている情報とを対比し、ユーザのチケットに対応するチケット情報が電子掲示板に表示されている場合に、携帯端末のディスプレイに、ユーザの乗車予定の電車が到着するホームへの行き先を示す矢印を表示する。降車時に出口や乗り換え経路に近い車両までの経路を表示するとしてもよい。座席指定がされている場合は、その座席までの経路を表示するとしてもよい。矢印を表示する際には、地図や、電車案内情報における電車の路線の色と同じ色を用いて矢印を表示することにより、より分かりやすく表示することができる。また、矢印の表示とともに、ユーザの予約情報(ホーム番号、車両番号、発車時刻、座席番号)を表示することもできる。ユーザの予約情報を併せて表示することにより、誤認識を防ぐことが可能となる。チケット情報がサーバに保存されている場合には、携帯端末からサーバに問い合わせてチケット情報を取得し対比するか、または、サーバ側でチケット情報と電子掲示板に表示されている情報とを対比することにより、チケット情報に関連する情報を取得することができる。ユーザが乗換検索を行った履歴から目的の路線を推定し、経路を表示してもよい。また、電子掲示板に表示されている内容だけでなく、電子掲示板が設置されている駅の電車情報・構内情報を取得し、対比を行ってもよい。ディスプレイ上の電子掲示板の表示に対してユーザに関連する情報を強調表示してもよいし、書き換えて表示してもよい。ユーザの乗車予定が不明である場合には、各路線の乗り場への案内の矢印を表示してもよい。駅の構内情報を取得した場合には、売店・お手洗いへなどの案内する矢印をディスプレイに表示してもよい。ユーザの行動特性を予めサーバで管理しておき、ユーザが駅構内で売店・お手洗いに立ち寄ることが多い場合に、売店・お手洗いなどへ案内する矢印をディスプレイに表示する構成にしてもよい。売店・お手洗いに立ち寄る行動特性を有するユーザに対してのみ、売店・お手洗いなどへ案内する矢印を表示し、その他のユーザに対しては表示を行わないため処理量を減らすことが可能となる。売店・お手洗いなどへ案内する矢印の色を、ホームへの行き先を案内する矢印と異なる色としてもよい。両方の矢印を同時に表示する際には、異なる色とすることにより、誤認識を防ぐことが可能となる。尚、図459では電車の例を示したが、飛行機やバスなどでも同様の構成で表示を行うことが可能である。
図460は、空港や駅構内、観光地や病院などに設置された電子案内表示板から可視光通信により情報を取得する場合における一例を示したものである。可視光通信により、電子案内表示板から表示内容の情報を取得し、表示内容の情報を携帯端末に設定されている言語情報に翻訳をした上で、携帯端末のディスプレイ上に表示を行う。ユーザの言語に翻訳されて表示されるため、ユーザは容易に情報を理解することができる。言語の翻訳は、携帯端末内で行ってもよいし、サーバにおいて行ってもよい。サーバにおいて翻訳を行う場合には、可視光通信により取得した表示内容の情報と、携帯端末の言語情報をサーバに送信し、サーバにおいて翻訳を行い、携帯端末に送信を行い、携帯端末のディスプレイ上に表示を行ってもよい。また、電子案内表示板からID情報を取得した場合には、ID情報をサーバに送信し、ID情報に対応する表示内容情報をサーバから取得する構成でもよい。更に、携帯端末に保存されている国籍情報やチケット情報や手荷物預入情報に基づき、ユーザが次に向かうべき場所へ案内する矢印を表示してもよい。
図461は、ユーザが店舗に近づくと、可視光通信により取得したクーポン情報が表示される、または、ポップアップが携帯端末のディスプレイに表示される一例を示したものである。ユーザは、携帯端末を用いて、可視光通信により、電子掲示板などから店舗のクーポン情報を取得する。次に、店舗から所定の範囲内にユーザが入ると、店舗のクーポン情報、または、ポップアップが表示される。ユーザが、店舗から所定の範囲内に入ったか否かは、携帯端末のGPS情報と、クーポン情報に含まれる店舗情報とを用いて判断される。クーポン情報に限らず、チケット情報でもよい。クーポンやチケットが利用できる店舗などが近づくと自動的にアラートしてくれるため、ユーザはクーポンやチケットを適切に利用することが可能となる。
図463は、ユーザが携帯端末を用いて、可視光通信により、家電より情報を取得する一例を示したものである。可視光通信により、家電からID情報、または、当該家電に関する情報を取得した場合に、当該家電を操作するためのアプリケーションが自動的に立ち上がる。図463では、テレビを用いた例を示している。このような構成により、携帯端末を家電にかざすだけで、家電を操作するためのアプリケーションを起動することが可能となる。
図464は、バーコードリーダ8405aが商品のバーコードを読み取る際に、バーコードリーダ8405aの付近で可視光通信用のデータ通信を中止する一例を示したものである。バーコード読み取り時に可視光通信を停止することで、バーコードリーダ8405aがバーコードを誤認識することを防ぐことができる。バーコードリーダ8405aは、バーコード読み取りボタンが押されたときに送信停止信号を可視光信号送信機8405bへ送信し、ボタンから指が離されたとき、あるいは、ボタンから指が離されて所定の時間経過したときに送信再開信号を可視光信号送信機8405bへ送信する。送信停止信号や送信再開信号は、有線/無線通信や赤外線通信や音波による通信で行う。バーコードリーダ8405aは、バーコードリーダ8405aに備えた加速度センサの計測値からバーコードリーダ8405aが動かされたと推定したときに送信停止信号を送信し、バーコードリーダ8405aが所定の時間動かされなかったと推定したときに送信再開信号を送信してもよい。バーコードリーダ8405aは、バーコードリーダ8405aに備えた静電センサや照度センサの計測値からバーコードリーダ8405aが握られたと推定したときに送信停止信号を送信し、手が離されたと推定した時に送信再開信号を送信してもよい。バーコードリーダ8405aは、バーコードリーダ8405aの接地面に設けられたスイッチが押下状態から開放されることからバーコードリーダ8405aが持ち上げられたことを検知して送信停止信号を送信し、前記ボタンが押されることからバーコードリーダ8405aが置かれたことを検知して送信再開信号を送信してもよい。バーコードリーダ8405aは、バーコードリーダ置き場のスイッチや赤外線センサの計測値からバーコードリーダ8405aが持ち上げられたことを検出して送信停止信号を送信し、バーコードリーダ8405aが戻されたことを検出して送信再開信号を送信してもよい。レジ8405cは、操作が開始されたときに送信停止信号を送信し、精算操作が行われた時に送信再開信号を送信してもよい。
図466は、本発明の利用形態の一例を示したものである。
図467は、送信機が送信するIDを管理するサーバの保持するデータベースの構成の一例を示したものである。
図468は、本通信方式による受信を開始するジェスチャ動作の一例を示したものである。
図469は、本発明の送信機の一例を示したものである。
図470は、可視光通信画像の符号化方式の一つを説明する図である。
図471は、可視光通信画像の符号化方式の一つを説明する図である。
図472、図473は、可視光通信画像の符号化方式の一つを説明する図である。
図474は、可視光通信画像の符号化方式の一つを説明する図である。
図475と図476は、可視光通信画像の符号化方式の一つを説明する図である。
図477は、可視光通信画像の符号化方法の一つを説明する図である。
図478は、送信機の動作の一つを説明する図である。
図479は、可視光通信の応用例の一つを説明する図である。
図480は、可視光通信データのフォーマットの一つを説明する図である。
図481と図482は、可視光通信の応用例の一つを説明する図である。
図483と図484は、可視光通信画像の表示方法の一つを説明する図である。
図485は、実施の形態17における送信機と受信機の動作の一例を示す図である。
図486は、実施の形態17における受信機と送信機の動作の一例を示す図である。
本実施の形態における情報通信方法は、輝度変化によって信号を送信する情報通信方法であって、複数の送信対象の信号のそれぞれを変調することによって、複数の輝度変化のパターンを決定する決定ステップと、複数の発光体のそれぞれが、決定された複数の輝度変化のパターンのうちの何れか1つのパターンにしたがって輝度変化することによって、前記何れか1つのパターンに対応する送信対象の信号を送信する送信ステップとを含み、前記送信ステップでは、前記複数の発光体のうちの2つ以上の発光体のそれぞれは、当該発光体に対して予め定められた時間単位ごとに、互いに輝度の異なる2種類の光のうちの何れか一方の光が出力されるように、且つ、前記2つ以上の発光体のそれぞれに対して予め定められた前記時間単位が互いに異なるように、互いに異なる周波数で輝度変化する。
本実施の形態では、上記実施の形態1~17におけるスマートフォンなどの受信機と、LEDの点滅パターンとして情報を送信する送信機とを用いた各適用例について説明する。
図492より、露光時間が変調周期の3倍程度までであればはっきりとした輝線が観察できる。変調周波数は480Hz以上であるため、中間撮像モード(中間モード)の露光時間は、1/160秒以下とすることが望ましい。
図508は、既に説明した実施の形態に記載の受信方法を用いたサービス提供システムを示す図である。
(受信しやすい変調方式)
図512A、図512Bおよび図513は、実施の形態20における信号の符号化の一例を示す図である。
図514は、実施の形態20における撮像画像の一例を示す図である。
図515A~図515Cは、実施の形態20における受信機の構成および動作の一例を示す図である。
図515Dは、実施の形態21における信号受信方法の一例を示す図である。
図515Eは、実施の形態21における信号受信方法の一例を示すフローチャートである。
図516および図517Aは、実施の形態20における受信方法の一例を示す図である。
図518は、実施の形態20における受信方法の一例を示す図である。
図519は、実施の形態20における受信方法の一例を示す図である。
図520は、実施の形態20における信号変調方法の一例を示す図である。
図521は、実施の形態20における受信機の動作の一例を示す図である。
図522は、実施の形態20における送信機の動作の一例を示す図である。
図523は、実施の形態20における受信機の一例を示す図である。
図524と図525は、実施の形態20における送信システムの一例を示す図である。
図526は、実施の形態20における受信機の動作の一例を示す図である。
図527は、実施の形態20における受信機の動作の一例を示す図である。
図528は、本通信方式による受信を開始するジェスチャ動作の一例を示す図である。
図529と図530は、実施の形態20における送受信システムの応用例の一例を示す図である。
図531は、実施の形態20における送受信システムの応用例の一例を示す図である。
図532は、実施の形態20における送受信システムの応用例の一例を示す図である。
図533は、実施の形態20における送受信システムの応用例の一例を示す図である。
図534は、実施の形態20における送受信システムの応用例の一例を示す図である。
図535は、実施の形態20における送受信システムの応用例の一例を示す図である。
図536は、実施の形態20における送受信システムの応用例の一例を示す図である。
図537は、実施の形態20における送受信システムの応用例の一例を示す図である。
図538と図539は、実施の形態20における送受信システムの応用例の一例を示す図である。
図540は、実施の形態20における送受信システムの応用例の一例を示す図である。
図541は、可視光通信と無線通信とによる位置推定を行う構成を示す図である。つまり、図541は、可視光通信と無線通信とを用いて端末の位置推定を行う構成を示している。
本発明の一態様に係る情報通信方法は、被写体から情報を取得する情報通信方法であって、イメージセンサによる前記被写体の撮影によって得られる画像に、前記イメージセンサに含まれる複数の露光ラインに対応する複数の輝線が前記被写体の輝度変化に応じて生じるように、前記イメージセンサの第1の露光時間を設定する第1の露光時間設定ステップと、前記イメージセンサが、輝度変化する前記被写体を、設定された前記第1の露光時間で撮影することによって、前記複数の輝線を含む輝線画像を取得する第1の画像取得ステップと、取得された前記輝線画像に含まれる前記複数の輝線のパターンによって特定されるデータを復調することにより情報を取得する情報取得ステップとを含み、前記第1の画像取得ステップでは、前記複数の露光ラインのそれぞれは、順次異なる時刻で露光を開始し、かつ、当該露光ラインに隣接する隣接露光ラインの露光が終了してから所定の空き時間経過後に、露光を開始する。
本実施の形態では、上記各実施の形態におけるスマートフォンなどの受信機と、LEDや有機ELの点滅パターンとして情報を送信する送信機とを用いた各適用例について説明する。
図569と図570は、実施の形態21における送信機の動作の一例を示す図である。
図571と図572は、実施の形態21における送信機の動作の一例を示す図である。
(複数の受光部による複数の方向からの信号の受信)
図573は、実施の形態21における受信機の一例を示す図である。
図574は、実施の形態21における受信機の一例を示す図である。
図575は、実施の形態21における受信システムの一例を示す図である。
図576は、実施の形態21における受信システムの一例を示す図である。
図577Aと図577Bは、実施の形態21における変調方式の一例を示す図である。
図577Cと図577Dは、実施の形態21における混合信号の分離の一例を示す図である。
図578は、実施の形態21における可視光通信システムの一例を示す図である。
図579は、実施の形態21における干渉を排除した受信方法を示すフローチャートである。
図580は、実施の形態21における送信機の方位の推定方法を示すフローチャートである。
図581は、実施の形態21における受信の開始方法を示すフローチャートである。
図582は、実施の形態21における他媒体の情報を併用したIDの生成方法を示すフローチャートである。
図583は、実施の形態21における周波数分離による受信方式の選択方法を示すフローチャートである。
図584は、実施の形態21における露光時間が長い場合の信号受信方法を示すフローチャートである。
本実施の形態は、可視光通信の受光部切り替えを行う装置に関するものであり、スマートフォンのような携帯端末に搭載することを想定している。そこで、まず本装置を搭載した携帯端末を使用するときのシーンについて説明する。
本実施の形態は、実施の形態22に加えて、受光制御部の構成及びフローを明確にしたものである。
本実施の形態は、実施の形態22に加えて、受光制御部の構成及びフローを明確にしたものである。
本実施の形態は、実施の形態22をベースに、視線検出の技術を応用して、ユーザの意思をより的確に表示部A11018に反映させることを目的としたものである。
本実施の形態は、実施の形態25をベースにしたものであって、受光制御部A11012の構成を明確化したものである。図617は、実施の形態26における受光制御部の構造図である。実施の形態26における受光制御部A11012は、第1の受光部A12501と、第2の受光部A12502と、これら2つの受光部のどちらを稼動させるかを選択するためのセレクタA12503と、セレクタA12503における選択処理に必要な情報を供給する加速度検知部A12504と、セレクタA12503を経由して第1の受光部A12501又は第2の受光部受光部A12502より得られた受光信号を格納する受光信号格納部A12505と、受光信号格納部A12505から受光信号を読み取り、デコード処理を行ってIDを生成し、生成したIDをID格納部A11013に格納するデコード部A12506と、セレクタA12503が取得した加速度から、受光制御部A11012を搭載した携帯端末A11011が地面に対して水平か否かを判定するための第1の閾値と第2の閾値が予め格納されている閾値格納部A12507とを備え、更に受光部A12501から映像信号を受け、ユーザの視線位置を検出し、視線分類情報を視線情報格納部A13001に格納する視線検出部A13301から構成される。なお、本実施の形態において、受光部A12501はカメラデバイスのような映像撮像装置とする。
本実施の形態は、実施の形態25をベースにしたものであって、受光制御部A11012の構成を明確化したものである。図618は、実施の形態27における受光制御部の構造図である。実施の形態26における受光制御部A11012は、第1の受光部A12501と、第2の受光部A12502と、これら2つの受光部のどちらを稼動させるかを選択するためのセレクタA12503と、セレクタA12503における選択処理に必要な情報を供給する加速度検知部A12504と、セレクタA12503を経由して第1の受光部A12501又は第2の受光部受光部A12502より得られた受光信号を格納する受光信号格納部A12505と、受光信号格納部A12505から受光信号を読み取り、デコード処理を行ってIDを生成し、生成したIDをID格納部A11013に格納するデコード部A12506と、セレクタA12503が取得した加速度から、受光制御部A11012を搭載した携帯端末A11011が地面に対して水平か否かを判定するための第1の閾値と第2の閾値が予め格納されている閾値格納部A12507とを備え、更に撮像部A13401と、撮像部A13401から映像信号を受け、ユーザの視線位置を検出し、視線分類情報を視線情報格納部A13001に格納する視線検出部A13402から構成される。なお、本実施の形態において、撮像部A13401はカメラデバイスのような映像撮像装置とし、受光部A12501は照度センサのようなセンサデバイスとする。
図619は、実施の形態28におけるユースケースを説明するための図である。この図619を用いて、本発明のPPM方式もしくはFDM方式FSK方式もしくは周波数割り当て方式の変調方式を用いた受信部1028を用いた場合の実施の形態を述べる。
電子レンジ等を使用しようとする使用者は、携帯端末のインカメラ部1017で発光機1003からの光信号を受け取り、インカメラ処理部1026で、発光機IDと発光機認証IDを受信部1027で受信しておく。電子機器1040の受光可能な発光機IDは、3G等の携帯電話の電波やWi-Fiを用いた位置情報とクラウド1032や携帯端末内部に記録されている、その位置に存在する発光機IDを検出してもよい(1025)。
可視光発信機器から可視光IDを受信する際、可視光ID受信までに時間を要するため、ユーザは可視光ID受信を一定時間待たなければならず、また可視光ID受信に失敗する場合が生じることが課題である。前記課題は、ユニークに特定可能な可視光IDはビット列長が長いために生じる。特に、周波数変調方式を採用する可視光発信機器では頻繁に発生する。
1105 スマートフォン
1106 電子レンジ
1107 空気清浄機
1201 スマートフォン
1301 送信側装置
1309 送信速度判定部
1401 受信側装置
1404 画像取得部
1406 点滅情報取得部
3001a モバイル端末
3001b ドアホン宅内器
3001c ドアホン宅外器
3001e 宅配注文サーバ
3001f 宅配者モバイル端末
4040a 電子レンジテーブル
4040b 携帯テーブル
4040c 携帯機種テーブル
4040d ユーザ音声特性テーブル
4040e ユーザキーワード音声テーブル
4040f ユーザ所有機器位置テーブル
4040h ユーザ位置テーブル
A11001 携帯端末
A11002 第1の照明機器
A11003 第2の照明機器
A11004 第1のID格納部
A11005 第1のエンコード部
A11006 第1の照度パターン格納部
A11007 第1の発光部
A11008 第2のID格納部
A11009 第2のエンコード部
A11010 第2の照度パターン格納部
A11011 第2の発光部
A11012 受光制御部
A11013 ID格納部
A11014 DB管理部
A11015 サーバ装置
A11016 商品情報格納部
A11017 地図情報格納部
A11018 表示部
A11019 状態管理部
A11020 UI部
A12501 第1の受光部
A12502 第2の受光部
A12503 セレクタ
A12504 加速度検知部
A12505 受光信号格納部
A12506 デコード部
A12507 閾値格納部
A12701 第1の受光部
A12702 第2の受光部
A12703 セレクタ
A12704 タイマ制御部
A12705 受光信号格納部
A12706 デコード部
A12707 閾値格納部
A13001 視線情報格納部
A13002 受光制御部
A13003 DB管理部
A13004 サーバ装置
A13301 視線検出部
A13401 撮像部
A13402 視線検出部
B0101 モバイル端末
B0102 エリア検出手段
B0103 センシング手段
B0104 問合せID生成手段
B0105 可視光ID受光処理手段
B0106 前面カメラ
B0107 背面カメラ
B0108 通信手段
B0109 表示手段
B0110 補間ID生成手段
B0111 ID対応情報変換サーバ
B0112 通信手段
B0113 変換情報判定手段
B0114 ID対応情報保持手段
B0120 可視光発信機器
B0130 公衆ネットワーク、または、機器間で一時的に確立されるアドホックネットワーク
B0141 エリアID情報サーバ
B0142 通信手段
B0143 エリア情報判定手段
B0144 エリアID情報保持手段
B0151 ユーザ情報保持手段
B1301 モバイル端末角度
B1302 天井に設置された照明等の可視光発信機器
B1303 ユーザ前方に設置されたサイネージ等の可視光発信機器
B1304 ユーザ後方に設置されたサイネージ等の可視光発信機器
B1305 床に設置された照明等の可視光発信機器
B1401 可視光発光機器からの発光
B2101 エリアを示すIDビット列
B2102 エリア内の特定の場所を示すIDビット列
K10 情報通信装置
K11 決定部
K12 送信部
K20 情報通信装置
K21 露光時間設定部
K22 輝線画像取得部
K23 情報取得部
K24 扉制御部
Claims (10)
- 輝度変化によって信号を送信する情報通信方法であって、
送信対象の信号を変調することによって、輝度変化のパターンを決定する決定ステップと、
決定された輝度変化のパターンにしたがって複数の発光体が輝度変化することによって、前記送信対象の信号を送信する送信ステップとを含み、
前記複数の発光体が配置された面において、前記複数の発光体の外にあって輝度変化しない非輝度変化領域が、前記面の垂直方向および水平方向のうちの少なくとも一方に沿って前記複数の発光体の間を通って前記面を横断することがないように、前記複数の発光体が前記面上に配置されている
情報通信方法。 - 前記送信ステップでは、
前記複数の発光体のうちの少なくとも1つの発光体の明るさのレベルが、予め定められた明るさである基準レベルよりも小さいか、または前記基準レベル以下であるかを判定する明るさ判定ステップを含み、
前記送信ステップでは、
前記発光体の明るさのレベルが前記基準レベルよりも小さい、または前記基準レベル以下であると判定されたときには、前記発光体からの前記送信対象の信号の送信を停止する
請求項1に記載の情報通信方法。 - 前記送信ステップでは、
前記複数の発光体のうちの少なくとも1つの発光体の明るさのレベルが、予め定められた明るさである基準レベルよりも大きいか、または前記基準レベル以上であるかを判定する明るさ判定ステップを含み、
前記送信ステップでは、
前記発光体の明るさのレベルが前記基準レベルよりも大きい、または前記基準レベル以上であると判定されたときには、前記発光体からの前記送信対象の信号の送信を開始する
請求項1または2に記載の情報通信方法。 - 前記決定ステップでは、
前記送信対象の信号のうちの一部であるボディに対応する第1の輝度変化のパターンと、前記ボディを特定するためのヘッダを示す第2の輝度変化のパターンとを決定し、
前記送信ステップでは、
前記第1の輝度変化のパターン、前記第2の輝度変化のパターン、前記第1の輝度変化のパターンの順に、それぞれの前記パターンにしたがって輝度変化することによって、前記ヘッダおよび前記ボディを送信する
請求項1~3の何れか1項に記載の情報通信方法。 - 前記決定ステップでは、
さらに、前記ヘッダと異なる他のヘッダを示す第3の輝度変化のパターンを決定し、
前記送信ステップでは、
前記第1の輝度変化のパターン、前記第2の輝度変化のパターン、前記第1の輝度変化のパターン、前記第3の輝度変化のパターンの順に、それぞれの前記パターンにしたがって輝度変化することによって、前記ヘッダ、前記ボディおよび前記他のヘッダを送信する
請求項4に記載の情報通信方法。 - 前記決定ステップでは、
所定の輝度値が現れるタイミングが互いに隣接する2つの輝度変化のパターンが、同一のパリティの信号単位に割り当てられていないように、前記タイミングが互いに異なる輝度変化のパターンが、互いに異なる信号単位のそれぞれに対して予め割り当てられてあり、
前記送信対象の信号に含まれる信号単位のそれぞれに対して、当該信号単位に割り当てられた輝度変化のパターンを決定する
請求項1~5の何れか1項に記載の情報通信方法。 - 前記情報通信方法は、さらに、
前記複数の発光体のうちの少なくとも1つの発光体をイメージセンサによって撮影することによって得られる画像に、前記イメージセンサに含まれる露光ラインに対応する輝線が前記発光体の輝度変化に応じて生じるように、前記イメージセンサの露光時間を設定する露光時間設定ステップと、
前記イメージセンサが、輝度変化する前記発光体を、設定された前記露光時間で撮影することによって、前記輝線を含む画像である輝線画像を取得する輝線画像取得ステップと、
取得された前記輝線画像に含まれる前記輝線のパターンによって特定されるデータを復調することにより情報を取得する情報取得ステップとを含み、
前記情報取得ステップは、
前記輝線のパターンのうち、前記第2の輝度変化のパターンに対応する第1の部分を、前記輝線に垂直な方向に挟む第2の部分と第3の部分とを特定する部分特定ステップと、
前記第2および第3の部分によって特定されるデータを復調することにより前記ボディを取得するボディ取得ステップとを含み、
前記部分特定ステップでは、
前記輝線に垂直な方向の、前記第2の部分の長さと前記第3の部分の長さとの和が、前記ボディに対応付けられた長さとなるように、前記第2の部分と前記第3の部分とを特定する
請求項4に記載の情報通信方法。 - 前記情報通信方法は、さらに、
予め定められたリズムで光るフラッシュを受けたか否かを判定するフラッシュ判定ステップを含み、
前記送信ステップでは、
前記フラッシュを受けたと判定された場合には、前記複数の発光体は輝度を上げて輝度変化する
請求項1~7の何れか1項に記載の情報通信方法。 - 前記情報通信方法は、さらに、
前記複数の発光体のうちの少なくとも1つの発光体が、人の目で視認されるように点滅する点滅ステップを含み、
前記発光体では、前記送信ステップと前記点滅ステップとが交互に繰り返し行なわれる
請求項1~8の何れか1項に記載の情報通信方法。 - 輝度変化によって信号を送信する情報通信装置であって、
送信対象の信号を変調することによって、輝度変化のパターンを決定する決定部と、
決定された輝度変化のパターンにしたがって複数の発光体が輝度変化することによって、前記送信対象の信号を送信する送信部とを備え、
前記複数の発光体が配置された面において、前記複数の発光体の外にあって輝度変化しない非輝度変化領域が、前記面の垂直方向および水平方向のうちの少なくとも一方に沿って前記複数の発光体の間を通って前記面を横断することがないように、前記複数の発光体が前記面上に配置されている
情報通信装置。
Priority Applications (13)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
BR112015014733A BR112015014733A2 (pt) | 2012-12-27 | 2013-12-27 | método de comunicação de informações |
JP2014554163A JP6294235B2 (ja) | 2012-12-27 | 2013-12-27 | プログラム、制御方法および情報通信装置 |
AU2013367893A AU2013367893B2 (en) | 2012-12-27 | 2013-12-27 | Information communication method |
CN201380067922.8A CN105874728B (zh) | 2012-12-27 | 2013-12-27 | 信息通信方法及信息通信装置 |
EP13869275.1A EP2940902B1 (en) | 2012-12-27 | 2013-12-27 | Information communication method |
SG11201505027UA SG11201505027UA (en) | 2012-12-27 | 2013-12-27 | Information communication method |
MX2015008253A MX343578B (es) | 2012-12-27 | 2013-12-27 | Metodo de comunicacion de informacion. |
CN201480056651.0A CN105637783B (zh) | 2013-12-27 | 2014-12-25 | 信息处理方法、接收方法和信息处理装置 |
JP2015554573A JP6345697B2 (ja) | 2013-12-27 | 2014-12-25 | 情報処理プログラム、受信プログラムおよび情報処理装置 |
AU2014371943A AU2014371943B2 (en) | 2013-12-27 | 2014-12-25 | Information processing program, receiving program and information processing device |
CA2927809A CA2927809A1 (en) | 2013-12-27 | 2014-12-25 | Information processing program, receiving program and information processing device |
EP14874981.5A EP3089381B1 (en) | 2013-12-27 | 2014-12-25 | Information processing program, receiving program and information processing device |
PCT/JP2014/006448 WO2015098108A1 (ja) | 2013-12-27 | 2014-12-25 | 情報処理プログラム、受信プログラムおよび情報処理装置 |
Applications Claiming Priority (36)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201261746315P | 2012-12-27 | 2012-12-27 | |
JP2012286339 | 2012-12-27 | ||
JP2012-286339 | 2012-12-27 | ||
US61/746,315 | 2012-12-27 | ||
US201361805978P | 2013-03-28 | 2013-03-28 | |
JP2013-070740 | 2013-03-28 | ||
JP2013070740 | 2013-03-28 | ||
US61/805,978 | 2013-03-28 | ||
US201361810291P | 2013-04-10 | 2013-04-10 | |
JP2013082546 | 2013-04-10 | ||
US61/810,291 | 2013-04-10 | ||
JP2013-082546 | 2013-04-10 | ||
JP2013110445 | 2013-05-24 | ||
JP2013-110445 | 2013-05-24 | ||
US201361859902P | 2013-07-30 | 2013-07-30 | |
JP2013158359 | 2013-07-30 | ||
US61/859,902 | 2013-07-30 | ||
JP2013-158359 | 2013-07-30 | ||
US201361872028P | 2013-08-30 | 2013-08-30 | |
JP2013180729 | 2013-08-30 | ||
JP2013-180729 | 2013-08-30 | ||
US61/872,028 | 2013-08-30 | ||
US201361895615P | 2013-10-25 | 2013-10-25 | |
JP2013222827 | 2013-10-25 | ||
US61/895,615 | 2013-10-25 | ||
JP2013-222827 | 2013-10-25 | ||
US201361896879P | 2013-10-29 | 2013-10-29 | |
JP2013-224805 | 2013-10-29 | ||
JP2013224805 | 2013-10-29 | ||
US61/896,879 | 2013-10-29 | ||
US201361904611P | 2013-11-15 | 2013-11-15 | |
JP2013-237460 | 2013-11-15 | ||
JP2013237460 | 2013-11-15 | ||
US61/904,611 | 2013-11-15 | ||
JP2013242407 | 2013-11-22 | ||
JP2013-242407 | 2013-11-22 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014103340A1 true WO2014103340A1 (ja) | 2014-07-03 |
Family
ID=51020454
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/007708 WO2014103340A1 (ja) | 2012-12-27 | 2013-12-27 | 情報通信方法 |
PCT/JP2013/007709 WO2014103341A1 (ja) | 2012-12-27 | 2013-12-27 | 情報通信方法 |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/007709 WO2014103341A1 (ja) | 2012-12-27 | 2013-12-27 | 情報通信方法 |
Country Status (10)
Country | Link |
---|---|
US (6) | US9085927B2 (ja) |
EP (2) | EP2940902B1 (ja) |
JP (3) | JP5590431B1 (ja) |
CN (2) | CN105874728B (ja) |
AU (1) | AU2013367893B2 (ja) |
BR (1) | BR112015014733A2 (ja) |
CL (1) | CL2015001829A1 (ja) |
MX (1) | MX343578B (ja) |
SG (2) | SG11201505027UA (ja) |
WO (2) | WO2014103340A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017169066A1 (ja) * | 2016-03-28 | 2017-10-05 | ソニー株式会社 | 電子機器 |
Families Citing this family (159)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11265082B2 (en) | 2007-05-24 | 2022-03-01 | Federal Law Enforcement Development Services, Inc. | LED light control assembly and system |
US9414458B2 (en) | 2007-05-24 | 2016-08-09 | Federal Law Enforcement Development Services, Inc. | LED light control assembly and system |
US9100124B2 (en) | 2007-05-24 | 2015-08-04 | Federal Law Enforcement Development Services, Inc. | LED Light Fixture |
US9455783B2 (en) | 2013-05-06 | 2016-09-27 | Federal Law Enforcement Development Services, Inc. | Network security and variable pulse wave form with continuous communication |
US20080317475A1 (en) | 2007-05-24 | 2008-12-25 | Federal Law Enforcement Development Services, Inc. | Led light interior room and building communication system |
US8890773B1 (en) | 2009-04-01 | 2014-11-18 | Federal Law Enforcement Development Services, Inc. | Visible light transceiver glasses |
EP2538584B1 (en) * | 2011-06-23 | 2018-12-05 | Casio Computer Co., Ltd. | Information Transmission System, and Information Transmission Method |
JP5845462B2 (ja) * | 2011-11-07 | 2016-01-20 | パナソニックIpマネジメント株式会社 | 通信システムおよびそれに用いる伝送ユニット |
US9166810B2 (en) | 2012-05-24 | 2015-10-20 | Panasonic Intellectual Property Corporation Of America | Information communication device of obtaining information by demodulating a bright line pattern included in an image |
US8988574B2 (en) | 2012-12-27 | 2015-03-24 | Panasonic Intellectual Property Corporation Of America | Information communication method for obtaining information using bright line image |
US10523876B2 (en) | 2012-12-27 | 2019-12-31 | Panasonic Intellectual Property Corporation Of America | Information communication method |
US8913144B2 (en) | 2012-12-27 | 2014-12-16 | Panasonic Intellectual Property Corporation Of America | Information communication method |
JP5606655B1 (ja) | 2012-12-27 | 2014-10-15 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 情報通信方法 |
US10303945B2 (en) | 2012-12-27 | 2019-05-28 | Panasonic Intellectual Property Corporation Of America | Display method and display apparatus |
US9608725B2 (en) | 2012-12-27 | 2017-03-28 | Panasonic Intellectual Property Corporation Of America | Information processing program, reception program, and information processing apparatus |
US10951310B2 (en) | 2012-12-27 | 2021-03-16 | Panasonic Intellectual Property Corporation Of America | Communication method, communication device, and transmitter |
WO2014103329A1 (ja) | 2012-12-27 | 2014-07-03 | パナソニック株式会社 | 可視光通信信号表示方法及び表示装置 |
US10530486B2 (en) | 2012-12-27 | 2020-01-07 | Panasonic Intellectual Property Corporation Of America | Transmitting method, transmitting apparatus, and program |
JP5608834B1 (ja) | 2012-12-27 | 2014-10-15 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 映像表示方法 |
JP5590431B1 (ja) | 2012-12-27 | 2014-09-17 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 情報通信方法 |
WO2014103333A1 (ja) | 2012-12-27 | 2014-07-03 | パナソニック株式会社 | 表示方法 |
US8922666B2 (en) | 2012-12-27 | 2014-12-30 | Panasonic Intellectual Property Corporation Of America | Information communication method |
US9088360B2 (en) | 2012-12-27 | 2015-07-21 | Panasonic Intellectual Property Corporation Of America | Information communication method |
US9087349B2 (en) | 2012-12-27 | 2015-07-21 | Panasonic Intellectual Property Corporation Of America | Information communication method |
US9560284B2 (en) | 2012-12-27 | 2017-01-31 | Panasonic Intellectual Property Corporation Of America | Information communication method for obtaining information specified by striped pattern of bright lines |
US9608727B2 (en) | 2012-12-27 | 2017-03-28 | Panasonic Intellectual Property Corporation Of America | Switched pixel visible light transmitting method, apparatus and program |
US9843386B2 (en) * | 2013-04-19 | 2017-12-12 | Philips Lighting Holding B.V. | Receiving coded visible light in presence of interference |
US10379551B2 (en) | 2013-07-10 | 2019-08-13 | Crowdcomfort, Inc. | Systems and methods for providing augmented reality-like interface for the management and maintenance of building systems |
US10070280B2 (en) | 2016-02-12 | 2018-09-04 | Crowdcomfort, Inc. | Systems and methods for leveraging text messages in a mobile-based crowdsourcing platform |
US10541751B2 (en) * | 2015-11-18 | 2020-01-21 | Crowdcomfort, Inc. | Systems and methods for providing geolocation services in a mobile-based crowdsourcing platform |
US11394462B2 (en) | 2013-07-10 | 2022-07-19 | Crowdcomfort, Inc. | Systems and methods for collecting, managing, and leveraging crowdsourced data |
US9625922B2 (en) | 2013-07-10 | 2017-04-18 | Crowdcomfort, Inc. | System and method for crowd-sourced environmental system control and maintenance |
US10796085B2 (en) * | 2013-07-10 | 2020-10-06 | Crowdcomfort, Inc. | Systems and methods for providing cross-device native functionality in a mobile-based crowdsourcing platform |
US10841741B2 (en) | 2015-07-07 | 2020-11-17 | Crowdcomfort, Inc. | Systems and methods for providing error correction and management in a mobile-based crowdsourcing platform |
CN103476169B (zh) * | 2013-07-18 | 2016-08-24 | 浙江生辉照明有限公司 | 一种基于led照明装置的室内导航控制系统及方法 |
CN105408939B (zh) * | 2013-07-23 | 2021-08-24 | 皇家飞利浦有限公司 | 用于配准成像设备与跟踪设备的配准系统 |
DE102013014536B4 (de) * | 2013-09-03 | 2015-07-09 | Sew-Eurodrive Gmbh & Co Kg | Verfahren zur Übertragung von Information und Vorrichtung zur Durchführung des Verfahrens |
CN103593340B (zh) * | 2013-10-28 | 2017-08-29 | 余自立 | 自然表达信息处理方法、处理及回应方法、设备及系统 |
WO2015075937A1 (ja) | 2013-11-22 | 2015-05-28 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 情報処理プログラム、受信プログラムおよび情報処理装置 |
CA2934784A1 (en) * | 2013-12-27 | 2015-07-02 | Panasonic Intellectual Property Corporation Of America | Visible light communication method, identification signal, and receiver |
US10024679B2 (en) | 2014-01-14 | 2018-07-17 | Toyota Motor Engineering & Manufacturing North America, Inc. | Smart necklace with stereo vision and onboard processing |
US9915545B2 (en) | 2014-01-14 | 2018-03-13 | Toyota Motor Engineering & Manufacturing North America, Inc. | Smart necklace with stereo vision and onboard processing |
US10248856B2 (en) | 2014-01-14 | 2019-04-02 | Toyota Motor Engineering & Manufacturing North America, Inc. | Smart necklace with stereo vision and onboard processing |
US9578307B2 (en) | 2014-01-14 | 2017-02-21 | Toyota Motor Engineering & Manufacturing North America, Inc. | Smart necklace with stereo vision and onboard processing |
US10360907B2 (en) | 2014-01-14 | 2019-07-23 | Toyota Motor Engineering & Manufacturing North America, Inc. | Smart necklace with stereo vision and onboard processing |
US9668294B2 (en) * | 2014-01-14 | 2017-05-30 | Qualcomm Incorporated | Method and apparatus for bluetooth low energy suspend and resume |
US9629774B2 (en) | 2014-01-14 | 2017-04-25 | Toyota Motor Engineering & Manufacturing North America, Inc. | Smart necklace with stereo vision and onboard processing |
US20150198941A1 (en) | 2014-01-15 | 2015-07-16 | John C. Pederson | Cyber Life Electronic Networking and Commerce Operating Exchange |
WO2015112702A1 (en) * | 2014-01-22 | 2015-07-30 | Hamilton Christopher Chad | Portable social communication client |
KR102180236B1 (ko) * | 2014-02-20 | 2020-11-18 | 삼성전자 주식회사 | 전자 장치의 입력 처리 방법 및 장치 |
CN105100633A (zh) * | 2014-05-22 | 2015-11-25 | 宁波舜宇光电信息有限公司 | 虹膜识别应用中的补光方法及装置 |
US9479250B2 (en) * | 2014-05-30 | 2016-10-25 | Comcast Cable Communications, Llc | Light based location system |
US20150358079A1 (en) * | 2014-06-04 | 2015-12-10 | Grandios Technologies, Llc | Visible light communication in a mobile electronic device |
CN104038899B (zh) * | 2014-06-11 | 2019-02-22 | 北京智谷睿拓技术服务有限公司 | 邻近关系确定方法及装置 |
EP2961157A1 (en) * | 2014-06-23 | 2015-12-30 | Thomson Licensing | Message inserting method in a rendering of a video content by a display device, reading method, devices and programs associated |
US9735868B2 (en) * | 2014-07-23 | 2017-08-15 | Qualcomm Incorporated | Derivation of an identifier encoded in a visible light communication signal |
JP6379811B2 (ja) * | 2014-07-30 | 2018-08-29 | カシオ計算機株式会社 | 表示装置、表示制御方法及び表示制御プログラム |
US10024667B2 (en) | 2014-08-01 | 2018-07-17 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable earpiece for providing social and environmental awareness |
JP6494214B2 (ja) * | 2014-08-11 | 2019-04-03 | キヤノン株式会社 | 固体撮像装置、撮像システム及び固体撮像装置の駆動方法 |
WO2016025488A2 (en) | 2014-08-12 | 2016-02-18 | Abl Ip Holding Llc | System and method for estimating the position and orientation of a mobile communications device in a beacon-based positioning system |
US9922236B2 (en) | 2014-09-17 | 2018-03-20 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable eyeglasses for providing social and environmental awareness |
US10024678B2 (en) | 2014-09-17 | 2018-07-17 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable clip for providing social and environmental awareness |
JP6448121B2 (ja) * | 2014-09-25 | 2019-01-09 | 池上通信機株式会社 | 光無線通信装置、光無線通信方法および光無線通信システム |
JP6670996B2 (ja) * | 2014-09-26 | 2020-03-25 | パナソニックIpマネジメント株式会社 | 表示装置及び表示方法 |
KR101571719B1 (ko) * | 2014-10-02 | 2015-11-25 | 엘지전자 주식회사 | 로봇 청소기 |
KR20160041147A (ko) * | 2014-10-06 | 2016-04-18 | 삼성전자주식회사 | 제어 방법 및 그 방법을 처리하는 전자장치 |
EP3208779B1 (en) * | 2014-10-15 | 2022-05-04 | Sony Group Corporation | System and device for secured communication of identification information between two devices |
WO2016063730A1 (ja) * | 2014-10-21 | 2016-04-28 | ソニー株式会社 | 送信装置及び送信方法、受信装置及び受信方法、並びに、プログラム |
US10798428B2 (en) * | 2014-11-12 | 2020-10-06 | Sony Corporation | Method and system for providing coupon |
WO2016075948A1 (ja) | 2014-11-14 | 2016-05-19 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 再生方法、再生装置およびプログラム |
EP3023807B1 (en) * | 2014-11-18 | 2016-12-28 | Siemens Aktiengesellschaft | A method for determining a distance between an FMCW ranging device and a target |
US20160171504A1 (en) * | 2014-12-11 | 2016-06-16 | Schneider Electric Industries Sas | Blink code product registration |
JP6357427B2 (ja) * | 2015-01-16 | 2018-07-11 | 株式会社デンソー | 車両用制御システム |
US9576460B2 (en) | 2015-01-21 | 2017-02-21 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable smart device for hazard detection and warning based on image and audio data |
US10490102B2 (en) | 2015-02-10 | 2019-11-26 | Toyota Motor Engineering & Manufacturing North America, Inc. | System and method for braille assistance |
US10560188B2 (en) * | 2015-02-17 | 2020-02-11 | Kookmin University Industry Academy Cooperation Foundation | Image sensor communication system and communication method using rolling shutter modulation |
US9851091B2 (en) * | 2015-02-18 | 2017-12-26 | Lg Electronics Inc. | Head mounted display |
US9586318B2 (en) | 2015-02-27 | 2017-03-07 | Toyota Motor Engineering & Manufacturing North America, Inc. | Modular robot with smart device |
US9354318B1 (en) | 2015-03-05 | 2016-05-31 | Horizon Hobby, LLC | Optical spread spectrum detection and ranging |
JP6425173B2 (ja) | 2015-03-06 | 2018-11-21 | パナソニックIpマネジメント株式会社 | 照明装置及び照明システム |
US9677901B2 (en) | 2015-03-10 | 2017-06-13 | Toyota Motor Engineering & Manufacturing North America, Inc. | System and method for providing navigation instructions at optimal times |
US9811752B2 (en) | 2015-03-10 | 2017-11-07 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable smart device and method for redundant object identification |
US9972216B2 (en) | 2015-03-20 | 2018-05-15 | Toyota Motor Engineering & Manufacturing North America, Inc. | System and method for storing and playback of information for blind users |
US20160316046A1 (en) * | 2015-04-21 | 2016-10-27 | Jianhui Zheng | Mobile phone with integrated retractable image capturing device |
US10412173B2 (en) | 2015-04-22 | 2019-09-10 | Panasonic Avionics Corporation | Passenger seat pairing system |
US9793987B2 (en) * | 2015-07-02 | 2017-10-17 | Nokia Technologies Oy | Method and apparatus for recognizing a device |
US10715653B2 (en) | 2015-07-07 | 2020-07-14 | Crowdcomfort, Inc. | Systems and methods for providing geolocation services |
US10171646B2 (en) | 2015-07-07 | 2019-01-01 | Crowdcomfort, Inc. | Systems and methods for providing geolocation services |
US9898039B2 (en) | 2015-08-03 | 2018-02-20 | Toyota Motor Engineering & Manufacturing North America, Inc. | Modular smart necklace |
WO2017025854A1 (en) * | 2015-08-07 | 2017-02-16 | Tridonic Gmbh & Co Kg | Commissioning device for commissioning installed building technology devices |
US20170046950A1 (en) * | 2015-08-11 | 2017-02-16 | Federal Law Enforcement Development Services, Inc. | Function disabler device and system |
JP6579884B2 (ja) * | 2015-09-24 | 2019-09-25 | キヤノン株式会社 | 通信装置、制御方法、及びプログラム |
CN105243799B (zh) * | 2015-09-30 | 2018-07-27 | 小米科技有限责任公司 | 安全提醒处理方法和装置 |
CN107113058B (zh) | 2015-11-06 | 2020-12-18 | 松下电器(美国)知识产权公司 | 可见光信号的生成方法、信号生成装置以及介质 |
EP3376772B1 (en) * | 2015-11-12 | 2023-01-25 | Panasonic Intellectual Property Corporation of America | Display method, program and display device |
WO2017122924A1 (ko) * | 2016-01-12 | 2017-07-20 | 국민대학교 산학협력단 | S2-psk 광학 무선 통신 방법 및 장치 |
JP6240235B2 (ja) * | 2016-02-19 | 2017-11-29 | ヤフー株式会社 | 判定装置、判定方法および判定プログラム |
US10024680B2 (en) | 2016-03-11 | 2018-07-17 | Toyota Motor Engineering & Manufacturing North America, Inc. | Step based guidance system |
TWI564680B (zh) * | 2016-03-23 | 2017-01-01 | The control method of the scanning light source of the exposure machine and the computer program product | |
US20170323252A1 (en) * | 2016-05-05 | 2017-11-09 | Wal-Mart Stores, Inc. | Rf permeability measure of product out of stocks |
US9958275B2 (en) | 2016-05-31 | 2018-05-01 | Toyota Motor Engineering & Manufacturing North America, Inc. | System and method for wearable smart device communications |
JP6792351B2 (ja) * | 2016-06-01 | 2020-11-25 | キヤノン株式会社 | 符号化装置、撮像装置、符号化方法、及びプログラム |
JP2018000308A (ja) * | 2016-06-28 | 2018-01-11 | フォーブ インコーポレーテッド | 映像表示装置システム、心拍特定方法、心拍特定プログラム |
US20180012318A1 (en) * | 2016-07-06 | 2018-01-11 | Panasonic Intellectual Property Management Co., Ltd. | Method and system for remote order submission via a light identifier |
EP3486714B1 (en) * | 2016-07-15 | 2021-03-10 | Nec Corporation | Transmitter and bias adjustment method |
US10561519B2 (en) | 2016-07-20 | 2020-02-18 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable computing device having a curved back to reduce pressure on vertebrae |
CN108344988B (zh) * | 2016-08-30 | 2022-05-10 | 李言飞 | 一种测距的方法、装置及系统 |
US10432851B2 (en) | 2016-10-28 | 2019-10-01 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable computing device for detecting photography |
CN110114988B (zh) | 2016-11-10 | 2021-09-07 | 松下电器(美国)知识产权公司 | 发送方法、发送装置及记录介质 |
TWI736702B (zh) * | 2016-11-10 | 2021-08-21 | 美商松下電器(美國)知識產權公司 | 資訊通訊方法、資訊通訊裝置及程式 |
US10012505B2 (en) | 2016-11-11 | 2018-07-03 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable system for providing walking directions |
US10521669B2 (en) | 2016-11-14 | 2019-12-31 | Toyota Motor Engineering & Manufacturing North America, Inc. | System and method for providing guidance or feedback to a user |
US10218855B2 (en) | 2016-11-14 | 2019-02-26 | Alarm.Com Incorporated | Doorbell call center |
CA3043678A1 (en) * | 2016-11-16 | 2018-05-24 | Meir GOLAN | System, methods and software for user authentication |
EP3456154A1 (en) * | 2016-12-20 | 2019-03-20 | Wizconnected Company Limited | A device, system and method for controlling operation of lighting units |
KR102000067B1 (ko) * | 2017-01-16 | 2019-09-17 | 엘지전자 주식회사 | 이동 로봇 |
US10172760B2 (en) | 2017-01-19 | 2019-01-08 | Jennifer Hendrix | Responsive route guidance and identification system |
CA3057002A1 (en) * | 2017-03-20 | 2018-09-27 | Telefonaktiebolaget Lm Ericsson (Publ) | Secure network connection resume |
US10484091B2 (en) * | 2017-06-29 | 2019-11-19 | Osram Sylvania Inc. | Light-based fiducial communication |
JP7213494B2 (ja) * | 2017-07-11 | 2023-01-27 | 大学共同利用機関法人情報・システム研究機構 | 情報伝送システム |
US20200211223A1 (en) * | 2017-07-19 | 2020-07-02 | Signify Holding B.V. | System and method for providing spatial information of an object to a device |
EP3658947B1 (en) | 2017-07-26 | 2021-04-07 | Signify Holding B.V. | A system for communicating a presence of a device via a light source |
US10242390B2 (en) | 2017-07-31 | 2019-03-26 | Bank Of America Corporation | Digital data processing system for controlling automated exchange zone systems |
JP2019036400A (ja) * | 2017-08-10 | 2019-03-07 | パナソニックIpマネジメント株式会社 | 照明システム、操作装置、および、照明システムのマッピング方法 |
CN109410891B (zh) * | 2017-08-17 | 2021-01-01 | 群创光电股份有限公司 | 显示器以及其操作方法 |
US20190088108A1 (en) * | 2017-09-18 | 2019-03-21 | Qualcomm Incorporated | Camera tampering detection |
WO2019076596A1 (en) * | 2017-10-19 | 2019-04-25 | Telefonaktiebolaget Lm Ericsson (Publ) | TRANSMITTER, NETWORK NODE, METHOD AND COMPUTER PROGRAM FOR TRANSMITTING BINARY INFORMATION |
WO2019086178A1 (en) | 2017-11-03 | 2019-05-09 | Telefonaktiebolaget Lm Ericsson (Publ) | Receiver, communication apparatus, method and computer program for receiving binary information |
US10909830B1 (en) | 2017-11-07 | 2021-02-02 | Pica Product Development, Llc | Personal emergency alert system, method and device |
US10999560B2 (en) * | 2017-11-07 | 2021-05-04 | Readiness Systems, LLC | Remote electronic monitoring infrastructure |
US10694338B2 (en) | 2017-11-07 | 2020-06-23 | Pica Product Development, Llc | Cellular automated external defibrillator (AED) tracker |
US10798541B2 (en) | 2017-11-07 | 2020-10-06 | Pica Product Development, Llc | Systems, methods and devices for remote trap monitoring |
JP2019132673A (ja) * | 2018-01-31 | 2019-08-08 | 沖電気工業株式会社 | 端末装置及び位置検出システム |
JP6990819B2 (ja) * | 2018-03-07 | 2022-01-12 | 富士フイルムヘルスケア株式会社 | 超音波撮像装置及び方法 |
CN109040423B (zh) * | 2018-06-27 | 2021-06-25 | 努比亚技术有限公司 | 一种通知信息处理方法、设备及计算机可读存储介质 |
WO2020031260A1 (ja) * | 2018-08-07 | 2020-02-13 | 三菱電機株式会社 | 制御装置、制御システム、報知方法及びプログラム |
JP7006539B2 (ja) * | 2018-08-23 | 2022-01-24 | 日本電信電話株式会社 | 受信装置、受信方法、およびプログラム |
DE102018006988B3 (de) | 2018-09-04 | 2019-08-14 | Sew-Eurodrive Gmbh & Co Kg | System und Verfahren zum Betreiben dieses Systems, aufweisend eine erste Kommunikationseinheit und eine zweite Kommunikationseinheit |
CN109600771B (zh) * | 2018-11-26 | 2020-09-08 | 清华大学 | 一种WiFi设备到ZigBee设备的跨协议通信方法及装置 |
CN109872241A (zh) * | 2019-01-28 | 2019-06-11 | 太仓煜和网络科技有限公司 | 交友平台数据分销系统及分销方法 |
US10699576B1 (en) * | 2019-01-30 | 2020-06-30 | Po-Han Shih | Travel smart collision avoidance warning system |
US11552706B2 (en) | 2019-03-29 | 2023-01-10 | Advanced Functional Fabrics Of America, Inc. | Optical communication methods and systems using motion blur |
US11928682B2 (en) * | 2019-05-15 | 2024-03-12 | Worldpay, Llc | Methods and systems for generating a unique signature based on user movements in a three-dimensional space |
CN110146105B (zh) * | 2019-05-29 | 2022-05-20 | 阿波罗智联(北京)科技有限公司 | 路线导航方法、智能家居设备、服务器、电子设备 |
JP7298329B2 (ja) * | 2019-06-24 | 2023-06-27 | オムロン株式会社 | マスタモジュールおよび機器制御装置の制御プログラム |
WO2021006433A1 (ko) | 2019-07-08 | 2021-01-14 | 국민대학교산학협력단 | 광학 카메라 통신 시스템의 통신 방법 및 장치 |
JP7345100B2 (ja) * | 2019-08-02 | 2023-09-15 | パナソニックIpマネジメント株式会社 | 位置推定装置、位置推定システム、及び、位置推定方法 |
JP7475830B2 (ja) * | 2019-09-17 | 2024-04-30 | キヤノン株式会社 | 撮像制御装置および撮像制御方法 |
DE102019007311B3 (de) | 2019-10-21 | 2020-09-24 | SEW-EURODRlVE GmbH & Co. KG | Empfänger für ein System zur Lichtübertragung, System zur Lichtübertragung und Verfahren zum Betrieb eines Systems zur Lichtübertragung |
US11418956B2 (en) | 2019-11-15 | 2022-08-16 | Panasonic Avionics Corporation | Passenger vehicle wireless access point security system |
CN111240471B (zh) * | 2019-12-31 | 2023-02-03 | 维沃移动通信有限公司 | 信息交互方法及穿戴式设备 |
US11445369B2 (en) * | 2020-02-25 | 2022-09-13 | International Business Machines Corporation | System and method for credential generation for wireless infrastructure and security |
WO2021190856A1 (de) * | 2020-03-24 | 2021-09-30 | Sew-Eurodrive Gmbh & Co. Kg | Empfänger für ein system zur lichtübertragung, system zur lichtübertragung und verfahren zum betrieb eines systems zur lichtübertragung |
CN114064963A (zh) * | 2020-08-03 | 2022-02-18 | 北京字跳网络技术有限公司 | 信息显示方法及设备 |
CN112613117B (zh) * | 2020-12-11 | 2022-08-12 | 成都飞机工业(集团)有限责任公司 | 一种航空口盖由展开尺寸向3d快速构建设计方法 |
US11907988B2 (en) * | 2020-12-15 | 2024-02-20 | Crowdcomfort, Inc. | Systems and methods for providing geolocation services in a mobile-based crowdsourcing platform |
US20230327853A1 (en) * | 2022-04-07 | 2023-10-12 | Bank Of America Corporation | System and method for generating a block in a blockchain network using a voice-based hash value generated by a voice signature |
DE102023002931A1 (de) | 2022-08-10 | 2024-02-15 | Sew-Eurodrive Gmbh & Co Kg | Verfahren zur Bestimmung einer passenden Zeilenabtastfrequenz und System zur Lichtübertragung |
WO2024137696A1 (en) * | 2022-12-22 | 2024-06-27 | Arizona Board Of Regents On Behalf Of The University Of Arizona | Method and system for optical transmission and crosstalk reduction in lidar using polarized light |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002290335A (ja) | 2001-03-28 | 2002-10-04 | Sony Corp | 光空間伝送装置 |
JP2006121466A (ja) * | 2004-10-22 | 2006-05-11 | Nec Corp | 撮像素子、撮像モジュール及び携帯端末 |
JP2010226172A (ja) * | 2009-03-19 | 2010-10-07 | Casio Computer Co Ltd | 情報復元装置及び情報復元方法 |
JP2012010269A (ja) * | 2010-06-28 | 2012-01-12 | Outstanding Technology:Kk | 可視光通信送信機 |
JP2012169189A (ja) * | 2011-02-15 | 2012-09-06 | Koito Mfg Co Ltd | 発光モジュールおよび車両用灯具 |
JP2013223047A (ja) * | 2012-04-13 | 2013-10-28 | Toshiba Corp | 伝送システム、送信装置および受信装置 |
Family Cites Families (248)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4120171A (en) | 1977-01-13 | 1978-10-17 | Societe Nationale Elf Aquitaine (Production) | Apparatus and method of connecting a flexible line to a subsea station |
JPS5931477B2 (ja) | 1977-01-27 | 1984-08-02 | 日本電気株式会社 | 印刷装置 |
JPS595896B2 (ja) | 1977-06-15 | 1984-02-07 | エプソン株式会社 | 残像効果型表示装置 |
JPS5521125A (en) | 1978-08-02 | 1980-02-15 | Hitachi Ltd | Method of mounting semiconductor device |
JPS5541153A (en) | 1978-09-15 | 1980-03-22 | Fujitsu Ltd | Dc power supply system |
US6062481A (en) | 1986-04-18 | 2000-05-16 | Cias, Inc. | Optimal error-detecting, error-correcting and other coding and processing, particularly for bar codes, and applications therefor such as counterfeit detection |
JPH087567B2 (ja) | 1986-08-12 | 1996-01-29 | 株式会社日立製作所 | 画像表示装置 |
US4807031A (en) | 1987-10-20 | 1989-02-21 | Interactive Systems, Incorporated | Interactive video method and apparatus |
US6347163B2 (en) | 1994-10-26 | 2002-02-12 | Symbol Technologies, Inc. | System for reading two-dimensional images using ambient and/or projected light |
JPH08509582A (ja) | 1993-05-03 | 1996-10-08 | ピンジャロー プロプライエタリー リミテッド | サブリミナル・メッセージ表示システム |
US6345104B1 (en) | 1994-03-17 | 2002-02-05 | Digimarc Corporation | Digital watermarks and methods for security documents |
JPH07200428A (ja) | 1993-12-28 | 1995-08-04 | Canon Inc | 通信装置 |
CN2187863Y (zh) | 1994-02-03 | 1995-01-18 | 清华大学 | 用以观测快速运动物流的跟踪摄象-录象装置 |
US5484998A (en) | 1994-03-16 | 1996-01-16 | Decora Industries, Inc. | Bar-coded card with coding and reading system |
ATE418233T1 (de) | 1995-05-08 | 2009-01-15 | Digimarc Corp | Verfahren zur einbettung von maschinenlesbarem steganographischen kode |
JP3949679B2 (ja) | 1995-05-08 | 2007-07-25 | ディジマーク コーポレイション | ステガノグラフィシステム |
US5822310A (en) * | 1995-12-27 | 1998-10-13 | Ericsson Inc. | High power short message service using broadcast control channel |
US5765176A (en) | 1996-09-06 | 1998-06-09 | Xerox Corporation | Performing document image management tasks using an iconic image having embedded encoded information |
US5974348A (en) | 1996-12-13 | 1999-10-26 | Rocks; James K. | System and method for performing mobile robotic work operations |
US20050169643A1 (en) | 1997-01-02 | 2005-08-04 | Franklin Philip G. | Method and apparatus for the zonal transmission of data using building lighting fixtures |
WO1999044336A1 (fr) | 1998-02-26 | 1999-09-02 | Sony Corporation | Appareil de traitement de donnees et support lisible par ordinateur |
WO2001093473A2 (en) | 2000-05-31 | 2001-12-06 | Optinetix (Israel) Ltd. | Systems and methods for distributing information through broadcast media |
KR100434459B1 (ko) * | 2000-06-27 | 2004-06-05 | 삼성전자주식회사 | 이동통신 시스템에서 패킷의 전송 제어방법 및 장치 |
JP2002144984A (ja) | 2000-11-17 | 2002-05-22 | Matsushita Electric Ind Co Ltd | 車載用電子機器 |
US6831643B2 (en) | 2001-04-16 | 2004-12-14 | Lucent Technologies Inc. | Method and system for reconstructing 3D interactive walkthroughs of real-world environments |
US20020171639A1 (en) | 2001-04-16 | 2002-11-21 | Gal Ben-David | Methods and apparatus for transmitting data over graphic displays |
US20030026422A1 (en) | 2001-06-19 | 2003-02-06 | Usa Video Interactive Corporation | Method and apparatus for digitally fingerprinting videos |
JP3861113B2 (ja) | 2001-08-30 | 2006-12-20 | 株式会社日立プラズマパテントライセンシング | 画像表示方法 |
JP2003179556A (ja) | 2001-09-21 | 2003-06-27 | Casio Comput Co Ltd | 情報伝送方式、情報伝送システム、撮像装置、および、情報伝送方法 |
EP1439649A4 (en) | 2001-10-23 | 2008-03-19 | Sony Corp | DATA COMMUNICATION SYSTEM, DATA TRANSMITTER AND DATA RECEIVER |
US8054357B2 (en) | 2001-11-06 | 2011-11-08 | Candela Microsystems, Inc. | Image sensor with time overlapping image output |
JP3829743B2 (ja) | 2002-03-22 | 2006-10-04 | 株式会社デンソーウェーブ | 光学的情報記録媒体及び光学的情報読取装置 |
JP4207490B2 (ja) | 2002-08-06 | 2009-01-14 | ソニー株式会社 | 光通信装置、光通信データ出力方法、および光通信データ解析方法、並びにコンピュータ・プログラム |
JP4200435B2 (ja) | 2002-09-10 | 2008-12-24 | ソニー株式会社 | 情報処理装置および方法、記録媒体、並びにプログラム |
EP1564914B1 (en) | 2002-10-24 | 2007-09-05 | Nakagawa Laboratories, Inc. | Illumination light communication device |
JP3827082B2 (ja) | 2002-10-24 | 2006-09-27 | 株式会社中川研究所 | 放送システム及び電球、照明装置 |
US20040101309A1 (en) | 2002-11-27 | 2004-05-27 | Beyette Fred R. | Optical communication imager |
JP4233371B2 (ja) | 2003-04-10 | 2009-03-04 | 株式会社京三製作所 | 踏切障害物検知装置 |
JP2004334269A (ja) | 2003-04-30 | 2004-11-25 | Sony Corp | 画像処理装置および方法、記録媒体、並びにプログラム |
JP4389871B2 (ja) | 2003-06-27 | 2009-12-24 | 株式会社ニコン | 基準パターン抽出方法とその装置、パターンマッチング方法とその装置、位置検出方法とその装置及び露光方法とその装置 |
JP3974629B2 (ja) | 2003-11-19 | 2007-09-12 | 株式会社ナナオ | 液晶表示装置の経年変化補償方法、液晶表示装置の経年変化補償装置、コンピュータプログラム及び液晶表示装置 |
JP4082689B2 (ja) | 2004-01-23 | 2008-04-30 | 株式会社 日立ディスプレイズ | 液晶表示装置 |
KR101110009B1 (ko) * | 2004-02-27 | 2012-02-06 | 교세라 가부시키가이샤 | 촬상 장치 및 화상 생성 방법 |
DE112005000738T5 (de) | 2004-03-29 | 2007-04-26 | Evolution Robotics, Inc., Pasadena | Verfahren und Vorrichtung zur Positionsbestimmung unter Verwendung von reflektierten Lichtquellen |
US20050265731A1 (en) | 2004-05-28 | 2005-12-01 | Samsung Electronics Co.; Ltd | Wireless terminal for carrying out visible light short-range communication using camera device |
KR100617679B1 (ko) | 2004-05-28 | 2006-08-28 | 삼성전자주식회사 | 카메라 장치를 이용하여 가시광선 근거리 통신을 수행하는무선 단말기 |
KR100818392B1 (ko) | 2004-05-31 | 2008-04-02 | 가시오게산키 가부시키가이샤 | 정보수신장치, 정보전송장치, 정보수신방법 및 기록매체 |
US7706917B1 (en) | 2004-07-07 | 2010-04-27 | Irobot Corporation | Celestial navigation system for an autonomous robot |
US7830357B2 (en) | 2004-07-28 | 2010-11-09 | Panasonic Corporation | Image display device and image display system |
WO2006013755A1 (ja) | 2004-08-05 | 2006-02-09 | Japan Science And Technology Agency | 空間光通信を用いた情報処理システム及び空間光通信システム |
US20060044741A1 (en) | 2004-08-31 | 2006-03-02 | Motorola, Inc. | Method and system for providing a dynamic window on a display |
WO2006033263A1 (ja) * | 2004-09-22 | 2006-03-30 | Kyocera Corporation | 光送信装置及び光通信システム |
JP2006092486A (ja) | 2004-09-27 | 2006-04-06 | Nippon Signal Co Ltd:The | Led信号灯器 |
US7728893B2 (en) | 2004-11-02 | 2010-06-01 | Japan Science And Technology Agency | Imaging device and method for reading signals from such device |
CN101099186B (zh) | 2004-11-12 | 2012-01-18 | Vfs技术有限公司 | 微粒探测器,系统与方法 |
US7787012B2 (en) | 2004-12-02 | 2010-08-31 | Science Applications International Corporation | System and method for video image registration in a heads up display |
CN101088295A (zh) | 2004-12-22 | 2007-12-12 | 皇家飞利浦电子股份有限公司 | 可分级编码 |
CA2609877C (en) | 2005-01-25 | 2015-05-26 | Tir Technology Lp | Method and apparatus for illumination and communication |
KR20060087744A (ko) | 2005-01-31 | 2006-08-03 | 삼성전자주식회사 | 이동통신 단말기에서 잔상효과를 이용한 데이터 표시장치및 방법 |
JP3939327B2 (ja) | 2005-02-03 | 2007-07-04 | 三菱電機株式会社 | データ送信及び受信方法、データ送信及び受信装置 |
JP4789481B2 (ja) | 2005-02-16 | 2011-10-12 | シャープ株式会社 | 画像表示装置及びデータ伝達システム |
JP4506502B2 (ja) | 2005-02-23 | 2010-07-21 | パナソニック電工株式会社 | 照明光伝送システム |
JP4627084B2 (ja) * | 2005-04-12 | 2011-02-09 | パイオニア株式会社 | 通信システム、通信装置及び方法、並びにコンピュータプログラム |
JP2006319545A (ja) | 2005-05-11 | 2006-11-24 | Fuji Photo Film Co Ltd | ディスプレイ装置および可視光送受信システム |
JP4692991B2 (ja) | 2005-05-20 | 2011-06-01 | 株式会社中川研究所 | データ送信装置及びデータ受信装置 |
JP4660818B2 (ja) | 2005-06-03 | 2011-03-30 | 清水建設株式会社 | 照明システム |
US20080290988A1 (en) * | 2005-06-18 | 2008-11-27 | Crawford C S Lee | Systems and methods for controlling access within a system of networked and non-networked processor-based systems |
WO2007004530A1 (ja) | 2005-06-30 | 2007-01-11 | Pioneer Corporation | 照明光通信装置および照明光通信方法 |
JP2007019936A (ja) | 2005-07-08 | 2007-01-25 | Fujifilm Holdings Corp | 可視光通信システム、撮像装置、可視光通信準備方法及び可視光通信準備プログラム |
JP2007036833A (ja) | 2005-07-28 | 2007-02-08 | Sharp Corp | 電子透かし埋め込み方法及び埋め込み装置、電子透かし検出方法及び検出装置 |
JP4765027B2 (ja) | 2005-07-29 | 2011-09-07 | 国立大学法人 奈良先端科学技術大学院大学 | 情報処理装置及び情報処理システム |
US7570246B2 (en) | 2005-08-01 | 2009-08-04 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | Method and apparatus for communication using pulse-width-modulated visible light |
JP2007049584A (ja) | 2005-08-12 | 2007-02-22 | Casio Comput Co Ltd | 宣伝支援システム及びプログラム |
JP4483744B2 (ja) | 2005-08-26 | 2010-06-16 | ソニー株式会社 | 撮像装置及び撮像制御方法 |
JP4643403B2 (ja) | 2005-09-13 | 2011-03-02 | 株式会社東芝 | 可視光通信システム及びその方法 |
JP2007082098A (ja) | 2005-09-16 | 2007-03-29 | Nakagawa Kenkyusho:Kk | 送信データ割り当て方法および光通信システム |
JP4939024B2 (ja) | 2005-09-27 | 2012-05-23 | 京セラ株式会社 | 光通信装置、及び光通信方法 |
JP4325604B2 (ja) | 2005-09-30 | 2009-09-02 | 日本電気株式会社 | 可視光制御装置、可視光通信装置、可視光制御方法及びプログラム |
JP4739914B2 (ja) | 2005-10-28 | 2011-08-03 | 京セラ株式会社 | 通信装置、通信システム及び通信方法 |
JP2007150643A (ja) | 2005-11-28 | 2007-06-14 | Sony Corp | 固体撮像素子、固体撮像素子の駆動方法および撮像装置 |
JP4371108B2 (ja) | 2005-12-27 | 2009-11-25 | ソニー株式会社 | 撮像装置および方法、記録媒体、並びにプログラム |
JP4600297B2 (ja) | 2006-01-11 | 2010-12-15 | ソニー株式会社 | オブジェクトの関連情報の記録システム,オブジェクトの関連情報の記録方法,テレビジョン受像機及び表示制御方法 |
JP4868217B2 (ja) | 2006-01-25 | 2012-02-01 | ソニー株式会社 | 撮像装置および方法、記録媒体、並びにプログラム |
US20060242908A1 (en) * | 2006-02-15 | 2006-11-02 | Mckinney David R | Electromagnetic door actuator system and method |
JP2007221570A (ja) | 2006-02-17 | 2007-08-30 | Casio Comput Co Ltd | 撮像装置及びそのプログラム |
US7835649B2 (en) * | 2006-02-24 | 2010-11-16 | Cisco Technology, Inc. | Optical data synchronization scheme |
JP2007228512A (ja) | 2006-02-27 | 2007-09-06 | Kyocera Corp | 可視光通信システムおよび情報処理装置 |
JP4980633B2 (ja) | 2006-03-16 | 2012-07-18 | エヌ・ティ・ティ・コミュニケーションズ株式会社 | 画像表示装置、受信装置、画像表示制御方法およびデータ受信方法 |
JP2007256496A (ja) | 2006-03-22 | 2007-10-04 | Fujifilm Corp | 液晶表示装置 |
JP5045980B2 (ja) | 2006-03-28 | 2012-10-10 | カシオ計算機株式会社 | 情報伝送システム、移動体の制御装置、移動体の制御方法、及び、プログラム |
JP4610511B2 (ja) | 2006-03-30 | 2011-01-12 | 京セラ株式会社 | 可視光受信装置および可視光受信方法 |
JP2007274566A (ja) | 2006-03-31 | 2007-10-18 | Nakagawa Kenkyusho:Kk | 照明光通信装置 |
JP4767747B2 (ja) | 2006-04-27 | 2011-09-07 | 京セラ株式会社 | 可視光通信のための発光装置およびその制御方法 |
US7599789B2 (en) | 2006-05-24 | 2009-10-06 | Raytheon Company | Beacon-augmented pose estimation |
DE102006024421B3 (de) | 2006-05-24 | 2007-10-25 | Siemens Ag | Verfahren und Anordnung zur Übertragung von Daten mit wenigstens zwei Strahlungsquellen |
US9323055B2 (en) | 2006-05-26 | 2016-04-26 | Exelis, Inc. | System and method to display maintenance and operational instructions of an apparatus using augmented reality |
JP5162850B2 (ja) | 2006-07-10 | 2013-03-13 | セイコーエプソン株式会社 | プロジェクタ及び画像表示システム |
JP5256552B2 (ja) | 2006-07-10 | 2013-08-07 | Nltテクノロジー株式会社 | 液晶表示装置、該液晶表示装置に用いられる駆動制御回路及び駆動方法 |
JP4873623B2 (ja) | 2006-07-28 | 2012-02-08 | Kddi株式会社 | カラー画像へのバーコード埋め込み方法および装置、およびコンピュータプログラム |
EP1887526A1 (en) | 2006-08-11 | 2008-02-13 | Seac02 S.r.l. | A digitally-augmented reality video system |
WO2008023583A1 (fr) | 2006-08-21 | 2008-02-28 | Panasonic Corporation | Dispositif de transmission spatiale optique utilisant un capteur d'image |
US7965274B2 (en) | 2006-08-23 | 2011-06-21 | Ricoh Company, Ltd. | Display apparatus using electrophoretic element |
JP4996175B2 (ja) * | 2006-08-29 | 2012-08-08 | 株式会社東芝 | 入室管理システムおよび入室管理方法 |
US7714892B2 (en) | 2006-11-08 | 2010-05-11 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | Systems, devices and methods for digital camera image stabilization |
US20100020970A1 (en) | 2006-11-13 | 2010-01-28 | Xu Liu | System And Method For Camera Imaging Data Channel |
JP2008124922A (ja) | 2006-11-14 | 2008-05-29 | Matsushita Electric Works Ltd | 照明装置、および照明システム |
US20080122994A1 (en) * | 2006-11-28 | 2008-05-29 | Honeywell International Inc. | LCD based communicator system |
JP5288579B2 (ja) | 2006-12-13 | 2013-09-11 | ルネサスエレクトロニクス株式会社 | 表示装置及び、コントローラドライバ |
JP2008187615A (ja) | 2007-01-31 | 2008-08-14 | Canon Inc | 撮像素子、撮像装置、制御方法、及びプログラム |
JP4265662B2 (ja) | 2007-02-06 | 2009-05-20 | 株式会社デンソー | 車両用通信装置 |
US20080205848A1 (en) | 2007-02-28 | 2008-08-28 | Victor Company Of Japan, Ltd. | Imaging apparatus and reproducing apparatus |
US8144990B2 (en) | 2007-03-22 | 2012-03-27 | Sony Ericsson Mobile Communications Ab | Translation and display of text in picture |
JP5031427B2 (ja) | 2007-03-30 | 2012-09-19 | 三星電子株式会社 | 可視光送信装置、可視光受信装置、可視光通信システム、及び可視光通信方法 |
JP2008252466A (ja) | 2007-03-30 | 2008-10-16 | Nakagawa Kenkyusho:Kk | 光通信システム、送信装置および受信装置 |
JP2008269486A (ja) | 2007-04-24 | 2008-11-06 | Olympus Corp | 撮像機器及びその認証方法 |
JP4935492B2 (ja) | 2007-05-11 | 2012-05-23 | 株式会社豊田中央研究所 | 光通信装置 |
JP2008292397A (ja) | 2007-05-28 | 2008-12-04 | Shimizu Corp | 可視光通信を用いた位置情報提供システム |
WO2009008418A1 (ja) | 2007-07-11 | 2009-01-15 | Sony Corporation | 表示装置、映像信号処理方法、およびプログラム |
JP2009033338A (ja) | 2007-07-25 | 2009-02-12 | Olympus Imaging Corp | 撮像装置 |
JP2009036571A (ja) | 2007-07-31 | 2009-02-19 | Toshiba Corp | 可視光通信システムを利用した位置測定システム、位置測定装置及び位置測定方法 |
JP4867874B2 (ja) | 2007-09-12 | 2012-02-01 | 富士通株式会社 | 画像処理プログラム、画像処理装置、および画像処理方法 |
JP5048440B2 (ja) | 2007-09-27 | 2012-10-17 | 株式会社豊田中央研究所 | 光通信システム |
JP2009117892A (ja) | 2007-11-01 | 2009-05-28 | Toshiba Corp | 可視光通信装置 |
JP4998232B2 (ja) | 2007-11-27 | 2012-08-15 | セイコーエプソン株式会社 | 撮像装置及び映像記録装置 |
US9058764B1 (en) | 2007-11-30 | 2015-06-16 | Sprint Communications Company L.P. | Markers to implement augmented reality |
JP5285301B2 (ja) | 2008-02-26 | 2013-09-11 | パナソニック株式会社 | 光伝送システム |
JP2009212768A (ja) | 2008-03-04 | 2009-09-17 | Victor Co Of Japan Ltd | 可視光通信光送信装置、情報提供装置、及び情報提供システム |
US8587680B2 (en) | 2008-03-10 | 2013-11-19 | Nec Corporation | Communication system, transmission device and reception device |
WO2009113415A1 (ja) | 2008-03-10 | 2009-09-17 | 日本電気株式会社 | 通信システム、制御装置及び受信装置 |
JP2009232083A (ja) | 2008-03-21 | 2009-10-08 | Mitsubishi Electric Engineering Co Ltd | 可視光通信システム |
US8542906B1 (en) | 2008-05-21 | 2013-09-24 | Sprint Communications Company L.P. | Augmented reality image offset and overlay |
JP5171393B2 (ja) | 2008-05-27 | 2013-03-27 | パナソニック株式会社 | 可視光通信システム |
WO2009144853A1 (ja) | 2008-05-30 | 2009-12-03 | シャープ株式会社 | 照明装置、表示装置、並びに導光板 |
US20100107189A1 (en) | 2008-06-12 | 2010-04-29 | Ryan Steelberg | Barcode advertising |
US8731301B1 (en) | 2008-09-25 | 2014-05-20 | Sprint Communications Company L.P. | Display adaptation based on captured image feedback |
JP2010103746A (ja) | 2008-10-23 | 2010-05-06 | Hoya Corp | 撮像装置 |
JP2010117871A (ja) | 2008-11-13 | 2010-05-27 | Sony Ericsson Mobile Communications Ab | パターン画像の読み取り方法、パターン画像の読み取り装置、情報処理方法およびパターン画像の読み取りプログラム |
US8690335B2 (en) | 2008-11-17 | 2014-04-08 | Nec Corporation | Display apparatus which displays the first image and the synthesized image with circular polarizations having different senses of rotation from each other |
JP5185087B2 (ja) | 2008-11-25 | 2013-04-17 | 三星電子株式会社 | 可視光通信システム、及び信号伝送方法 |
KR20100059502A (ko) | 2008-11-26 | 2010-06-04 | 삼성전자주식회사 | 가시광 통신 시스템에서 브로드캐스팅 서비스 방법 및 시스템 |
GB2465793A (en) | 2008-11-28 | 2010-06-02 | Sony Corp | Estimating camera angle using extrapolated corner locations from a calibration pattern |
JP5307527B2 (ja) | 2008-12-16 | 2013-10-02 | ルネサスエレクトロニクス株式会社 | 表示装置、表示パネルドライバ、及びバックライト駆動方法 |
JP5447391B2 (ja) | 2008-12-18 | 2014-03-19 | 日本電気株式会社 | ディスプレイシステム、制御装置、表示方法およびプログラム |
JP2010152285A (ja) | 2008-12-26 | 2010-07-08 | Fujifilm Corp | 撮像装置 |
JP2010232912A (ja) | 2009-03-26 | 2010-10-14 | Panasonic Electric Works Co Ltd | 照明光伝送システム |
JP5193124B2 (ja) | 2009-04-23 | 2013-05-08 | 株式会社日立情報制御ソリューションズ | 電子透かし埋め込み方法及び装置 |
JP2010268264A (ja) | 2009-05-15 | 2010-11-25 | Panasonic Corp | 撮像素子及び撮像装置 |
JP2010278573A (ja) | 2009-05-26 | 2010-12-09 | Panasonic Electric Works Co Ltd | 点灯制御装置、盗撮防止システム、映写機 |
KR20100133806A (ko) | 2009-06-12 | 2010-12-22 | 삼성전자주식회사 | 이미지 디스플레이 방법 및 장치 |
JP5537841B2 (ja) | 2009-06-15 | 2014-07-02 | ビーコア株式会社 | 発光体及び受光体及び関連する方法 |
JP5515472B2 (ja) | 2009-07-13 | 2014-06-11 | カシオ計算機株式会社 | 撮像装置、撮像方法及びプログラム |
CN101959016B (zh) | 2009-07-14 | 2012-08-22 | 华晶科技股份有限公司 | 图像撷取装置的省电方法 |
US8879735B2 (en) | 2012-01-20 | 2014-11-04 | Digimarc Corporation | Shared secret arrangements and optical data transfer |
JP5414405B2 (ja) | 2009-07-21 | 2014-02-12 | キヤノン株式会社 | 画像処理装置、撮像装置及び画像処理方法 |
JP5394843B2 (ja) | 2009-07-24 | 2014-01-22 | 三星電子株式会社 | 送信装置、受信装置、可視光通信システム、及び可視光通信方法 |
JP2011055288A (ja) | 2009-09-02 | 2011-03-17 | Toshiba Corp | 可視光通信装置及びデータ受信方法 |
US8731406B2 (en) | 2009-09-16 | 2014-05-20 | Samsung Electronics Co., Ltd. | Apparatus and method for generating high resolution frames for dimming and visibility support in visible light communication |
KR101621095B1 (ko) | 2009-09-16 | 2016-05-16 | 삼성전자주식회사 | 디스플레이를 통한 부가 정보 제공 방법 및 장치 |
KR101615762B1 (ko) * | 2009-09-19 | 2016-04-27 | 삼성전자주식회사 | 다중 통신 모드를 제공하는 가시광 통신 시스템에서 가시 프레임을 출력하기 위한 방법 및 장치 |
TWI559763B (zh) | 2009-10-01 | 2016-11-21 | 索尼半導體解決方案公司 | 影像取得裝置及照相機系統 |
JP2011097141A (ja) | 2009-10-27 | 2011-05-12 | Renesas Electronics Corp | 撮像装置、撮像装置の制御方法、及びプログラム |
US8831279B2 (en) | 2011-03-04 | 2014-09-09 | Digimarc Corporation | Smartphone-based methods and systems |
US8175617B2 (en) | 2009-10-28 | 2012-05-08 | Digimarc Corporation | Sensor-based mobile search, related methods and systems |
KR101654934B1 (ko) | 2009-10-31 | 2016-09-23 | 삼성전자주식회사 | 가시광 통신 방법 및 장치 |
JP5246146B2 (ja) | 2009-12-01 | 2013-07-24 | コニカミノルタビジネステクノロジーズ株式会社 | 画像形成装置及び画像読取装置 |
US8848059B2 (en) | 2009-12-02 | 2014-09-30 | Apple Inc. | Systems and methods for receiving infrared data with a camera designed to detect images based on visible light |
US8798479B2 (en) | 2009-12-03 | 2014-08-05 | Samsung Electronics Co., Ltd. | Controlling brightness of light sources used for data transmission |
CN101710890B (zh) | 2009-12-15 | 2013-01-02 | 华东理工大学 | 脉冲和ofdm双重数据调制方法 |
US8855496B2 (en) * | 2010-01-05 | 2014-10-07 | Samsung Electronics Co., Ltd. | Optical clock rate negotiation for supporting asymmetric clock rates for visible light communication |
JP5698764B2 (ja) | 2010-01-15 | 2015-04-08 | コーニンクレッカ フィリップス エヌ ヴェ | 従来のカメラセンサを使用した可視光通信のためのデータ検出 |
US8964298B2 (en) | 2010-02-28 | 2015-02-24 | Microsoft Corporation | Video display modification based on sensor input for a see-through near-to-eye display |
US8217997B2 (en) | 2010-03-16 | 2012-07-10 | Interphase Corporation | Interactive display system |
JP5436311B2 (ja) | 2010-04-02 | 2014-03-05 | 三菱電機株式会社 | 情報表示システム、情報コンテンツ配信サーバおよびディスプレイ装置 |
JP5802997B2 (ja) | 2010-05-05 | 2015-11-04 | ディジマーク コーポレイション | 隠された画像のシグナリング |
JP5750837B2 (ja) | 2010-05-28 | 2015-07-22 | カシオ計算機株式会社 | 情報伝送システム、情報伝送方法、受光装置、受光方法、及び、プログラム |
US9183560B2 (en) | 2010-05-28 | 2015-11-10 | Daniel H. Abelow | Reality alternate |
JP2011254317A (ja) * | 2010-06-02 | 2011-12-15 | Sony Corp | 送信装置、送信方法、受信装置、受信方法、通信システムおよび通信方法 |
CN102474570B (zh) | 2010-06-08 | 2016-07-06 | 松下电器(美国)知识产权公司 | 信息显示装置、显示控制用集成电路及显示控制方法 |
US9183675B2 (en) | 2010-08-06 | 2015-11-10 | Bizmodeline Co., Ltd. | Apparatus and method for augmented reality |
JP5561860B2 (ja) | 2010-08-19 | 2014-07-30 | 西日本電信電話株式会社 | 広告配信装置及び方法、ならびに、プログラム |
WO2012026039A1 (ja) | 2010-08-27 | 2012-03-01 | 富士通株式会社 | 電子透かし埋め込み装置、電子透かし埋め込み方法及び電子透かし埋め込み用コンピュータプログラムならびに電子透かし検出装置 |
US8682245B2 (en) * | 2010-09-23 | 2014-03-25 | Blackberry Limited | Communications system providing personnel access based upon near-field communication and related methods |
US8891977B2 (en) * | 2010-09-29 | 2014-11-18 | Supreme Architecture Ltd. | Receiver chip and method for on-chip multi-node visible light communication |
US8523075B2 (en) * | 2010-09-30 | 2013-09-03 | Apple Inc. | Barcode recognition using data-driven classifier |
US8634725B2 (en) | 2010-10-07 | 2014-01-21 | Electronics And Telecommunications Research Institute | Method and apparatus for transmitting data using visible light communication |
JP2012095214A (ja) | 2010-10-28 | 2012-05-17 | Canon Inc | 撮像装置 |
JP5343995B2 (ja) | 2010-11-25 | 2013-11-13 | カシオ計算機株式会社 | 撮像装置、撮像制御方法及びプログラム |
US9112606B2 (en) | 2010-12-15 | 2015-08-18 | Electronics And Telecommunications Research Institute | Method and apparatus for transmitting and receiving data using visible light communication |
TWM404929U (en) | 2011-01-03 | 2011-06-01 | Univ Kun Shan | LED luminaries with lighting and communication functions |
US8553146B2 (en) | 2011-01-26 | 2013-10-08 | Echostar Technologies L.L.C. | Visually imperceptible matrix codes utilizing interlacing |
US9571888B2 (en) | 2011-02-15 | 2017-02-14 | Echostar Technologies L.L.C. | Selection graphics overlay of matrix code |
CN102654400A (zh) | 2011-03-01 | 2012-09-05 | 丁梅 | 应用于数字水准仪条码尺的伪随机条码 |
WO2012120853A1 (ja) | 2011-03-04 | 2012-09-13 | 国立大学法人徳島大学 | 情報提供方法および情報提供装置 |
WO2012123572A1 (en) | 2011-03-16 | 2012-09-20 | Siemens Aktiengesellschaft | A method and device for notification in a system for visible-light communication |
JP2012195763A (ja) | 2011-03-16 | 2012-10-11 | Seiwa Electric Mfg Co Ltd | 電子機器及びデータ収集システム |
EP2503852A1 (en) * | 2011-03-22 | 2012-09-26 | Koninklijke Philips Electronics N.V. | Light detection system and method |
JP2012205168A (ja) | 2011-03-28 | 2012-10-22 | Toppan Printing Co Ltd | 映像処理装置、映像処理方法及び映像処理プログラム |
US9721388B2 (en) | 2011-04-20 | 2017-08-01 | Nec Corporation | Individual identification character display system, terminal device, individual identification character display method, and computer program |
US8256673B1 (en) | 2011-05-12 | 2012-09-04 | Kim Moon J | Time-varying barcode in an active display |
US9667823B2 (en) * | 2011-05-12 | 2017-05-30 | Moon J. Kim | Time-varying barcode in an active display |
JP2012244549A (ja) | 2011-05-23 | 2012-12-10 | Nec Commun Syst Ltd | イメージセンサ通信装置と方法 |
JP2013029816A (ja) | 2011-06-20 | 2013-02-07 | Canon Inc | 表示装置 |
EP2538584B1 (en) | 2011-06-23 | 2018-12-05 | Casio Computer Co., Ltd. | Information Transmission System, and Information Transmission Method |
US8866391B2 (en) | 2011-07-26 | 2014-10-21 | ByteLight, Inc. | Self identifying modulated light source |
US8334901B1 (en) * | 2011-07-26 | 2012-12-18 | ByteLight, Inc. | Method and system for modulating a light source in a light based positioning system using a DC bias |
US8964016B2 (en) * | 2011-07-26 | 2015-02-24 | ByteLight, Inc. | Content delivery based on a light positioning system |
JP2013042221A (ja) | 2011-08-11 | 2013-02-28 | Panasonic Corp | 通信端末、通信方法、マーカ装置及び通信システム |
US9337926B2 (en) | 2011-10-31 | 2016-05-10 | Nokia Technologies Oy | Apparatus and method for providing dynamic fiducial markers for devices |
GB2496379A (en) * | 2011-11-04 | 2013-05-15 | Univ Edinburgh | A freespace optical communication system which exploits the rolling shutter mechanism of a CMOS camera |
KR101961887B1 (ko) | 2011-11-30 | 2019-03-25 | 삼성전자주식회사 | 무선 광통신 시스템 및 이를 이용한 무선 광통신 방법 |
KR20130093699A (ko) * | 2011-12-23 | 2013-08-23 | 삼성전자주식회사 | 광정보 전송장치 및 광정보 수신장치 |
US20130169663A1 (en) | 2011-12-30 | 2013-07-04 | Samsung Electronics Co., Ltd. | Apparatus and method for displaying images and apparatus and method for processing images |
US20130212453A1 (en) | 2012-02-10 | 2013-08-15 | Jonathan Gudai | Custom content display application with dynamic three dimensional augmented reality |
JP2013197849A (ja) | 2012-03-19 | 2013-09-30 | Toshiba Corp | 可視光通信送信装置、可視光通信受信装置および可視光通信システム |
US9450671B2 (en) * | 2012-03-20 | 2016-09-20 | Industrial Technology Research Institute | Transmitting and receiving apparatus and method for light communication, and the light communication system thereof |
JP2013201541A (ja) * | 2012-03-23 | 2013-10-03 | Toshiba Corp | 受信装置、送信装置、及び通信システム |
KR101887548B1 (ko) | 2012-03-23 | 2018-08-10 | 삼성전자주식회사 | 증강현실 서비스를 위한 미디어 파일의 처리 방법 및 장치 |
US8794529B2 (en) | 2012-04-02 | 2014-08-05 | Mobeam, Inc. | Method and apparatus for communicating information via a display screen using light-simulated bar codes |
JP2013223043A (ja) | 2012-04-13 | 2013-10-28 | Toshiba Corp | 受光装置および伝送システム |
JP2013223209A (ja) | 2012-04-19 | 2013-10-28 | Panasonic Corp | 撮像処理装置 |
CN102684869B (zh) | 2012-05-07 | 2016-04-27 | 深圳光启智能光子技术有限公司 | 基于可见光通信的解密方法和系统 |
WO2013166958A1 (zh) | 2012-05-07 | 2013-11-14 | 深圳光启创新技术有限公司 | 基于可见光通信的加密、解密及加解密方法和系统 |
JP5902995B2 (ja) | 2012-05-10 | 2016-04-13 | 株式会社フジクラ | Ledチューブを用いた移動システム、移動方法及びledチューブ |
WO2013171954A1 (ja) | 2012-05-17 | 2013-11-21 | パナソニック株式会社 | 撮像装置、半導体集積回路および撮像方法 |
US9166810B2 (en) | 2012-05-24 | 2015-10-20 | Panasonic Intellectual Property Corporation Of America | Information communication device of obtaining information by demodulating a bright line pattern included in an image |
CN102811284A (zh) | 2012-06-26 | 2012-12-05 | 深圳市金立通信设备有限公司 | 一种语音输入自动翻译为目标语言的方法 |
KR101391128B1 (ko) | 2012-07-06 | 2014-05-02 | 주식회사 아이디로 | 가시광 통신용 oled 표시 장치 |
US20140055420A1 (en) | 2012-08-23 | 2014-02-27 | Samsung Electronics Co., Ltd. | Display identification system and display device |
US20140079281A1 (en) | 2012-09-17 | 2014-03-20 | Gravity Jack, Inc. | Augmented reality creation and consumption |
CN104704535A (zh) | 2012-10-02 | 2015-06-10 | 索尼公司 | 增强现实系统 |
WO2014057632A1 (ja) | 2012-10-09 | 2014-04-17 | パナソニック株式会社 | 照明器具及びそれを用いた可視光通信システム |
US9667865B2 (en) | 2012-11-03 | 2017-05-30 | Apple Inc. | Optical demodulation using an image sensor |
US8988574B2 (en) | 2012-12-27 | 2015-03-24 | Panasonic Intellectual Property Corporation Of America | Information communication method for obtaining information using bright line image |
US8922666B2 (en) | 2012-12-27 | 2014-12-30 | Panasonic Intellectual Property Corporation Of America | Information communication method |
JP5608834B1 (ja) | 2012-12-27 | 2014-10-15 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 映像表示方法 |
US9088360B2 (en) | 2012-12-27 | 2015-07-21 | Panasonic Intellectual Property Corporation Of America | Information communication method |
US9087349B2 (en) | 2012-12-27 | 2015-07-21 | Panasonic Intellectual Property Corporation Of America | Information communication method |
WO2014103333A1 (ja) | 2012-12-27 | 2014-07-03 | パナソニック株式会社 | 表示方法 |
US8913144B2 (en) | 2012-12-27 | 2014-12-16 | Panasonic Intellectual Property Corporation Of America | Information communication method |
JP5590431B1 (ja) | 2012-12-27 | 2014-09-17 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 情報通信方法 |
US9560284B2 (en) | 2012-12-27 | 2017-01-31 | Panasonic Intellectual Property Corporation Of America | Information communication method for obtaining information specified by striped pattern of bright lines |
JP5606655B1 (ja) | 2012-12-27 | 2014-10-15 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 情報通信方法 |
US9608725B2 (en) | 2012-12-27 | 2017-03-28 | Panasonic Intellectual Property Corporation Of America | Information processing program, reception program, and information processing apparatus |
WO2014103329A1 (ja) | 2012-12-27 | 2014-07-03 | パナソニック株式会社 | 可視光通信信号表示方法及び表示装置 |
CN105027473B (zh) | 2013-03-12 | 2017-12-08 | 飞利浦灯具控股公司 | 通信系统、照明系统以及发送信息的方法 |
US9705594B2 (en) | 2013-03-15 | 2017-07-11 | Cree, Inc. | Optical communication for solid-state light sources |
US9407367B2 (en) | 2013-04-25 | 2016-08-02 | Beijing Guo Cheng Wan Tong Information Co. Ltd | Methods and devices for transmitting/obtaining information by visible light signals |
JP6183802B2 (ja) * | 2013-06-04 | 2017-08-23 | ユニバーリンク株式会社 | 可視光受信方法及びその装置 |
-
2013
- 2013-12-27 JP JP2014514660A patent/JP5590431B1/ja active Active
- 2013-12-27 WO PCT/JP2013/007708 patent/WO2014103340A1/ja active Application Filing
- 2013-12-27 BR BR112015014733A patent/BR112015014733A2/pt active Search and Examination
- 2013-12-27 SG SG11201505027UA patent/SG11201505027UA/en unknown
- 2013-12-27 MX MX2015008253A patent/MX343578B/es active IP Right Grant
- 2013-12-27 EP EP13869275.1A patent/EP2940902B1/en active Active
- 2013-12-27 US US14/142,372 patent/US9085927B2/en active Active
- 2013-12-27 EP EP13867350.4A patent/EP2940893B1/en active Active
- 2013-12-27 CN CN201380067922.8A patent/CN105874728B/zh active Active
- 2013-12-27 SG SG10201610410WA patent/SG10201610410WA/en unknown
- 2013-12-27 WO PCT/JP2013/007709 patent/WO2014103341A1/ja active Application Filing
- 2013-12-27 AU AU2013367893A patent/AU2013367893B2/en active Active
- 2013-12-27 JP JP2014554163A patent/JP6294235B2/ja active Active
- 2013-12-27 US US14/142,413 patent/US9341014B2/en active Active
- 2013-12-27 CN CN201380067923.2A patent/CN104956609B/zh active Active
-
2015
- 2015-06-24 CL CL2015001829A patent/CL2015001829A1/es unknown
-
2016
- 2016-03-03 US US15/060,027 patent/US9467225B2/en active Active
- 2016-08-11 US US15/234,135 patent/US9571191B2/en active Active
- 2016-12-20 US US15/384,481 patent/US10148354B2/en active Active
-
2018
- 2018-02-15 JP JP2018025434A patent/JP6616440B2/ja active Active
- 2018-10-05 US US16/152,995 patent/US10447390B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002290335A (ja) | 2001-03-28 | 2002-10-04 | Sony Corp | 光空間伝送装置 |
JP2006121466A (ja) * | 2004-10-22 | 2006-05-11 | Nec Corp | 撮像素子、撮像モジュール及び携帯端末 |
JP2010226172A (ja) * | 2009-03-19 | 2010-10-07 | Casio Computer Co Ltd | 情報復元装置及び情報復元方法 |
JP2012010269A (ja) * | 2010-06-28 | 2012-01-12 | Outstanding Technology:Kk | 可視光通信送信機 |
JP2012169189A (ja) * | 2011-02-15 | 2012-09-06 | Koito Mfg Co Ltd | 発光モジュールおよび車両用灯具 |
JP2013223047A (ja) * | 2012-04-13 | 2013-10-28 | Toshiba Corp | 伝送システム、送信装置および受信装置 |
Non-Patent Citations (2)
Title |
---|
DAI YAMANAKA ET AL.: "An Investigation for the Adoption of Subcarrier Modulation to Wireless Visible Light Communication using Imaging Sensor", IEICE TECHNICAL REPORT TSUSHIN HOSHIKI, vol. 106, no. 450, 4 January 2007 (2007-01-04), pages 25 - 30, XP008177080 * |
See also references of EP2940902A4 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017169066A1 (ja) * | 2016-03-28 | 2017-10-05 | ソニー株式会社 | 電子機器 |
Also Published As
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6616440B2 (ja) | プログラム、制御方法および情報通信装置 | |
JP6524132B2 (ja) | 情報通信方法、情報通信装置およびプログラム | |
JP5606653B1 (ja) | 情報通信方法 | |
JP6378511B2 (ja) | 情報通信方法、情報通信装置およびプログラム | |
WO2014103156A1 (ja) | 情報通信方法 | |
JP5608307B1 (ja) | 情報通信方法 | |
JP5607277B1 (ja) | 情報通信方法 | |
JP6849773B2 (ja) | プログラム、制御方法および制御装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13869275 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014554163 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: P815/2015 Country of ref document: AE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2013869275 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: MX/A/2015/008253 Country of ref document: MX |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112015014733 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 2013367893 Country of ref document: AU Date of ref document: 20131227 Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 112015014733 Country of ref document: BR Kind code of ref document: A2 Effective date: 20150619 |