CN112118045A - Position acquisition device, position acquisition method, and recording medium - Google Patents

Position acquisition device, position acquisition method, and recording medium Download PDF

Info

Publication number
CN112118045A
CN112118045A CN202010422144.1A CN202010422144A CN112118045A CN 112118045 A CN112118045 A CN 112118045A CN 202010422144 A CN202010422144 A CN 202010422144A CN 112118045 A CN112118045 A CN 112118045A
Authority
CN
China
Prior art keywords
light
image
light receiving
receiving surface
set value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010422144.1A
Other languages
Chinese (zh)
Other versions
CN112118045B (en
Inventor
宫本直知
山村贤二
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Casio Computer Co Ltd
Original Assignee
Casio Computer Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Casio Computer Co Ltd filed Critical Casio Computer Co Ltd
Publication of CN112118045A publication Critical patent/CN112118045A/en
Application granted granted Critical
Publication of CN112118045B publication Critical patent/CN112118045B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B10/00Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
    • H04B10/11Arrangements specific to free-space transmission, i.e. transmission through air or vacuum
    • H04B10/114Indoor or close-range type systems
    • H04B10/116Visible light communication

Landscapes

  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention provides a position acquisition device, a position acquisition method and a recording medium. In a sensor that receives light, the accuracy of the position where the light is emitted is improved. When a member or the like that reflects light is present in the vicinity of the installation position of the light source (102a) or the like, and reflected light is detected in addition to light from the light source (102a) or the like for 1 ID, the server (200) determines, as light from the light source (102a) or the like, an image position having a luminance value that is equal to or greater than a set value and is the maximum luminance value for a plurality of image positions corresponding to 1 ID. The server (200) selects the image position of the maximum brightness value and uses the image position in the calculation of the installation position of the light source (102a) and the like.

Description

Position acquisition device, position acquisition method, and recording medium
Technical Field
The invention relates to a position acquisition device, a position acquisition method and a recording medium.
Background
In recent years, a technique of modulating information with luminance and color in a wavelength region of visible light and transmitting the information has been considered.
Further, as described in japanese patent application laid-open No. 2003-179556, there are also techniques as follows: the image pickup device simultaneously receives the modulated visible light in parallel, and associates the position of light reception in the image sensor with the demodulated information.
However, in the above-described technology, when a member or the like that reflects light is present around the installation position of the communication indicator that transmits information, it is expected that it is difficult for the receiving-side apparatus to distinguish whether the light received by the image sensor originates from the light emission of the communication indicator or from the reflected light.
Disclosure of Invention
The present invention has been made in view of the above problems, and an object of the present invention is to improve accuracy of obtaining a position where light is emitted.
In order to achieve the above object, a position acquisition device according to the present invention includes: a light receiving unit having a light receiving surface for receiving light; a determination unit that determines whether or not a plurality of regions having brightness equal to or greater than a set value are present in the light-receiving surface within a set range; and an acquisition unit that acquires a position of a brightest region on the light receiving surface when the determination unit determines that there are a plurality of regions having brightness equal to or greater than the set value.
In order to achieve the above object, a position acquisition method according to the present invention includes: a determination step of determining whether or not a plurality of regions having brightness equal to or higher than a set value exist within a set range on a light receiving surface of the light receiving portion that receives light; and an obtaining step of obtaining a position of a brightest region on the light receiving surface when it is determined in the determining step that there are a plurality of regions having brightness equal to or higher than the set value.
In order to achieve the above object, a recording medium of the present invention is a recording medium that records a program readable by a computer provided in a light receiving device, the program causing the computer to function as: a determination unit that determines whether or not a plurality of regions having brightness equal to or greater than a set value are present in a set range on a light receiving surface of the light receiving device; and an acquisition unit that acquires a position of a brightest region on the light receiving surface when the determination unit determines that there are a plurality of regions having brightness equal to or greater than the set value.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the present invention, the accuracy of obtaining the position where the emitted light exists can be improved.
Drawings
Fig. 1 is a diagram showing an example of a visible light communication system according to an embodiment of the present invention.
Fig. 2 is a diagram showing an example of the configuration of the server according to the embodiment.
Fig. 3 is a diagram showing an example of a light-emitting region according to this embodiment.
Fig. 4 is a flowchart showing an example of image position selection by the server according to the embodiment.
Fig. 5 is a flowchart showing an example of calculation of the installation position by the server according to the embodiment.
Detailed Description
A visible light communication system according to an embodiment of the present invention will be described below with reference to the drawings.
Fig. 1 is a diagram showing a configuration of a visible light communication system. As shown in fig. 1, the visible light communication system 1 includes devices 100a, 100b, and 100c (hereinafter, each of the devices 100a, 100b, and 100c is referred to as "device 100" as appropriate, without limiting the device) provided in a space 500, and a server 200 corresponding to a position acquisition apparatus.
The devices 100a, 100b, 100c are disposed within the space 500. The device 100a is equipped with a light source 102a as a communication indicator, the device 100b is equipped with a light source 102b as a communication indicator, and the device 100c is equipped with a light source 102c as a communication indicator (hereinafter, the light sources 102a, 102b, and 102c are not limited to the respective light sources, and are referred to as "light sources 102" as appropriate). The server 200 is equipped with a light receiving unit and cameras 201a, 201b, 201c, and 201d corresponding to the imaging unit (hereinafter, each of the cameras 201a, 201b, 201c, and 201d is appropriately referred to as a "camera 201" unless otherwise specified). The Light source 102 includes an LED (Light Emitting Diode) not shown. The light source 102 corresponds to a position information acquisition target.
In the present embodiment, the light source 102 attached to the device 100 emits light corresponding to various pieces of information of transmission targets such as the state of the device 100. On the other hand, the server 200 demodulates changes in the light emission color in the light image obtained by the time-series continuous imaging by the camera 201, and acquires information on various transmission targets.
In the present embodiment, the positions and imaging directions of the cameras 201a to 201d are known. The server 200 sets a combination of 2 cameras 201 (camera pairs) for the cameras 201a to 201d, and holds a conversion matrix for converting the position (two-dimensional coordinate information: image position) of an image obtained by imaging into a position (installation position) in the space 500 for each camera pair.
Fig. 2 is a diagram showing an example of the configuration of the server 200. As shown in fig. 2, the server 200 includes a control unit 202, an image input unit 204, a memory 205, an operation unit 206, a display unit 207, and a communication unit 208. The cameras 201a to 201d are mounted on the server 200 via wiring.
The camera 201a includes a lens 203a, the camera 201b includes a lens 203b, the camera 201c includes a lens 203c, and the camera 201d includes a lens 203d (hereinafter, the lenses 203a, 203b, 203c, and 203d are not limited to the respective lenses, and are appropriately referred to as "lenses 203"). The lens 203 is constituted by a zoom lens or the like. The lens 203 is moved by a zoom control operation from the operation unit 206 and a focus control by the control unit 202. The imaging angle of view and the optical image captured by the camera 201 are controlled by the movement of the lens 203. The transformation matrix changes corresponding to the movement of the lens 203.
The cameras 201a to 201d are configured by a plurality of light receiving elements regularly arrayed two-dimensionally on a light receiving surface. The light receiving element is an imaging Device such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor). The cameras 201a to 201d generate frames by converting image signals within an imaging angle of view into digital data, based on an optical image entered via the lens 203 for imaging (light reception) within the imaging angle of view within a predetermined range based on a control signal from the control unit 202. The cameras 201a to 201d perform imaging and frame generation continuously in time, and output the continuous frames to the image input unit 204 in the server 200.
The frame (digital data) output from the camera 201 based on the control signal from the control unit 202 is input to the image input unit 204.
The control Unit 202 is constituted by, for example, a CPU (Central Processing Unit). The control unit 202 controls various functions provided in the server 200 by executing software processing in accordance with a program (for example, a program for realizing the operation of the server 200 shown in fig. 4 and 5 described later) stored in the memory 205.
The Memory 205 is, for example, a RAM (Random Access Memory) or a ROM (Read Only Memory). The memory 205 stores various information (programs and the like) used for control and the like in the server 200.
The operation unit 206 is an interface for inputting operation contents of a user, and is composed of a keypad, function keys, and the like. The Display portion 207 is configured by, for example, an LCD (Liquid Crystal Display), a PDP (Plasma Display Panel), an EL (Electro Luminescence) Display, or the like. The display unit 207 displays an image in accordance with the image signal output from the control unit 202. The communication unit 208 is, for example, a Local Area Network (LAN) card. The communication unit 208 communicates with an external communication device based on the control of the communication control unit 242.
The control unit 202 includes an image processing unit 231, a luminance determination unit 232 corresponding to the determination means, an image position acquisition unit 234 corresponding to the acquisition means, an installation position acquisition unit 236, and a communication control unit 242.
The image processing unit 231 displays frames (digital data) output from the cameras 201 and input to the image input unit 204 on the display unit 207 for real-time display, and performs peripheral dimming correction and distortion correction on the frames to adjust the image quality and the image size. The image processing unit 231 has the following functions: when a control signal based on a recording instruction operation from the operation unit 206 is input, an optical image in the imaging angle of view of the camera 201 at the time point of the recording instruction or in the display range displayed on the display unit 207 is encoded and converted into a file in a compression encoding format such as JPEG (Joint Photographic Experts Group).
The brightness determination unit 232 detects the position (two-dimensional coordinate information: image position) of the image of the light source 102 in each image obtained by the imaging by the cameras 201a to 201 d. Here, the light sources 102a, 102B, 102c in the space 500 emit light of cyclically changing color of R (red) G (green) B (blue) modulated with an id (identification) that uniquely identifies itself. The brightness determination unit 232 detects light of a color that changes cyclically in a predetermined pattern, which is included in each image captured by the cameras 201a to 201 d. Further, the luminance determining section 232 detects an ID corresponding to the emission pattern of the three colors, and then attempts demodulation to the ID.
Here, when there is a member or the like that reflects light around the installation position of the light source 102, there is a case where reflected light or the like is detected in addition to light from the light source 102 for 1 ID. Fig. 3 is a diagram showing an example of a light emitting region on the light receiving surface of 1 camera 102. In fig. 3, light emitting regions 300a, 300b, and 300c in the light receiving surface 250a are light emitting regions (image positions) corresponding to 1 ID. The light-emitting region 300a is a light-emitting region corresponding to light from the light source 102, and the light-emitting regions 300b and 300c are light-emitting regions corresponding to reflected light. In this case, the luminance determination unit 232 detects the image position for each of the plurality of light-emitting regions corresponding to 1 ID.
When detecting a plurality of image positions corresponding to 1 ID, the luminance determination unit 232 calculates luminance values for the plurality of image positions corresponding to 1 ID. Further, the image position obtaining unit 234 determines, as light from the light source 102, an image position having a luminance value equal to or greater than a set value and a maximum luminance value for a plurality of image positions corresponding to 1 ID, and selects the image position having the maximum luminance value.
The installation position acquisition unit 236 sets a combination (camera pair) of 2 cameras 201 for the cameras 201a to 201 d. The pattern of any combination (camera pair) of 2 cameras 201 from 4 cameras 201 is 6 (6).
The installation position acquisition unit 236 attempts to acquire, for each camera pair, a region of light modulated with the same ID from both of images captured by the 2 cameras 201 included in the camera pair, using the image position of the light from the light source 102, which is the image position of the light having the luminance value that is the maximum value and equal to or greater than the set value acquired by the luminance determination unit 232. If the ID is available, the installation position acquisition unit 236 determines that the light source 102 corresponding to the ID is available.
Next, the installation position acquisition unit 236 acquires, for each camera pair: a position (Xga2, Yga2) of an image of the light source 102 on the light-receiving surface of one camera 201 of the 2 cameras 201 included in the pair of cameras; and the position of the image of the light source 102 on the light-receiving surface of the other camera 201 (Xgb2, Ygb 2). Further, the installation position acquisition unit 236 calculates the installation position (Xk2, Yk2, Zk2) of the light source 102 in the space 500 using the combination of the positions (Xga2, Yga2), (Xgb2, Ygb2) of the images of both sides and the transformation matrix corresponding to the camera pair.
Next, the operation of the server 200 will be described with reference to a flowchart. Fig. 4 is a flowchart showing an example of image position selection performed by the server 200.
The plurality of cameras 201 take an image of the space 500 (step S101). The luminance determination unit 232 in the control unit 202 acquires the image position of the light emitting region in each image obtained by the imaging by the plurality of cameras 201. Further, the luminance determination unit 232 tries to acquire an ID corresponding to the image position (step S102).
Next, the brightness determination section 232 selects 1 camera 201 among the plurality of cameras 201 (step S103).
Next, the brightness determination unit 232 recognizes the image position corresponding to 1 ID from the image captured by the 1 camera 201 selected in step S103. Further, the brightness determination unit 232 measures the brightness value of the image position corresponding to 1 ID, and determines whether there is only 1 image position whose brightness value is equal to or greater than a preset threshold value, that is, equal to or greater than a set value (step S104). The brightness value may be stored in the memory 205 together with the ID, the identification information of the camera 102 that captured the image, and the imaging date and time.
When it is determined that there are only 1 image position having a luminance value equal to or greater than the set value for the image position corresponding to 1 ID (yes in step S104), the image position acquisition unit 234 selects 1 image position having a luminance value equal to or greater than the set value (step S105).
On the other hand, if it is determined that there are not only 1 image position having a luminance value equal to or greater than the set value for the image position corresponding to 1 ID (no in step S104), the luminance determination unit 232 determines whether or not there are a plurality of image positions having a luminance value equal to or greater than the set value for the image position corresponding to 1 ID in the image captured by the 1 camera 201 selected in step S103 within a predetermined range (step S106).
When it is determined that there are a plurality of image positions having a luminance value equal to or greater than the set value in the image positions corresponding to 1 ID within the predetermined range (yes in step S106), the image position acquisition unit 234 selects an image position having a maximum luminance value among the plurality of image positions having a luminance value equal to or greater than the set value (step S107).
Next, the image position obtaining unit 234 excludes 1 or more image positions not selected in step S107 from the calculation targets of the installation positions (step S108).
Next, the image position obtaining unit 234 notifies the user of the fact that 1 or more image positions corresponding to 1 ID are excluded by display or the like on the display unit 207 (step S109). This enables the user to remove an object causing reflection or the like, or to set an area excluding the calculation of the installation position in the space 500.
After 1 image position having a luminance value equal to or higher than the set value is selected in step S105, after it is determined in step S106 that there are no plurality of image positions having a luminance value equal to or higher than the set value (no in step S106), or after it is notified in step S109 that an image position is excluded from the calculation targets of the set positions, the luminance determination unit 232 determines whether or not the selection in step S103 is completed for all the cameras 201 (step S110). If it is determined that the selection in step S103 is completed for all the cameras 201 (yes in step S110), the series of operations ends. On the other hand, if it is determined that the camera 201 does not complete the selection in step S103 (no in step S110), the operations from step S103 onward are repeated.
After the above-described image position selection operation, the installation position of the light source 102 is calculated. Fig. 5 is a flowchart showing an example of calculation of the installation position by the server 200.
The installation position acquisition unit 236 selects 1 ID corresponding to the image position selected in step S105 and step S107 in fig. 4 (step S201).
Next, the installation position acquisition unit 236 selects a camera pair for capturing an image of the light source 102 corresponding to the 1 ID selected in step S201 (step S202). When a plurality of pairs of cameras for capturing images of the light sources 102 corresponding to the 1 ID selected in step S201 are set, the installation position acquisition unit 236 may select 1 pair of cameras. For example, the installation position acquisition unit 236 may select 2 cameras 201 corresponding to 2 image positions having high luminance values as a camera pair.
Next, the installation position acquisition unit 236 calculates the installation position of the light source 102 based on the image position of the image of the light source 102 corresponding to the 1 ID selected in step S201, among the image positions included in the images captured by the 2 cameras 102 in the camera pair selected in step S202 (step S203). Here, the installation position acquisition unit 236 acquires the position of the image of the light source 102 acquired on the light receiving surface of one of the cameras 201 and the position of the image of the light source 102 acquired on the light receiving surface of the other camera 201 among the 2 cameras 201 included in the camera pair. Further, the installation position acquisition unit 236 calculates the installation position of the light source 102 in the space 500 using the combination of the positions of both images and the transformation matrix corresponding to the camera pair.
Thereafter, the installation position acquisition unit 236 determines whether all IDs have been selected in step S201 (step S204). If it is determined that all IDs have been selected (yes in step S204), the series of operations ends. On the other hand, if it is determined that all IDs have not been selected in step S201, that is, if it is determined that there are unselected IDs (no in step S204), the operations from step S201 onward are repeated.
As described above, in the present embodiment, when a member or the like that reflects light is present in the vicinity of the installation position of the light source 102 and reflected light is detected for 1 ID in addition to light from the light source 102, the server 200 determines, as light from the light source 102, an image position having a luminance value equal to or higher than a set value and the maximum luminance value for a plurality of image positions corresponding to 1 ID, selects the image position having the maximum luminance value, and uses the image position for calculation of the installation position of the light source 102. This can exclude the image position of the reflected light or the like having a low luminance value from the calculation of the installation position of the light source 102, thereby improving the accuracy of the calculation.
The present invention is not limited to the description of the above embodiments and the drawings, and modifications and the like can be added to the above embodiments and the drawings as appropriate.
For example, in the above-described embodiment, for a plurality of image positions corresponding to 1 ID, an image position having a luminance value equal to or greater than a set value and the maximum luminance value is determined to be light from the light source 102. However, for example, when there are a plurality of image positions having luminance values equal to or greater than the set value, the image position closest to the center may be determined to be light from the light source 102 in view of the fact that the distortion is smaller as the image position is closer to the center.
In addition, when the luminance value is the maximum, the corresponding image position may be determined to be light from the light source 102.
In the above-described embodiment, the server 200 selects the camera pair with the highest reliability of the light source 102, and calculates the installation position of the light source 102 based on the images obtained by imaging the camera pair. However, the method of calculating the installation position is not limited to this.
The light source 102 and the light source 102 are not limited to LEDs. For example, a light source may be formed in a part of an LCD, a PDP, an EL display, or the like that forms a display device.
The server 200 may be any device as long as a camera is mounted.
In the above-described embodiment, the program to be executed is stored in and distributed from a computer-readable recording medium such as a flexible disk, a CD-ROM (Compact Disc-Read Only Memory), a DVD (Digital Versatile Disc), and an MO (magnetic-Optical Disc), and the program can be installed to configure a system that executes the above-described processing.
The program may be stored in a disk device or the like of a predetermined server on a network such as the internet, and may be downloaded by being superimposed on a carrier wave, for example.
When the above-described functions are implemented by sharing the OS (Operating System) or by cooperation of the OS and the application, only the part other than the OS may be stored in a medium and distributed, or may be downloaded.
While the preferred embodiments of the present invention have been described above, the present invention is not limited to the specific embodiments, and the present invention includes the inventions described in the claims and their equivalent ranges.

Claims (7)

1. A position acquisition device is characterized by comprising:
a light receiving unit having a light receiving surface for receiving light;
a determination unit that determines whether or not a plurality of regions having brightness equal to or greater than a set value are present in the light-receiving surface within a set range; and
and an acquisition unit that acquires a position of a brightest region on the light receiving surface when the determination unit determines that there are a plurality of regions having brightness equal to or greater than the set value.
2. The position acquisition apparatus according to claim 1,
the acquiring unit acquires the position of the brightest region on the light receiving surface when the determining unit determines that there are a plurality of regions having brightness equal to or greater than the set value, and further determines that the plurality of regions having brightness equal to or greater than the set value are brighter regions than light based on the same identification information.
3. The position acquisition apparatus according to claim 1 or 2,
the light receiving unit includes an image pickup unit,
the light receiving surface is a light receiving surface for imaging a predetermined space,
the acquisition means acquires a position of an image having brightest light in the space as a position of the brightest area.
4. The position acquisition apparatus according to claim 3,
the acquisition means acquires positions of a plurality of dimensions of an image having the brightest light from a plurality of images obtained by imaging the space from different angles by the imaging means.
5. The position acquisition apparatus according to claim 3,
the acquisition means acquires positions of a plurality of dimensions of an image having the brightest light from a plurality of images obtained by imaging the space from different angles by the plurality of imaging means.
6. A method for position acquisition, comprising:
a determination step of determining whether or not a plurality of regions having brightness equal to or higher than a set value exist within a set range on a light receiving surface of the light receiving portion that receives light; and
an obtaining step of obtaining a position of a brightest region on the light receiving surface when it is determined in the determining step that there are a plurality of regions having brightness equal to or higher than the set value.
7. A recording medium having a program recorded thereon that is readable by a computer provided in a light receiving device, the program causing the computer to function as:
a determination unit that determines whether or not a plurality of regions having brightness equal to or greater than a set value are present in a set range on a light receiving surface of the light receiving device; and
and an acquisition unit that acquires a position of a brightest region on the light receiving surface when the determination unit determines that there are a plurality of regions having brightness equal to or greater than the set value.
CN202010422144.1A 2019-06-21 2020-05-18 Position acquisition device, position acquisition method, and recording medium Active CN112118045B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019115140A JP7052775B2 (en) 2019-06-21 2019-06-21 Position acquisition device, position acquisition method and program
JP2019-115140 2019-06-21

Publications (2)

Publication Number Publication Date
CN112118045A true CN112118045A (en) 2020-12-22
CN112118045B CN112118045B (en) 2023-10-27

Family

ID=73799294

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010422144.1A Active CN112118045B (en) 2019-06-21 2020-05-18 Position acquisition device, position acquisition method, and recording medium

Country Status (2)

Country Link
JP (1) JP7052775B2 (en)
CN (1) CN112118045B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007147642A (en) * 2002-11-25 2007-06-14 Nippon Telegr & Teleph Corp <Ntt> Real world object recognition method and real world object recognition system
JP2007267850A (en) * 2006-03-30 2007-10-18 Namco Bandai Games Inc Program, information storage medium, and image generating system
CN107370538A (en) * 2017-07-20 2017-11-21 西安电子科技大学 Radio data transmission method, camera and system
CN108668089A (en) * 2017-03-28 2018-10-16 卡西欧计算机株式会社 Information processing unit, information processing method and recording medium
US20190036646A1 (en) * 2017-07-25 2019-01-31 Fujitsu Limited Discrimination method and communication system

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4186915B2 (en) 2003-12-18 2008-11-26 株式会社デンソーウェーブ Optical information reader
JP2009288154A (en) 2008-05-30 2009-12-10 Mitsubishi Electric Corp Position detecting device
JP6654019B2 (en) 2015-11-09 2020-02-26 任天堂株式会社 Information processing system information processing apparatus, information processing method, information processing program, and handheld information processing apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007147642A (en) * 2002-11-25 2007-06-14 Nippon Telegr & Teleph Corp <Ntt> Real world object recognition method and real world object recognition system
JP2007267850A (en) * 2006-03-30 2007-10-18 Namco Bandai Games Inc Program, information storage medium, and image generating system
CN108668089A (en) * 2017-03-28 2018-10-16 卡西欧计算机株式会社 Information processing unit, information processing method and recording medium
CN107370538A (en) * 2017-07-20 2017-11-21 西安电子科技大学 Radio data transmission method, camera and system
US20190036646A1 (en) * 2017-07-25 2019-01-31 Fujitsu Limited Discrimination method and communication system

Also Published As

Publication number Publication date
CN112118045B (en) 2023-10-27
JP2021001788A (en) 2021-01-07
JP7052775B2 (en) 2022-04-12

Similar Documents

Publication Publication Date Title
CN111256662B (en) Position information acquisition device, position information acquisition method, recording medium, and position information acquisition system
US10218439B2 (en) Optical communication device, optical communication method, and non-transitory recording medium
JP7409425B2 (en) Position calculation system, position calculation device, position calculation method and program
CN108306682B (en) Light emitting device, information transmission system, and information transmission method
WO2021020355A1 (en) Color estimation device, color estimation method, and program
US10951313B2 (en) Transmitting device, transmission method, and recording medium
CN109756667B (en) Position acquisition system, position acquisition device, position acquisition method, and recording medium
US9843387B2 (en) Decoding apparatus, decoding method, and non-transitory recording medium
CN108668089B (en) Information processing apparatus, information processing method, and recording medium
CN112118045B (en) Position acquisition device, position acquisition method, and recording medium
JP6210081B2 (en) Decoding device, decoding method, and program
JP7396336B2 (en) Location information acquisition device, location information acquisition method, and program
US11582396B2 (en) Information processing device, information processing method, and recording medium
CN108810404B (en) Information processing apparatus, information processing method, and recording medium
CN112788311B (en) Information acquisition method and information acquisition device
JP7056495B2 (en) Position calculation system, position calculation device, position calculation method and program
CN113728564B (en) Method and system for invisible light communication using visible light camera
JP7024186B2 (en) Display devices, information processing systems, information processing methods and programs
JP2021022916A (en) Color estimation device, color estimation method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant