CN112118045B - Position acquisition device, position acquisition method, and recording medium - Google Patents
Position acquisition device, position acquisition method, and recording medium Download PDFInfo
- Publication number
- CN112118045B CN112118045B CN202010422144.1A CN202010422144A CN112118045B CN 112118045 B CN112118045 B CN 112118045B CN 202010422144 A CN202010422144 A CN 202010422144A CN 112118045 B CN112118045 B CN 112118045B
- Authority
- CN
- China
- Prior art keywords
- light
- image
- light receiving
- unit
- acquisition
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 10
- 238000003384 imaging method Methods 0.000 claims description 14
- 238000009434 installation Methods 0.000 abstract description 25
- 238000004891 communication Methods 0.000 description 16
- 230000006870 function Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 239000011159 matrix material Substances 0.000 description 4
- 238000005401 electroluminescence Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- 230000000295 complement effect Effects 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 229910044991 metal oxide Inorganic materials 0.000 description 2
- 150000004706 metal oxides Chemical class 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04B—TRANSMISSION
- H04B10/00—Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
- H04B10/11—Arrangements specific to free-space transmission, i.e. transmission through air or vacuum
- H04B10/114—Indoor or close-range type systems
- H04B10/116—Visible light communication
Landscapes
- Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The application provides a position acquisition device, a position acquisition method and a recording medium. In a sensor that receives light, accuracy in acquiring a position where the light is emitted is improved. When a member or the like that reflects light exists around the installation position of the light source (102 a) or the like and reflected light is detected for 1 ID in addition to light from the light source (102 a) or the like, the server (200) determines that the image position having the maximum brightness value and the brightness value that is equal to or greater than the set value is light from the light source (102 a) or the like for a plurality of image positions corresponding to 1 ID. Further, the server (200) selects the image position of the maximum brightness value and uses it in the calculation of the installation position of the light source (102 a) and the like.
Description
Technical Field
The application relates to a position acquisition device, a position acquisition method, and a recording medium.
Background
In recent years, techniques for transmitting information by modulating information with brightness and color in the wavelength region of visible light have been considered.
As described in japanese patent laid-open publication No. 2003-179556, there are also the following techniques: the image pickup device receives the modulated visible light in parallel at the same time, and associates the position of the received light with the demodulated information in the image sensor.
However, in the above-described technique, when a member or the like for reflecting light is provided around the installation position of the communication indicator for transmitting information, it is expected that it is difficult for the device on the receiving side to distinguish whether the light received by the image sensor originates from the light emission of the communication indicator or from the reflected light.
Disclosure of Invention
The present application has been made in view of the above-described problems, and an object of the present application is to improve accuracy in obtaining a position where light is emitted.
In order to achieve the above object, a position acquisition device according to the present application includes: a light receiving unit having a light receiving surface for receiving light; a determination unit that determines whether or not a plurality of regions having brightness equal to or higher than a set value exist within a set range in the light receiving surface; and an acquisition unit that acquires a position of a brightest region on the light receiving surface when the determination unit determines that there are a plurality of regions having brightness equal to or higher than the set value.
In order to achieve the above object, a position acquisition method according to the present application includes: a determination step of determining whether or not a plurality of areas having brightness equal to or higher than a set value exist within a set range on a light receiving surface receiving light in the light receiving section; and an acquisition step of acquiring a position of a brightest region on the light receiving surface when it is determined in the determination step that there are a plurality of regions having brightness equal to or higher than the set value.
In order to achieve the above object, a recording medium according to the present application stores a program readable by a computer provided in a light receiving device, the program causing the computer to function as: a determination unit that determines whether or not a plurality of regions having brightness equal to or higher than a set value exist within a set range on a light receiving surface of the light receiving device; and an acquisition unit that acquires a position of a brightest region on the light receiving surface when the determination unit determines that there are a plurality of regions having brightness equal to or higher than the set value.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the present application, the accuracy of obtaining the position where the light is emitted can be improved.
Drawings
Fig. 1 is a diagram showing an example of a visible light communication system according to an embodiment of the present application.
Fig. 2 is a diagram showing an example of the structure of the server according to this embodiment.
Fig. 3 is a view showing an example of the light-emitting region according to this embodiment.
Fig. 4 is a flowchart showing an example of image position selection by the server according to this embodiment.
Fig. 5 is a flowchart showing an example of the installation position calculation performed by the server according to the embodiment.
Detailed Description
A visible light communication system according to an embodiment of the present application is described below with reference to the drawings.
Fig. 1 is a diagram showing a configuration of a visible light communication system. As shown in fig. 1, the visible light communication system 1 includes devices 100a, 100b, and 100c (hereinafter, the devices 100a, 100b, and 100c are not limited to the devices 100a, 100b, and 100c, and are appropriately referred to as "devices 100") provided in a space 500, and a server 200 corresponding to a position acquisition device.
The devices 100a, 100b, 100c are disposed within the space 500. The device 100a is mounted with a light source 102a as a communication indicator, the device 100b is mounted with a light source 102b as a communication indicator, and the device 100c is mounted with a light source 102c as a communication indicator (hereinafter, the light sources 102a, 102b, 102c are appropriately referred to as "light sources 102" without limitation). The server 200 is mounted with a light receiving unit and cameras 201a, 201b, 201c, and 201d (hereinafter, referred to as "cameras 201" as appropriate without limiting the respective cameras 201a, 201b, 201c, and 201 d) corresponding to the image capturing unit. The light source 102 includes an LED (Light Emitting Diode ), not shown. The light source 102 corresponds to a position information acquisition object.
In the present embodiment, the light source 102 attached to the device 100 emits light corresponding to information of various transmission objects such as the state of the device 100. On the other hand, the server 200 demodulates the change in the luminescent color in the light image obtained by the continuous imaging in the time series of the camera 201, and acquires various pieces of information to be transmitted.
In the present embodiment, the positions and imaging directions of the cameras 201a to 201d are known. The server 200 sets a combination (camera pair) of 2 cameras 201 for the cameras 201a to 201d, and holds a conversion matrix for converting the position (two-dimensional coordinate information: image position) of an image obtained by imaging into the position (installation position) in the space 500 for each camera pair.
Fig. 2 is a diagram showing an example of the structure of the server 200. As shown in fig. 2, the server 200 includes a control unit 202, an image input unit 204, a memory 205, an operation unit 206, a display unit 207, and a communication unit 208. Further, cameras 201a to 201d are mounted on the server 200 via wiring.
The camera 201a includes a lens 203a, the camera 201b includes a lens 203b, the camera 201c includes a lens 203c, and the camera 201d includes a lens 203d (hereinafter, the lens 203 is appropriately referred to as "lens 203" unless the respective lenses 203a, 203b, 203c, 203d are defined). The lens 203 is constituted by a zoom lens or the like. The lens 203 is moved by a zoom control operation from the operation section 206 and a focus control by the control section 202. The movement of the lens 203 controls the imaging angle and the optical image imaged by the camera 201. The transformation matrix varies corresponding to the movement of the lens 203.
The cameras 201a to 201d are constituted by a plurality of light receiving elements regularly two-dimensionally arranged on a light receiving surface. The light receiving element is an image pickup device such as a CCD (Charge Coupled Device ), CMOS (Complementary Metal Oxide Semiconductor, complementary metal oxide semiconductor), or the like. The cameras 201a to 201d generate frames by converting image signals within imaging angles into digital data based on optical images captured (received) at the imaging angles in a predetermined range via the lens 203 based on control signals from the control unit 202. The cameras 201a to 201d perform imaging and frame generation continuously in time, and output continuous frames to the image input unit 204 in the server 200.
The frame (digital data) output from the camera 201 based on the control signal from the control unit 202 is input to the image input unit 204.
The control unit 202 is constituted by, for example, a CPU (Central Processing Unit ). The control unit 202 executes software processing in accordance with a program stored in the memory 205 (for example, a program for realizing the operation of the server 200 shown in fig. 4 and 5 described later) to control various functions provided in the server 200.
The Memory 205 is, for example, RAM (Random Access Memory ), ROM (Read Only Memory). The memory 205 stores various information (programs and the like) used for control and the like in the server 200.
The operation unit 206 is constituted by a keypad, function keys, and the like, and is an interface for inputting operation contents of a user. The display portion 207 is constituted by, for example, an LCD (Liquid Crystal Display ), a PDP (Plasma Display Panel, plasma panel), an EL (Electro Luminescence ) display, or the like. The display unit 207 displays an image in accordance with the image signal output from the control unit 202. The communication unit 208 is, for example, a LAN (Local Area Network ) card. The communication unit 208 performs communication with an external communication device based on the control of the communication control unit 242.
The control unit 202 includes an image processing unit 231, a luminance determination unit 232 corresponding to the determination means, an image position acquisition unit 234 corresponding to the acquisition means, a set position acquisition unit 236, and a communication control unit 242.
The image processing unit 231 adjusts the image quality and the image size of frames (digital data) output from the video camera 201 and input to the image input unit 204, respectively, by performing peripheral dimming correction and distortion correction for the frames to be displayed on the display unit 207 as real-time display. The image processing unit 231 has the following functions: when a control signal based on a recording instruction operation from the operation unit 206 is input, an optical image in the imaging angle of view of the camera 201 at the time point of the recording instruction or in the display range of the display unit 207 is encoded and filed in a compression encoding format such as JPEG (Joint Photographic Experts Group ).
The luminance determination unit 232 detects the position (two-dimensional coordinate information: image position) of the image of the light source 102 in each image obtained by capturing the images of the cameras 201a to 201d. Here, the light sources 102a, 102B, 102c in the space 500 emit light of cyclically varying colors of R (red) G (green) B (blue) modulated with ID (Identification) capable of uniquely identifying themselves. The brightness determination unit 232 detects light whose color varies in a predetermined pattern in a cycle included in each image captured by the cameras 201a to 201d. Further, the luminance determination unit 232 detects an ID corresponding to the three-color emission pattern, and then tries to demodulate the ID.
Here, when a member or the like that reflects light is present around the installation position of the light source 102, the reflected light or the like may be detected for 1 ID in addition to the light from the light source 102. Fig. 3 is a diagram showing an example of the light emitting regions on the light receiving surfaces of 1 camera 102. In fig. 3, light emitting regions 300a, 300b, 300c in the light receiving surface 250a are light emitting regions (image positions) corresponding to 1 ID. The light emitting region 300a corresponds to light from the light source 102, and the light emitting regions 300b and 300c correspond to reflected light. In this case, the luminance determination unit 232 detects the image positions for each of the plurality of light emitting regions corresponding to 1 ID.
When detecting a plurality of image positions corresponding to 1 ID, the luminance determination unit 232 calculates luminance values for the plurality of image positions corresponding to 1 ID. Further, the image position obtaining unit 234 determines, for a plurality of image positions corresponding to 1 ID, an image position having a luminance value equal to or higher than a set value and a maximum luminance value as light from the light source 102, and selects the image position having the maximum luminance value.
The installation position acquiring unit 236 sets a combination (camera pair) of 2 cameras 201 for the cameras 201a to 201d. The pattern of any 2 combinations (camera pairs) of the cameras 201 from the 4 cameras 201 is 6 (6).
The installation position obtaining unit 236 uses the image position of the maximum luminance value, that is, the image position of the light from the light source 102, which is the luminance value equal to or higher than the set value obtained by the luminance determining unit 232, to attempt to obtain, for each camera pair, the area of the light modulated with the same ID from both of the images obtained by capturing the 2 cameras 201 included in the camera pair. If the acquisition is possible, the installation position acquisition unit 236 is regarded as being able to acquire the light source 102 corresponding to the ID.
Next, the installation position acquisition unit 236 acquires, for each camera pair: the position (Xga, yga 2) of the image of the light source 102 on the light receiving surface of one camera 201 out of the 2 cameras 201 included in the pair of cameras; and the positions (Xgb, ygb 2) of the images of the light sources 102 on the light receiving surface of the other camera 201. The installation position acquiring unit 236 calculates the installation positions (Xk 2, yk2, zk 2) of the light sources 102 in the space 500 using a combination of the positions (Xga, yga 2) of the images (Xgb, ygb 2) and a transformation matrix corresponding to the camera pair.
Next, the operation of the server 200 will be described with reference to a flowchart. Fig. 4 is a flowchart showing an example of image position selection by the server 200.
The plurality of cameras 201 capture images of the space 500 (step S101). The luminance determination unit 232 in the control unit 202 obtains the image positions of the light emitting regions in the respective images obtained by capturing the images of the plurality of cameras 201. Further, the brightness determination unit 232 attempts to acquire an ID corresponding to the image position (step S102).
Next, the luminance determination section 232 selects 1 camera 201 among the plurality of cameras 201 (step S103).
Next, the luminance determination unit 232 recognizes the image position corresponding to 1 ID among the images captured by the 1 cameras 201 selected in step S103. Further, the luminance determination unit 232 measures the luminance value of the image position corresponding to 1 ID, and determines whether or not there are only 1 image positions whose luminance value is equal to or greater than a preset threshold, that is, equal to or greater than a set value (step S104). The luminance value may be stored in the memory 205 together with the ID, the identification information of the camera 102 for imaging, and the imaging date and time.
When it is determined that there are only 1 image positions whose luminance value is equal to or greater than the set value for the image positions corresponding to the 1 IDs (yes in step S104), the image position obtaining unit 234 selects 1 image positions whose luminance value is equal to or greater than the set value (step S105).
On the other hand, when it is determined that there are not only 1 image positions whose luminance value is equal to or greater than the set value for the image positions corresponding to the 1 IDs (no in step S104), the luminance determining unit 232 determines whether or not there are a plurality of image positions whose luminance value is equal to or greater than the set value within a predetermined range for the image positions corresponding to the 1 IDs among the images captured by the 1 cameras 201 selected in step S103 (step S106).
When it is determined that there are a plurality of image positions having a luminance value equal to or greater than the set value for the image positions corresponding to 1 ID (yes in step S106), the image position obtaining unit 234 selects an image position having the largest luminance value among the plurality of image positions having a luminance value equal to or greater than the set value (step S107).
Next, the image position obtaining unit 234 excludes 1 or more image positions not selected in step S107 from the calculation targets of the set positions (step S108).
Next, the image position obtaining unit 234 notifies the user of the fact that 1 or more image positions corresponding to 1 ID are excluded by display or the like on the display unit 207 (step S109). Thus, the user can remove an object that causes reflection or the like, or can set a region in the space 500 that is excluded from calculation of the installation position.
After 1 image position having a luminance value equal to or higher than the set value is selected in step S105, it is determined in step S106 that there are no more image positions having a luminance value equal to or higher than the set value (no in step S106), or after notifying in step S109 that the image position is excluded from the calculation targets of the set positions, the luminance determination unit 232 determines whether or not the selection in step S103 is completed for all the cameras 201 (step S110). If it is determined that the selection in step S103 is completed for all cameras 201 (yes in step S110), the series of operations is ended. On the other hand, when it is determined that the selection in step S103 is not completed for the camera 201 (no in step S110), the operations of step S103 and the following are repeated.
After the above-described operation of selecting the image position, the installation position of the light source 102 is calculated. Fig. 5 is a flowchart showing an example of the setting position calculation performed by the server 200.
The installation position acquiring unit 236 selects 1 ID corresponding to the image position selected in step S105 and step S107 in fig. 4 (step S201).
Next, the position acquisition unit 236 is provided to select a camera pair that captures an image of the light source 102 corresponding to 1 ID selected in step S201 (step S202). In addition, when a plurality of camera pairs can be set for capturing images of the light sources 102 corresponding to the 1 ID selected in step S201, the installation position acquiring unit 236 may select 1 camera pair. For example, the installation position acquisition unit 236 may select 2 cameras 201 corresponding to 2 image positions having high brightness values as the camera pairs.
Next, the setting position obtaining unit 236 calculates the setting position of the light source 102 based on the image position of the image of the light source 102 corresponding to the 1 ID selected in step S201, among the image positions included in the images captured by the 2 cameras 102 in the camera pair selected in step S202 (step S203). Here, the installation position acquisition unit 236 acquires the position of the acquired image of the light source 102 on the light receiving surface of one camera 201 and the position of the acquired image of the light source 102 on the light receiving surface of the other camera 201 out of the 2 cameras 201 included in the camera pair. The installation position obtaining unit 236 calculates the installation position of the light source 102 in the space 500 using a combination of the positions of the images of both sides and a transformation matrix corresponding to the camera pair.
Thereafter, the setting position acquiring unit 236 determines whether or not all IDs are selected in step S201 (step S204). When it is determined that all IDs are selected (yes in step S204), the series of operations is ended. On the other hand, when it is determined that all IDs are not selected in step S201, that is, when it is determined that there are unselected IDs (no in step S204), the operations of step S201 and the following are repeated.
As described above, in the present embodiment, when a member or the like that reflects light exists around the installation position of the light source 102 and reflected light is detected for 1 ID in addition to light from the light source 102, the server 200 determines that an image position having a luminance value equal to or higher than a set value and a maximum luminance value is light from the light source 102 for a plurality of image positions corresponding to 1 ID, and selects the image position having the maximum luminance value to be used for calculating the installation position of the light source 102. This eliminates the image position of reflected light or the like having a low luminance value from the calculation of the installation position of the light source 102, and improves the accuracy of the calculation.
The present application is not limited to the description and drawings of the above embodiments, and can be modified as appropriate in the above embodiments and drawings.
For example, in the above embodiment, the image position having the maximum luminance value and the luminance value equal to or higher than the set value is determined as the light from the light source 102 for the plurality of image positions corresponding to 1 ID. However, for example, when there are a plurality of image positions whose luminance value is equal to or higher than the set value, the image position closest to the center may be determined as light from the light source 102 in view of the fact that distortion is smaller as the image position approaches the center.
Further, if the brightness value is the maximum, the corresponding image position may be determined as the light from the light source 102.
In the above embodiment, the server 200 selects the camera pair having the highest reliability of the light source 102, and calculates the installation position of the light source 102 based on the image captured by the camera pair. However, the method of calculating the installation position is not limited thereto.
The light source 102 and the light source 102 are not limited to LEDs. For example, the light source may be formed in a part of a LCD, PDP, EL display or the like which forms the display device.
The server 200 may be any device as long as it has a camera mounted thereon.
In the above embodiment, the program to be executed is stored in a computer-readable recording medium such as a floppy disk, a CD-ROM (Compact Disc-Read Only Memory), a DVD (Digital Versatile Disc), or an MO (magnetic-Optical Disc) and distributed, and the system for executing the above processing can be configured by installing the program.
The program may be stored on a disk device or the like of a predetermined server on a network such as the internet, and may be superimposed on a carrier wave and downloaded, for example.
In addition, when the above-described functions are realized by sharing the OS (Operating System), or when the above-described functions are realized by cooperation of the OS and an application, only a part other than the OS may be stored in a medium and distributed, or may be downloaded, or the like.
The preferred embodiments of the present application have been described above, but the present application is not limited to the specific embodiments, and the application includes the applications described in the claims and their equivalents.
Claims (6)
1. A position acquisition device is characterized by comprising:
a light receiving unit having a light receiving surface for receiving light;
a determination unit that determines whether or not a plurality of areas having brightness equal to or higher than a set value exist within a set range for an image position corresponding to the same identification information in the light receiving surface;
an acquisition unit that acquires a position of a brightest region on the light receiving surface when the determination unit determines that there are a plurality of regions having brightness equal to or higher than the set value for the image position corresponding to the same identification information; and
and a notification unit configured to notify that a position of the light emitting region different from a position of the brightest light emitting region is not the acquisition target from the acquisition unit.
2. The position acquisition apparatus according to claim 1, wherein,
the light receiving unit comprises an image capturing unit,
the light receiving surface is used for shooting a preset space,
the acquisition means acquires, as the position of the brightest region, the position of the image having the brightest light in the space that has been imaged.
3. The position acquisition apparatus according to claim 2, wherein,
the acquisition unit acquires positions having a plurality of dimensions of the brightest light image from a plurality of images obtained by imaging the space from different angles by the imaging unit, respectively.
4. The position acquisition apparatus according to claim 2, wherein,
the acquisition unit acquires positions having a plurality of dimensions of the brightest light image from a plurality of images obtained by imaging the space from different angles by the plurality of imaging units, respectively.
5. A method for acquiring a position, comprising:
a determination step of determining whether or not a plurality of areas having brightness equal to or higher than a set value exist within a set range for an image position corresponding to the same identification information in a light receiving surface receiving light in the light receiving section;
an acquisition step of acquiring a position of a brightest region on the light receiving surface when it is determined in the determination step that there are a plurality of regions having brightness equal to or higher than the set value for the image position corresponding to the same identification information; and
and a notification step of notifying that a position of the light emitting region different from a position of the brightest light emitting region is excluded from the acquisition target in the acquisition step.
6. A recording medium having recorded thereon a program readable by a computer provided in a light receiving device, the program causing the computer to function as:
a determination unit that determines whether or not a plurality of areas having brightness equal to or higher than a set value exist within a set range for an image position corresponding to the same identification information in a light receiving surface of the light receiving device;
an acquisition unit that acquires a position of a brightest region on the light receiving surface when the determination unit determines that there are a plurality of regions having brightness equal to or higher than the set value for the image position corresponding to the same identification information; and
and a notification unit configured to notify that a position of the light emitting region different from a position of the brightest light emitting region is not the acquisition target from the acquisition unit.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019115140A JP7052775B2 (en) | 2019-06-21 | 2019-06-21 | Position acquisition device, position acquisition method and program |
JP2019-115140 | 2019-06-21 |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112118045A CN112118045A (en) | 2020-12-22 |
CN112118045B true CN112118045B (en) | 2023-10-27 |
Family
ID=73799294
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010422144.1A Active CN112118045B (en) | 2019-06-21 | 2020-05-18 | Position acquisition device, position acquisition method, and recording medium |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP7052775B2 (en) |
CN (1) | CN112118045B (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007147642A (en) * | 2002-11-25 | 2007-06-14 | Nippon Telegr & Teleph Corp <Ntt> | Real world object recognition method and real world object recognition system |
JP2007267850A (en) * | 2006-03-30 | 2007-10-18 | Namco Bandai Games Inc | Program, information storage medium, and image generating system |
CN107370538A (en) * | 2017-07-20 | 2017-11-21 | 西安电子科技大学 | Radio data transmission method, camera and system |
CN108668089A (en) * | 2017-03-28 | 2018-10-16 | 卡西欧计算机株式会社 | Information processing unit, information processing method and recording medium |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4186915B2 (en) | 2003-12-18 | 2008-11-26 | 株式会社デンソーウェーブ | Optical information reader |
JP2009288154A (en) | 2008-05-30 | 2009-12-10 | Mitsubishi Electric Corp | Position detecting device |
JP6654019B2 (en) | 2015-11-09 | 2020-02-26 | 任天堂株式会社 | Information processing system information processing apparatus, information processing method, information processing program, and handheld information processing apparatus |
JP6897389B2 (en) * | 2017-07-25 | 2021-06-30 | 富士通株式会社 | Discrimination computer program, discriminating device and discriminating method, and communication system |
-
2019
- 2019-06-21 JP JP2019115140A patent/JP7052775B2/en active Active
-
2020
- 2020-05-18 CN CN202010422144.1A patent/CN112118045B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007147642A (en) * | 2002-11-25 | 2007-06-14 | Nippon Telegr & Teleph Corp <Ntt> | Real world object recognition method and real world object recognition system |
JP2007267850A (en) * | 2006-03-30 | 2007-10-18 | Namco Bandai Games Inc | Program, information storage medium, and image generating system |
CN108668089A (en) * | 2017-03-28 | 2018-10-16 | 卡西欧计算机株式会社 | Information processing unit, information processing method and recording medium |
CN107370538A (en) * | 2017-07-20 | 2017-11-21 | 西安电子科技大学 | Radio data transmission method, camera and system |
Also Published As
Publication number | Publication date |
---|---|
JP7052775B2 (en) | 2022-04-12 |
JP2021001788A (en) | 2021-01-07 |
CN112118045A (en) | 2020-12-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111256662B (en) | Position information acquisition device, position information acquisition method, recording medium, and position information acquisition system | |
US10218439B2 (en) | Optical communication device, optical communication method, and non-transitory recording medium | |
WO2021020355A1 (en) | Color estimation device, color estimation method, and program | |
CN109756667B (en) | Position acquisition system, position acquisition device, position acquisition method, and recording medium | |
CN111262627B (en) | Transmitting apparatus, transmitting method, and recording medium | |
CN112118045B (en) | Position acquisition device, position acquisition method, and recording medium | |
US9843387B2 (en) | Decoding apparatus, decoding method, and non-transitory recording medium | |
US9729794B2 (en) | Display device, display control method, and non-transitory recording medium | |
JP2018113600A (en) | Light emitting device, imaging device, information transmission system, information transmission method, and program | |
CN108668089B (en) | Information processing apparatus, information processing method, and recording medium | |
JP7396336B2 (en) | Location information acquisition device, location information acquisition method, and program | |
CN113450406B (en) | Information processing apparatus, information processing method, and recording medium | |
JP2019109174A (en) | Position calculation system, position calculation device, position calculation method, and program | |
CN112788311B (en) | Information acquisition method and information acquisition device | |
CN108810404B (en) | Information processing apparatus, information processing method, and recording medium | |
JP7472541B2 (en) | COLOR ESTIMATION DEVICE, COLOR ESTIMATION METHOD, AND PROGRAM | |
CN113728564B (en) | Method and system for invisible light communication using visible light camera | |
JP7024186B2 (en) | Display devices, information processing systems, information processing methods and programs | |
JP2019086511A (en) | Position calculation system, position calculation apparatus, position calculation method and program | |
JP2011009804A (en) | Photographing device, portable terminal, transmitter, photographing method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |