US20150254486A1 - Communication system, image pickup device, program, and communication method - Google Patents
Communication system, image pickup device, program, and communication method Download PDFInfo
- Publication number
- US20150254486A1 US20150254486A1 US14/582,722 US201414582722A US2015254486A1 US 20150254486 A1 US20150254486 A1 US 20150254486A1 US 201414582722 A US201414582722 A US 201414582722A US 2015254486 A1 US2015254486 A1 US 2015254486A1
- Authority
- US
- United States
- Prior art keywords
- images
- unit
- code
- image
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/002—Specific input/output arrangements not covered by G06F3/01 - G06F3/16
- G06F3/005—Input arrangements through a video camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/14—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
- G06K7/1404—Methods for optical code recognition
- G06K7/1408—Methods for optical code recognition the method being specifically adapted for the type of code
- G06K7/1417—2D bar codes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K19/00—Record carriers for use with machines and with at least a part designed to carry digital markings
- G06K19/06—Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
- G06K19/06009—Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code with optically detectable marking
- G06K19/06037—Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code with optically detectable marking multi-dimensional coding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K19/00—Record carriers for use with machines and with at least a part designed to carry digital markings
- G06K19/06—Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
- G06K19/06009—Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code with optically detectable marking
- G06K19/06046—Constructional details
- G06K19/06112—Constructional details the marking being simulated using a light source, e.g. a barcode shown on a display or a laser beam with time-varying intensity profile
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/14—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
- G06K7/1404—Methods for optical code recognition
- G06K7/1439—Methods for optical code recognition including a method step for retrieval of the optical code
- G06K7/1447—Methods for optical code recognition including a method step for retrieval of the optical code extracting optical codes from image or text carrying said optical code
Definitions
- the present invention relates to a communication system, an image pickup device, a program, and a communication method.
- JP-A-2008-077380 discloses the use of a three-dimensional code formed by changing a two-dimensional code in time series, for wireless LAN (local area network) connection between a projector and an information terminal.
- JP-A-2008-077380 does not describe the relation between the frame rate at which the projector projects a three-dimensional code and the frame rate at which the information terminal picks up an image of the three-dimensional code, and therefore data cannot be communicated via a three-dimensional code.
- An advantage of some aspects of the invention is that a greater volume of data can be communicated via a code image.
- An aspect of the invention provides a communication system including a display device and an image pickup device.
- the display device includes: a storage unit which stores data; a dividing unit which divides the data; an encoding unit which encodes each of the data divided by the dividing unit and order information indicating an order of the data and thus generates a plurality of code images; and a display unit which displays the plurality of code images generated by the encoding unit, at a first frame rate according to the order.
- the image pickup device includes: an image pickup unit which sequentially picks up images of the plurality of code images displayed by the display unit, at a second frame rate that is higher than the first frame rate; a decoding unit which extracts and decodes the code images from pickup images picked up by the image pickup unit; and a combining unit which combines the data decoded by the decoding unit according to the order indicated by the order information.
- an image pickup unit which sequentially picks up images of the plurality of code images displayed by the display unit, at a second frame rate that is higher than the first frame rate
- a decoding unit which extracts and decodes the code images from pickup images picked up by the image pickup unit
- a combining unit which combines the data decoded by the decoding unit according to the order indicated by the order information.
- the display unit repeatedly displays the plurality of code images, and the image pickup unit continues image pickup until images of all the plurality of code images are picked up. According to this communication system, a loss of the plurality of code images at the time of image pickup is prevented.
- the display unit displays information indicating a timing when a first code image of the plurality of code images is displayed.
- the image pickup device can specify the timing when the first code image of the plurality of code images is displayed.
- the display device includes a sound emitting unit which emits a sound indicating a timing when a first code image of the plurality of code images is displayed.
- the image pickup device can specify the timing when the first code image of the plurality of code images is displayed.
- the display unit displays an order image indicating the order beside at least one code image of the plurality of code images.
- the image pickup device include a determination unit which determines whether images of all the plurality of code images are picked up or not, on the basis of the order image extracted from the pickup image. The image pickup unit continue image pickup until the determination unit determines that images of all the plurality of code images are picked up. According to this communication system, the image pickup device can determine whether images of all the code images are picked up or not, without decoding the code images.
- the decoding unit if the decoding unit is successful in decoding a certain pickup image, the decoding unit decodes, next to the decoding, a pickup image that is two frames or more after the pickup image. According to this communication system, the time required for decoding is shortened, compared with the case where all the plurality of code images is decoded.
- an image pickup device including: an image pickup unit which sequentially picks up images of a plurality of code images sequentially displayed at a first frame rate and representing data and order information indicating an order of the data, at a second frame rate that is higher than the first frame rate; a decoding unit which extracts and decodes the code images from pickup images picked up by the image pickup unit; and a combining unit which combines the data decoded by the decoding unit according to the order indicated by the order information.
- an image pickup unit which sequentially picks up images of a plurality of code images sequentially displayed at a first frame rate and representing data and order information indicating an order of the data, at a second frame rate that is higher than the first frame rate
- a decoding unit which extracts and decodes the code images from pickup images picked up by the image pickup unit
- a combining unit which combines the data decoded by the decoding unit according to the order indicated by the order information.
- Still another aspect of the invention provides a program for executing: sequentially picking up images of a plurality of code images that is sequentially displayed at a first frame rate and represent data and order information indicating an order of the data, at a second frame rate that is higher than the first frame rate; extracting and decoding the code images from the pickup images that are picked up; and combining the decoded data according to the order indicated by the order information.
- this program a greater volume of data can be communicated via the code images, compared with the case where an image of a single code image is picked up.
- Yet another aspect of the invention provides a communication method including: dividing data; encoding each of the divided data and order information indicating an order of the data and thus generating a plurality of code images; displaying the plurality of code images at a first frame rate according to the order; sequentially picks up images of the plurality of code images that is displayed, at a second frame rate that is higher than the first frame rate; extracting and decoding the code images from the pickup images that are picked up; and combining the decoded data according to the order indicated by the order information.
- this communication method a greater volume of data can be communicated via the code images, compared with the case where a single code image is displayed.
- FIG. 1 shows the overall configuration of a projection system.
- FIG. 2 is a block diagram showing the functional configuration of the projection system.
- FIG. 3 is a block diagram showing the hardware configuration of a projector.
- FIG. 4 is a block diagram showing the hardware configuration of a tablet terminal.
- FIG. 5 is a flowchart showing processing to project a QR code.
- FIG. 6 illustrates the data format of a QR code.
- FIG. 7 illustrates transition of a projected image
- FIG. 8 is a flowchart showing processing to pick up an image of a QR code.
- FIG. 9 is a flowchart showing processing to decode a QR code.
- FIG. 10 illustrates transition of a projected image in Modification 2.
- FIG. 1 shows the overall configuration of a projection system PS according to an embodiment of the invention.
- the projection system PS includes a projector 1 , a tablet terminal 2 , a controller R, and a screen SC.
- the projection system PS is a system for transmitting data from the projector 1 to the tablet terminal 2 via a two-dimensional code.
- the projector 1 is a display device which encodes data, thus generates a two-dimensional code, and projects the two-dimensional code.
- the projector 1 projects a QR code (an example of a code image) on the screen SC.
- the tablet terminal 2 is an image pickup device which picks up an image of the QR code projected on the screen SC and decodes the QR code.
- the controller R is a device for controlling the projector 1 wirelessly via infrared communication or the like, that is, a so-called remote controller.
- the screen SC is a surface on which an image projected by the projector 1 (hereinafter referred to as “projected image”) is shown.
- the volume of data that can be transmitted via a single two-dimensional code there is an upper limit to the volume of data that can be transmitted via a single two-dimensional code.
- QR codes a plurality of versions (versions 1 to 40) according to the number of cells in the codes are prescribed, and the maximum volume of data that can be transmitted via a QR code of version 40 is about 3 kilobytes (2953 bytes, to be more precise) per QR code. Therefore, data exceeding the maximum volume cannot be transmitted via a single two-dimensional code.
- the projection system PS as the projector 1 sequentially projects a plurality of QR codes and the tablet terminal 2 reads (picks up images of and decodes) the plurality of QR codes, data exceeding the maximum volume is communicated.
- FIG. 2 is a block diagram showing the functional configuration of the projection system PS.
- the projector 1 includes a storage unit 11 , a dividing unit 12 , an encoding unit 13 , a projection unit 14 , and a sound emitting unit 15 .
- the storage unit 11 stores data that is a target of communication (hereinafter referred to as “target data”).
- the dividing unit 12 divides the target data into a predetermined size.
- the encoding unit 13 encodes each of the data divided by the dividing unit 12 (hereinafter referred to as “divided data”) and information indicating an order of the divided data (hereinafter referred to as “order information”) and thus generates a plurality of QR codes.
- the projection unit 14 projects the plurality of QR codes generated by the encoding unit 13 , at a predetermined interval according to the order of the divided data.
- the frame rate at which the projection unit 14 sequentially projects the plurality of QR codes is referred to as “frame rate F 1 ”.
- the sound emitting unit 15 emits a sound indicating the timing when the first QR code of the plurality of QR codes is projected.
- the tablet terminal 2 includes an image pickup unit 21 , a decoding unit 22 , a combining unit 23 , an output unit 24 , and a determination unit 25 .
- the image pickup unit 21 sequentially picks up images of the plurality of QR codes projected by the projection unit 14 , at a predetermined interval.
- the frame rate at which the image pickup unit 21 sequentially picks up images of the plurality of QR codes is referred to as “frame rate F 2 ”.
- the frame rate F 2 is a higher rate than the frame rate F 1 .
- the decoding unit 22 extracts and decodes the QR codes from the images picked up by the image pickup unit 21 (hereinafter referred to as “pickup images”).
- the combining unit 23 combined the divided data decoded by the decoding unit 22 .
- the combining unit 23 combines the divided data according to the order indicated by the order information.
- the output unit 24 outputs the data combined by the combining unit (that is, target data).
- the determination unit 25 determines whether images of all the
- FIG. 3 is a block diagram showing the hardware configuration of the projector 1 .
- the projector 1 includes an MCU (micro control unit) 100 , a ROM (read only memory) 101 , a RAM (random access memory) 102 , a storage unit 103 , an IF (interface) unit 104 , an image processing circuit 105 , a projection unit 106 , a light receiving unit 107 , an operation panel 108 , an input processing unit 109 , and a speaker 110 .
- the MCU 100 is a control unit which executes a program and thus controls each part of the projector 1 .
- the ROM 101 is a non-volatile storage device in which various programs and data are stored.
- the ROM 101 stores programs that are executed by the MCU 100 .
- the RAM 102 is a volatile storage device which stores data.
- the RAM 102 has a frame memory which stores images on a frame basis.
- the storage unit 103 is a storage device which stores data and programs.
- the IF unit 104 communicates with an electronic device or the like that is a video source.
- the IF unit 104 has various terminals (for example, VGA terminal, USB terminal, wired LAN interface, S terminal, RCA terminal, HDMI (High-Definition Multimedia Interface (trademark registered) terminal and the like) and a wireless LAN interface, to connect to an external device.
- the image processing circuit 105 performs predetermined image processing on an image represented by a video signal inputted from the electronic device as a video source (hereinafter referred to as “input image”).
- the image processing circuit 105 writes the image-processed input image to the frame memory.
- the projection unit 106 has a light source 116 , a liquid crystal panel 126 , an optical system 136 , a light source driving circuit 146 , a panel driving circuit 156 , and an optical system driving circuit 166 .
- the light source 116 has a lamp such as a high-pressure mercury lamp, halogen lamp or metal halide lamp, or a solid light source such as an LED (light emitting diode) or laser diode, and casts light on the liquid crystal panel 126 .
- the light crystal panel 126 is a light modulator which modulates the light cast from the light source 116 , according to image data.
- the liquid crystal panel 126 is a transmission-type liquid crystal panel, in which the transmittance of each pixel is controlled according to image data.
- the projector 1 has three liquid crystal panels 126 corresponding to the primary colors of RGB.
- the light from the light source 116 is separated into three color lights of RGB and each color light becomes incident on the corresponding liquid crystal panel 126 .
- the color lights transmitted through and modulated by the respective liquid crystal panels are combined by a cross dichroic prism or the like and then emitted to the optical system 136 .
- the optical system 136 has a lens which enlarged and projects the light modulated to image light by the liquid crystal panels 126 , onto the screen SC, and a zoom lens which carries out enlargement and reduction of the image to be projected, and adjustment of the focal point.
- the light source driving circuit 146 drives the light source 116 under the control of the MCU 100 .
- the panel driving circuit 156 drives the liquid crystal panels 126 according to the image data outputted from the image processing circuit 105 .
- the optical system driving circuit 166 drives the optical system 136 under the control of the MCU 100 and carries out adjustment of the degree of zooming and adjustment of focusing.
- the light receiving unit 107 receives an infrared signal transmitted from the controller R, decodes the received infrared signal, and outputs the decoded signal to the input processing unit 109 .
- the operation panel 108 has buttons and switches to turn on/off the power of the projector 1 and to carry out various operations.
- the input processing unit 109 generates information indicating the content of an operation by the controller R and the operation panel 108 , and outputs the information to the MCU 100 .
- the speaker 110 outputs a sound according to an electrical signal outputted from the MCU 100 .
- the MCU 100 executing a program is an example of the dividing unit 12 and the encoding unit 13 .
- the projection unit 106 controlled by the MCU 100 executing a program is an example of the projection unit 14 .
- the speaker 110 controlled by the MCU 100 executing a program is an example of the sound emitting unit 15 .
- the storage unit 103 is an example of the storage unit 11 .
- FIG. 4 is a block diagram showing the hardware configuration of the tablet terminal 2 .
- the tablet terminal 2 includes a CPU (central processing unit) 200 , a ROM 201 , a RAM 202 , an IF unit 203 , a touch panel 204 , an image pickup unit 205 , a storage unit 206 , and a microphone 207 .
- the CPU 200 is a control device which executes a program and thus controls each part of the tablet terminal 2 .
- the ROM 201 is a non-volatile storage device in which various programs and data are stored.
- the RAM 202 is a volatile storage device which stores data.
- the IF unit 203 has various terminals and a wireless LAN interface to communicate with an external device.
- the touch panel 204 is an input device in which a panel that detects coordinates is superimposed on a display surface of a liquid crystal display or the like.
- the touch panel 204 for example, an optical-type, resistive membrane-type, electrostatic capacitance-type, or ultrasonic-type touch panel is used.
- the image pickup unit 205 is a digital camera which picks up an image of a subject and generates image data, and includes a lens, a CCD (charge coupled device) image sensor, and a signal processing circuit (these components are not shown).
- the storage unit 206 is a storage device which stores data and programs.
- the microphone 207 converts a sound to an electrical signal and outputs the electrical signal to the CPU 200 .
- the image pickup unit 205 controlled by the CPU 200 executing a program is an example of the image pickup unit 21 .
- the CPU 200 executing a program is an example of the decoding unit 22 , the combining unit 23 and the determination unit 25 .
- the touch panel 204 controlled by the CPU 200 executing a program is an example of the output unit 24 .
- FIG. 5 is a flowchart showing processing in which the projector 1 sequentially projects a plurality of QR codes.
- target data is dynamic image data that represents a dynamic image.
- the dynamic image data is stored in the storage unit 103 in advance.
- the following processing is started, triggered by the fact that an instruction to project the QR codes (hereinafter referred to as “QR code projection instruction”) is inputted to the projector 1 .
- the QR code projection instruction is inputted, for example, by the user operating the controller R or the operation panel 108 .
- Step SA 1 the MCU 100 divides the target data. Specifically, the MCU 100 reads out the target data from the storage unit 103 and divides the target data. The size of the divided data is predetermined according to the version of the QR code to be generated, in such a way as to be a size equal to or below a maximum volume of data that can be transmitted via a single QR code.
- the MCU 100 stores a plurality of divided data in the RAM 102 .
- the MCU 100 also adds order information to each of the divided data stored in the RAM 102 . In this example, the MCU 100 adds numbers (hereinafter referred to as “frame numbers”) as order information.
- the MCU 100 adds frames numbers in ascending order from the divided data corresponding to the leading edge of the target data. As the MCU 100 finishes dividing all the target data, the MCU 100 stores the total number of divided data (hereinafter referred to as “total number of frames”) in the RAM 102 .
- Step SA 2 the MCU 100 encodes each of the divided data and thus generates a plurality of QR codes. Specifically, the MCU 100 reads out the divided data, the frame numbers added to the divided data and the total number of frames from the RAM 102 and also reads out information indicating the frame rate F 1 from the ROM 101 , and then encodes these pieces of information and thus generates QR codes of a predetermined version. The MCU 100 encodes the divided data, for example, in order from the smallest frame number and thus generates QR codes, and stores the QR codes in the RAM 102 . The MCU 100 separately adds a frame number that is the same as the frame number indicated by each QR code, to each of the QR codes stored in the RAM 102 .
- FIG. 6 illustrates the data format of a QR code generated by the projector 1 .
- FIG. 6 shows an example in which a QR code D of version 40 (2953 bytes) is generated by the projector 1 .
- the QR code D includes a 2-byte area A 1 indicating the frame number, a 1-byte area A 2 indicating the frame rate F 1 , a 2-byte area A 3 indicating the total number of frames, and a 2948-byte area A 4 representing the divided data.
- Each area of the QR code D includes information that enables the device on the reading side to restore the encoded information even if a part of the QR code cannot be read because of a loss.
- Step SA 3 the MCU 100 determines whether all the divided data are encoded or not. Whether all the divided data are encoded or not is determined on the basis of whether divided data that is not encoded is stored in the RAM 102 or not. If it is determined that all the divided data are encoded (SA 3 : YES), the MCU 100 shifts the processing to Step SA 4 . If it is determined that there is divided data that is not encoded (SA 3 : NO), the MCU 100 shifts the processing to Step SA 2 and continues to generate QR codes.
- Step SA 4 the MCU 100 projects an image showing the beginning of the series of QR codes, for example, an entirely white image (hereinafter referred to as “white image”).
- the white image is displayed in order to allow the tablet terminal 2 to specify the timing when the first QR code of the plurality of QR codes is projected.
- the MCU 100 reads out a white image from the ROM 101 and projects the white image on the screen SC for a predetermined time.
- the MCU 100 sequentially projects the plurality of QR codes at the frame rate F 1 .
- the MCU 100 reads out the plurality of QR codes in order from the QR code with the smallest frame number and sequentially projects the QR codes.
- the MCU 100 projects the plurality of QR codes, for example, at 20 FPS (frames per second). As the MCU 100 finishes projecting all the QR codes, the MCU 100 shifts the processing to Step SA 4 and repeats the projection of a white image and the projection of QR codes.
- FIG. 7 illustrates transition of a projected image Im in the case where the processing shown in FIG. 5 is carried out.
- a projected image Imw represents a white image.
- QR codes D 1 , D 2 , D 3 , and D 100 represent the QR codes of the first frame, second frame, third frame, and 100 th frame, respectively.
- the version of each QR code D is version 5, and character data is encoded in each QR code D.
- the projector 1 projects each QR code D at 20 FPS for five seconds (100 frames). For example, if the version of each QR code D is version 40, the QR codes corresponding to 100 frames represent data of 295.3 kilobytes. After the projector 1 projects the QR codes to the QR code D 100 , the projector 1 projects the projected image Imw again.
- FIG. 8 is a flowchart showing processing in which the tablet terminal 2 sequentially picks up images of a plurality of QR codes.
- the following processing is started as the user operates the touch panel 204 to start up a program for reading the plurality of QR codes in the state where the plurality of QR codes is sequentially projected on the screen SC.
- the user holds the tablet terminal 2 in such a way that images of the QR codes projected on the screen SC are picked up by the image pickup unit 205 .
- Step SB 1 the CPU 200 sequentially picks up images of the plurality of QR codes projected on the screen, at the frame rate F 2 .
- the frame rate F 2 is a higher frame rate than the frame rate F 1 .
- the frame rate F 2 is twice the frame rate F 1 or higher.
- the CPU 200 picks up images of the QR codes, for example, at 60 FPS.
- the CPU 200 stores image data representing the pickup images, in the RAM 202 .
- Step SB 2 the CPU 200 determines whether images of all the plurality of QR codes are picked up or not. Specifically, the CPU 200 analyzes the histogram of gradation levels of the pickup images and thus specifies the number of times a white image is projected while the image pickup is carried out (that is, the number of times the processing of Step SA 4 is carried out). If the number of times a white image is projected is specified as twice or more, the CPU 200 determines that images of all the plurality of QR codes are picked up. If it is determined that images of all the plurality of QR codes are picked up (SB 2 : YES), the CPU 200 shifts the processing to Step SB 3 . If it is determined that there is a QR code whose image is not picked up (SB 2 : NO), the CPU 200 shifts the processing to SB 1 and continues to pick up images of the QR codes.
- Step SB 3 the CPU 200 displays a message indicating that the image pickup of the QR codes is ended (hereinafter referred to as “end message”) on the touch panel 204 .
- the end message is stored in the storage unit 206 .
- the end message includes, for example, a message such as “Image pickup of all the QR codes is complete”.
- FIG. 9 is a flowchart showing processing in which the tablet terminal 2 decodes a plurality of QR codes. The following processing is started, for example, triggered by the fact that the processing shown in FIG. 8 is ended.
- Step SC 1 the CPU 200 extracts a QR code from an image that is a decoding target (hereinafter referred to as “target image”) of a plurality of pickup images, and starts decoding the QR code.
- target image is specified by the processing of step SC 6 or step SC 12 , described below.
- the CPU 200 specifies a pickup image in which the QR code of the first frame is picked up, as the first target image.
- the CPU 200 specifies the pickup image in which the QR code of the first frame is picked up, on the basis of the timing when a white image is picked up.
- the CPU 200 analyzes the histogram of gradation levels of pickup images and thereby specifies a pickup image in which a white image is picked up first, and specifies a pickup image that is picked up a predetermined number of frames after the pickup image, as a pickup image in which the QR code of the first frame is picked up.
- Step SC 2 the CPU 200 determines whether the decoding of the QR code is successful or not. Specifically, if decoded information (that is, the frame number, the frame rate F 1 , the total number of frames, and the divided data) is obtained within a predetermined time following the start of the decoding, the CPU 200 determines that the decoding of the QR code is successful. If decoded information is not obtained within the predetermined time, or if an error occurs during the decoding, the CPU 200 determines that the decoding of the QR code is not successful. If the decoding of the QR code is determined as successful (SC 2 : YES), the CPU 200 stores the divided data in the RAM 202 and shifts the processing to Step SC 3 . If the decoding of the QR code is determined as not successful (SC 2 : NO), the CPU 200 shifts the processing to Step SC 8 .
- decoded information that is, the frame number, the frame rate F 1 , the total number of frames, and the divided data
- Step SC 3 the CPU 200 determines whether the frame rate F 1 and the total number of frames are stored in the RAM 202 or not. Initially, when the processing of FIG. 9 is started, the frame rate F 1 and the total number of frames are not stored in the RAM 202 . If it is determined that the frame rate F 1 and the total number of frames are not stored in the RAM 202 (SC 3 : NO), the CPU 200 shifts the processing to Step SC 4 . If it is determined that the frame rate F 1 and the total number of frames are stored in the RAM 202 (SC 3 : YES), the CPU 200 shifts the processing to Step SC 5 .
- Step SC 4 the CPU 200 stores the frame rate F 1 and the total number of frames included in the decoded information, in the RAM 202 .
- Step SC 5 the CPU 200 combines the newly decoded divided data with the previously decoded divided data. Specifically, the CPU 200 combines the divided data according to the frame number included in the decoded information.
- Step SC 6 the CPU 200 specifies the next target image. Specifically, the CPU 200 specifies a pickup image that is after the current target image by the number of frames equal to the frame rate F 2 divided by the frame rate F 1 , as the next target image.
- the CPU 200 specifies a pickup image that is three frames after the current target image, as the next target image. As described above, it is desirable that the frame rate F 2 is twice the frame rate F 1 or higher. Therefore, in Step SC 6 , the CPU 200 specifies a pickup image that is two frames or more after the current target image, as the next target image.
- Step SC 7 the CPU 200 determines whether target data is restored or not. Specifically, the CPU 200 determines whether target data is restored or not, on the basis of whether the frame number included in the information that is decoded last coincides with the total number of frames stored in the RAM 202 or not. If it is determined that target data is restored (SC 7 : YES), the CPU 200 shifts the processing to Step SC 14 . If it is determined that target data is not restored (SC 7 : NO), the CPU 200 shifts the processing to Step SC 1 .
- Step SC 8 the CPU 200 extracts a QR code from a pickup image that is picked up one frame before the target image (hereinafter referred to as “pickup image Ib”) and starts decoding the QR code.
- pickup image Ib a QR code from a pickup image that is picked up one frame before the target image
- the CPU 200 decodes the QR code included in a different frame from the target image.
- Step SC 9 the CPU 200 determines whether the decoding of the QR code included in the pickup image Ib is successful or not.
- the CPU 200 determines whether the decoding of the QR code is successful or not, by the similar method to Step SC 2 . If the decoding of the QR code is determined as successful (SC 9 : YES), the CPU 200 stores the divided data in the RAM 202 and shifts the processing to Step SC 3 . If the decoding of the QR code is determined as not successful (SC 9 : NO), the CPU 200 shifts the processing to Step SC 10 .
- Step SC 10 the CPU 200 extracts a QR code from a pickup image that is picked up one frame after the target image (hereinafter referred to as “pickup image Ia”) and starts decoding the QR code.
- the CPU 200 decodes the QR code included in a different frame from the target image for the similar reason to Step SC 8 .
- Step SC 11 the CPU 200 determines whether the decoding of the QR code included in the pickup image Ia is successful or not.
- the CPU 200 determines whether the decoding of the QR code is successful or not, by the similar method to Step SC 2 . If the decoding of the QR code is determined as successful (SC 11 : YES), the CPU 200 stores the divided data in the RAM 202 and shifts the processing to Step SC 3 . If the decoding of the QR code is determined as not successful (SC 11 : NO), the CPU 200 shifts the processing to Step SC 12 .
- Step SC 12 the CPU 200 specifies the next target image. Specifically, the CPU 200 specifies a pickup image that is a predetermined number of frames (for example, several frames) before the current target image, as the next target image. The CPU 200 may also specify a pickup image that is a predetermined number of frames after the current target image, as the next target image.
- Step SC 13 the CPU 200 erases the frame rate F 1 and the total number of frames stored in the RAM 202 . As the CPU 200 finishes the processing of Step SC 13 , the CPU 200 shifts the processing to Step SC 1 .
- Step SC 14 the CPU 200 outputs the restored target data. Specifically, the CPU 200 displays the target data on the touch panel 204 .
- data is communicated between the projector 1 and the tablet terminal 2 via a plurality of QR codes that is sequentially projected. Therefore, data can be communicated via QR codes without being restricted by the maximum volume of the QR codes. Also, since only a part of a plurality of pickup images is decoded, the processing burden on the tablet terminal 2 is reduced, compared with the case where all the plurality of pickup images is decoded.
- the method in which the projector 1 shows information indicating the timing when the first QR code of a plurality of QR codes is projected is not limited to the projection of a white image.
- the projector 1 may project a different image from a white image to show the timing when the first QR code is projected.
- the projector 1 may display a predetermined character or graphic pattern near the QR code of the first frame to show the timing when the first QR code is projected.
- the projector 1 may project the QR code of the first frame for a longer time than the QR codes of the other frames to show the timing when the first QR code is projected.
- the projector 1 may output a sound indicating the timing when the first QR code of a plurality of QR codes is projected (hereinafter referred to as “start sound”) from the speaker 110 .
- start sound a sound indicating the timing when the first QR code of a plurality of QR codes is projected
- the tablet terminal 2 determines that images of all the plurality of QR codes are picked up, when the start sound is detected twice or more via the microphone 207 .
- the projector 1 may display an image showing the order of divided data (hereinafter referred to as “order image”) beside at least one QR code of a plurality of QR codes, in a projected image.
- order image an image showing the order of divided data
- the projector 1 may display an order image near each QR code.
- the tablet terminal 2 may extract the order image from a pickup image, carry out pattern matching to specify the order indicated by the order image, and determine whether images of all the plurality of QR codes are picked up or not, according to the specified order.
- frame numbers need not be included in the QR codes, and the tablet terminal 2 may combine the divided data according to the order indicated by the order image.
- FIG. 10 illustrates transition of a projected image Im in Modification 2.
- the order images are numbers indicating the order of the divided data.
- numbers 1 , 2 , 3 , and 100 are displayed as the order of the divided data represented by each QR code D, below QR codes D 1 , D 2 , D 3 , and D 100 .
- the relation between the frame rate F 1 and the frame rate F 2 is not limited to the relation described in the embodiment.
- the frame rate F 2 may be equal to or below the frame rate F 1 .
- the tablet terminal 2 continues picking up images of the projection surface until it is determined that images of all the plurality of QR codes are picked up, for example, using the method described in the above Modification 2.
- the specific values of the frame rate F 1 and the frame rate F 2 are not limited to those described in the embodiment, either.
- the processing in which the tablet terminal 2 picks up images of QR codes is not limited to the processing shown in FIG. 8 .
- the tablet terminal 2 may decode a plurality of QR codes whose images are picked up, and determine whether images of all the plurality of QR codes are picked up or not. In this case, the tablet terminal 2 determines whether images of all the plurality of QR codes are picked up or not, according to the total number of frames and the frame number of each QR code.
- the processing in which the tablet terminal 2 decodes a plurality of QR codes is not limited to the case where this processing is started, triggered by the end of the processing of picking up images of the QR codes.
- the processing in which the tablet terminal 2 decodes QR codes may be carried out in parallel with the processing of picking up images of the QR codes.
- the processing in which the tablet terminal 2 decodes QR codes is not limited to the processing shown in FIG. 9 .
- the tablet terminal 2 may regard all the plurality of QR codes as decoding targets.
- the processing of combining divided data (Step SC 6 ) may be carried out after the decoding of all the QR codes is finished.
- QR codes are not limited to the format shown in FIG. 6 .
- only the QR code that is projected first, of a plurality of QR codes may have an area indicating the total number of frames.
- the QR code that is projected first, of a plurality of QR codes may not have an area representing divided data.
- the frame rate F 1 may not be included in QR codes.
- the projector may not include information indicating the frame rate F 1 in QR codes.
- the projector 1 may transmit information indicating the frame rate F 1 to the tablet terminal 2 , not via QR codes.
- the projector 1 may display a character or graphic pattern indicating the frame rate F 1 near at least one QR code of a plurality of QR codes. In this case, the tablet terminal 2 carries out pattern matching with respect to the pickup image and thus specifies the frame rate F 1 .
- the QR codes used in the projection system PS may be of lower versions than version 40.
- the projector 1 may project QR codes of a lower version than version 40.
- the code image is not limited to a QR code.
- the code image may be other two-dimensional codes, for example, PDF 417, CP (communication platform) code, HCCB (high capacity color barcode) or the like, as long as these are images representing encoded information.
- the code image may also be a one-dimensional code.
- the number of QR codes included in a projected image is not limited to one.
- a plurality of QR codes may be displayed next to each other.
- the tablet terminal 2 may simultaneously read the plurality of QR codes displayed next to each other. In this case, data can be communicated in a shorter time than in the case where the number of QR codes included in a projected image is one.
- the plurality of QR codes that is sequentially projected may be provided with a function that enables the device on the reading side to restore the information of a frame even if the QR code of the frame cannot be read because of camera shake or an obstacle (the case where a person passes in front of the QR code during image pickup, or the like).
- the plurality of QR codes that is sequentially projected may include a QR code that represents information to enable restoration of the information of a frame that cannot be read.
- the plurality of QR codes may not be projected entirely continuously.
- a projected image that does not include a QR code may be projected every time a predetermined number of QR codes are projected.
- the projected image that does not include a QR code may also be used, for example, for the tablet terminal 2 to specify the order of the projected QR code.
- the projected image that does not include a QR code may be used to synchronize the timing when the plurality of QR codes is projected and the timing when images of the plurality of QR codes are picked up.
- the code image is not limited to a QR code.
- the code image may be other two-dimensional codes, for example, PDF 417, CP (communication platform) code, HCCB (high capacity color barcode) or the like, as long as these are images representing encoded information.
- the code image may also be a one-dimensional code.
- the target data is not limited to dynamic image data.
- the target data may be, for example, data representing a character, graphic pattern, musical sound, or image. If the target data is data representing a musical sound, the tablet terminal 2 may reproduce the musical sound from a speaker, not shown, in Step SC 14 .
- the target data is not limited to data stored in the projector 1 .
- the target data may be data inputted to the projector 1 from an electronic device such as a personal computer.
- the projector 1 may not repeat the projection of a white image and the projection of a QR code.
- the projector 1 may output information indicating that the last QR code of a plurality of QR codes is projected (for example, a predetermined image or sound, or the like) after Step SA 5 .
- the display device is not limited to the projector 1 .
- the display device may be any electronic device that can sequentially display a plurality of QR codes, for example, a television set, personal computer, smartphone, mobile phone, digital camera, car navigation system, POS terminal, vending machine, printer, game machine, or the like.
- the image pickup device is not limited to the tablet terminal 2 , either.
- the image pickup device may be any electronic device that can sequentially picks up images of a plurality of QR codes, for example, a personal computer, smartphone, mobile phone, digital camera, game machine, or the like.
- a plurality of QR codes may be displayed on a display for electronic advertisement or a display on a vending machine, so that data of a standby screen or ringtone for smartphone can be communicated.
- the hardware configurations of the projector 1 and the tablet terminal 2 are not limited to the configurations shown in FIGS. 3 and 4 .
- Each device may have any hardware configuration that can execute the processing shown in FIGS. 5 , 8 and 9 .
- the liquid crystal panel 126 may be a reflection-type liquid crystal panel.
- an electro-optical element such as an organic EL (electroluminescence) element or digital mirror device (DMD) may be used instead of the liquid crystal panel 126 .
- the liquid crystal panel 126 may not be provided for each color component.
- the projector 1 may has a single liquid crystal panel 126 . In this case, images of respective color components may be display time-divisionally.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Toxicology (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Electromagnetism (AREA)
- Health & Medical Sciences (AREA)
- Controls And Circuits For Display Device (AREA)
- Optics & Photonics (AREA)
- Transforming Electric Information Into Light Information (AREA)
Abstract
Description
- The entire disclosure of Japanese Patent Application No. 2014-041991, filed Mar. 4, 2014 is expressly incorporated by reference herein.
- 1. Technical Field
- The present invention relates to a communication system, an image pickup device, a program, and a communication method.
- 2. Related Art
- A communication system in which a QR code (trademark registered) displayed on a display unit of an electronic device is picked up as an image and read by another electronic device is known. JP-A-2008-077380 discloses the use of a three-dimensional code formed by changing a two-dimensional code in time series, for wireless LAN (local area network) connection between a projector and an information terminal.
- In the traditional communication via a two-dimensional code, there is an upper limit to the volume of data that can be communicated. Also, JP-A-2008-077380 does not describe the relation between the frame rate at which the projector projects a three-dimensional code and the frame rate at which the information terminal picks up an image of the three-dimensional code, and therefore data cannot be communicated via a three-dimensional code.
- An advantage of some aspects of the invention is that a greater volume of data can be communicated via a code image.
- An aspect of the invention provides a communication system including a display device and an image pickup device. The display device includes: a storage unit which stores data; a dividing unit which divides the data; an encoding unit which encodes each of the data divided by the dividing unit and order information indicating an order of the data and thus generates a plurality of code images; and a display unit which displays the plurality of code images generated by the encoding unit, at a first frame rate according to the order. The image pickup device includes: an image pickup unit which sequentially picks up images of the plurality of code images displayed by the display unit, at a second frame rate that is higher than the first frame rate; a decoding unit which extracts and decodes the code images from pickup images picked up by the image pickup unit; and a combining unit which combines the data decoded by the decoding unit according to the order indicated by the order information. According to this communication system, a greater volume of data can be communicated via the code images, compared with the case where a single code image is displayed.
- In a preferred aspect, the display unit repeatedly displays the plurality of code images, and the image pickup unit continues image pickup until images of all the plurality of code images are picked up. According to this communication system, a loss of the plurality of code images at the time of image pickup is prevented.
- In another preferred aspect, the display unit displays information indicating a timing when a first code image of the plurality of code images is displayed. According to this communication system, the image pickup device can specify the timing when the first code image of the plurality of code images is displayed.
- In another preferred aspect, the display device includes a sound emitting unit which emits a sound indicating a timing when a first code image of the plurality of code images is displayed. According to this communication system, the image pickup device can specify the timing when the first code image of the plurality of code images is displayed.
- In another preferred aspect, the display unit displays an order image indicating the order beside at least one code image of the plurality of code images. The image pickup device include a determination unit which determines whether images of all the plurality of code images are picked up or not, on the basis of the order image extracted from the pickup image. The image pickup unit continue image pickup until the determination unit determines that images of all the plurality of code images are picked up. According to this communication system, the image pickup device can determine whether images of all the code images are picked up or not, without decoding the code images.
- In another preferred aspect, if the decoding unit is successful in decoding a certain pickup image, the decoding unit decodes, next to the decoding, a pickup image that is two frames or more after the pickup image. According to this communication system, the time required for decoding is shortened, compared with the case where all the plurality of code images is decoded.
- Another aspect of the invention provides an image pickup device including: an image pickup unit which sequentially picks up images of a plurality of code images sequentially displayed at a first frame rate and representing data and order information indicating an order of the data, at a second frame rate that is higher than the first frame rate; a decoding unit which extracts and decodes the code images from pickup images picked up by the image pickup unit; and a combining unit which combines the data decoded by the decoding unit according to the order indicated by the order information. According to this image pickup device, a greater volume of data can be communicated via the code images, compared with the case where an image of a single code image is picked up.
- Still another aspect of the invention provides a program for executing: sequentially picking up images of a plurality of code images that is sequentially displayed at a first frame rate and represent data and order information indicating an order of the data, at a second frame rate that is higher than the first frame rate; extracting and decoding the code images from the pickup images that are picked up; and combining the decoded data according to the order indicated by the order information. According to this program, a greater volume of data can be communicated via the code images, compared with the case where an image of a single code image is picked up.
- Yet another aspect of the invention provides a communication method including: dividing data; encoding each of the divided data and order information indicating an order of the data and thus generating a plurality of code images; displaying the plurality of code images at a first frame rate according to the order; sequentially picks up images of the plurality of code images that is displayed, at a second frame rate that is higher than the first frame rate; extracting and decoding the code images from the pickup images that are picked up; and combining the decoded data according to the order indicated by the order information. According to this communication method, a greater volume of data can be communicated via the code images, compared with the case where a single code image is displayed.
- The invention will be described with reference to the accompanying drawings, wherein like numbers reference like elements.
-
FIG. 1 shows the overall configuration of a projection system. -
FIG. 2 is a block diagram showing the functional configuration of the projection system. -
FIG. 3 is a block diagram showing the hardware configuration of a projector. -
FIG. 4 is a block diagram showing the hardware configuration of a tablet terminal. -
FIG. 5 is a flowchart showing processing to project a QR code. -
FIG. 6 illustrates the data format of a QR code. -
FIG. 7 illustrates transition of a projected image. -
FIG. 8 is a flowchart showing processing to pick up an image of a QR code. -
FIG. 9 is a flowchart showing processing to decode a QR code. -
FIG. 10 illustrates transition of a projected image inModification 2. -
FIG. 1 shows the overall configuration of a projection system PS according to an embodiment of the invention. The projection system PS includes aprojector 1, atablet terminal 2, a controller R, and a screen SC. The projection system PS is a system for transmitting data from theprojector 1 to thetablet terminal 2 via a two-dimensional code. Theprojector 1 is a display device which encodes data, thus generates a two-dimensional code, and projects the two-dimensional code. In the example ofFIG. 1 , theprojector 1 projects a QR code (an example of a code image) on the screen SC. Thetablet terminal 2 is an image pickup device which picks up an image of the QR code projected on the screen SC and decodes the QR code. The controller R is a device for controlling theprojector 1 wirelessly via infrared communication or the like, that is, a so-called remote controller. The screen SC is a surface on which an image projected by the projector 1 (hereinafter referred to as “projected image”) is shown. - In communication via a two-dimensional code, there is an upper limit to the volume of data that can be transmitted via a single two-dimensional code. For example, as for QR codes, a plurality of versions (
versions 1 to 40) according to the number of cells in the codes are prescribed, and the maximum volume of data that can be transmitted via a QR code of version 40 is about 3 kilobytes (2953 bytes, to be more precise) per QR code. Therefore, data exceeding the maximum volume cannot be transmitted via a single two-dimensional code. In the projection system PS, as theprojector 1 sequentially projects a plurality of QR codes and thetablet terminal 2 reads (picks up images of and decodes) the plurality of QR codes, data exceeding the maximum volume is communicated. -
FIG. 2 is a block diagram showing the functional configuration of the projection system PS. Theprojector 1 includes astorage unit 11, a dividingunit 12, anencoding unit 13, aprojection unit 14, and asound emitting unit 15. Thestorage unit 11 stores data that is a target of communication (hereinafter referred to as “target data”). The dividingunit 12 divides the target data into a predetermined size. Theencoding unit 13 encodes each of the data divided by the dividing unit 12 (hereinafter referred to as “divided data”) and information indicating an order of the divided data (hereinafter referred to as “order information”) and thus generates a plurality of QR codes. The projection unit 14 (an example of a display unit) projects the plurality of QR codes generated by theencoding unit 13, at a predetermined interval according to the order of the divided data. Hereinafter, the frame rate at which theprojection unit 14 sequentially projects the plurality of QR codes is referred to as “frame rate F1”. Thesound emitting unit 15 emits a sound indicating the timing when the first QR code of the plurality of QR codes is projected. - The
tablet terminal 2 includes animage pickup unit 21, adecoding unit 22, a combiningunit 23, anoutput unit 24, and adetermination unit 25. Theimage pickup unit 21 sequentially picks up images of the plurality of QR codes projected by theprojection unit 14, at a predetermined interval. Hereinafter, the frame rate at which theimage pickup unit 21 sequentially picks up images of the plurality of QR codes is referred to as “frame rate F2”. The frame rate F2 is a higher rate than the frame rate F1. Thedecoding unit 22 extracts and decodes the QR codes from the images picked up by the image pickup unit 21 (hereinafter referred to as “pickup images”). The combiningunit 23 combined the divided data decoded by thedecoding unit 22. The combiningunit 23 combines the divided data according to the order indicated by the order information. Theoutput unit 24 outputs the data combined by the combining unit (that is, target data). Thedetermination unit 25 determines whether images of all the plurality of QR code are picked up or not. -
FIG. 3 is a block diagram showing the hardware configuration of theprojector 1. Theprojector 1 includes an MCU (micro control unit) 100, a ROM (read only memory) 101, a RAM (random access memory) 102, astorage unit 103, an IF (interface)unit 104, animage processing circuit 105, aprojection unit 106, alight receiving unit 107, anoperation panel 108, aninput processing unit 109, and aspeaker 110. TheMCU 100 is a control unit which executes a program and thus controls each part of theprojector 1. TheROM 101 is a non-volatile storage device in which various programs and data are stored. TheROM 101 stores programs that are executed by theMCU 100. TheRAM 102 is a volatile storage device which stores data. TheRAM 102 has a frame memory which stores images on a frame basis. Thestorage unit 103 is a storage device which stores data and programs. - The
IF unit 104 communicates with an electronic device or the like that is a video source. TheIF unit 104 has various terminals (for example, VGA terminal, USB terminal, wired LAN interface, S terminal, RCA terminal, HDMI (High-Definition Multimedia Interface (trademark registered) terminal and the like) and a wireless LAN interface, to connect to an external device. Theimage processing circuit 105 performs predetermined image processing on an image represented by a video signal inputted from the electronic device as a video source (hereinafter referred to as “input image”). Theimage processing circuit 105 writes the image-processed input image to the frame memory. - The
projection unit 106 has alight source 116, aliquid crystal panel 126, anoptical system 136, a light source driving circuit 146, apanel driving circuit 156, and an opticalsystem driving circuit 166. Thelight source 116 has a lamp such as a high-pressure mercury lamp, halogen lamp or metal halide lamp, or a solid light source such as an LED (light emitting diode) or laser diode, and casts light on theliquid crystal panel 126. Thelight crystal panel 126 is a light modulator which modulates the light cast from thelight source 116, according to image data. In this example, theliquid crystal panel 126 is a transmission-type liquid crystal panel, in which the transmittance of each pixel is controlled according to image data. Theprojector 1 has threeliquid crystal panels 126 corresponding to the primary colors of RGB. The light from thelight source 116 is separated into three color lights of RGB and each color light becomes incident on the correspondingliquid crystal panel 126. The color lights transmitted through and modulated by the respective liquid crystal panels are combined by a cross dichroic prism or the like and then emitted to theoptical system 136. Theoptical system 136 has a lens which enlarged and projects the light modulated to image light by theliquid crystal panels 126, onto the screen SC, and a zoom lens which carries out enlargement and reduction of the image to be projected, and adjustment of the focal point. The light source driving circuit 146 drives thelight source 116 under the control of theMCU 100. Thepanel driving circuit 156 drives theliquid crystal panels 126 according to the image data outputted from theimage processing circuit 105. The opticalsystem driving circuit 166 drives theoptical system 136 under the control of theMCU 100 and carries out adjustment of the degree of zooming and adjustment of focusing. - The
light receiving unit 107 receives an infrared signal transmitted from the controller R, decodes the received infrared signal, and outputs the decoded signal to theinput processing unit 109. Theoperation panel 108 has buttons and switches to turn on/off the power of theprojector 1 and to carry out various operations. Theinput processing unit 109 generates information indicating the content of an operation by the controller R and theoperation panel 108, and outputs the information to theMCU 100. Thespeaker 110 outputs a sound according to an electrical signal outputted from theMCU 100. - In the
projector 1, theMCU 100 executing a program is an example of the dividingunit 12 and theencoding unit 13. Theprojection unit 106 controlled by theMCU 100 executing a program is an example of theprojection unit 14. Thespeaker 110 controlled by theMCU 100 executing a program is an example of thesound emitting unit 15. Thestorage unit 103 is an example of thestorage unit 11. -
FIG. 4 is a block diagram showing the hardware configuration of thetablet terminal 2. Thetablet terminal 2 includes a CPU (central processing unit) 200, aROM 201, aRAM 202, an IFunit 203, atouch panel 204, animage pickup unit 205, astorage unit 206, and amicrophone 207. TheCPU 200 is a control device which executes a program and thus controls each part of thetablet terminal 2. TheROM 201 is a non-volatile storage device in which various programs and data are stored. TheRAM 202 is a volatile storage device which stores data. TheIF unit 203 has various terminals and a wireless LAN interface to communicate with an external device. Thetouch panel 204 is an input device in which a panel that detects coordinates is superimposed on a display surface of a liquid crystal display or the like. As thetouch panel 204, for example, an optical-type, resistive membrane-type, electrostatic capacitance-type, or ultrasonic-type touch panel is used. Theimage pickup unit 205 is a digital camera which picks up an image of a subject and generates image data, and includes a lens, a CCD (charge coupled device) image sensor, and a signal processing circuit (these components are not shown). Thestorage unit 206 is a storage device which stores data and programs. Themicrophone 207 converts a sound to an electrical signal and outputs the electrical signal to theCPU 200. - In the
tablet terminal 2, theimage pickup unit 205 controlled by theCPU 200 executing a program is an example of theimage pickup unit 21. TheCPU 200 executing a program is an example of thedecoding unit 22, the combiningunit 23 and thedetermination unit 25. Thetouch panel 204 controlled by theCPU 200 executing a program is an example of theoutput unit 24. -
FIG. 5 is a flowchart showing processing in which theprojector 1 sequentially projects a plurality of QR codes. In this example, target data is dynamic image data that represents a dynamic image. The dynamic image data is stored in thestorage unit 103 in advance. The following processing is started, triggered by the fact that an instruction to project the QR codes (hereinafter referred to as “QR code projection instruction”) is inputted to theprojector 1. The QR code projection instruction is inputted, for example, by the user operating the controller R or theoperation panel 108. - In Step SA1, the
MCU 100 divides the target data. Specifically, theMCU 100 reads out the target data from thestorage unit 103 and divides the target data. The size of the divided data is predetermined according to the version of the QR code to be generated, in such a way as to be a size equal to or below a maximum volume of data that can be transmitted via a single QR code. TheMCU 100 stores a plurality of divided data in theRAM 102. TheMCU 100 also adds order information to each of the divided data stored in theRAM 102. In this example, theMCU 100 adds numbers (hereinafter referred to as “frame numbers”) as order information. TheMCU 100 adds frames numbers in ascending order from the divided data corresponding to the leading edge of the target data. As theMCU 100 finishes dividing all the target data, theMCU 100 stores the total number of divided data (hereinafter referred to as “total number of frames”) in theRAM 102. - In Step SA2, the
MCU 100 encodes each of the divided data and thus generates a plurality of QR codes. Specifically, theMCU 100 reads out the divided data, the frame numbers added to the divided data and the total number of frames from theRAM 102 and also reads out information indicating the frame rate F1 from theROM 101, and then encodes these pieces of information and thus generates QR codes of a predetermined version. TheMCU 100 encodes the divided data, for example, in order from the smallest frame number and thus generates QR codes, and stores the QR codes in theRAM 102. TheMCU 100 separately adds a frame number that is the same as the frame number indicated by each QR code, to each of the QR codes stored in theRAM 102. -
FIG. 6 illustrates the data format of a QR code generated by theprojector 1.FIG. 6 shows an example in which a QR code D of version 40 (2953 bytes) is generated by theprojector 1. The QR code D includes a 2-byte area A1 indicating the frame number, a 1-byte area A2 indicating the frame rate F1, a 2-byte area A3 indicating the total number of frames, and a 2948-byte area A4 representing the divided data. Each area of the QR code D includes information that enables the device on the reading side to restore the encoded information even if a part of the QR code cannot be read because of a loss. - Back to
FIG. 5 , in Step SA3, theMCU 100 determines whether all the divided data are encoded or not. Whether all the divided data are encoded or not is determined on the basis of whether divided data that is not encoded is stored in theRAM 102 or not. If it is determined that all the divided data are encoded (SA3: YES), theMCU 100 shifts the processing to Step SA4. If it is determined that there is divided data that is not encoded (SA3: NO), theMCU 100 shifts the processing to Step SA2 and continues to generate QR codes. - In Step SA4, the
MCU 100 projects an image showing the beginning of the series of QR codes, for example, an entirely white image (hereinafter referred to as “white image”). The white image is displayed in order to allow thetablet terminal 2 to specify the timing when the first QR code of the plurality of QR codes is projected. Specifically, theMCU 100 reads out a white image from theROM 101 and projects the white image on the screen SC for a predetermined time. In Step SA5, theMCU 100 sequentially projects the plurality of QR codes at the frame rate F1. Specifically, theMCU 100 reads out the plurality of QR codes in order from the QR code with the smallest frame number and sequentially projects the QR codes. TheMCU 100 projects the plurality of QR codes, for example, at 20 FPS (frames per second). As theMCU 100 finishes projecting all the QR codes, theMCU 100 shifts the processing to Step SA4 and repeats the projection of a white image and the projection of QR codes. -
FIG. 7 illustrates transition of a projected image Im in the case where the processing shown inFIG. 5 is carried out. InFIG. 7 , a projected image Imw represents a white image. QR codes D1, D2, D3, and D100 represent the QR codes of the first frame, second frame, third frame, and 100th frame, respectively. InFIG. 7 , for convenience of explanation, the version of each QR code D isversion 5, and character data is encoded in each QR code D. In the example ofFIG. 7 , theprojector 1 projects each QR code D at 20 FPS for five seconds (100 frames). For example, if the version of each QR code D is version 40, the QR codes corresponding to 100 frames represent data of 295.3 kilobytes. After theprojector 1 projects the QR codes to the QR code D100, theprojector 1 projects the projected image Imw again. -
FIG. 8 is a flowchart showing processing in which thetablet terminal 2 sequentially picks up images of a plurality of QR codes. The following processing is started as the user operates thetouch panel 204 to start up a program for reading the plurality of QR codes in the state where the plurality of QR codes is sequentially projected on the screen SC. The user holds thetablet terminal 2 in such a way that images of the QR codes projected on the screen SC are picked up by theimage pickup unit 205. - In Step SB1, the
CPU 200 sequentially picks up images of the plurality of QR codes projected on the screen, at the frame rate F2. As described above, the frame rate F2 is a higher frame rate than the frame rate F1. In order to pick up images of all the plurality of QR codes that is projected, it is desirable that the frame rate F2 is twice the frame rate F1 or higher. TheCPU 200 picks up images of the QR codes, for example, at 60 FPS. TheCPU 200 stores image data representing the pickup images, in theRAM 202. - In Step SB2, the
CPU 200 determines whether images of all the plurality of QR codes are picked up or not. Specifically, theCPU 200 analyzes the histogram of gradation levels of the pickup images and thus specifies the number of times a white image is projected while the image pickup is carried out (that is, the number of times the processing of Step SA4 is carried out). If the number of times a white image is projected is specified as twice or more, theCPU 200 determines that images of all the plurality of QR codes are picked up. If it is determined that images of all the plurality of QR codes are picked up (SB2: YES), theCPU 200 shifts the processing to Step SB3. If it is determined that there is a QR code whose image is not picked up (SB2: NO), theCPU 200 shifts the processing to SB1 and continues to pick up images of the QR codes. - In Step SB3, the
CPU 200 displays a message indicating that the image pickup of the QR codes is ended (hereinafter referred to as “end message”) on thetouch panel 204. The end message is stored in thestorage unit 206. The end message includes, for example, a message such as “Image pickup of all the QR codes is complete”. -
FIG. 9 is a flowchart showing processing in which thetablet terminal 2 decodes a plurality of QR codes. The following processing is started, for example, triggered by the fact that the processing shown inFIG. 8 is ended. - In Step SC1, the
CPU 200 extracts a QR code from an image that is a decoding target (hereinafter referred to as “target image”) of a plurality of pickup images, and starts decoding the QR code. The target image is specified by the processing of step SC6 or step SC12, described below. When the processing of Step SC1 is carried out first, theCPU 200 specifies a pickup image in which the QR code of the first frame is picked up, as the first target image. TheCPU 200 specifies the pickup image in which the QR code of the first frame is picked up, on the basis of the timing when a white image is picked up. Specifically, theCPU 200 analyzes the histogram of gradation levels of pickup images and thereby specifies a pickup image in which a white image is picked up first, and specifies a pickup image that is picked up a predetermined number of frames after the pickup image, as a pickup image in which the QR code of the first frame is picked up. - In Step SC2, the
CPU 200 determines whether the decoding of the QR code is successful or not. Specifically, if decoded information (that is, the frame number, the frame rate F1, the total number of frames, and the divided data) is obtained within a predetermined time following the start of the decoding, theCPU 200 determines that the decoding of the QR code is successful. If decoded information is not obtained within the predetermined time, or if an error occurs during the decoding, theCPU 200 determines that the decoding of the QR code is not successful. If the decoding of the QR code is determined as successful (SC2: YES), theCPU 200 stores the divided data in theRAM 202 and shifts the processing to Step SC3. If the decoding of the QR code is determined as not successful (SC2: NO), theCPU 200 shifts the processing to Step SC8. - In Step SC3, the
CPU 200 determines whether the frame rate F1 and the total number of frames are stored in theRAM 202 or not. Initially, when the processing ofFIG. 9 is started, the frame rate F1 and the total number of frames are not stored in theRAM 202. If it is determined that the frame rate F1 and the total number of frames are not stored in the RAM 202 (SC3: NO), theCPU 200 shifts the processing to Step SC4. If it is determined that the frame rate F1 and the total number of frames are stored in the RAM 202 (SC3: YES), theCPU 200 shifts the processing to Step SC5. - In Step SC4, the
CPU 200 stores the frame rate F1 and the total number of frames included in the decoded information, in theRAM 202. In Step SC5, theCPU 200 combines the newly decoded divided data with the previously decoded divided data. Specifically, theCPU 200 combines the divided data according to the frame number included in the decoded information. In Step SC6, theCPU 200 specifies the next target image. Specifically, theCPU 200 specifies a pickup image that is after the current target image by the number of frames equal to the frame rate F2 divided by the frame rate F1, as the next target image. For example, if the frame rate F1 is 20 FPS and the frame rate F2 is 60 FPS, theCPU 200 specifies a pickup image that is three frames after the current target image, as the next target image. As described above, it is desirable that the frame rate F2 is twice the frame rate F1 or higher. Therefore, in Step SC6, theCPU 200 specifies a pickup image that is two frames or more after the current target image, as the next target image. - In Step SC7, the
CPU 200 determines whether target data is restored or not. Specifically, theCPU 200 determines whether target data is restored or not, on the basis of whether the frame number included in the information that is decoded last coincides with the total number of frames stored in theRAM 202 or not. If it is determined that target data is restored (SC7: YES), theCPU 200 shifts the processing to Step SC14. If it is determined that target data is not restored (SC7: NO), theCPU 200 shifts the processing to Step SC1. - In Step SC8, the
CPU 200 extracts a QR code from a pickup image that is picked up one frame before the target image (hereinafter referred to as “pickup image Ib”) and starts decoding the QR code. As a cause of the unsuccessful decoding of the QR code in Step SC2, it can be considered that there is a problem in the pickup image, such as in the case where the QR code is out of focus or in the case where there is a stain on the screen SC. Thus, theCPU 200 decodes the QR code included in a different frame from the target image. In Step SC9, theCPU 200 determines whether the decoding of the QR code included in the pickup image Ib is successful or not. TheCPU 200 determines whether the decoding of the QR code is successful or not, by the similar method to Step SC2. If the decoding of the QR code is determined as successful (SC9: YES), theCPU 200 stores the divided data in theRAM 202 and shifts the processing to Step SC3. If the decoding of the QR code is determined as not successful (SC9: NO), theCPU 200 shifts the processing to Step SC10. - In Step SC10, the
CPU 200 extracts a QR code from a pickup image that is picked up one frame after the target image (hereinafter referred to as “pickup image Ia”) and starts decoding the QR code. TheCPU 200 decodes the QR code included in a different frame from the target image for the similar reason to Step SC8. In Step SC11, theCPU 200 determines whether the decoding of the QR code included in the pickup image Ia is successful or not. TheCPU 200 determines whether the decoding of the QR code is successful or not, by the similar method to Step SC2. If the decoding of the QR code is determined as successful (SC11: YES), theCPU 200 stores the divided data in theRAM 202 and shifts the processing to Step SC3. If the decoding of the QR code is determined as not successful (SC11: NO), theCPU 200 shifts the processing to Step SC12. - In Step SC12, the
CPU 200 specifies the next target image. Specifically, theCPU 200 specifies a pickup image that is a predetermined number of frames (for example, several frames) before the current target image, as the next target image. TheCPU 200 may also specify a pickup image that is a predetermined number of frames after the current target image, as the next target image. In Step SC13, theCPU 200 erases the frame rate F1 and the total number of frames stored in theRAM 202. As theCPU 200 finishes the processing of Step SC13, theCPU 200 shifts the processing to Step SC1. - In Step SC14, the
CPU 200 outputs the restored target data. Specifically, theCPU 200 displays the target data on thetouch panel 204. - By the above processing, data is communicated between the
projector 1 and thetablet terminal 2 via a plurality of QR codes that is sequentially projected. Therefore, data can be communicated via QR codes without being restricted by the maximum volume of the QR codes. Also, since only a part of a plurality of pickup images is decoded, the processing burden on thetablet terminal 2 is reduced, compared with the case where all the plurality of pickup images is decoded. - The invention is not limited to the above embodiment and can be modified in various manners. Hereinafter, some modifications will be described. Of the modifications below, two or more may be combined.
- The method in which the
projector 1 shows information indicating the timing when the first QR code of a plurality of QR codes is projected, is not limited to the projection of a white image. Theprojector 1 may project a different image from a white image to show the timing when the first QR code is projected. In another example, theprojector 1 may display a predetermined character or graphic pattern near the QR code of the first frame to show the timing when the first QR code is projected. In still another example, theprojector 1 may project the QR code of the first frame for a longer time than the QR codes of the other frames to show the timing when the first QR code is projected. In still another example, theprojector 1 may output a sound indicating the timing when the first QR code of a plurality of QR codes is projected (hereinafter referred to as “start sound”) from thespeaker 110. In this case, in Step SB2, thetablet terminal 2 determines that images of all the plurality of QR codes are picked up, when the start sound is detected twice or more via themicrophone 207. - The
projector 1 may display an image showing the order of divided data (hereinafter referred to as “order image”) beside at least one QR code of a plurality of QR codes, in a projected image. For example, when sequentially projecting a plurality of QR codes, theprojector 1 may display an order image near each QR code. In this case, thetablet terminal 2 may extract the order image from a pickup image, carry out pattern matching to specify the order indicated by the order image, and determine whether images of all the plurality of QR codes are picked up or not, according to the specified order. Also, in this case, frame numbers need not be included in the QR codes, and thetablet terminal 2 may combine the divided data according to the order indicated by the order image. -
FIG. 10 illustrates transition of a projected image Im inModification 2. In the example ofFIG. 10 , the order images are numbers indicating the order of the divided data. InFIG. 10 ,numbers - The relation between the frame rate F1 and the frame rate F2 is not limited to the relation described in the embodiment. The frame rate F2 may be equal to or below the frame rate F1. In this case, the
tablet terminal 2 continues picking up images of the projection surface until it is determined that images of all the plurality of QR codes are picked up, for example, using the method described in theabove Modification 2. The specific values of the frame rate F1 and the frame rate F2 are not limited to those described in the embodiment, either. - The processing in which the
tablet terminal 2 picks up images of QR codes is not limited to the processing shown inFIG. 8 . For example, thetablet terminal 2 may decode a plurality of QR codes whose images are picked up, and determine whether images of all the plurality of QR codes are picked up or not. In this case, thetablet terminal 2 determines whether images of all the plurality of QR codes are picked up or not, according to the total number of frames and the frame number of each QR code. - The processing in which the
tablet terminal 2 decodes a plurality of QR codes is not limited to the case where this processing is started, triggered by the end of the processing of picking up images of the QR codes. The processing in which thetablet terminal 2 decodes QR codes may be carried out in parallel with the processing of picking up images of the QR codes. - The processing in which the
tablet terminal 2 decodes QR codes is not limited to the processing shown inFIG. 9 . For example, thetablet terminal 2 may regard all the plurality of QR codes as decoding targets. In another example, the processing of combining divided data (Step SC6) may be carried out after the decoding of all the QR codes is finished. - The data format of QR codes is not limited to the format shown in
FIG. 6 . For example, only the QR code that is projected first, of a plurality of QR codes, may have an area indicating the total number of frames. In another example, the QR code that is projected first, of a plurality of QR codes, may not have an area representing divided data. - The frame rate F1 may not be included in QR codes. For example, if the frame rate F1 and the frame rate F2 are predetermined values, the projector may not include information indicating the frame rate F1 in QR codes. Also, the
projector 1 may transmit information indicating the frame rate F1 to thetablet terminal 2, not via QR codes. For example, theprojector 1 may display a character or graphic pattern indicating the frame rate F1 near at least one QR code of a plurality of QR codes. In this case, thetablet terminal 2 carries out pattern matching with respect to the pickup image and thus specifies the frame rate F1. - The QR codes used in the projection system PS may be of lower versions than version 40. For example, if the
tablet terminal 2 cannot read QR codes of version 40, theprojector 1 may project QR codes of a lower version than version 40. Also, the code image is not limited to a QR code. The code image may be other two-dimensional codes, for example, PDF 417, CP (communication platform) code, HCCB (high capacity color barcode) or the like, as long as these are images representing encoded information. The code image may also be a one-dimensional code. - The number of QR codes included in a projected image is not limited to one. In a projected image, a plurality of QR codes may be displayed next to each other. The
tablet terminal 2 may simultaneously read the plurality of QR codes displayed next to each other. In this case, data can be communicated in a shorter time than in the case where the number of QR codes included in a projected image is one. - The plurality of QR codes that is sequentially projected may be provided with a function that enables the device on the reading side to restore the information of a frame even if the QR code of the frame cannot be read because of camera shake or an obstacle (the case where a person passes in front of the QR code during image pickup, or the like). For example, the plurality of QR codes that is sequentially projected may include a QR code that represents information to enable restoration of the information of a frame that cannot be read.
- The plurality of QR codes may not be projected entirely continuously. For example, a projected image that does not include a QR code may be projected every time a predetermined number of QR codes are projected. The projected image that does not include a QR code may also be used, for example, for the
tablet terminal 2 to specify the order of the projected QR code. In another example, the projected image that does not include a QR code may be used to synchronize the timing when the plurality of QR codes is projected and the timing when images of the plurality of QR codes are picked up. - The code image is not limited to a QR code. The code image may be other two-dimensional codes, for example, PDF 417, CP (communication platform) code, HCCB (high capacity color barcode) or the like, as long as these are images representing encoded information. The code image may also be a one-dimensional code.
- The target data is not limited to dynamic image data. The target data may be, for example, data representing a character, graphic pattern, musical sound, or image. If the target data is data representing a musical sound, the
tablet terminal 2 may reproduce the musical sound from a speaker, not shown, in Step SC14. - The target data is not limited to data stored in the
projector 1. For example, the target data may be data inputted to theprojector 1 from an electronic device such as a personal computer. - The
projector 1 may not repeat the projection of a white image and the projection of a QR code. In this case, theprojector 1 may output information indicating that the last QR code of a plurality of QR codes is projected (for example, a predetermined image or sound, or the like) after Step SA5. - The display device is not limited to the
projector 1. The display device may be any electronic device that can sequentially display a plurality of QR codes, for example, a television set, personal computer, smartphone, mobile phone, digital camera, car navigation system, POS terminal, vending machine, printer, game machine, or the like. The image pickup device is not limited to thetablet terminal 2, either. The image pickup device may be any electronic device that can sequentially picks up images of a plurality of QR codes, for example, a personal computer, smartphone, mobile phone, digital camera, game machine, or the like. For example, a plurality of QR codes may be displayed on a display for electronic advertisement or a display on a vending machine, so that data of a standby screen or ringtone for smartphone can be communicated. - The hardware configurations of the
projector 1 and thetablet terminal 2 are not limited to the configurations shown inFIGS. 3 and 4 . Each device may have any hardware configuration that can execute the processing shown inFIGS. 5 , 8 and 9. For example, theliquid crystal panel 126 may be a reflection-type liquid crystal panel. Also, an electro-optical element such as an organic EL (electroluminescence) element or digital mirror device (DMD) may be used instead of theliquid crystal panel 126. Moreover, theliquid crystal panel 126 may not be provided for each color component. Theprojector 1 may has a singleliquid crystal panel 126. In this case, images of respective color components may be display time-divisionally.
Claims (8)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014041991A JP2015169952A (en) | 2014-03-04 | 2014-03-04 | Communication system, imaging apparatus, program, and communication method |
JP2014-041991 | 2014-03-04 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20150254486A1 true US20150254486A1 (en) | 2015-09-10 |
US9135490B1 US9135490B1 (en) | 2015-09-15 |
Family
ID=54017647
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/582,722 Active US9135490B1 (en) | 2014-03-04 | 2014-12-24 | Communication system, image pickup device, program, and communication method |
Country Status (2)
Country | Link |
---|---|
US (1) | US9135490B1 (en) |
JP (1) | JP2015169952A (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170277501A1 (en) * | 2016-03-24 | 2017-09-28 | Casio Computer Co., Ltd. | Information display device, information display method and storage medium |
US9984354B1 (en) * | 2014-09-30 | 2018-05-29 | Amazon Technologies, Inc. | Camera time synchronization system |
US20180225488A1 (en) * | 2015-05-11 | 2018-08-09 | Veridos Gmbh | Method for checking an identity of a person |
US10104346B2 (en) * | 2015-03-24 | 2018-10-16 | Seiko Epson Corporation | Projector and control method for projector |
US20190258839A1 (en) * | 2018-02-20 | 2019-08-22 | Fuji Xerox Co., Ltd. | Information processing apparatus and non-transitory computer readable medium |
CN110198463A (en) * | 2018-03-07 | 2019-09-03 | 腾讯科技(深圳)有限公司 | A kind of mobile projector method, apparatus, computer-readable medium and electronic equipment |
US10614540B2 (en) * | 2015-05-01 | 2020-04-07 | Graphiclead LLC | System and method for embedding a two dimensional code in video images |
US20200401139A1 (en) * | 2018-02-20 | 2020-12-24 | Sony Corporation | Flying vehicle and method of controlling flying vehicle |
US20220191392A1 (en) * | 2020-12-10 | 2022-06-16 | Seiko Epson Corporation | Display method, detection apparatus, and non-transitory computer-readable storage medium storing a program |
US11388226B1 (en) * | 2015-01-13 | 2022-07-12 | Snap Inc. | Guided personal identity based actions |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160132760A1 (en) * | 2014-11-06 | 2016-05-12 | Kuei-Sheng Tsou | Dynamic recognizable two-dimensional code for interaction between electronic devices |
JP2018028807A (en) * | 2016-08-18 | 2018-02-22 | 望 樋口 | Information acquisition system |
KR20210122541A (en) * | 2020-04-01 | 2021-10-12 | 삼성전자주식회사 | Method of providing data and electronic device supporting same |
JP7447876B2 (en) * | 2021-07-27 | 2024-03-12 | カシオ計算機株式会社 | Electronic equipment, display methods and programs |
JP7461510B2 (en) | 2022-03-03 | 2024-04-03 | 三菱電機株式会社 | Diagnostic system, control device, diagnostic method and diagnostic program |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7090136B2 (en) * | 2002-07-24 | 2006-08-15 | Sharp Kabushiki Kaisha | Portable terminal device, program for reading information, and recording medium having the same recorded thereon |
US20140097251A1 (en) * | 2012-10-04 | 2014-04-10 | Cognex Corporation | Systems and methods for operating symbology reader with multi-core processor |
US20140098220A1 (en) * | 2012-10-04 | 2014-04-10 | Cognex Corporation | Symbology reader with multi-core processor |
US20140217264A1 (en) * | 2011-10-31 | 2014-08-07 | The Trustees Of Columbia University In The City Of New York | Systems and methods for imaging using single photon avalanche diodes |
US20150050027A1 (en) * | 2012-12-27 | 2015-02-19 | Panasonic Intellectual Property Corporation Of America | Information communication method |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4274217B2 (en) | 2006-09-21 | 2009-06-03 | セイコーエプソン株式会社 | Image display device, image display system, and network connection method |
-
2014
- 2014-03-04 JP JP2014041991A patent/JP2015169952A/en active Pending
- 2014-12-24 US US14/582,722 patent/US9135490B1/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7090136B2 (en) * | 2002-07-24 | 2006-08-15 | Sharp Kabushiki Kaisha | Portable terminal device, program for reading information, and recording medium having the same recorded thereon |
US20140217264A1 (en) * | 2011-10-31 | 2014-08-07 | The Trustees Of Columbia University In The City Of New York | Systems and methods for imaging using single photon avalanche diodes |
US20140097251A1 (en) * | 2012-10-04 | 2014-04-10 | Cognex Corporation | Systems and methods for operating symbology reader with multi-core processor |
US20140098220A1 (en) * | 2012-10-04 | 2014-04-10 | Cognex Corporation | Symbology reader with multi-core processor |
US20150050027A1 (en) * | 2012-12-27 | 2015-02-19 | Panasonic Intellectual Property Corporation Of America | Information communication method |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9984354B1 (en) * | 2014-09-30 | 2018-05-29 | Amazon Technologies, Inc. | Camera time synchronization system |
US11388226B1 (en) * | 2015-01-13 | 2022-07-12 | Snap Inc. | Guided personal identity based actions |
US11962645B2 (en) | 2015-01-13 | 2024-04-16 | Snap Inc. | Guided personal identity based actions |
US10104346B2 (en) * | 2015-03-24 | 2018-10-16 | Seiko Epson Corporation | Projector and control method for projector |
US10614540B2 (en) * | 2015-05-01 | 2020-04-07 | Graphiclead LLC | System and method for embedding a two dimensional code in video images |
US20180225488A1 (en) * | 2015-05-11 | 2018-08-09 | Veridos Gmbh | Method for checking an identity of a person |
US20170277501A1 (en) * | 2016-03-24 | 2017-09-28 | Casio Computer Co., Ltd. | Information display device, information display method and storage medium |
US20190258839A1 (en) * | 2018-02-20 | 2019-08-22 | Fuji Xerox Co., Ltd. | Information processing apparatus and non-transitory computer readable medium |
US10621400B2 (en) * | 2018-02-20 | 2020-04-14 | Fuji Xerox Co., Ltd. | Information processing apparatus and non-transitory computer readable medium |
US20200401139A1 (en) * | 2018-02-20 | 2020-12-24 | Sony Corporation | Flying vehicle and method of controlling flying vehicle |
CN110198463A (en) * | 2018-03-07 | 2019-09-03 | 腾讯科技(深圳)有限公司 | A kind of mobile projector method, apparatus, computer-readable medium and electronic equipment |
US20220191392A1 (en) * | 2020-12-10 | 2022-06-16 | Seiko Epson Corporation | Display method, detection apparatus, and non-transitory computer-readable storage medium storing a program |
Also Published As
Publication number | Publication date |
---|---|
US9135490B1 (en) | 2015-09-15 |
JP2015169952A (en) | 2015-09-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9135490B1 (en) | Communication system, image pickup device, program, and communication method | |
JP3867205B2 (en) | Pointed position detection device, pointed position detection system, and pointed position detection method | |
US11159739B2 (en) | Apparatus and method for generating moving image data including multiple section images in electronic device | |
CN103677444A (en) | Interactive system, control method for interactive system, and projector | |
US20190265847A1 (en) | Display apparatus and method for controlling display apparatus | |
US20210063861A1 (en) | Image processing apparatus and image processing method | |
JP2015166893A (en) | Position detector and control method of position detector | |
US10171781B2 (en) | Projection apparatus, method for controlling the same, and projection system | |
US10839482B2 (en) | Information processing apparatus, image display method, display system, and computer readable storage medium | |
CN101441393A (en) | Projection device for image projection with document camera device connected thereto, and projection method | |
JP2017116689A (en) | Image projection system, control method for the same, terminal device, and program | |
JP2012181264A (en) | Projection device, projection method, and program | |
JP2015148964A (en) | Projection device and projection method | |
JP2015177285A (en) | communication system and communication method | |
US9438873B2 (en) | Projector | |
US20190065130A1 (en) | Display device and method for controlling display device | |
JP6464834B2 (en) | Display device and display method | |
US9761200B2 (en) | Content output system, content output apparatus, content output method, and computer-readable medium | |
JP2015179935A (en) | Communication system, communication method, and display device | |
JP2009008752A (en) | Projection device, method and program | |
US10503322B2 (en) | Projector and method of controlling projector | |
JP6653084B2 (en) | Image display device | |
JP6772045B2 (en) | Display system, control device and method of projection type display device | |
JP2015184972A (en) | Motion detection device, image display device, and motion detection method | |
JP2023034635A (en) | Method for controlling display system and display system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SEIKO EPSON CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SHIGEMITSU, MAKOTO;REEL/FRAME:034584/0434 Effective date: 20141121 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |