WO2015115294A1 - Dispositif de transmission, dispositif de réception et système de traitement d'informations - Google Patents

Dispositif de transmission, dispositif de réception et système de traitement d'informations Download PDF

Info

Publication number
WO2015115294A1
WO2015115294A1 PCT/JP2015/051642 JP2015051642W WO2015115294A1 WO 2015115294 A1 WO2015115294 A1 WO 2015115294A1 JP 2015051642 W JP2015051642 W JP 2015051642W WO 2015115294 A1 WO2015115294 A1 WO 2015115294A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
unit
data
code
timing
Prior art date
Application number
PCT/JP2015/051642
Other languages
English (en)
Japanese (ja)
Inventor
孝介 八木
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2015559897A priority Critical patent/JP5976240B2/ja
Publication of WO2015115294A1 publication Critical patent/WO2015115294A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/10544Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
    • G06K7/10821Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum further details of bar or optical code scanning devices
    • G06K7/10861Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum further details of bar or optical code scanning devices sensing of data fields affixed to objects or articles, e.g. coded labels
    • G06K7/10871Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum further details of bar or optical code scanning devices sensing of data fields affixed to objects or articles, e.g. coded labels randomly oriented data-fields, code-marks therefore, e.g. concentric circles-code
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B10/00Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
    • H04B10/11Arrangements specific to free-space transmission, i.e. transmission through air or vacuum
    • H04B10/114Indoor or close-range type systems
    • H04B10/116Visible light communication

Definitions

  • the present invention relates to a transmission device, a reception device, and an information processing system, and more particularly to transmission of information performed by imaging a code image displayed on the transmission device with the reception device.
  • QR code registered trademark
  • Patent Documents 1 to 3 a method of increasing the amount of transfer data by color multiplexing or time multiplexing is considered.
  • a data transmission device converts data to be transmitted into a code, generates a frame including cells in which the code is two-dimensionally arranged, and uses the frame as a display screen. it's shown.
  • the data receiving apparatus captures the displayed frame, and returns the code of the two-dimensional array corresponding to the cells constituting the captured frame to the data before conversion. That is, the data receiving apparatus captures the displayed frame and returns the code of the two-dimensional array to the data before conversion.
  • a method of inserting a correction cell into a cell constituting a frame, or a method of inserting a correction frame made up of correction cells reduces the influence of the characteristics of background light or variations thereof, or an optical device. This makes it possible to accurately decode data by reducing the influence of the characteristics or variations thereof.
  • binary data that the data transmission side wants to transfer is converted into a concentric color component code whose color changes with time, and is displayed on a monitor as a moving image.
  • the data receiving side continuously photographs concentric color component codes with the camera of the camera-equipped mobile terminal device, and extracts binary data from the color information of each track.
  • the first information processing apparatus generates and sequentially displays a plurality of first graphic images (for example, two-dimensional codes).
  • the first information processing apparatus detects second graphic images that are sequentially displayed in other information processing apparatuses, and acquires input data represented by the second graphic image.
  • the detection area for detecting the second graphic image is formed in a part of the display area for displaying the first graphic image.
  • Japanese Patent No. 4749856 paragraphs 0033 to 0036, FIGS. 1 and 2) JP 2007-13786 (paragraphs 0032 to 0044, FIG. 8) Japanese Patent No. 4400355 (paragraphs 0088 to 0099, FIG. 7)
  • the display screen refresh rate and camera scan rate may vary depending on the device. Even if the refresh rate is the same, the phases are not always the same during actual use. For this reason, the screen in the middle of the display rewriting may be shot and the code may not be read accurately. Therefore, in the methods shown in Patent Documents 1 to 3, transfer may fail, and a more reliable transfer method is desired.
  • Patent Documents 1 to 3 the user needs to hold the camera for a certain period of time. However, when the user does not know the shooting period or the shooting timing, the timing of holding over is not appropriate, and the data may not be read accurately.
  • an object of the present invention is to efficiently transfer information from a transmission device to a reception device, and to enable a user to easily read data.
  • a transmission device includes: A transmission data storage unit for storing a series of data; The series of data stored in the transmission data storage unit is divided into a plurality of data parts, each of the data parts obtained by the division is converted into a code image, and each data part generated by the conversion
  • a code generator that sequentially outputs code images corresponding to An identification image generator for generating and outputting an identification image
  • a combining unit that combines the code image generated by the code generation unit with the identification image output from the identification image generation unit to generate a combined image
  • An image connecting unit that connects the composite images generated by the combining unit to create a moving image corresponding to the series of data
  • a video playback unit that plays back and outputs the video created by the image connection unit
  • a display unit that displays the video played by the video playback unit on a screen; In the moving image output from the moving image reproducing unit, the code images are sequentially switched, and the identification image appears at a timing other than the switching timing.
  • a transmission device includes: A transmission data storage unit for storing a series of data; The series of data stored in the transmission data storage unit is divided into a plurality of data parts, each of the data parts obtained by the division is converted into a code image, and each data part generated by the conversion
  • a code generator that sequentially outputs code images corresponding to A first image connecting unit that connects the code images generated by the code generating unit to create a moving image; A first moving image reproduction unit that reproduces and outputs the moving image created by the first image connection unit; An identification image generator for generating and outputting an identification image; A second image connecting unit that connects the identification images generated by the identification image generating unit to create a moving image; A second moving image reproduction unit that reproduces and outputs the moving image created by the second image connecting unit; A synthesis unit that synthesizes the video created by the first video playback unit and the video created by the second video playback unit to generate and output a composite video; A display unit that displays the synthesized video generated by the synthesis unit on
  • a transmission device provides: A transmission data storage unit for storing a series of data; The series of data stored in the transmission data storage unit is divided into a plurality of data parts, each of the data parts obtained by the division is converted into a code image, and each data part generated by the conversion
  • a code generator that sequentially outputs code images corresponding to An image connecting unit that connects the code images generated by the code generating unit to create a moving image
  • a video playback unit that plays back and outputs the video created by the image connection unit
  • the moving image displayed on the display unit includes a light emitting unit that emits light at a timing other than a timing at which the code image is switched.
  • a receiving apparatus includes: An imaging unit that sequentially switches the code image, captures a screen displaying a video in which the identification image appears at a timing other than the switching timing, and sequentially acquires the image of the screen; A video buffer for storing images sequentially acquired by the imaging unit; A reading unit for sequentially reading out images stored in the video buffer; An identification image detection unit for sequentially detecting the identification images in images sequentially read by the reading unit; A code decoding unit that decodes the code image that appears at the same timing as the identification image detected by the identification image detection unit, and sequentially acquires corresponding data portions; And a reception data storage unit that sequentially writes data portions sequentially acquired by the code decoding unit and connects the sequentially written data portions to form a series of data.
  • a receiving apparatus includes: A screen that displays code images that are switched in sequence, The light from the light emitting unit that is arranged adjacent to the screen and emits light at a timing other than the switching timing is captured, and the image displayed on the screen and the light image from the light emitting unit are sequentially acquired.
  • An imaging unit A video buffer for accumulating images sequentially acquired by the imaging unit; A reading unit for sequentially reading out images stored in the video buffer; A detection unit for detecting an image portion by light from the light emitting unit in images sequentially read by the reading unit; A code decoding unit that decodes the code image that appears at the same timing as the image portion by the light sequentially detected by the detection unit, and sequentially obtains the corresponding data portion; And a reception data storage unit that sequentially writes data portions sequentially acquired by the code decoding unit and connects the sequentially written data portions to form a series of data.
  • the present invention it is possible to improve the transfer efficiency when transferring information from the transmission device to the reception device with simple handling of the reception device.
  • FIG. 1 is an overview diagram schematically showing a configuration of an information processing system according to a first embodiment of the present invention.
  • FIG. 3 is a block diagram schematically showing a configuration of a transmission apparatus in the first embodiment.
  • 6 is a schematic diagram for explaining synthesis of a code image and a mark indicating timing according to Embodiment 1.
  • FIG. (A) to (C) are timings showing the display timing of the timing mark assigned to the green channel, the first code assigned to the red channel, and the second code assigned to the blue channel in the first embodiment. It is a chart.
  • FIG. 3 is a block diagram schematically showing a configuration of a receiving apparatus in the first embodiment. It is a figure which shows an example of the dimension of each part of the image of a timing mark.
  • FIG. 6 is a flowchart illustrating an operation of displaying a video based on transfer data in the transmission device according to the first embodiment. It is a flowchart which shows the detail of step S12 of FIG. 4 is a flowchart illustrating an operation of acquiring and storing transfer data from a captured image in the receiving device according to the first embodiment.
  • (A) to (M) are schematic views showing modifications of the timing mark in the first embodiment.
  • (A) And (B) is a figure which shows the example of a display image and a captured image in case the frame period of the display in a display part and the frame period of the imaging in an imaging part have shifted
  • (A)-(D) are figures which show the change of the display image within a display frame period.
  • (A)-(C) are figures which show the change of the display image within a display frame period.
  • (A)-(C) are figures which show the change of the display image within a display frame period.
  • FIG. 10 is a block diagram schematically showing a configuration of a receiving apparatus in a third embodiment. It is a block diagram which shows roughly the structure of the transmitter in Embodiment 4 of this invention.
  • FIG. 10 is a block diagram schematically showing a configuration of a receiving apparatus in a fourth embodiment. It is a block diagram which shows roughly the structure of the transmitter in Embodiment 5 of this invention.
  • FIG. 10 is a block diagram schematically showing a configuration of a receiving apparatus in a fifth embodiment. It is the schematic which shows the layout of a timing mark and a code image used in Embodiment 5 and 6 of this invention.
  • (A) to (C) are schematic diagrams illustrating temporal changes in video displayed on the display unit of the transmission device in the fifth and sixth embodiments.
  • FIG. 20 is a block diagram schematically showing a configuration of a receiving apparatus in a sixth embodiment. It is a block diagram which shows roughly the structure of the transmitter in Embodiment 7 of this invention.
  • FIG. 20 is a block diagram schematically showing a configuration of a receiving apparatus in a seventh embodiment. It is a figure which shows an example of the sequence of QR Code which attached
  • FIG. 20 is a block diagram schematically showing a configuration of a transmission apparatus in an eighth embodiment.
  • FIG. 20 is a block diagram schematically showing a configuration of a receiving apparatus in an eighth embodiment.
  • FIG. 1 shows an example of an information processing system 1 according to the present invention.
  • An information processing system 1 illustrated in FIG. 1 includes a transmission device 100 and a reception device 200.
  • the information processing system 1 transfers the data file D 0 from the transmission device 100 to the reception device 200.
  • the transmission device 100 displays an image on the display screen 131.
  • the receiving device 200 captures the image.
  • Data file D 0 to be transferred, the document files, image files, video files may be either a voice file or executable file.
  • the audio file or the execution file is converted into image data such as a QR code, for example. That is, any data is converted into image data such as a QR code.
  • the transmission device 100 the data file D 0 to be transferred is divided into a plurality of data portions are converted into the data portion of one or more code image D 3, are sequentially displayed on the display screen 131
  • the Receiving apparatus 200 sequentially captures the code image D 3 to be displayed on the display screen 131 of the transmitting apparatus 100. Then, the receiving apparatus 200 restores the data portion by decoding a code image D 3. Receiving apparatus 200, by connecting the recovered data to restore the original data file D 0.
  • Code image D 3 above for example, two-dimensional bar code such as a QR code, or what is called a one-dimensional bar code.
  • the display frame rate of the transmission apparatus 100 and the imaging frame rate of the reception apparatus 200 may be the same or different from each other.
  • FIG. 2 is a block diagram schematically showing a configuration of transmitting apparatus 100 in the first embodiment.
  • the transmission apparatus 100 illustrated in FIG. 2 includes a mark generation unit 106, a data processing unit 110, a synthesis unit 122, and a display unit 130. Further, the transmission device 100 can include a transmission data storage unit 102, a reference clock generation unit 104, a video buffer 126, and a moving image reproduction unit 128.
  • Transmission data storage unit 102 stores a plurality of data files D 0.
  • the “data file” is a piece of data, for example. Incidentally, when the capacity of the data file D 0 is large, for example, it is sent to the receiving apparatus 200 from the transmitting apparatus 100 the data of the data file D 0 by using a plurality of QR code or the like.
  • the transmission data storage unit 102 gives the stored transmission data D 1 to the data processing unit 110.
  • the transmission data D 1 is, for example, a data file D 0 stored in the transmission data storage unit 102. Further, the transmission data D 1 are, for example, the greater the capacity of the data file D 0 stored in the transmission data storage unit 102 may divide the data file D 0 in volume suitable for transmission to the data processing unit 110 It has been done.
  • the reference clock generation unit 104 generates a clock signal CL. Then, the reference clock generation unit 104 provides the clock signal CL to the mark generation unit 106 and the data processing unit 110.
  • Mark generation unit 106 in synchronization with the generated clock signal CL at the reference clock generator 104, the mark showing the timing (timing mark) to produce a D 2. Further, the mark generating unit 106 includes an image storage unit 107 for storing the timing mark D 2.
  • Timing mark D 2 is an identification image.
  • the timing mark D 2 in the image if it is displayed, for example, is recognized as an image, such as a QR code being displayed at the same time is intended to include the appropriate information.
  • the timing mark D 2 is not visible in the image, for example, it is recognized as an image, such as a QR code being displayed at the same time are those which do not contain adequate information. “Not including appropriate information” means that data cannot be correctly decoded.
  • Image storage unit 107 stores an image plane including timing mark D 2.
  • the image storage unit 107 for frame not including the timing mark D 2, and stores the image plane which does not include a timing mark D 2.
  • Timing mark D 2 is synthesized with the code image D 3 as described later. Timing mark D 2 is used as an identification image for the code image D 3 combined by the combining unit 122 to identify whether or not suitable for decoding. For this reason, the mark generation unit 106 is also referred to as an “identification image generation unit”.
  • “compositing” refers to handling different data as one image data. For example, it is to represent a code image D 3 and the timing mark D 2 as an image in one frame.
  • Timing mark D 2 is, for example, an image composed of pixels having the pixel and the minimum value takes the maximum value of the gradation values (pixel values) of possible values. When the gradation value is expressed by 8 bits, the maximum value is “255” and the minimum value is “0”.
  • Figure 3 is a schematic view for explaining a combination of the code image D 3 and the timing mark D 2.
  • Code image D 3 is, for example, an image representing the QR code or the like. That is, a coded data file D 0.
  • Code image D 3 is produced from the transmission data D 1.
  • the timing mark D ⁇ b > 2 is configured by a rectangular area (shown by cross-hatching) composed of pixels whose gradation values have the above maximum values.
  • the timing mark D 2 shown in FIG. 3 is a rectangular image of the maximum tone value. In that case, the gradation value of the pixel outside the above region becomes the minimum value.
  • a timing mark D 2 it is composed of only the data of green (G).
  • the code image D 3 is composed of data of the data and blue red (R) (B).
  • the code image D 3 of the green timing marks D 2 and red (G) (R) and blue (B) is synthesized as an image of one frame.
  • the data processing unit 110 selects one of the plurality of data files D 0 stored in the transmission data storage unit 102.
  • the data processing unit 110 divides the data file D 0 that was selected plurality of data portions. That is, the data processing unit 110 divides the data easily accepts data file D 0 selected volume. Then, the data processing unit 110 receives the divided data (transmission data D 1 ).
  • the data processing unit 110 converts each divided data portion into two graphic codes.
  • each divided data portion is converted into a graphic code displayed in red (R) and a graphic code displayed in blue (B).
  • the data processing unit 110 generates a code image D 3 from the graphic code. Code image D 3 is obtained by converting so that the image when drawing a graph code.
  • Code image D 3 is, for example, an image composed of pixels having a pixel and a minimum value taking the maximum value of the values that can take the tone values (pixel values).
  • the data processing unit 110 includes a code generation unit 116.
  • the code generation unit 116 may include a first code generation unit 116A and a second code generation unit 116B.
  • the data processing unit 110 can include a reading unit 112 and a data dividing unit 114.
  • the reading unit 112 divides the data file D 0 stored in the transmission data storage unit 102 into a plurality of data portions (herein referred to as “shots”).
  • the divided data portion (shot) is obtained by dividing the data file D 0 into a capacity suitable for transmission to the data processing unit 110 as described above. That is, one data portion divided into a capacity suitable for transmission is called a “shot”.
  • Transmission data D 1 is one shot.
  • the data for the image displayed in the one-shot period Td shown in FIG. 4A will be described as “one shot”. That is, when two QR codes or the like are displayed separately for color images as described in FIG. 3, one shot is displayed as an image of two QR codes or the like.
  • the reading unit 112 sequentially sends a plurality of shots (shot data D 4 ) generated by dividing the data file D 0 to the data dividing unit 114. Further, the reading unit 112 sends a synchronization signal SY described later to the mark generation unit 106.
  • the data dividing unit 114 further divides the data D 4 of each shot read by the reading unit 112 into two data parts (herein referred to as “data unit D 5 ”).
  • “two data portions” are different data portions displayed for each color channel. In the first embodiment, the first data unit D 51 displayed in the red channel and the second data unit D 52 displayed in the blue channel.
  • the data dividing unit 114 gives the first data portion (first data unit D 51 ) to the first code generating unit 116A.
  • the data dividing unit 114 gives the second data portion (second data unit D 52 ) to the second code generating unit 116B.
  • the first code generating unit 116A converts the first data unit D 51 supplied from the data dividing unit 114, the corresponding code image (first code image) D 31. That is, the first data unit D 51 is converted into a graphic code, and this graphic code is further converted into a corresponding code image (first code image) D 31 .
  • the first code generating unit 116A the code image (first code image) to the D 31. Incidentally, actually it outputs an image plane including a first code image D 31. However, in FIG. 2, simplified to describe the output of the first code generating unit 116A as the first code image D 31.
  • the second code generating unit 116B converts the second data unit D 52 supplied from the data dividing unit 114, the corresponding code image (second code image) D 32. That is, the second data unit D 52 is converted into a graphic code, and the graphic code is further converted into a code image (second code image) D 32 corresponding thereto.
  • the second code generating unit 116B outputs the code image (first code image) D 32. Incidentally, actually it outputs an image plane including a first code image D 31. However, in FIG. 2, simplified to describe the output of the first code generating unit 116A as the first code image D 31.
  • Data unit D 5 is the data of the chunk numbers.
  • the code image D 3 is in the form of data that can be displayed as an image.
  • Code image D 3 for example, is represented by bitmap data. Conversion to code image data units D 5 referred to as "drawing.”
  • the rules for conversion of each data unit D 51 , D 52 into graphic codes are determined in advance.
  • the rules for converting the data units D 51 and D 52 into graphic codes are shared by the transmission device 100 and the reception device 200.
  • Examples of the first code image D 31 and the second code image D 32 is shown in FIG.
  • images of red (R) is the first code image D 31.
  • an image of blue (B) is the second code image D 32.
  • the first code image D 31 and the second code image D 32 are formed in the same region as the timing mark D 2 in the frame. Accordingly, the first code image D 31 , the second code image D 32 and the timing mark D 2 are formed so as to overlap each other.
  • a plurality of shot data D 4 read by the reading unit 112 are sequentially input to the data dividing unit 114.
  • First data unit D 51 supplied from the data dividing unit 114 in the first code generating unit 116A is switched in association with the shot data D 4 input to the data dividing unit 114 is switched. Therefore, the code image D 31 outputted from the first code generating unit 116A is switched each time the switches are available for the data D 4 that is read by the reading unit 112. Similarly, the code image D 32 outputted from the second code generating unit 116B is switched each time the switches are available for the data D 4 that is read by the reading unit 112.
  • Both the output of the first code image D 31 by the first code generation unit 116A and the output of the second code image D 32 by the second code generation unit 116B are performed in synchronization with the clock signal CL from the reference clock generation unit 104. Is called.
  • the output of the timing marks D 2 by the mark generation unit 106 is performed in synchronization with the clock signal CL.
  • Mark generation unit 106 receives the (signal indicating to read the data D 4 of a shot) above synchronizing signal SY from the reading unit 112 determines the timing of the output of the timing mark D 2.
  • the combining unit 122 assigns the timing mark D 2 , the first code image D 31, and the second code image D 32 to independent color channels. Then, the synthesizing unit 122 synthesizes the timing mark D 2 , the first code image D 31, and the second code image D 32 to generate a synthesized image (color image) D 61 .
  • the timing mark D2 is output from the mark generation unit 106.
  • the first code image D 31 is outputted from the first code generating unit 116A.
  • Second code image D 32 is outputted from the second code generating unit 116B.
  • Synthesis by combining unit 122 includes an image plane comprising a timing mark D 2, and an image plane including a first code image D 31, is performed by superimposing an image plane including a second code image D 32.
  • “Plain” is like a transparent plate on which an image is pasted. Generally, an image is composed of a plurality of screens (planes) called planes. There are three planes that express red, green, and blue in the plane, and colors are expressed by combining them. An image plane expressed by a plurality of planes is called a frame.
  • the color channel for example, three primary color channels of red (R), blue (B), and green (G) light are employed.
  • a “color channel” is information obtained by separating colors by components. Specifically, there are a red channel, a green channel, a blue channel, and the like.
  • the composition unit 122 assigns a timing mark D 2 of FIG. 3 in green (G) channel, a first code image D 31 assigned to the red (R) channel, the second code image D 32 blue (B ) Assign to the channel and compose.
  • the synthesizing unit 122 synthesizes the tone values of the timing mark D 2 , the first code image D 31, and the second code image D 32 as the tone values of the respective color channels.
  • the maximum value of the timing mark D 2 has a maximum value of the gradation values of the green channel (green component value).
  • the timing mark D ⁇ b > 2 is shown in a portion indicated by cross hatching.
  • the maximum value of the first code image D 31 has a maximum value of the gradation values of the red channel (red component values).
  • the maximum value of the first code image D 31 is a portion which is shown in black.
  • the maximum value of the second code image D 32 has a maximum value of the gradation values of the blue channel (blue component value).
  • the maximum value of the second code image D 32 is a portion which is shown in black.
  • the synthesizer 122 synthesizes the timing mark D 2 , the first code image D 31, and the second code image D 32 to generate an image for one frame displayed on the display unit 130.
  • the video buffer 126 stores the composite image D 61 generated by the combining unit 122.
  • the video buffer 126 couples the combined image D 61 that has accumulated.
  • the video buffer 126 connects a plurality of synthesized one-frame images so that the display unit 130 can display them as “moving images”. Connecting a plurality of images for one frame is called “connection”.
  • Video buffer 126 the synthesized images are sequentially connected to the generating a combined image D 62.
  • An “image” is a still image displayed in one frame.
  • a “video” is an image in which several frames of images are connected and displayed continuously. That is, when the “image” of each frame is different, the “video” is a moving image.
  • the image for one frame combined by the combining unit 122 is the same image. Therefore, the viewer even display unit 130 displays the combined image D 62 recognizes the one shown the same image.
  • the video buffer 126 connects the accumulated composite video D62 to each other.
  • the composite video D62 is image data obtained by combining several frames of the same image.
  • the composite video D 62 combined with the video buffer 126 is of different images. For this reason, as will be described later, when an image displayed on the display unit 130 is captured by the receiving device 200, the receiving device 200 may capture two different images.
  • the composite video D 63 obtained as a result of the connection of the composite video D 62 is a video in which the content of the image is switched.
  • the composite video D 63 obtained as a result of the concatenation of the composite video D 62 is a video whose contents are switched at a cycle of 4 frames.
  • Accumulation of the image in the video buffer 126 is continued until the conversion of all data (data portion) of the data file D 0 to be transferred into the code image D 3 is completed. That is, when the capacity of the data file D 0 is large, for example, to create a combined image D 63 formed by coupling a plurality of QR codes.
  • a composite video D 63 obtained by connecting two types of composite video D 62 can transmit four types of QR codes.
  • the video buffer 126 functions as a connecting unit that connects the composite image D 61 to generate the composite video D 62 .
  • the video buffer 126 functions as a connecting unit that generates a composite video D 63 (moving image) by connecting the composite video D 62 .
  • the step of connecting the composite image D 61 generates a combined image D 62 may be synthesized unit 122 performs.
  • the moving image reproducing unit 128 stores the data (the image including the code image D 3 and the timing mark D 2) stored in the video buffer 126.
  • a sequence of frames is sequentially read out for each frame and reproduced as a moving image.
  • the “sequence of image frames” refers to the composite video D 63 .
  • the display unit 130 displays a color image according to the output of the moving image playback unit 128.
  • first code image D 31 is red
  • the second code image D 32 are respectively displayed in blue.
  • the display unit 130 is configured by a liquid crystal display device that performs display by a dot sequential scanning method or a line sequential scanning method and a progressive display method.
  • the timing marks D 2 is generated in synchronization with the clock signal CL.
  • the output of the first code image D 31 and the output of the second code image D 32 are both performed in synchronization with the clock signal CL from the reference clock generation unit 104. Therefore, the first code image D 31 and the second code image D 32 is switched simultaneously. It carried out as appears timing mark D 2 at the timing just before and except immediately after the timing of the changeover. For example, so that appears timing mark D 2 in switched immediately before and after the non-Framed frame. That is, immediately before and frame after the switching of the frame timing mark D 2 is not synthesized.
  • FIG. 4 (A) represents a timing mark D 2.
  • FIG. 4 (B) represents the first code image D 31.
  • FIG. 4 (C) represents a second code image D 32.
  • Figure 4 (A) ⁇ FIG 4 (C) shows in an example of the combined image D 63, the switching of the code image D 3, and the period in which the timing mark D 2 appears.
  • the vertical axis indicates whether or not each signal is output. A signal is output at “H”, and no signal is output at “L”.
  • the horizontal axis represents time.
  • a dotted line in the vertical direction indicates a break of a frame period of display by the display unit 130.
  • the symbol T t represents one frame period (one frame period) at the display frame rate.
  • a period (one-shot period) T d in which the composite video D 62 generated from one-shot data D 4 is displayed is four frame periods T. t .
  • the contents of the code image D 3 to be displayed for each 4-frame period is switched.
  • the first frame period T t1 of does not display timing mark D 2.
  • Code images D 31 and D 32 are displayed in a period T t1 of the first one frame in one shot period T d .
  • the timing marks D 2 is the switching of the immediately following frame of the code image D 3 (the first frame period of each shot), it does not appear in the frame immediately before the changeover (last frame of each shot) .
  • Timing mark D 2 appears at their other frames (second and third frame periods of each shot).
  • the contents of the code image D 3 is sometimes disturbed part. Meanwhile, in the second frame and the third frame after a certain time from the switched contents of the code image D 3 it is estimated to be stable. Therefore, in the invention according to the first embodiment, using the code image D 3 of the second frame and the third frame to the decoding in the receiving device 200. Then, in order to distinguish the code image D 3 of the second frame and the third frame from the code image D 3 of the first frame and the fourth frame, it is denoted by the timing mark D 2 in the transmitting apparatus 100. Then, the receiving apparatus 200, and whether the determination is a code image D 3 of the first frame and the fourth frame or a code image D 3 of the second frame and third frame based on the timing mark D 2 .
  • the second frame and the third frame are used for decoding by the receiving apparatus 200. Therefore, these frames are called “body frames”.
  • the front and rear frames (first frame and second frame) of the body frame are referred to as front and rear “guard frames”, respectively.
  • the period in which the body frame appears is called the body period.
  • a period in which the guard frame appears is called a guard period.
  • the body period is a period T e of the two frames.
  • the guard period is the first one frame period T t1 and the last one frame period T t2 .
  • FIG. 5 is a block diagram schematically showing a configuration of receiving apparatus 200 in the first embodiment.
  • the receiving apparatus 200 illustrated in FIG. 5 includes an imaging unit 202, a mark recognition unit 210, an image cutout unit 220, and a code decoding unit 230.
  • the receiving device 200 can include a video buffer 204, a reading unit 206, a separating unit 208, a combining unit 242, and a received data storage unit 244.
  • the imaging unit 202 images the display screen 131 of the display unit 130 of the transmission device 100. Then, the imaging unit 202 generates video data D 7 representing red, green, and blue component values of the video displayed on the display screen 131. That is, the imaging unit 202 generates red component video data, green component video data, and blue component video data of the video (composite video D 63 ) displayed on the display screen 131.
  • the video data D 7 generated by the imaging unit 202 is a series of frame data corresponding to the composite video D 63 .
  • the video data D 7 of each frame generated by the imaging unit 202 includes data representing a red component value, a green component value, and a blue component value. Data representing the red component value, the green component value, and the blue component value is generated by, for example, a photoelectric conversion element including a red color filter, a green color filter, and a blue color filter, respectively.
  • an image displayed on the display unit 130 of the transmission device 100 is formed on the imaging surface of the imaging unit 202.
  • the distance to the display screen 131 of the display unit 130 is known and the display screen 131 of the display unit 130 is focused.
  • the imaging unit 202 has an autofocus function.
  • the video buffer 204 stores video data D 7 composed of a series of frames output from the imaging unit 202.
  • the accumulation of the video data D 7 is performed by imaging all the frames (all frames of the composite video D 63 ) constituting the video corresponding to one data file D 0 displayed on the display unit 130 of the transmission device 100. Continue until the end. At the time of storing the image data D 7 is finished, the state where data of a series of frames constituting the data file D 0 (all frames of the composite video D 63) is stored.
  • the reading unit 206 sequentially reads out and outputs a series of frames of data D 8 constituting a video corresponding to one data file D 0 stored in the video buffer 204. That is, the reading unit 206 outputs one data of the data file D 0 as data D 8 of the frame is read from one frame by the video buffer 204.
  • the separation unit 208 converts the data D 8 of each frame read by the reading unit 206 into green component value data D 9G (green channel data) and red component value data D 9R (red channel data). ) And blue component value data D 9B (blue channel data).
  • the separation unit 208 gives the green component value data D 9G to the mark recognition unit 210.
  • the separation unit 208 supplies the image component 220 with red component value data D 9R and blue component value data D 9B .
  • the mark recognizing unit 210 detects the timing mark D 2 from the green component image (green component data D 9G ) given from the separating unit 208.
  • Mark recognition unit 210 functions as a detection unit for detecting a timing mark D 2.
  • Mark recognition unit 210 recognizes that the timing of the timing mark D 2 is detected, a timing that can read the code image D 3 correctly. That is, the mark recognition unit 210 recognizes detects the timing mark D 2, and read the code image D 3 correctly.
  • the “timing” here may be, for example, a timing for each frame. Moreover, the timing for each line of the display screen may be used.
  • “line” means a column of pixels extending in the main scanning line direction in the dot sequential scanning method, and means a line in the line sequential scanning method.
  • Mark recognition unit 210 at the timing of the timing mark D 2 is detected, it instructs the cutout image in the image clipping unit 220.
  • Mark recognition unit 210 provides the image clipping signal D 10 of the image clipping unit 220.
  • Image clipping unit 220 in response to an instruction from the mark recognition unit 210 (cut signal D 10 of the image), cut out the code image D 3 from a video of a predetermined color component.
  • the code image D 3 is represented in the red component data D 9R and the blue component value data D 9B .
  • the red component data D 9R and the blue component value data D 9B representing the code image D 3 are images of predetermined color components. That is, the image clipping unit 220, at the timing when the mark recognition unit 210 detects the timing mark D 2, cuts out the code image D 3 from a video color components determined in advance.
  • the image cutout unit 220 includes a first cutout unit 222A and a second cutout unit 222B.
  • the first cutout portion 222A receives the data D 9R red components from the mark recognition unit 210.
  • the first cutout unit 222A cuts out the first code image from the red component data (video) D9R given from the separation unit 208 at the timing instructed by the mark recognition unit 210.
  • the second cutout portion 222B receives the data D 9B of the blue component from the mark recognition unit 210.
  • the second cutout unit 222B cuts out the second code image from the blue component data (video) D9B given from the separation unit 208 at the timing instructed by the mark recognition unit 210.
  • Code decoder 230 decodes the code image D 3 cut out by the image clipping unit 220 to restore the original data. That is, the code decoder 230, from the code image D 3, and generates a graphic code corresponding thereto. The code decoding unit 230 further converts the graphic code into data corresponding thereto. As a result, the code decoding unit 230 acquires data (data file D 0 ) represented by the captured code image D 3 .
  • the code decoding unit 230 includes a first decoding unit 232A and a second decoding unit 232B.
  • the first decoding unit 232A receives the first code image D 11R from the first cutout portion 222A.
  • the first decryption unit 232A decrypts the first code image D 11R cut out by the first cutout portion 222A, restores the decoded data D 12R of the corresponding color units. That is, the first decoding unit 232A generates a first graphic code corresponding to the first code image D11R .
  • the decoded data D12R in units of color is data for one frame.
  • the first decoding unit 232A converts the generated first graphic code into decoded data D12R in color units corresponding thereto.
  • Second decoding unit 232B receives the second code image D 11B from the second cutout portion 222B. Second decoding unit 232B decodes the second code image D 11B cut out in the second cutout portion 222B, restores the decoded data D 12B of the corresponding color units. That is, the second decoding unit 232B generates a second graphic code corresponding to the second code image D 11B. Here, the decoded data D12B in units of color is data for one frame. Then, the second decoding unit 232B converts the generated second graphic code into decoded data D12B in color units corresponding thereto.
  • the second data unit D 52 input to the second code generation unit 116B. Will produce the same.
  • the first decoding unit 232A and the second decoding unit 232B are configured to generate the code images D 31 and D 32 in the first code generation unit 116A and the second code generation unit 116B of the transmission device 100.
  • the conversion rules used in the above are applied to the decoding of the decoded data D 12R and D 12B .
  • the synthesizing unit 242 combines the decoded data D 12R and D 12B in units of colors output from the first decoding unit 232A and the second decoding unit 232B.
  • the synthesizing unit 242 synthesizes the decoded data D 12R and D 12B in units of colors and restores the transfer data D 13 for one shot.
  • the reception data storage unit 244 stores the transfer data D 13 for one shot (one frame) restored by the combining unit 242. Transfer data D 13 to be stored may be connected to the transfer data D 13 to which it was previously stored in the same data file D 0. By the above process is repeated, the received data storage unit 244, one of all the data in the data file D 0 is stored.
  • FIG. Figure 6 is a diagram showing an example of the dimensions of each part of the timing mark D 2 of the image.
  • FIG. 3 but showing the image of the timing mark D 2 to an image of a color component, in FIG. 6 represent the image of the timing mark D 2 to an image of all color components. Also, it represents one of the code image D 11 to an image of all color components. Therefore, the image and the code image D 11 of the timing mark D 2, for example, is displayed on the display unit 130 as a black image.
  • the transmission apparatus 100 may have only one code generation unit 116. Further, the image cutout unit 220 of the receiving apparatus 200 only needs to have one cutout unit 222.
  • the timing mark D ⁇ b> 2 is shown by two mutually parallel black belt-shaped areas St a and St b located on both sides of the code area Acd.
  • the widths of the regions St a and St b are equal to each other and have the dimension Wst.
  • the interval Sep between the regions St a and St b is 6 times the width Wst of the region.
  • the height direction of the band-shaped regions St a and St b coincides with the sub-scanning direction of the dot sequential scanning method or the scanning direction (direction orthogonal to the line) of the line sequential scanning method.
  • the height Hgh of the regions St a and St b is four times the width Wst.
  • the mark recognition unit 210 the timing marks D 2 in the image obtained by the photographing, whether they have the same dimensional relationships as the (difference Or not).
  • the receiving apparatus 200 can determine whether the whole of the timing mark D 2 has been correctly acquired. For example, in an image obtained by photographing of the receiving device 200, strip-shaped area St a, is shorter than 4 times the height of the St b is the width, mark recognition unit 210, the overall timing mark D 2 is correct shooting It can be determined that it has not been done.
  • FIG. 7A shows the same thing as FIG. FIG. 7 (B) shows what corresponds to FIG. 4 (B) and FIG. 4 (C). That is, FIG. 7 (A) represents a timing mark D 2.
  • FIG. 7 (B) represents a code image D 3. That is, in the description of FIGS. 7A to 7C, description will be made using the image of FIG. Therefore, the code image D 3 (code area Acd) and the timing mark D 2 (areas St a , St b ) are displayed in the data of each color component.
  • FIG. 7C shows a frame shot by the imaging unit 202.
  • a reference symbol P n is attached to each frame photographed by the imaging unit 202.
  • “N” represents a frame number, which is indicated by 1 to 14 in FIG.
  • the dotted line in the vertical direction in FIG. 7A indicates a break of the frame period T t of the display by the display unit 130 as in FIGS. 4A to 4C.
  • a vertical dotted line in FIGS. 7B and 7C indicates a break of the frame period Tr of the imaging by the imaging unit 202. Accordingly, the interval Tr between the vertical lines in FIGS. 7B and 7C represents the length of the imaging frame period (cycle).
  • the period of the frames taken by the imaging unit 202 is the frame period of display by the display unit 130 (frame period T t of display). It is assumed that the length is 0.9 times as large as.
  • the timing mark D 201 is displayed over the entire display period. Therefore, the code image D 301 of the corresponding period of the frame P 3 is cut out.
  • the timing mark D 202 is displayed over all the display periods. For this reason, the code image D 302 in the period of the corresponding frames P 7 and P 8 is cut out.
  • a timing mark D 203 are displayed. For this reason, the first code image D 313 in the period of the corresponding frame P 12 is cut out.
  • composite video D 62 is for synthesizing unit 122 is created by connecting a plurality of synthetic images D61.
  • Composite video D 62 is an image for displaying a single code image D 3.
  • the decoded decode data of the color unit obtained by decoding the code image D 11 For D 12 data representing the pixel value of each pixel
  • processing such as selecting either one (for example, the top one).
  • FIG. 8 is a flowchart showing an operation of displaying video based on transfer data in the transmission apparatus 100.
  • FIG. 9 is a flowchart showing details of step S12.
  • the process shown in FIG. 8 is started in transmission apparatus 100 based on a predetermined program. For example, in the process shown in FIG. 8, a transmission program for transmitting data to the receiving device 200 is activated, and one of the plurality of data files D 0 stored in the transmission data storage unit 102 is selected. Will start when.
  • step S11 the read unit 112 divides the data file D 0 that was selected plurality of data portions (shots).
  • step S12 one of the plurality of data parts generated by the division in step S11 is selected, and the selected data part (data for one shot) is processed.
  • “Processing” here refers to processing performed by the data dividing unit 114, the mark generation unit 106, the code generation unit 116, and the synthesis unit 122 described above. The processing performed by the data dividing unit 114, the mark generating unit 106, the code generating unit 116, and the combining unit 122 will be described with reference to FIG.
  • step S13 the read unit 112, it is determined whether or not finished reading of all the data portions of the data file D 0 that is selected. Determining that the reading of all the data portions ended data file D 0 of the read unit 112 is, for example, sequentially reads out the data of one shot from the data file D 0, it is when it reaches the end of the data file D 0 . Alternatively, the determination that all the data portions of the data file D 0 of the reading unit 112 have been read is performed by dividing the data file D 0 and temporarily storing it in a memory or the like, for example. The data that has been read out is erased, and all the data in the temporary storage location is lost.
  • erasing data can be replaced by moving the data or adding a processed mark to the data. If it is determined in step S13 that reading of all data portions has not been completed (YES in S13), the process returns to step S11 to select the next data portion, and thereafter the processing in steps S12 and S13 is performed. repeat. By repeating such processing, a series of frame data constituting the video is accumulated in the video buffer 126.
  • step S13 when the processing for all the data portions is determined over, the process proceeds to step S14, moving image playback unit 128 performs the output of data D 63 stored in the video buffer 126.
  • the “data portion” is data for one shot. That is, the moving image reproducing unit 128 sequentially displays the data D 63 of a series of frames constituting the video from the video buffer 126 on the display unit 130 as a moving image.
  • FIG. 9 shows details of step S12 of FIG.
  • the data dividing unit 114 a single (one shot data) of a plurality of data portions generated by the division of the data file D 0 in step S11, data of the first color channel (first 1 data unit D 51 ) and second color channel data (second data unit D 52 ).
  • the data units D 51 and D 52 of each color channel become data having a data amount that can be expressed by one graphic code.
  • the plurality of color channels for example, a red (R) channel and a blue (B) channel are used.
  • Data dividing unit 114 provides the first data unit D 51 to the first code generating unit 116A. Then, the data dividing unit 114 provides a second data unit D 52 to the second code generating unit 116B.
  • step S22 the first code generation unit 116A converts the first data unit D 51 given from the data division unit 114 into the first code image D 31 , and outputs an image plane including the first code image D 31.
  • step S23 the second code generating unit 116B converts the second data unit D 52 supplied from the data dividing unit 114 to the second code image D 32, the image plane containing the second code image D 32 Output.
  • the combining unit 122 includes an image plane comprising a first code image D 31 generated in step S22, and an image plane comprising a second code image D 32 generated in step S23, each red channel and Assign to the blue channel and combine to generate a color image. That is, in the step S24, the combining unit 122 assigns image plane including a first code image D 31 generated in step S22 to the red channel. Then, the composition unit 122 assigns image plane comprising a second code image D 32 generated in step S23 to the blue channel. Further, the combining unit 122 assigns image plane which does not include a timing mark D 2 stored in the image storage unit 107 in the green channel.
  • composition unit 122 an image plane including a first code image D 31 assigned to the red channel, the timing mark D assigned to the image plane and the green channel comprises a second code image D 32 assigned to the blue channel
  • An image plane not including 2 is synthesized to generate a color image.
  • step S25 the color image created in step S24 is repeated a predetermined number of times, for example, once, an image for one frame is generated, sent to the video buffer 126, and written into the video buffer 126.
  • a color image obtained by combining the image plane of the first code image D 31 , the image plane of the second code image D 32 , and the image plane not including the timing mark D 2 written in the video buffer 126 is shown in FIG. This corresponds to the image of the first one frame period T t1 shown in A).
  • the “predetermined number of times” is one in the example of FIG.
  • step S26 performs image generation timing mark D 2.
  • mark generation unit 106 previously generates an image plane containing the advance timing mark D 2.
  • the mark generator 106 allowed to store the image plane including timing mark D 2 in the image storage unit 107.
  • the mark generation unit 106 reads out an image plane including the timing mark D ⁇ b > 2 stored in the image storage unit 107 and gives it to the synthesis unit 122.
  • the mark generator 106 allowed to store the image plane which does not include a timing mark D 2 in the image storage unit 107.
  • the combining unit 122 includes an image plane comprising a first code image D 31 generated in step S22, the second code image D 32 generated in Step S23 including an image plane and an image plane including timing mark D 2 generated in step S26, each red channel synthesizes assigned to the blue channel and the green channel, the image D 3 of the timing mark D 2 with code A color image (synthesized image D 61 ) is generated. That is, the combining unit 122 assigns image plane including a first code image D 31 in the red channel. Combining unit 122 assigns image plane comprising a second code image D 32 in the blue channel. Combining unit 122 assigns image plane including timing mark D 2 in the green channel. The combining unit 122 combines the color channels. Then, the composition unit 122 generates a composite image D 61.
  • step S28 the image created in step S27 is repeated a predetermined number of times, for example, twice, and two frames of video are generated, sent to the video buffer 126, and written into the video buffer 126. These two frames are written as a frame subsequent to one frame written in step S25. That is, the color image obtained by combining the image plane of the first code image D 31 , the image plane of the second code image D 32 , and the image plane of the timing mark D 2 written in the video buffer 126 is shown in FIG. It corresponds to the image of the period T e of the two frames indicated by.
  • the “specified number of times” is two in the example of FIG.
  • step S29 it erases the timing mark D 2. Since this process, the mark generator 106 previously generates an image plane that does not include the pre-timing mark D 2. For example, the mark generation unit 106 stores an image plane that does not include the timing mark D 2 in the image storage unit 107. In step S ⁇ b > 29, the mark generation unit 106 reads an image plane that does not include the timing mark D ⁇ b > 2 stored in the image storage unit 107, and provides the image plane to the synthesis unit 122.
  • the combining unit 122 includes an image plane comprising a first code image D 31 generated in step S22, an image plane comprising a second code image D 32 generated in step S23, step S29 in generated, respectively red channel and an image plane in which the timing mark D 2 is erased, synthesized by assigning the blue channel, and green channel, free of timing marks D 2, first code image D 31 and a second generating a color image including a code image D 32. That is, in the step S30, the combining unit 122, an image plane of the first code image D 31, the image plane where the image plane and the timing marks D 2 has been erased in the second code image D 32 synthesizes.
  • step S24 it is also possible to synthesize an image plane in which the timing mark D 2 is erased. That is, it is also possible to synthesize an image plane that does not include a timing mark D 2 stored in the image storage unit 107.
  • step S31 the image created in step S30 is repeated a predetermined number of times, for example, once to generate an image for one frame. Then, the generated image for one frame is sent to the video buffer 126 and written into the video buffer 126. The image for one frame is written as a frame following the video for two frames written in step S28. In other words, the color image obtained by combining the image plane of the first code image D 31 , the image plane of the second code image D 32 , and the image plane from which the timing mark D 2 is erased is written in the video buffer 126 here. This corresponds to the image of the last one frame period T t2 shown in (A).
  • the “predetermined number of times” is one in the example of FIG.
  • the process in step S30 is the same as the process in step S25.
  • the video buffer 126 has, for example, a management area including the number of frames or reference information to actual image data, and a buffer area in which image data is actually stored.
  • the end address of the image stored in the current buffer area is read from the management area information. Then, the data of the frame to be added is copied after the stored image. Thereafter, the data in the management area is updated by adding one frame.
  • the composite video D 63 written in the video buffer 126 is read and displayed as described above, a plurality of shots constituting the data file D 0 are each displayed over a period of 4 frames.
  • one shot is image data of one QR code, for example.
  • the data is sent by the image data of a plurality of QR codes. That is, when the capacity of the data file D 0 is large, the data file D 0 data is sent by the image data of a plurality of QR codes, each QR code is displayed over 4-frame period.
  • the display content changes as follows during the display of one shot. In the first frame period T t1, only the first code image D 31 and the second code image D 32 is displayed.
  • the receiving apparatus 200 acquires transfer data D 13 from the video data D 7 imaged is a flowchart showing the operation of storing.
  • the process shown in FIG. 10 is started in receiving apparatus 200 based on a predetermined program. For example, the process illustrated in FIG. 10 is started when a reception program for receiving data (synthesized video D 63 ) transmitted from the transmission device 100 is activated.
  • step S41 the imaging unit 202 captures a moving image. That is, the image (synthesized image D 63 ) displayed on the display unit 130 is imaged by the imaging unit 202.
  • step S42 the sequence of frames constituting the video data D 7 obtained by the imaging, will be stored sequentially in the video buffer 204. After the end of recording, reading the frames constituting the video data D 7 stored in the video buffer 204 sequentially performs processing to be described later.
  • step S43 the reading unit 206 selects one of the frames from the video buffer 204 and reads the image data of the selected frame.
  • step S44 the separation unit 208, a frame of data D 8 read in step S43, the separated data D 9 color component values (step S44).
  • mark recognition unit 210 detects the timing mark D 2 (step S45). In the case where the timing mark D 2 is not detected, if there are remaining frames in the video buffer 204, reading unit 206 repeats from 1 frame read from the video buffer 204 (step S43). Reading unit 206 performs a frame read (step S43) from the video buffer 204 to the timing mark D 2 is detected.
  • step S44 the separation unit 208, the data D 8 of one frame read in step S43, the separated data for each color channel.
  • three primary colors of red (R), blue (B), and green (G) are adopted as color channels.
  • the separation unit 208 provides the red component image data to the first cutout unit 222A, the blue component image data to the second cutout unit 222B, and the green component image data to the mark recognition unit 210.
  • step S45 the mark recognition unit 210 determines whether the timing mark D 2 is detected. That is, the mark recognition unit 210, and determines whether contains timing marks D 2 to the image represented by the data D 9G of the separated green component value at step S44. In the case where the image represented by the data D 9G of the separated green component value in step S44 does not include a timing mark D 2 (NO in step S45), the process proceeds to step S57.
  • step S47, S49 the image clipping unit 220 is included in each color channel code image (color component values of the data D 9R, D 9B) is Cut out the area.
  • step S48 and S50 the code decoding unit 230 executes code decoding.
  • step S47 the first cutout unit 222A cuts out the first code image D11R from the image represented by the red component value data D9R separated in step S44.
  • step S48 the first decoding unit 232A decodes the first code image D11R included in the image cut out in step S47. As a result, the decoded decoded data D 12R in color units corresponding to the first data unit D 51 is extracted.
  • step S49 the second cutout portion 222B from the image represented by the data D 9B of the separated blue component value at step S44, cut a second code image D 11B.
  • step S50 the second decoding unit 232B decodes the second code image D11B included in the image cut out in step S49. As a result, the decoded decoded data D 12B in color units corresponding to the second data unit D 52 is extracted.
  • step S51 the combining unit 242 combines the decoded data D 12R and D 12B in units of colors.
  • the received data storage unit 244 the decoded decode data D 12R color unit, stores the transfer data D 13 obtained as a result of synthesis of D 12B.
  • step S51 the synthesizer 242 decodes the color-unit decoded decoded data D 12R (first data unit D 51 ) extracted in step S48 and the color-unit decoded decoded data D extracted in step S48. 12B (second data unit D 52 ) is combined to restore the transfer data D 13 .
  • step S52 the combining unit 242 sends the transfer data D 13 that is restored to the received data storage unit 244.
  • the received data storage unit 244 stores the transfer data D 13 received.
  • the red channel is the preceding data D 12R and the blue channel is the succeeding data D 12B .
  • the red channel data D 12R acquired this time is added, and then the blue channel data D 12B is added and stored.
  • step S52 the process proceeds to step S53, where it is determined whether there are any remaining frames in the video buffer 204. If there are no remaining frames (NO in step S50), the process is terminated. Any remaining frame (YES at step S50), in step S54, the reading unit 206 further reads out data D 8 for one frame from the video buffer 204.
  • step S55 the separation unit 208, the data D 8 of the frame read in step S54, the image of each color channel (data D 9G color component values, D 9R, D 9B) to separate.
  • step S56 the mark recognition unit 210, (or erased) Timing mark D 2 is either detected to determine whether.
  • step S56 the timing mark D 2 which has been detected to verify no longer detected.
  • step S57 it is determined whether there are any remaining frames in the video buffer 204. If there are remaining frames, the process returns to step S43. If there are no remaining frames, the process ends. In this way, by performing processing until the video data D 7 in the video buffer 204 is exhausted, the receiving device 200 can restore the transferred data (transfer data D 13 ).
  • the amount of information represented by each code is reduced, and each point constituting the code image is increased. Can be displayed.
  • a high-density QR code is obtained.
  • the “cell” indicates the smallest monochrome pattern.
  • blurring or blurring when taking a QR code is likely to occur.
  • the cell If the cell is large, the occurrence of blurring or blurring when taking a QR code is suppressed. The probability of successful decoding of the QR code increases. For this reason, it is possible to improve resistance to defocusing and resistance to camera shake during shooting. In other words, the tolerance at the time of shooting with respect to defocus and camera shake is improved. Then, when the transmission device 100 is held up toward the reception device 200, the workability of the user can be improved.
  • the timing mark D 2 is at least drawing code image D 3 It can be represented as an image that occupies the entire range (code region).
  • the timing mark D 2 can be expressed as a uniform image. “Uniform image” is, for example, an image represented by the maximum gradation value in the entire rectangular area (area shown by cross-hatching) shown in FIG. In the example shown in FIG. 3, the drawing range of the code images D 31 and D 32 and the range occupied by the timing mark D 2 are the same.
  • the timing mark D 2 is represented as a uniform image including the entire code images D 31 and D 32 , and the entire timing mark D 2 is read without being lost, the code images D 31 and D 32 can be regarded as being read correctly.
  • the period (body period) in which the code images D 31 and D 32 with the timing mark D 2 are continuously displayed is a two-frame period.
  • a period (guard period) during which the code images D 31 and D 32 without the timing mark D 2 are displayed before and after the body period is one frame period.
  • the length of the body period (number of frames) and the length of the guard period (number of frames) should be determined according to various conditions, and the invention of Embodiment 1 is not limited to the above example.
  • Timing mark D 2 is not limited to those described above. Further, if there is room in the timing to check chipping timing mark D 2 (overall timing mark D 2 is a confirmation of whether or captured) it becomes unnecessary. Then, further, the degree of freedom relating to the determination of the shape of the timing mark D 2 is higher.
  • the timing margin increases as the imaging frame rate increases with respect to the display frame rate. Also, the larger the number of guard frames and the number of body frames, the larger. Further, for example, the timing mark D 2 are marks fills the entire screen in a certain color, it marks fill a uniform pattern, a circular mark, the mark of multiple circles, a further square around the square marks or square It is possible to adopt a mark with an enclosed edge.
  • timing mark D 2 In the case of using a plurality of timing marks D 2, it can have different meanings in different timing mark D 2. In other words, it is possible to add a timing mark D 2 which differs depending on data portion to the data file D 0. For example, a plurality of timing marks D 2, it can be shown a series of delimited data, termination, etc. of the start or data of the data.
  • FIG 11 (A) ⁇ FIG 11 (M) show an example of a different timing mark D 2 of the combination they have different meanings.
  • a timing mark D 2 it can be shown that the leading data part of the data file D 0 by a circular timing mark D 2A shown in Figure 11 (A).
  • FIG. 11 (A) ⁇ FIG 11 (C) is preferably displayed on the position corresponding to the code image D 3. That is, the circular depicted in FIG. 11 (A) ⁇ FIG 11 (C) is preferably encoded image D 3 is displayed in the area to be displayed.
  • timing mark D 2 it can be shown that the leading data part of the data file D 0 by rectangular timing mark D 2D shown in FIG. 11 (D).
  • rectangle depicted in FIG. 11 (D) ⁇ FIG 11 (F) drawn with a size corresponding to the code image D 3, it is desirable to be displayed in the corresponding position in the code image. That is, the rectangle depicted in FIG. 11 (D) ⁇ FIG 11 (F) is preferably encoded image D 3 is displayed in the area to be displayed.
  • “Segment” is a set of one or more “shots”. Even if an error occurs in one shot, the file transfer is divided so as not to fail as a whole. That is, the “segment” is divided so that the file transfer can be said to be “partially successful”. When applied to signage, etc., the same data is repeatedly displayed. Even if a part of shots becomes an error in the first round, if the transfer including the shot is successful in the second round, the file transfer can be successful as a whole. Considering camera shake at the time of photographing, it is difficult to photograph a plurality of shots necessary for one transfer without error.
  • the “segment” is a mechanism for relieving such an error.
  • the rectangular timing mark D 2G shown in FIG. 11 (G) it can be shown that the leading data part of the data file D 0.
  • the timing mark D 2H arranged one smaller rectangle than the rectangle, that the data portion of the intermediate (non-head and tail) of the data file D 0 Can be shown.
  • a smaller rectangle than the rectangle by two arranged the timing mark D 2I can be shown to be the last data portion of the data file D 0.
  • a timing mark D2J in which three rectangles smaller than the rectangle are arranged on the left side of the rectangle shown in FIG.
  • FIG. 11 (J) can indicate the data portion at the beginning or end of each segment. “The data portion at the beginning or end of each segment” means, for example, the first shot or the tenth shot of one segment when one segment is composed of ten shots.
  • large rectangle depicted in FIG. 11 (G) ⁇ FIG 11 (J) is preferably encoded image D 3 is displayed in the area to be displayed.
  • timing mark D 2 it can be shown that the leading data part of the data file D 0 by rectangular timing mark D 2K shown in FIG. 11 (K).
  • the timing mark D 2M depicting each of the two rectangles on each side of the rectangle shown in Fig. 11 (M), can also be shown to be the last data portion of the data file D 0.
  • large rectangle depicted in FIG. 11 (K) ⁇ FIG 11 (M), drawn with a size corresponding to the code image D 3 may be displayed at a position corresponding to the code image D 3 desirable.
  • large rectangle depicted in FIG. 11 (K) ⁇ FIG 11 (M) it is desirable that the code image D 3 is displayed in the area to be displayed.
  • mark generation unit 106 may generate a corresponding timing mark D 2.
  • the video imaged by the imaging unit 202 is stored in the video buffer 204 as moving image data. Then, after taking a picture, the decoding process is taken out the video data D 7 frame by frame from the video buffer 204 is performed. Therefore, the receiving apparatus 200 can also be configured with a terminal having a low computing capability.
  • the code image D 3 has been described as being switched between one frame and the next frame.
  • a liquid crystal display device that is a hold-type display device
  • the “timing immediately before and after the switching” is set to “the frame in which the switching is performed and the immediately preceding frame and the immediately following frame”
  • the timing mark D 2 is set for a period of 3 frames per shot. In some cases, it is not added and is not subject to decoding.
  • FIGS. 12A, 12B, 13A, 13B, 13C, 13D, 14A, and 14 B), FIG. 14C, FIG. 15A, FIG. 15B, and FIG. FIG. 12A and FIG. 12B are diagrams illustrating changes in the display image within the display frame period.
  • FIG. 13A to FIG. 13D are diagrams showing changes in the display image within the display frame period.
  • FIGS. 14A to 14C are diagrams showing changes in the display image within the display frame period.
  • FIGS. 15A to 15C are diagrams showing changes in the display image within the display frame period.
  • FIG. 12A shows a change in the display image displayed on the display unit 130.
  • Upper symbols Fd 1 to Fd 7 represent display frame numbers.
  • FIG. 12B shows a change in an image (captured image) obtained by imaging by the imaging unit 202.
  • Upper symbols Fp 1 to Fp 7 represent imaging frame numbers.
  • FIGS. 13A to 15C show changes in the display image within the display frame period.
  • the display frame rate (display frame period T t ) and the imaging frame rate (imaging frame period T r ) are the same. However, it is assumed that the imaging frame is shifted from the display frame by a half frame period. In addition, the imaging screen is not inclined with respect to the display screen. That is, it is assumed that the horizontal direction of the imaging screen matches the horizontal direction of the display screen.
  • the update of the image within each frame is indicated by an oblique line SC that is sequentially performed from the top to the bottom of the screen over one frame period (display frame period T t ).
  • the code image D 3 is assumed to be located in the center of the screen.
  • the code image D 3 replaces sequentially cut code D 3A, by the symbol D 3B and code D 3C.
  • the code image D 3A, D 3B, the upper half of the D 3C respectively D 3AU, D 3BU, at D 3 CU.
  • the lower half of the code images D 3A , D 3B , D 3C is indicated by D 3AL , D 3BL , D 3CL .
  • the lower half of the timing mark D 2 indicated by D 2L.
  • the image update will be described with reference to the display frame Fd 1 .
  • the time point at which the screen is updated is indicated by a time point t ns (n is a natural number representing a frame number).
  • a time point t 1b indicates a time point when the update of the screen of the display frame Fd 1 proceeds to the upper end of the code image.
  • a time point when the screen update of the display frame Fd 1 has advanced to the center of the screen is indicated by a time point t 1m .
  • a time point t 1e indicates a time point when the update of the screen of the display frame Fd 1 proceeds to the lower end of the code image.
  • the code image D 3A is displayed in the display frame Fd 1 .
  • D 3BU is displayed on the upper side of the code image D 3 of the display frame Fd 1 .
  • D 3AL is displayed below the code image D 3 of the display frame Fd 1 .
  • the code image D 3B is displayed in the display frame Fd 1 .
  • Figure 12 (A) and the timing mark D 2 in the example shown in and FIG. 12 (B) is located only on the left side of and the code image D 3 intended strip. It is assumed that the imaging unit 202 is open for the entire imaging period of each imaging frame.
  • the code image D 3 is switched in 6 frame periods. Among them, the switching of the code image D 3 in the first frame Fd 1 is performed. Switching of the image due to the addition of the timing mark D 2 is Okonawaru in the third frame Fd 3. Switching of the image by erasing timing mark D 2 is carried out in the fifth frame Fd 5.
  • the update of the screen of the display frame Fd 1 will be described with reference to FIG.
  • the code image D 3 being displayed is displayed.
  • the lower half the lower half D 3AL of the code image D 3A before switching is displayed.
  • the upper half of the code image D 3 being displayed the upper half D 3BU code image D 3B after switched is displayed.
  • the lower half of the displayed code image D 3 is also switched as shown in FIG. 13B. It becomes the lower half D 3BL of 3B . From time t 1m to time t 1e, the lower half of the displayed code image D 3 gradually changes from the image D 3A before switching to the image D 3B after switching.
  • timing marks D 2 are not displayed at all.
  • Image update as shown in the advanced time t 3m to the center of the screen FIG. 14 (B), the displayed upper half D 2U timing mark D 2 is, the lower half D 2L timing mark D 2 are displayed Not.
  • advanced time t 3e update image to the lower end of the timing mark D 2 as shown in FIG. 14 (C), the overall timing mark D 2 are displayed. In other words, gradually appeared to lower the timing mark D 2 over a period of from time t 3b to time t 3e is from the upper side of the screen.
  • timing marks D 2 is displayed in its entirety.
  • Image update as shown in the advanced time t 5 m to the center of the screen FIG. 15 (B), the display lower half D 2L timing mark D 2 is the upper half D 2U timing mark D 2 is erased ing.
  • the overall timing mark D 2 is erased. In other words, it will be erased gradually to the lower timing mark D 2 over a period of from time t 5b to the time point t 5e from the upper side.
  • the code image D 3 to be displayed are all switched-previous image D 3B.
  • advanced time t 7m update image to the center of the code image D 3 as shown in FIG. 13 (D)
  • the upper half of the code image D 3 is the upper half of the code image D 3C after switched is D 3 CU.
  • the upper half of the code image D 3 is the same as the upper half D 3BU of the image D 3B after switching.
  • the lower half of the code image D 3 is an image D 3ABL where the lower half D 3BL mixed image D 3B after switches the lower half D 3AL of switched previous image D 3A.
  • the mixing ratio varies depending on the position in the vertical direction in the imaging screen, and is equal to the ratio of the time during which the image D 3A is displayed and the time during which the image D 3B is displayed at each position on the corresponding display screen.
  • the entire code image D 3 is the same as the code image D 3B.
  • the timing mark D 2 is not added, whereas in the first half of the display frame Fd 3 (a period until the update reaches the center of the screen), the timing mark D 2 gradually increases from the top. appear. Therefore, in the image obtained by the imaging in the imaging frame Fp 2, the lower half of the timing mark D 2 D 2L it does not appear. Also, half of D 2U on the timing mark D 2 is an image D 2U0 of the image that do not appear half D 2U appears images on the timing mark D 2 are mixed.
  • the ratio of the mixture in each location of the corresponding display screen, the time and the timing mark D 2 appearing equal to the ratio of the time when the timing mark D 2 does not appear.
  • the timing marks D 2 in the captured image as the direction of the upper darker (the gradation value is large), becomes thinner toward the lower side (gradation values is small).
  • the entire code image D 3 is the same as the code image D 3B.
  • the second half in update period after the time of reaching the center of the screen
  • the timing mark D 2 of the display frame Fd 3 is gradually extended progressively further downward from the state appeared to the center. That is, the lower half D 2L of the timing mark D 2 also gradually appears.
  • the display frame Fd 4 timing marks D 2 in its entirety has appeared. Therefore, in the image obtained by the imaging in the imaging frame Fp 3, timing marks D 2 entirety appears that.
  • the lower half of the image obtained by the imaging in the imaging frame Fp 3 the image D 2L0 of the image the bottom half of the timing mark D 2 does not appear as images D 2L appear are mixed.
  • the ratio of the mixture in each location, time and the timing mark D 2 appearing equal to the ratio of the time when the timing mark D 2 does not appear. Therefore, the lower half D 2L of the timing mark D 2 in the captured image is thinner toward the lower side and darker toward the upper side.
  • timing mark D 2 is displayed in the second half of the display frame Fd 4 (a period after the time when the update reaches the center of the screen).
  • timing mark D 2 is going to disappear gradually from the upper side in order. Therefore, in the image obtained by the imaging in the imaging frame Fp 4, timing marks D 2 entirety appears that.
  • the upper half image and image upper half D 2U image and timing marks D 2 of the upper half D 2U timing mark D 2 appears does not appear is mixed D of the image obtained by the imaging in the imaging frame Fp 4 2U0 It becomes.
  • the ratio of the mixture in each location, time and the timing mark D 2 appearing equal to the ratio of the time when the timing mark D 2 does not appear. Accordingly, the upper half D 2U of the timing mark D 2 in the captured image is thinner toward the upper side and darker toward the lower side.
  • the entire code image D 3 is the same as the code image D 3B.
  • the second half (the period after the time when the update has reached the center of the screen) of the display frame Fd 5 timing marks D 2 fades gradually from the upper side in this order.
  • the upper half D 2U of the timing mark D 2 does not appear in the image obtained by imaging in the imaging frame Fp 5 .
  • the lower half of the timing mark D 2 is an image D 2L0 to the image lower half D 2L image and timing marks D 2 of lower half D 2L timing mark D 2 appears does not appear have been mixed.
  • the timing marks D 2 in the captured image is darker towards the bottom, it becomes thinner toward the upper side.
  • the lower half of the imaging frame Fp 6 is the same as the lower half D 3BL of the image D 3B before switching.
  • the code image D 3 obtained by the image pickup frame Fp 4 it can be seen is suitable for decoding.
  • density of the timing marks D 2 varies depending longitudinal position. In this case, it is assumed that it is determined whether or not the timing mark D 2 appearing in each line of the display screen. For example, if the threshold than the predetermined density, may be determined code image D 3 of the line and suitable for decoding.
  • the threshold value is a medium value
  • the lower half D 3L of the code image D 3 of the imaging frame Fp 3 and a part of the upper half D 3U of the code image D 3 are suitable for decoding.
  • density is medium value is the density in the case where the timing mark D 2 over all imaging period of the image frame is displayed, not displayed timing mark D 2 over all imaging period of the image frame It is an intermediate value of the darkness of the case.
  • All of the upper half D 3U of the code image D 3 of the imaging frame Fp 5 and a part of the lower half D 3L of the code image D 3 are determined to be suitable for decoding.
  • the line density of the timing marks D 2 is equal to or greater than the threshold is, the code image D 3 of the frame is determined to be more than a predetermined number, even be determined to be in its entirety suitable for decoding good.
  • the lower half of the timing mark D 2 D 2L does not appear. Then, the upper half D 2U of the timing mark D 2 appears. Therefore, in the imaging frame Fp 2, density of the timing marks D 2 depends longitudinal position. In such a case, the code image D3B as a whole may be determined to be unsuitable for decoding. Alternatively , by lowering the threshold, a part of the upper half D 3BU of the code image D 3B may be suitable for decoding. Furthermore, in a predetermined number or more of lines, when the density of the timing marks D 2 is determined to be equal to or greater than the threshold value, it may be determined to be suitable for decoding the entire code image D 3B .
  • the upper half D 2U of the timing mark D 2 does not appear in the imaging frame Fp 5 .
  • the lower half D 2L of the timing mark D 2 appears. Therefore, density of the timing marks D 2 depends longitudinal position. In such a case, the code image D3B as a whole may be determined to be unsuitable for decoding. Alternatively , by lowering the threshold value, a part of the lower half D 3BL of the code image D 3B may be suitable for decoding. Furthermore, in a predetermined number or more of lines, when the density of the timing marks D 2 is determined to be equal to or greater than the threshold value, it may be determined to be suitable for decoding the entire code image D 3B .
  • the timing mark D 2 does not appear at all.
  • the upper half D 3BU of the code image D 3B is equal to the image D 3BU after switching.
  • the lower half D 3BL code image D 3B is an image D 3ABL that the image D 3BL after switches the switched previous image D 3AL have been mixed. Therefore, the code image D 3B of the imaging frame Fp 1 is not suitable for decoding.
  • timing marks D 2 does not appear.
  • the lower half of the code image D 3B is equal to the image D 3BL before switching.
  • the upper half of the code image D 3B is an image D 3BCU in which the image D 3 CU of switches and switched preceding image D 3BU mixed. Therefore, the code image D 3B of the imaging frame Fp 6 is not suitable for decoding.
  • the code image D 3 is not suitable for decoding, can be determined based on the timing mark D 2 does not appear at all.
  • the shutter opening time (exposure time) of the imaging unit 202 extends over the entire frame period Tr of each imaging has been described above. However, when the shutter opening time indicates only a part of the frame period Tr of each imaging, the above mixing ratio is switched to the period during which the image before switching is displayed within the shutter opening time. It becomes a ratio with the period in which a subsequent image is displayed.
  • the display screen of the display unit 130 may be tilted in an image obtained by imaging by the imaging unit 202. Therefore, when it is determined whether or not the code image D 3 for each line of the display screen suitable for decoding, in the image obtained by the imaging (image data D 7), the display unit 130 of the line Need to specify direction.
  • each line of the display screen may each range of longitudinal positions comprising a plurality of successive lines from each other, that it is determined whether or not those codes image D 3 is suitable for decoding.
  • each band-shaped area composed of a plurality of lines can be performed is determined whether or not those codes image D 3 is suitable for decoding.
  • the code image and the timing mark are composed of data of the three primary colors of red (R), blue (B), and green (G).
  • R red
  • B blue
  • G green
  • the present invention is not limited to this, It may be configured by data of a color different from the above. However, it is easier to configure with the data of the three primary colors when multiplexing by combining by the combining unit 122 as shown in FIG. 2, for example, and when separating by the separating unit 208 as shown in FIG. And efficient. The same applies to the following embodiments.
  • Embodiment 2 FIG.
  • the combining unit 122, the timing marks D 2 and code image D 3 are synthesized assigned to each color channel. Then, the composition unit 122, and generates a composite image D 61 of the collar.
  • the present invention is not limited to such an example.
  • the images D 2 , D 31 , and D 32 output from the mark generation unit 106, the first code generation unit 116A, and the second code generation unit 116B can be separately connected to generate separate videos. . Then, the synthesis unit 122 may synthesize these videos by assigning them to the respective color channels.
  • FIG. 16 shows the configuration in that case.
  • FIG. 16 is a block diagram schematically showing a configuration of transmitting apparatus 100a in the second embodiment.
  • a transmitting apparatus 100a illustrated in FIG. 16 includes a first video buffer 126A, a second video buffer 126B, a third video buffer 126C, a first video buffer 126A, a video playback unit 128, and a synthesis unit 122 instead of the video buffer 126, the video playback unit 128, and the synthesis unit 122 illustrated in FIG. It has a 1 video playback unit 128A, a second video playback unit 128B, a third video playback unit 128C, and a synthesis unit 122b.
  • Constituent elements similar to those of the transmitting apparatus 100 described in Embodiment 1 are assigned the same reference numerals, and descriptions thereof are omitted.
  • Constituent elements similar to those of the transmission device 100 are a transmission data storage unit 102, a data processing unit 110, a reference clock generation unit 104, a mark generation unit 106, and a display unit 130.
  • the reading unit 112, the data division unit 114, the first code generation unit 116A, and the second code generation unit 116B of the data processing unit 110 are the same components as those in the first embodiment.
  • the image storage unit 107 of the mark generation unit 106 is a component similar to that of the first embodiment.
  • First video buffer 126A stores the first code image D 31 generated by the first code generating unit 116A in this order.
  • First video buffer 126A stores the first code image D 31 in sequence.
  • the first video buffer 126A is, these are connected to generate a first code image D 64. That is, the first image buffer 126A connects the first code image D 64 accumulated.
  • the first video buffer 126A generates a first code image D 64.
  • Second video buffer 126B stores the second code image D 32 generated by the second code generating unit 116B in this order.
  • Second video buffer 126B stores the second code image D 32 in sequence.
  • the second video buffer 126B is these ligated to generate a second code image D 65. In other words, the second image buffer 126B connects the second code image D 65 accumulated.
  • the second video buffer 126B generates a second code image D 65.
  • Third image buffer 126C stores the timing mark D 2 generated by the mark generation unit 106 in order.
  • Third image buffer 126C stores the timing mark D 2 in order.
  • the third image buffer 126C is these ligated to generate a timing mark image D 21. That is, the third image buffer 126C connects the accumulated timing mark D 2.
  • the third image buffer 126C generates the timing mark image D 21.
  • the first video playback unit 128A among the data file D 0, when the connection code image D 31 of the first video buffer 126A is processing is completed, the data stored in the video buffer 126A (first code image D 64 ) Are sequentially read for each frame and reproduced as a moving image. That is, the first moving image reproduction unit 128A reproduces a moving image sequentially reads the first code image D 64. The first video playback unit 128A reads the first code image D 64 for each frame. The timing at which the first video playback unit 128A reads out the first code image D 64 is the time the connection is complete code image D 31 in the first image buffer 126A.
  • the second moving image reproduction unit 128B of the data file D 0, when the connection code image D 32 of the second image buffer 126B to process is completed, the data stored in the video buffer 126B (second code image D 65 ) Are sequentially read for each frame and reproduced as a moving image. That is, the second moving image reproduction unit 128B is reproduced as sequentially reads video a second code image D 65. The second moving image reproduction unit 128B reads the second code image D 65 for each frame. Timing when the second moving image reproduction unit 128B reads the second code image D 65 is the time the connection is complete code image D 32 in the second image buffer 126B.
  • the third moving image playback unit 128C receives the data (timing mark video D 21 ) stored in the video buffer 126C when the conversion of all data of the data file D 0 to the code video D 64 and D 65 is completed. It reads out every frame sequentially and reproduces it as a moving picture. That is, the third moving image reproduction unit 128C is reproduced as video sequentially reads the timing marks image D 21.
  • the third video playback unit 128C reads the timing marks video D 21 for each frame. Timing the third moving image reproduction unit 128C reads the timing marks image D 21 is the time when the video buffer 126A, the connection code image D 31, D 32 at 126B has been completed.
  • the video D 21 is synthesized by being assigned to independent color channels.
  • the synthesis unit 122b generates a synthesized video D 66 (color video) for one data file D 0 .
  • Data (synthesized video D 66 ) transmitted from the transmission device 100a in FIG. 16 can be received by the reception device 200 in FIG. 5 described in the first embodiment.
  • FIG. 17 shows a transmitting apparatus 100c according to the third embodiment of the present invention.
  • FIG. 18 illustrates a receiving device 200c according to the third embodiment.
  • the transmission device 100 c includes a light emitting unit 156 instead of the mark generation unit 106. Further, the transmission device 100 c includes a combining unit 122 c instead of the combining unit 122.
  • the receiving device 200 c includes a light emission detection unit 260 instead of the mark recognition unit 210.
  • Constituent elements similar to those of the transmitting apparatus 100 described in Embodiment 1 are assigned the same reference numerals, and descriptions thereof are omitted.
  • the same components as the transmission apparatus 100 are a transmission data storage unit 102, a data processing unit 110, a reference clock generation unit 104, a video buffer 126, a moving image reproduction unit 128, and a display unit 130.
  • the reading unit 112, the data division unit 114, the first code generation unit 116A, and the second code generation unit 116B of the data processing unit 110 are the same components as those in the first embodiment.
  • Constituent elements similar to those of receiving apparatus 200 described in Embodiment 1 are assigned the same reference numerals, and descriptions thereof are omitted.
  • Constituent elements similar to those of the receiving apparatus 200 are an imaging unit 202, a video buffer 204, a reading unit 206, a separating unit 208, an image clipping unit 220, a code decoding unit 230, a combining unit 242, and a received data storage unit 244.
  • the first cutout part 222A and the second cutout part 222B of the image cutout part 220 are the same constituent elements as those in the first embodiment.
  • the first decoding unit 232A and the second decoding unit 232B of the code decoding unit 230 are the same constituent elements as those in the first embodiment.
  • the transmission device 100c in FIG. 17 is generally the same as the transmission device 100 in FIG. 2, but includes a light emitting unit 156 instead of the mark generation unit 106. Further, a synthesis unit 122 c is provided instead of the synthesis unit 122.
  • Combining section 122c is combined with the first code image D 31 from the first code generating unit 116A of the second code image D 32 from the second code generating unit 116B. Unlike the synthesis unit 122 of FIG. 2, the timing mark D 2 I am not synthesized.
  • the transmission data (transfer data)
  • the combined image D 67 assigned to the red channel and blue channel is obtained.
  • Video buffer 126 sequentially connecting the composite image D 67. Also in the synthesized video D 68 obtained by linking, the content of the code image D 3 is switched every one shot period T d (four frame periods) as in the first embodiment.
  • the light emitting unit 156 is disposed adjacent to the display screen 131 of the display unit 130. That is, the light emitting unit 156 is arranged so that the imaging unit 202 of the receiving device 200c can shoot with the same field of view as the display screen 131 of the display unit 130.
  • the light emitting unit 156 is configured by an LED or the like. Emitting unit 156, in synchronization with the generated clock signal CL at the reference clock generator 104, to emit light only just before and the period other than the period immediately after the timing of switching the code image D 3 in the synthesis image D 68. For example, the period during which the light emitting unit 156 to emit light is switched immediately before and the frame period immediately after the code image D 3 (guard period) other periods (body period). The light emitting unit 156 emits green light, for example.
  • a monitor lamp having another role in the transmission device 100c may be used as the light emitting unit 156. For example, a monitor lamp used to indicate power on / off may be used as the light emitting unit 156.
  • the reading unit 112 uses one shot data (the code image D 3 displayed in one shot period T d). ) Is read each time a signal (synchronization signal SY) is output.
  • the light emitting unit 156 receives the signal (synchronization signal SY) from the reading unit 112 and determines the timing of light emission.
  • the receiving apparatus 200c shown in FIG. 18 is generally the same as the receiving apparatus 200 of FIG. 5, but differs in that a light emission detecting unit 260 is provided instead of the mark recognizing unit 210.
  • the imaging unit 202 images the display screen 131 of the display unit 130 and the light emitting unit 156 of the transmission device 100c on which the video (the synthesized video D 68 ) is displayed with the same field of view. Then, the imaging unit 202 acquires an image including the image (synthesized image D 68 ) displayed on the display screen 131 and the light from the light emitting unit 156 by imaging.
  • data (video data D 7 ) obtained by imaging by the imaging unit 202 is accumulated in the video buffer 204.
  • the reading unit 206 reads the data one frame (frame data D 8 ) at a time.
  • the frame data D 8 is separated by the separation unit 208.
  • the green component value data is supplied to the light emission detection unit 260.
  • the red component value data is supplied to the first cutout unit 222A.
  • the blue component value data is supplied to the second cutout unit 222B.
  • the light emission detection unit 260 determines whether or not the light emission unit 156 is reflected in each frame. Luminescent detection unit 260 (when the detection of luminescence) when it is determined that the reflected light-emitting state, it is determined that the code image D 3 in the frame are those that can be correctly decoded. The light emission detection unit 260 transmits this determination result to the image cutout unit 220. Luminescent detection unit 260 instructs the extraction of the code image D 3 to the image clipping unit 220. Image clipping unit 220 performs extraction of the code image D 3 in accordance with the instruction (image clipping signal D 10). Thus, light emission code image D 3 from the detected frame is cut. Except for the above, the third embodiment is the same as the first embodiment.
  • Embodiment 4 FIG.
  • the light emitting unit 156 to emit different colors of light from the code image D 3.
  • the light emission from the light emitting unit 156 is separated by the color of the image portion in the captured image.
  • the light from the light emitting unit 156 may be separated based on the position of the light emitting unit 156 or the blinking timing.
  • FIG. 19 shows a transmitting apparatus 100d according to the fourth embodiment of the present invention.
  • FIG. 20 illustrates a receiving device 200d according to the fourth embodiment.
  • the transmission device 100d in FIG. 19 is generally the same as the transmission device 100 in FIG. However, the transmission device 100d includes a light emitting unit 158 instead of the mark generation unit 106, similarly to the transmission device 100c of FIG.
  • the transmission device 100d includes a data processing unit 110d instead of the data processing unit 110.
  • the transmission device 100d includes a combining unit 122d instead of the combining unit 122.
  • the data processing unit 110d is generally the same as the data processing unit 110 of FIG. 2, but includes a data dividing unit 114d instead of the data dividing unit 114.
  • the data processing unit 110d includes a third code generation unit 116C in addition to the first code generation unit 116A and the second code generation unit 116B.
  • Constituent elements similar to those of the transmitting apparatus 100 described in Embodiment 1 are assigned the same reference numerals, and descriptions thereof are omitted.
  • Constituent elements similar to those of the transmission device 100 are a transmission data storage unit 102, a reference clock generation unit 104, a video buffer 126, a moving image reproduction unit 128, and a display unit 130.
  • the reading unit 112, the first code generation unit 116A, and the second code generation unit 116B of the data processing unit 110 are the same components as those in the first embodiment.
  • Constituent elements similar to those of receiving apparatus 200 described in Embodiment 1 are assigned the same reference numerals, and descriptions thereof are omitted.
  • the same components as the receiving apparatus 200 are an imaging unit 202, a video buffer 204, a reading unit 206, a separation unit 208, and a reception data storage unit 244. Further, the first cutout part 222A and the second cutout part 222B of the image cutout part 220d are the same constituent elements as those in the first embodiment. Further, the first decoding unit 232A and the second decoding unit 232B of the code decoding unit 230d are the same constituent elements as those in the first embodiment.
  • the data dividing unit 114d divides the data of each shot read by the reading unit 112 into three data parts. Then, the data dividing unit 114d gives the first data portion (first data unit D 51 ) to the first code generating unit 116A. The data dividing unit 114d gives the second data portion (second data unit D 52 ) to the second code generating unit 116B. The data dividing unit 114d gives the third data portion (third data unit D 53 ) to the third code generating unit 116C.
  • the third code generation unit 116C converts the third data unit D 53 given from the data division unit 114 into a corresponding code image (third code image D 33 ). That is, the third code generation unit 116C converts the third data unit D 53 into a graphic code. The third code generation unit 116C further converts the graphic code into a code image (third code image D 33 ) corresponding to the graphic code. The third code generating unit 116C outputs the third code image D 33.
  • the third code generating unit 116C divides one shot data D 4 into three channels. Therefore, the third code generating unit 116C is the amount of data D 4 in 1 shot, it can be made larger than that of the first embodiment.
  • Combining unit 122d includes a first code image D 31 outputted from the first code generating unit 116A, and the second code image D 32 outputted from the second code generating unit 116B, output from the third code generating unit 116C and a third code image D 33 that is synthesized assigned to each independent color channel. Then, the synthesis unit 122d generates a synthesized image D 61 (color image). For example, the composition unit 122d performs composition by assigning the first code image to the red (R) channel. Then, the combining unit 122d performs the combining by assigning the second code image to the blue (B) channel. Then, the combining unit 122d performs the combining by assigning the third code image to the green (G) channel. As a result of such processing, a composite image D 61 assigned to the red, blue and green channels is obtained.
  • the contents of the code image D 3 are switched every one shot period T d (4 frame periods) as in the first embodiment.
  • the light emitting unit 158 is disposed adjacent to the display screen 131 of the display unit 130, similarly to the light emitting unit 156 of the third embodiment. And the light emission part 158 is arrange
  • the light emitting unit 158 is configured by an LED or the like, similar to the light emitting unit 156 of the third embodiment.
  • Emitting unit 158 in synchronization with the reference clock generator 104 generates clock signals CL, emits light only immediately before and the period other than the period immediately after the timing of switching the code image D 3 in the synthesis image.
  • a monitor lamp having another role in the transmission device 100d may be used as the light emitting unit 158.
  • the transmission device 100d may use a monitor lamp used to indicate power on / off as the light emitting unit 158.
  • Emitting portion 156 of the third embodiment has been to emit a different color from the color assigned to the first code image D 31 and the second code image D 32. That is, the light emitting unit 156 emits green light in the above example. However, the light emitting unit 158 of Embodiment 4 does not have the above restriction on the color of the emitted light.
  • the reading unit 112 performs one shot of data every time reading the D 4, and outputs a signal (synchronization signal SY) indicating that.
  • the light emitting unit 158 receives the signal (synchronization signal SY) from the reading unit 112 and determines the timing of light emission.
  • a receiving device 200d shown in FIG. 20 is generally the same as the receiving device 200 of FIG. 5, but includes a light emission detection unit 262 instead of the mark recognition unit 210.
  • the receiving device 200d includes an image cutout unit 220d instead of the image cutout unit 220.
  • the receiving apparatus 200d includes a code decoding unit 230d instead of the code decoding unit 230.
  • the image cutout unit 220d is different from the image cutout unit 220 in that it includes a third cutout unit 222C in addition to the first cutout unit 222A and the second cutout unit 222B.
  • the code decoding unit 230d is different from the code decoding unit 230 in that it includes a third decoding unit 232C in addition to the first decoding unit 232A and the second decoding unit 232B.
  • the imaging unit 202 images the display screen 131 of the display unit 130 and the light emitting unit 158 of the transmission device 100d on which an image is displayed with the same field of view.
  • the imaging unit 202 acquires a video including light from the composite video D 63 and the light emitting unit 158 displayed on the display screen 131 by imaging.
  • data obtained by imaging by the imaging unit 202 is accumulated in the video buffer 204.
  • the reading unit 206 reads the data one frame at a time.
  • the frame data D 8 read by the reading unit 206 is supplied to the separation unit 208.
  • the frame data D 8 read by the reading unit 206 is also supplied to the light emission detection unit 262.
  • the separation unit 208, the data D 8 frames from the read unit 206, the red component data D 9R, is separated into data D 9G data D 9B and green component and blue component. Then, they are supplied to the first cutout part 222A, the second cutout part 222B, and the third cutout part 222C, respectively. That is, the red component data D 9R is supplied to the first cutout unit 222A.
  • Data D 9B of the blue component is supplied to the second cutout portion 222B.
  • Data D 9G green component is supplied to a third cutout portion 222C.
  • Luminescent detection unit 262 sequentially receives the data D 8 of each frame supplied from the reading unit 206. Luminescent detection unit 262 analyzes the image composed of data D 8 frames sequentially received. Then, the light emitting detection unit 262, a code image D 3 included in the image of each frame, to identify the position of the light from the light emitting portion 158 in the image.
  • the “light position” is an image portion corresponding to the light from the light emitting unit 158.
  • the light emission detection unit 262 identifies the position of the light from the light emitting unit 158 in the video from the positional relationship of the light from the light emitting unit 158 and the blinking timing of the light emitting unit 158 in the video.
  • the light emission detection unit 262 determines whether or not the light emission unit 158 is reflected in the data D 8 of each frame. When it is determined that the image is captured in a light-emitting state (when light emission is detected), the light emission detection unit 262 determines that the code image in that frame can be correctly decoded. The light emission detection unit 262 transmits the determination result to the image cutout unit 220d. Then, the light emission detecting unit 262 instructs the extraction of the code image D 3 to the image clipping unit 220d (image clipping signal D 10).
  • Image clipping unit 220d performs extraction of the code image D 3 in accordance with the instruction (image clipping signal D 10).
  • image clipping signal D 10 image clipping signal
  • the first cutout unit 222A cuts out the first code image D11R from the red component image of the frame in which the light emission is detected.
  • the second cutout unit 222B cuts out the second code image D11B from the blue component image of the frame in which the light emission is detected.
  • the third cutout unit 222C cuts out the third code image D11G from the green component image of the frame in which the light emission is detected.
  • Code decoder 230d decodes the code image D 11 contained in the cut out images in the image clipping unit 220d.
  • the code decoder 230d obtains the decoded if already decoded data D 12 color units corresponding to the code image D 11 which is the decoded.
  • the first decoding unit 232A and the second decoding unit 232B are the same as the first decoding unit 232A and the second decoding unit 232B shown in FIG.
  • Third decoding unit 232C decodes the third code image D 11G contained in the cut out images in the third cutout portion 239C. Then, the third decoding unit 232C generates decoded data D 12C that has been decoded in units of colors.
  • the synthesizing unit 242d combines the decoded data D 12R , D 12B , and D 12C in units of colors output from the first decoding unit 232A, the second decoding unit 232B, and the third decoding unit 232C. Then, the synthesizing unit 242d synthesizes the decoded data D 12R , D 12B , and D 12C of these color units to restore the transfer data D 13 for one shot. Except for the above, the fourth embodiment is the same as the first embodiment.
  • Embodiment 5 the data D 4 of each shot is divided into a plurality of data portions (data units D 5 ), each of which is assigned to a different color channel, and a code image D 3 of a different color is generated and displayed. Yes.
  • color channel used to generate the code image D 3 may be one.
  • FIG. 21 shows a transmitting apparatus 100e according to the fifth embodiment of the present invention.
  • FIG. 22 shows a receiving apparatus 200e of the fifth embodiment.
  • the transmission device 100 of FIG. 21 is generally the same as the transmission device 100 of FIG. 2, but includes a data processing unit 110e instead of the data processing unit 110.
  • the transmission device 100e includes a combining unit 122e instead of the combining unit 122.
  • the receiving device 200e includes an image cutout unit 220e instead of the image cutout unit 220.
  • the receiving device 200e includes a code decoding unit 230e instead of the code decoding unit 230.
  • the receiving device 200e is not provided with the combining unit 242.
  • Constituent elements similar to those of the transmitting apparatus 100 described in the first embodiment are denoted by the same reference numerals and description thereof is omitted.
  • Constituent elements similar to those of the transmission device 100 are a transmission data storage unit 102, a reference clock generation unit 104, a mark generation unit 106, a video buffer 126, a moving image reproduction unit 128, and a display unit 130.
  • the image storage unit 107 of the mark generation unit 106 is a component similar to that of the first embodiment.
  • Constituent elements similar to those of the receiving apparatus 200 are an imaging unit 202, a video buffer 204, a reading unit 206, a mark recognition unit 210, a combining unit 242, and a received data storage unit 244.
  • the data processing unit 110 e includes a single code generation unit 116. Further, the data processing unit 110e is not provided with the data dividing unit 114. Without data D 4 of each shot from the reading unit 112e is divided into a plurality of channels, supplied to the code generator 116.
  • the code generator 116 is the same as the first code generator 116A or the second code generator 116B in FIG. That is, the code generation unit 116 has the same function as the first code generation unit 116A or the second code generation unit 116B in FIG.
  • Combining section 122e includes a timing mark D 2 from the mark generator 106, and a code image D 3 from the code generation unit 116, respectively synthesized allocated to green (G) channel, and red (R) channel.
  • the output of the combining unit 122e (the combined image D 61 ) is supplied to the video buffer 126. Except for the above, the operation of the transmission device 100e in FIG. 21 is the same as that of the transmission device 100 in FIG.
  • Separation unit 208e includes data D 8 of frames read by the reading unit 206, separates the red channel, the green channel.
  • the image cutout unit 220e has a single image cutout unit 222.
  • Image clipping unit 222 cuts out a code image D 11R from the data D9 R of the red component values of the red channel from the separation unit 208e.
  • the code decoding unit 230e has a decoding unit 232.
  • Code decoder 232 decodes the code image D 11R cut out by the image cutting out unit 222.
  • the output of the code decoding unit 232 (decoded decoded data D 12R in color units) is supplied to the received data storage unit 244. Except for the above, the operation of the receiving device 200e in FIG. 22 is the same as that of the receiving device 200 in FIG.
  • Embodiment 6 FIG. In the first embodiment, and to send a timing mark D 2 with different color channels and the code image D 3.
  • a method is not suitable. For example, such a method cannot be performed when either the display unit 130 or the imaging unit 202 is monochrome.
  • such a method cannot be performed in the case of a system in which colors are separated using a rotary color filter and read sequentially for each color. Further, even when the color channel can more available, there is a case you want to assign all the color channels in the code image D 3.
  • FIGS. 23, 24A, 24B, and 24C are schematic diagrams illustrating temporal changes in video displayed on the display unit 130 of the transmission device 100.
  • FIG. FIG. 24 (A) represents a timing mark D 2.
  • FIG. 24 (B) represents a code image D 3.
  • FIG. 14 (C) represents the layout of the timing mark D 2 and the code image D 3 in each period.
  • the vertical axis indicates whether or not each signal is output. A signal is output at “H” and no signal is output at “L”.
  • the horizontal axis represents time.
  • the frame code image D 3 taken at the moment of switching, the code image D 3 is sometimes incomplete. However, if the entire timing mark D 2 is taken it can be regarded as being read even code image located inside the timing mark D 2 correctly. By using such timing marks D 2, it can be read automatically and efficiently code image D 3 which is continuously and sequentially displayed.
  • FIG. 25 shows a transmission device 100f according to the sixth embodiment.
  • FIG. 26 illustrates a receiving device 200f according to the sixth embodiment.
  • a transmission apparatus 100f shown in FIG. 25 is generally the same as the transmission apparatus 100e of FIG. 21, but includes a mark generation unit 106f instead of the mark generation unit 106.
  • the transmission device 100f includes a combining unit 122f instead of the combining unit 122e.
  • the receiving device 200f shown in FIG. 26 is generally the same as the receiving device 200e in FIG. 22, but includes a separating unit 208f instead of the separating unit 208e.
  • the receiving device 200f includes a mark recognition unit 210f instead of the mark recognition unit 210.
  • Constituent elements similar to those of the transmitting apparatus 100 described in the first embodiment are denoted by the same reference numerals and description thereof is omitted.
  • Constituent elements similar to those of the transmission device 100 are a transmission data storage unit 102, a reference clock generation unit 104, a video buffer 126, a moving image reproduction unit 128, and a display unit 130.
  • the image storage unit 107 of the mark generation unit 106f is the same component as that in the first embodiment.
  • the same components as those of the receiving apparatus 200 described in Embodiment 1 are denoted by the same reference numerals, and the description thereof is omitted.
  • the same components as the receiving apparatus 200 are an imaging unit 202, a video buffer 204, a reading unit 206, and a received data storage unit 244.
  • Mark generation unit 106f generates a shape surrounding the code image D 3 generated by the code generation unit 116 as a timing mark D 2. For example, as shown in FIG. 23, to generate a frame-like timing mark D 2 which is formed to surround the code image D 3.
  • Combining unit 122f synthesizes the timing mark D 2 and code image D 3 of the frame-shaped.
  • it is assigned a code image D 3 and the timing mark D 2 in different color channels.
  • the code image D 3 and the timing mark D 2 represented by the pixel values of the same color.
  • both are shown as representing the luminance value Y.
  • the operation of the transmission device 100f in FIG. 25 is the same as that of the transmission device 100e in FIG.
  • the separation unit 208f analyzes the composite image (frame data D 8 ), and separates the code image D 3 located in the center and the timing mark D 2 located around the code image D 3 . Separation unit 208f on the basis of the shape and relative position, separated from the code image D 3, and a timing mark D 2. Separation unit 208f supplies the timing mark D 2 generated by the separation in the mark recognition unit 210f (data D 91Y). Separation unit 208f supplies the code image D 3 in cutout section 222 (data D 92Y). The image clipping unit 222, the code image D 92Y from the separation unit 208f, cuts out a code image D 93Y. The code decoding unit 232 decodes the code image D93Y cut out by the image cutout unit 222. The output (decoded data D 94Y ) of the code decoding unit 232 is supplied to the received data storage unit 244.
  • the mark recognition unit 210f receives data D 91Y (timing mark D 2 ) from the separation unit 208f. Mark recognition unit 210f confirms that there is no lack of timing marks D 2 supplied from the separation unit 208f. Mark recognition unit 210f, the code image D 3 in the same frame that was contained timing mark D 2 is determined to be suitable for decoding. Mark recognition unit 210f instructs the extraction of the code image D 3 to clipping unit 222 (image clipping signal D 10). In other respects, the operation of the receiving device 200f in FIG. 26 is the same as that of the receiving device 200e in FIG.
  • FIG. 27 shows a configuration of transmitting apparatus 100g according to the seventh embodiment.
  • FIG. 28 shows a configuration of receiving apparatus 200g according to the seventh embodiment.
  • the transmission device 100 of FIG. 27 is generally the same as the transmission device 100 of FIG. 2, but includes a mark generation unit 106A and a mark generation unit 106B instead of the mark generation unit 106.
  • the transmission device 100g includes a combining unit 122g instead of the combining unit 122.
  • Constituent elements similar to those of the transmitting apparatus 100 described in the first embodiment are denoted by the same reference numerals and description thereof is omitted.
  • Constituent elements similar to those of the transmission device 100 include a transmission data storage unit 102, a reading unit 112, a data division unit 114, a first code generation unit 116A, a second code generation unit 116B, a reference clock generation unit 104, a video buffer 126, a moving image A playback unit 128 and a display unit 130.
  • the image storage units 107A and 107B of the mark generation units 106A and 106B are the same components as the image storage unit 107 of the first embodiment.
  • the timing mark D 2 is assigned a code image D 31, D 32 to a different color channels.
  • the timing mark D 24 is assigned to the same color channel as the code image D 31 .
  • the timing mark D 25 is assigned the same color channel as the code image D 32.
  • the seventh embodiment is the same as the sixth embodiment.
  • Mark generating unit 106A generates a shape surrounding the code image D 31 generated by the first code generating unit 116A as a timing mark D 24. For example, when the code image D 31 generated by the first code generation unit 116A is indicated by the code D 3 (code image D 3 ) in FIG. 23, the mark generation unit 106A displays the code image D 3 generating a frame-like timing mark D 24 which is formed to surround.
  • Mark generation unit 106B generates a shape surrounding the code image D 32 generated by the second code generating unit 116B as a timing mark D 25. For example, when the code image D 32 generated by the second code generation unit 116B is indicated by the code D 3 (code image D 3 ) in FIG. 23, the mark generation unit 106B converts the code image D 3 into the code image D 3 . generating a frame-like timing mark D 25 which is formed to surround.
  • timing mark D 24 generated by the mark generation unit 106A and the timing mark D 25 generated by the mark generation unit 106B may have the same shape and size. Further, the timing mark D 24 generated by the mark generation unit 106A and the timing mark D 25 generated by the mark generation unit 106B may have different shapes or sizes.
  • Combining section 122g generates a first combined image D 69 by combining a frame-like timing mark D 24 from the mark generating unit 106A and the code image D 31 from the first code generating unit 116A. This first composite image D69 is assigned to the first color channel. For example, the first composite image D69 is assigned to the red channel.
  • Combining section 122g generates a second composite image D 70 by combining a frame-like timing mark D 25 from the mark generating unit 106B and the code image D 32 from the second code generating unit 116B. Assigning the second combined image D 70 to a second color channel. For example, assigning a second combined image D 70 in the blue channel. Then, the composition unit 122g synthesizes a composite image D 69 and the composite image D 70. Combining unit 122g outputs the combined image D 71 of the composite image D 69 and the composite image D 70 synthesized color image.
  • the video buffer 126 stores the composite image D 71 generated by the combining unit 122. Then, the video buffer 126 connects the accumulated composite image D 71 to generate a composite video D 73 (moving image).
  • the operation of the transmission device 100g in FIG. 27 is the same as that of the transmission device 100 in FIG.
  • the receiving device 200g shown in FIG. 28 is generally the same as the receiving device 200 in FIG. However, the receiving device 200g includes a separation unit 208g instead of the separation unit 208.
  • the receiving device 200g includes mark recognition units 210A and 210B instead of the mark recognition unit 210.
  • the same components as those of the receiving apparatus 200 described in Embodiment 1 are denoted by the same reference numerals, and the description thereof is omitted.
  • Constituent elements similar to those of the receiving apparatus 200 are an imaging unit 202, a video buffer 204, a reading unit 206, an image cutout unit 220, a code decoding unit 230, a combining unit 242, and a received data storage unit 244.
  • the first cutout part 222A and the second cutout part 222B of the image cutout part 220 are the same constituent elements as those in the first embodiment.
  • the first decoding unit 232A and the second decoding unit 232B of the code decoding unit 230 are the same constituent elements as those in the first embodiment.
  • Separation unit 208g has data D 8 of each frame is read by the reading unit 206, and the red component value of the data D 9R (data of red channel), a blue component value data D 9B (blue channel data ) And separated.
  • the separation unit 208g further analyzes the red channel data (red component value data D 9R ). Then, the separation unit 208g includes a code image D 3 located in the center, separating the timing mark D 2 positioned therearound. Separation of the code image D 3 and the timing mark D 2 is carried out based on the shape and relative position. Separation unit 208g supplies the timing mark D 2 generated by the separation mark recognition unit 210A (data D 91R color component values). Separation unit 208g supplies the code image D 3 produced by the separation in the first cutout portion 222A (data D 92R color component values).
  • the separation unit 208g also analyzes the blue channel data (blue component value data D 9B ). Then, the separation unit 208g includes a code image D 3 located in the center, separating the timing mark D 2 positioned therearound. Separation of the code image D 3 and the timing mark D 2 is carried out based on the shape and relative position. Separation unit 208g supplies the timing mark D 2 generated by the separation mark recognition unit 210B (data D 91B color component values). Separation unit 208g supplies the code image D 3 produced by the separation in the second cutout portion 222B (data D 92B color component values).
  • the mark recognition unit 210A Upon receipt of the color component value data D 91R (timing mark D 2 ) from the separation unit 208g, the mark recognition unit 210A confirms that the timing mark D 2 supplied from the separation unit 208g is not missing. Then, when there is no lack of timing mark D 2 are mark recognition unit 210A instructs the extraction of the image with respect to cutout portion 222A (image clipping signal D 10R).
  • the mark recognition unit 210B confirms that the timing mark D 2 supplied from the separation unit 208g is not missing. Then, when there is no lack of timing mark D 2 are mark recognition unit 210B instructs the extraction of the image with respect to cutout portion 222B (image clipping signal D 10B).
  • the first cutout portion 222A receives the data D 10R of the red component from the mark recognition unit 210A.
  • the first cutout unit 222A cuts out the first code image from the red component data (video) D 92R given from the separation unit 208g at the timing instructed by the mark recognition unit 210A.
  • the second cutout portion 222B receives the data D 10B of the blue component from the mark recognition unit 210B.
  • the second cutout unit 222B cuts out the second code image from the blue component data (video) D 92B given from the separation unit 208g at the timing instructed by the mark recognition unit 210B.
  • the operation of the receiving device 200g in FIG. 28 is the same as that of the receiving device 200 in FIG.
  • a separate mark generator (106A, 106B) is provided for each color channel.
  • a single mark generating unit for a plurality of color channels, it is also possible to add a timing mark D 2 generated by the single mark generation unit to the plurality of code image D 3.
  • the transmission apparatus 100 g transmits given the timing mark D 2 in the code image D 3 of the one of the plurality of color channels. Then, the receiving apparatus 200 g, by detecting the timing mark D 2 attached to the code image D 3 of one color channel may be also be used for extraction of the code image D 3 of the other color channels.
  • timing mark D 2 in each color channel when generating a timing mark D 2 in each color channel, the communication using the plurality of color channels, is not necessarily performed in synchronization. Timing of switching of the code image D 3 to be transmitted may be used as the different between the color channels each other.
  • FIG. 29 shows an example for enabling such use.
  • Figure 29 is a diagram showing an example of a sequence of QR codes marked with different kinds of timing marks D 2.
  • the first mark is a timing mark D 2h with start mark to be used at the start of the transfer of the series of data (data file D 0). That is, it is a first identification image representing the head portion of a series of data (data file D 0 ) to be transferred.
  • the second mark is a timing mark D 2m used in the middle part.
  • the third mark is a timing mark D2t with an end mark used at the end of a series of data transfer. That is, it is an identification image representing the end portion of a series of transferred data (data file D 0 ).
  • the transmission device 100 divides the data file D 0 stored in the transmission data storage unit 102 and repeatedly displays it.
  • Combining unit 242 of the receiving apparatus 200 combines the divided data to reconstruct the original data file D 0.
  • the starting point and end point of the data file D 0 is required. From among the divided data is repeatedly displayed, by combining the data from the start point to the end point, it is possible to reproduce the original data file D 0.
  • “divided data” is described as a QR code in the first embodiment. And a user's operation can be simplified.
  • the time (transfer time) required to transfer a series of data is 2 seconds
  • a time that is twice or more of 2 seconds, which is the transfer time is determined in advance.
  • a time of 5 seconds is determined in advance.
  • the "transfer time” is that time that the display unit 130 all of the code image D 3 representing the data file D 0 is displayed.
  • the transmission device 100h is made to repeat the transfer of a series of data (data file D 0 ). Then, a message “Please shoot for 5 seconds” is displayed to the user.
  • the receiving apparatus 200 starts photographing by a user's operation
  • the start mark to the end mark are stored as a series of data for the repeatedly displayed video. Decoding is started after a series of data is stored in receiving apparatus 200.
  • storage can be started immediately in response to user operation.
  • the receiving apparatus 200 ends the storage when the start mark is detected and then the end mark is detected. Then, the receiving apparatus 200 may decode the stored data from the start mark to the end mark.
  • timing marks D 2 and end indicate the start of a whole in addition to the timing mark D 2 showing the timing mark D 2 and end indicating the beginning of individual data it may be used timing mark D 2.
  • timing marks D 2 in the sub-scanning direction, begins before the start of the coding region, from the end point of the coding region It is desirable to use what ends later.
  • the timing mark D 2 in the scanning direction (direction perpendicular to the line), begins before the start of the coding region, the end of the coding region It is desirable to use one that ends after the point. It may be adapted to draw a timing mark D 2 in accordance with the scan mode of the display unit 130.
  • Timing mark D 2 as shown in FIG. 29 is drawn as a figure ending the above coding region below the start coding region. In other words, the timing marks D 2 is drawn from above the coding region, a figure drawn to below the coding region.
  • “drawing” means drawing a picture.
  • FIG. 30 is a block diagram schematically showing the configuration of the transmission device 100h.
  • FIG. 31 is a block diagram schematically showing the configuration of the receiving device 200h.
  • a head portion of a set of data, and the end portion, in the other portions, and generates a different timing mark D 2 shown in Figure 30 an example of the transmission device 100h for combining a code image D 3.
  • the head portion of the series of data, and the end portion, in the other portions, and receives the image generated by concatenating the code image D 3 that is added is different from timing mark D 2, the recording
  • An example of a receiving device 200h that performs control is shown in FIG.
  • the transmission device 100 of FIG. 2 includes a data processing unit 110h instead of the data processing unit 110.
  • the data processing unit 110 h is generally the same as the data processing unit 110 in FIG. 2, but includes a reading unit 112 h instead of the reading unit 112.
  • the data processing unit 110h includes a mark generation unit 106h instead of the mark generation unit 106.
  • Constituent elements similar to those of the transmitting apparatus 100 described in the first embodiment are denoted by the same reference numerals and description thereof is omitted.
  • Constituent elements similar to those of the transmission device 100 are a transmission data storage unit 102, a reference clock generation unit 104, a synthesis unit 122, a video buffer 126, a moving image reproduction unit 128, and a display unit 130.
  • the data dividing unit 114, the first code generation unit 116A, and the second code generation unit 116B of the data processing unit 110h are the same constituent elements as those in the first embodiment.
  • the image storage unit 107 of the mark generation unit 106 is a component similar to that of the first embodiment.
  • the reading unit 112h supplies a signal (synchronization signal SY h ) indicating that when reading the head of a series of data (data file D 0 ) from the transmission data storage unit 102 to the mark generation unit 106h. Further, the reading unit 112h supplies a signal (synchronization signal SY t ) indicating that when reading the end portion of a series of data (data file D 0 ) from the transmission data storage unit 102 to the mark generation unit 106h.
  • the mark generation unit 106 h is configured to include a head portion, a tail portion, and other portions of a series of data (data file D 0 ). generating different timing mark D 2 in.
  • the mark generating unit 106h generates a first timing mark D 2h corresponding to the beginning of the data portion of the series of data (data file D 0).
  • First timing mark D 2h is combined with the first code image D 31 and the second code image D 32 of the first set of data (data file D 0).
  • An example of the first timing mark D 2h is indicated with D 2h in Figure 29.
  • Mark generating unit 106h generates a second timing mark D 2t corresponding to the end of the data part of the series of data (data file D 0).
  • the second timing mark D 2t is combined with the first code image D 31 and the second code image D 32 at the end of the series of data (data file D 0 ).
  • the second timing mark D 2t is a mark different from the first timing mark D 2h .
  • An example of the second timing marks D 2h is indicated with D 2t Figure 29.
  • Mark generating unit 106h generates a third timing mark D 2m corresponding to the head and a data portion other than the end of the series of data (data file D 0).
  • the third timing mark D 2m is combined with the first code image D 31 other than the beginning and end of a series of data (data file D 0 ).
  • the third timing mark D 2m is combined with the second code image D 32 top and non-end of the series of data (data file D 0).
  • the third timing mark D 2m is a mark different from the first timing mark D 2h .
  • the third timing mark D2m is a mark different from the second timing mark D2t .
  • An example of the third timing mark D 2m is indicated by a symbol D 2m in FIG.
  • FIG. 31 is generally the same as the receiving apparatus 200 of FIG. 5, but includes a mark recognizing unit 210h instead of the mark recognizing unit 210.
  • Constituent elements similar to those of receiving apparatus 200 described in Embodiment 1 are assigned the same reference numerals, and descriptions thereof are omitted.
  • the same components as the receiving apparatus 200 are an imaging unit 202, a reading unit 206, a separation unit 208, an image clipping unit 220, a code decoding unit 230, a combining unit 242, and a received data storage unit 244.
  • the first cutout part 222A and the second cutout part 222B of the image cutout part 220 are the same constituent elements as those in the first embodiment.
  • first decoding unit 232A and the second decoding unit 232B of the code decoding unit 230 are the same constituent elements as those in the first embodiment.
  • Mark recognition unit 210h receives the video data D 7 which are sequentially output from the imaging unit 202.
  • Mark recognition unit 210h upon detecting a first timing mark D 2h indicating the beginning to start stored in the video buffer 204 (memory start signal D 14h).
  • Mark recognition unit 210h ends the stored in the video buffer 204 when it detects the second timing mark D 2t indicating the end (storage completion signal D 14t).
  • the video buffer 204 stores a series of data (data file D 0 ) from the beginning to the end.
  • the storage start signal D 14h and storing end signal D 14t collectively called a memory timing signal D 14.
  • the second embodiment has been described as a modification of the first embodiment, similar modifications can be made to the third to eighth embodiments.
  • the same modifications can be applied to the second to eighth embodiments.
  • the timing mark D 2 when the display unit 130 is of the progressive scanning point, in the sub-scanning direction, begins before the start of the coding region, the end point of the coding region He said it would be desirable to use something that ends later.
  • the timing mark D 2 when the display unit 130 is of the line-sequential scanning scheme, in the scanning direction (direction perpendicular to the line), begins before the start of the coding region, the end of the coding region He stated that it would be desirable to use something that ends after the point. However, this is the same for the first, second, fifth, and seventh embodiments.
  • the display unit 130 of the transmission devices 100, 100a, 100c, 100d, 100e, 100f, 100g, and 110h is a liquid crystal display device that performs display by a dot sequential scanning method or a line sequential scanning method and a progressive display method. explained. However, the present invention can also be applied to the case where the display unit 130 is of the frame sequential scanning method, or can be applied to the case of the interlaced display method.
  • Each of the transmission devices 100, 100a, 100c, 100d, 100e, 100f, 100g, and 110h and the reception devices 200, 200c, 200d, 200e, 200f, 200g, and 200h can be partially realized by software. That is, each of the transmission devices 100, 100a, 100c, 100d, 100e, 100f, 100g, and 110h and the reception devices 200, 200c, 200d, 200e, 200f, 200g, and 200h can be realized by a programmed computer.
  • the processing performed by a part of the transmission devices 100, 100a, 100c, 100d, 100e, 100f, 100g, 110h and the reception devices 200, 200c, 200d, 200e, 200f, 200g, 200h is performed by a computer.
  • a program for causing the program to be executed is also part of the present invention.
  • a computer-readable recording medium that records such a program also forms part of the present invention.

Landscapes

  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Toxicology (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Optical Communication System (AREA)

Abstract

L'invention concerne un système qui permet de transférer des données en ayant un dispositif de réception pour recevoir des images de code affichées par un dispositif de transmission, au niveau du dispositif de transmission, une série de données étant divisée en de multiples parties de données, et chacune des parties de données obtenues par la division étant convertie en une image de code et sortie. Des images d'identification sont ajoutées à une synchronisation autre qu'une synchronisation de commutation d'image de code. Des chaînes des images de code générées sont reliées et transmises sous la forme d'un film. Au niveau du dispositif de réception, des images obtenues par capture d'image sont accumulées, les images accumulées sont lues en séquence et les images de code qui apparaissent simultanément avec les images d'identification sont décodées. Des échecs de transfert dus à une synchronisation de commutation d'image inappropriée et à une synchronisation de capture d'image inappropriée peuvent être réduits.
PCT/JP2015/051642 2014-01-31 2015-01-22 Dispositif de transmission, dispositif de réception et système de traitement d'informations WO2015115294A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2015559897A JP5976240B2 (ja) 2014-01-31 2015-01-22 送信装置、受信装置及び情報処理システム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-017559 2014-01-31
JP2014017559 2014-01-31

Publications (1)

Publication Number Publication Date
WO2015115294A1 true WO2015115294A1 (fr) 2015-08-06

Family

ID=53756870

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/051642 WO2015115294A1 (fr) 2014-01-31 2015-01-22 Dispositif de transmission, dispositif de réception et système de traitement d'informations

Country Status (2)

Country Link
JP (1) JP5976240B2 (fr)
WO (1) WO2015115294A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20210122541A (ko) * 2020-04-01 2021-10-12 삼성전자주식회사 데이터의 제공 방법 및 이를 지원하는 전자 장치

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005309756A (ja) * 2004-04-21 2005-11-04 Mitsubishi Electric Corp 情報伝達システム及び情報変換装置及び情報復元装置及び表示装置
JP2006067085A (ja) * 2004-08-25 2006-03-09 Nec Viewtechnology Ltd 空間光通信システム用の送信装置および受信装置
JP2007156969A (ja) * 2005-12-07 2007-06-21 Sharp Corp 情報転送システム及び情報転送方法
JP2010117871A (ja) * 2008-11-13 2010-05-27 Sony Ericsson Mobile Communications Ab パターン画像の読み取り方法、パターン画像の読み取り装置、情報処理方法およびパターン画像の読み取りプログラム
JP2011119820A (ja) * 2009-12-01 2011-06-16 Konica Minolta Business Technologies Inc 画像形成装置及び画像読取装置
JP2013045293A (ja) * 2011-08-24 2013-03-04 Fujitsu Mobile Communications Ltd データ伝送方法及び端末装置及びプログラム

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005309756A (ja) * 2004-04-21 2005-11-04 Mitsubishi Electric Corp 情報伝達システム及び情報変換装置及び情報復元装置及び表示装置
JP2006067085A (ja) * 2004-08-25 2006-03-09 Nec Viewtechnology Ltd 空間光通信システム用の送信装置および受信装置
JP2007156969A (ja) * 2005-12-07 2007-06-21 Sharp Corp 情報転送システム及び情報転送方法
JP2010117871A (ja) * 2008-11-13 2010-05-27 Sony Ericsson Mobile Communications Ab パターン画像の読み取り方法、パターン画像の読み取り装置、情報処理方法およびパターン画像の読み取りプログラム
JP2011119820A (ja) * 2009-12-01 2011-06-16 Konica Minolta Business Technologies Inc 画像形成装置及び画像読取装置
JP2013045293A (ja) * 2011-08-24 2013-03-04 Fujitsu Mobile Communications Ltd データ伝送方法及び端末装置及びプログラム

Also Published As

Publication number Publication date
JP5976240B2 (ja) 2016-08-23
JPWO2015115294A1 (ja) 2017-03-23

Similar Documents

Publication Publication Date Title
US7679616B2 (en) Image data generation apparatus for adding attribute information regarding image pickup conditions to image data, image data reproduction apparatus for reproducing image data according to added attribute information, and image data recording medium related thereto
JP3746506B2 (ja) 立体視化パラメータ埋込装置及び立体視画像再生装置
JP4877852B2 (ja) 画像符号化装置、および画像送信装置
US10313009B2 (en) Coded light
US10075236B2 (en) Coded light
JP2004357156A (ja) 映像受信装置および映像再生装置
CN102771109A (zh) 通过盖写视频数据进行视频传递和控制
JP2005519534A (ja) ダイナミックレンジビデオ記録および再生システムならびに方法
JP6206559B2 (ja) 復号装置、復号方法、プログラム、および記録媒体
JP2010245618A (ja) デジタルテレビ放送受信装置及びその制御方法
JP5156196B2 (ja) 撮像装置
JP5976240B2 (ja) 送信装置、受信装置及び情報処理システム
JP2006295835A (ja) 画像伝送システム
WO2011134373A1 (fr) Procédé, dispositif et système pour la transmission synchrone de vidéos multicanaux
JP5808485B2 (ja) 移動端末の録画方法、関連装置及びシステム
JP2007201862A (ja) 通信端末装置
JP7209195B2 (ja) 表示装置および表示方法
JP2006054550A (ja) 伝送システム
EP1463333A1 (fr) Appareil et méthode de traitement de signaux d'image vidéo et audio
JP3938019B2 (ja) 記録装置及び記録方法
JP6147161B2 (ja) 撮像装置、情報表示装置及び情報処理システム
JP2008211491A (ja) 字幕付き映像表示システム、サーバ装置及び端末装置
JP2007174207A (ja) 動画像処理装置
JP2006129463A (ja) 立体視化パラメータ記憶媒体
EP3249912A1 (fr) Procédé et dispositif pour émettre et recevoir un signal de diffusion pour restaurer un signal abaissé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15743709

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2015559897

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15743709

Country of ref document: EP

Kind code of ref document: A1