CN104346613A - Image processing apparatus and image processing method - Google Patents

Image processing apparatus and image processing method Download PDF

Info

Publication number
CN104346613A
CN104346613A CN201410387805.6A CN201410387805A CN104346613A CN 104346613 A CN104346613 A CN 104346613A CN 201410387805 A CN201410387805 A CN 201410387805A CN 104346613 A CN104346613 A CN 104346613A
Authority
CN
China
Prior art keywords
mentioned
straight line
unit
image
frame part
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410387805.6A
Other languages
Chinese (zh)
Other versions
CN104346613B (en
Inventor
宫本直知
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Casio Computer Co Ltd
Original Assignee
Casio Computer Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2013164741A external-priority patent/JP5858012B2/en
Priority claimed from JP2013164747A external-priority patent/JP5862623B2/en
Application filed by Casio Computer Co Ltd filed Critical Casio Computer Co Ltd
Publication of CN104346613A publication Critical patent/CN104346613A/en
Application granted granted Critical
Publication of CN104346613B publication Critical patent/CN104346613B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/10544Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
    • G06K7/10712Fixed beam scanning
    • G06K7/10722Photodetector array or CCD scanning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1439Methods for optical code recognition including a method step for retrieval of the optical code
    • G06K7/1443Methods for optical code recognition including a method step for retrieval of the optical code locating of the code in an image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1439Methods for optical code recognition including a method step for retrieval of the optical code
    • G06K7/1452Methods for optical code recognition including a method step for retrieval of the optical code detecting bar code edges
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Abstract

The image processing apparatus of the present invention includes: a acquiring unit for acquiring the frame portion of the additional code information imprinted print image obtained by imaging; a determining unit, the print image which is acquired by the acquisition means in the above-mentioned frame portion blot corresponding area is determined as the read frame image reading region of said code information; and a reading unit that reads out the code information from the read area determined by the above determination unit.

Description

Image processing apparatus and image processing method
Technical field
The present invention relates to a kind of image processing apparatus, image processing method and program.
Background technology
Current known a kind of device (US8,186,594B2) generating predetermined information coding the code information of the regularly arranged gained being pixel set.
Code information is such as being formed under the state on the recording mediums such as paper, is made a video recording by the camera head of mobile phone, smart mobile phone etc.Camera head implements predetermined decoding process by the image information of the code information to shooting gained, obtains the original predetermined information represented by code information.
Summary of the invention
According to the present invention, can suitably detect frame part from the image of shooting trace gained.
According to the embodiment of the present invention, be a kind of image processing apparatus, it is characterized in that, possess: acquisition unit, it obtains trace that subtend frame part addition of code information and to carry out making a video recording the print image of gained; Determining unit, the frame image-region corresponding with the trace of above-mentioned frame part in the print image obtained by above-mentioned acquisition unit is defined as the reading area reading above-mentioned code information by it; Reading unit, it reads above-mentioned code information from the above-mentioned reading area determined by above-mentioned determining unit.
According to the embodiment of the present invention, be a kind of image processing method employing image processing apparatus, the feature of this image processing method is, comprising: obtain trace that subtend frame part addition of code information carry out making a video recording gained print image obtain step; The frame image-region corresponding with the trace of above-mentioned frame part in the print image this obtained is defined as the determining step of the reading area reading above-mentioned code information; The read step of above-mentioned code information is read from this above-mentioned reading area determined.
Accompanying drawing explanation
Fig. 1 is the block diagram of the Sketch representing the mobile terminal applying the first embodiment of the present invention.
Fig. 2 is the process flow diagram of an example of the action of the code reading process of the mobile terminal representing Fig. 1.
Fig. 3 is the figure of the code reading process for illustration of Fig. 2.
Fig. 4 is the figure of an example of the image of the code reading process schematically representing Fig. 2.
Fig. 5 is the figure of an example of the image of the code reading process schematically representing Fig. 2.
Fig. 6 is the figure of an example of the image of the code reading process schematically representing Fig. 2.
Fig. 7 is the figure of an example of the image of the code reading process schematically representing Fig. 2.
Fig. 8 is the block diagram of the Sketch representing the mobile terminal applying the second embodiment of the present invention.
Fig. 9 is the process flow diagram of an example of the action of the code reading process of the mobile terminal representing Fig. 8.
Figure 10 is the figure of an example of the image of the code reading process schematically representing Fig. 9.
Figure 11 is the figure of an example of the image of the code reading process schematically representing Fig. 9.
Figure 12 is the block diagram of the Sketch representing the mobile terminal applying the 3rd embodiment of the present invention.
Figure 13 is the process flow diagram of an example of the action of the code reading process of the mobile terminal representing Figure 12.
Figure 14 is the figure of the code reading process for illustration of Figure 13.
Figure 15 is the figure of an example of the image of the code reading process schematically representing Figure 13.
Figure 16 is the figure of an example of the image of the code reading process schematically representing Figure 13.
Embodiment
Below, for the present invention, use accompanying drawing that concrete form is described.But scope of invention is not limited to illustrative example.
Fig. 1 is the block diagram of the Sketch representing the mobile terminal 100 applying the first embodiment of the present invention.
As shown in Figure 1, mobile terminal 100 possesses central control 1, storer 2, image pickup part 3, imaging control part 4, image data generating section 5, code process portion 6, action handling part 7, display part 8, display control unit 9, voice communication portion 10, communication control unit 11, operation inputting part 12 etc.
In addition, central control 1, storer 2, image pickup part 3, imaging control part 4, image data generating section 5, code process portion 6, action handling part 7, display control unit 9, voice communication portion 10 and communication control unit 11 connect via bus 13.
In addition, mobile terminal 100 is such as used in mobile communicating net by camera head, mobile phone, PHS (personal handy phone system) etc. movement station, PDA (personal digital assistant) etc. are formed.
Central control 1 controls each portion of mobile terminal 100.Specifically, central control 1 possesses the CPU (CPU (central processing unit) omits diagram) in each portion controlling mobile terminal 100, and the various handling procedures (omitting diagram) according to mobile terminal 100 carry out various control action.
Storer 2 is such as made up of DRAM (dynamic RAM) etc.In addition, storer 2 possesses temporarily to store and is carried out the working storage of the memory buffer, central control 1 etc. of the data processed etc. by central control 1 and code process portion 6 etc., stored the program storage etc. (all omitting diagram) of the various program relevant with the function of this mobile terminal 100 and data.
The trace Si (with reference to Fig. 3 A) of image pickup part 3 to the seal S impressed on recording medium P makes a video recording.
Seal S is formed with the frame of polygon (such as square etc.) around the predetermined labels for being remained in by trace Si on recording medium P, is formed as to be attached to predetermined information on the frame part Sw of trace Si with pixel set regularly arranged carry out the encoding code information Sc of gained at this frame.
Trace Si remains on recording medium P by being impressed on recording medium P by seal S, frame part Sw residual corresponding with the frame of polygon around the watermark image Sp being formed in print face.
Frame part Sw is attached with the code information Sc of multiple gained that predetermined information carried out encoding with pixel set regularly arranged.That is, frame part Sw roughly equal length multiple limit Sa ... in at least 2 limits Sa, Sa on be attached with code information Sc respectively.Specifically, roughly foursquare frame part Sw 4 limit Sa ... on, be that benchmark becomes on the symmetrical such direction of line and is attached with identical code information Sc respectively with diagonal line.That is, trace Si is embedded with code information Sc multiplely.In addition, 4 bights of foursquare frame part Sw are attached with the mark Sm of the reservation shape (such as square etc.) for detecting summit C respectively.
Such as, from the mark Sm relative to precalculated position separates the position of predetermined space, be attached with code information Sc in a predetermined direction, make the bearing of trend (vertical direction substantially vertical with Width) along this limit Sa in the Width substantial middle side of each limit Sa of frame part Sw.
At this, according to predetermined coding form (such as Reed Solomon code, Golay code etc.), original predetermined information (such as URL etc.) is encoded and obtained code information Sc.Such as code information Sc arranges the set of the set of the white pixel of pixel value " 1 " and the black pixel of pixel value " 0 " regularly with predetermined dimension.
In addition, in the present embodiment, suppose when seal S is impressed on recording medium P, impress substantially uniformly power not being applied under the state on whole of print face.Therefore, a part (part of the downside in such as Fig. 4 A) of the frame part Sw of such as trace Si expands, and a part (part of the upper left side in such as Fig. 4 A) of trace Si produces and lacks in vain.
In addition, image pickup part 3 possesses camera lens part 3a and electro-photographic portion 3b.
Camera lens part 3a is made up of Zoom lens, the first-class multiple camera lens of focus lamp.
Electro-photographic portion 3b is such as made up of imageing sensors such as CCD (charge-coupled image sensor), CMOS (complementary metal oxide semiconductor (CMOS)), the optical image of the various camera lenses that have passed camera lens part 3a is transformed to the picture signal of two dimension.
In addition, although the diagram of omission, image pickup part 3 also can possess the aperture of adjustment by the amount of the light of camera lens part 3a.
Imaging control part 4 controls the shooting of image pickup part 3 pairs of subjects.That is, imaging control part 4 possesses omission illustrated timing generator, driver etc.In addition, imaging control part 4 carries out turntable driving by timing generator, driver to electro-photographic portion 3b, make optical image be transformed to the picture signal of two dimension at each predetermined period by electro-photographic portion 3b, from the camera watch region of this electro-photographic portion 3b, read two field picture one by one picture and output to image data generating section 5.
In addition, imaging control part 4 carries out the adjustment control of the imaging conditions of the subject such as AF (automatic focus process), AE (automatic exposure process), AWB (Automatic white balance).
Image data generating section 5 is after each color component of signal to RGB of the analogue value for the two field picture sent from electro-photographic portion 3b suitably adjusts gain, carry out sampling by sampling hold circuit (omitting diagram) to keep, numerical data is transformed to by A/D transducer (omitting diagram), after carrying out comprising the color treatments of pixel interpolating process and γ correcting process by color treatments circuit (omitting diagram), generate brightness signal Y and colour difference signal Cb, Cr (yuv data) of digital value.
Then, the yuv data of generated each two field picture is in turn outputted to storer 2 by image data generating section 5, is stored in this storer 2.
Code process portion 6 possesses image acquiring section 6a, binaryzation portion 6b, straight line presumption unit 6c, frame test section 6d, information reading part 6e.
In addition, each portion in code process portion 6 is such as made up of predetermined logical circuit, but this structure is an example, is not limited to this.
Image acquiring section 6a successively obtains the photographed images Ia (with reference to Fig. 4) of the trace Si gained of impressing on recording medium P of making a video recording.
That is, image acquiring section 6a obtains photographed images (print image) Ia, its to code information Sc is attached to predetermined watermark image Sp surrounding the frame part Sw with preset width on the trace Si of gained make a video recording.Specifically, image acquiring section 6a is obtained from storer 2 and to be made a video recording copying of trace Si the view data of the predetermined resolution of the photographed images Ia generated by image data generating section 5 by image pickup part 3.
Binaryzation portion 6b generates the first binary image Ib (with reference to Fig. 4 b).
Namely, binaryzation portion 6b is for the luminance component Y of the view data (yuv data) of the photographed images Ia obtained by image acquiring section 6a, implement the binary conversion treatment (such as ecad binary conversion treatment etc.) of carrying out binaryzation according to predetermined threshold value, generate the view data of the first binary image Ib.
In addition, binaryzation portion 6b generates the second binary image Id (with reference to Fig. 6 C).Namely, the luminance component Y of the view data (yuv data) of image Ic after the mapping transformation that binaryzation portion 6b generates for the mapping transformation portion d2 by frame test section 6d, implement the binary conversion treatment (such as ecad binary conversion treatment etc.) of carrying out binaryzation according to predetermined threshold value, generate the view data of the second binary image Id.
In addition, above-mentioned binary conversion treatment is known technology, therefore omits detailed description at this.
Straight line presumption unit 6c, in the photographed images Ia obtained by image acquiring section 6a, forms the lateral profile of the frame part Sw corresponding with the frame of the polygon of seal S, estimates the straight line L of the predetermined number corresponding with the angle number of the frame of polygon.Specifically, straight line presumption unit 6c possesses profile determination portion c1 and straight line determination portion c2.
Profile determination portion c1 determines the convex closure region A1 of the polygon corresponding with the lateral profile of the frame part Sw of trace Si.
That is, profile determination portion c1 determines the convex closure region A1 (with reference to Fig. 4 C) of the polygon corresponding with the lateral profile of the frame part Sw of the photographed images Ia obtained by image acquiring section 6a.
Specifically, profile determination portion c1 such as obtains the view data of the first binary image Ib corresponding with the photographed images Ia generated by binaryzation portion 6b, implement convex closure process to this view data, the set thus for the black pixel of the pixel value be present in preset range " 0 " calculates respectively and will form the multiple line segments coupled together between the pixel of outermost profile.Thus, the black pixel being present in the pixel value " 0 " in preset range becomes the state of being surrounded by multiple line segment, and the region of the polygon be made up of these line segments becomes the convex closure region A that there is not recess.At this moment, profile determination portion c1 makes the scope becoming handling object in binary image change, and forms multiple convex closure region A thus.
Then, profile determination portion c1 formed multiple convex closure region A ... in, convex closure region A maximum for area is defined as the convex closure region A1 of the polygon (such as hexagonal configuration) corresponding with the lateral profile of the frame part Sw of trace Si.
In addition, the content of above-mentioned convex closure process is an example, is not limited to this, can suitably changes arbitrarily.In addition, convex closure region A is known technology, therefore omits detailed description at this.
Straight line determination portion c2 determines to form the straight line L (with reference to Fig. 6 A) of the block diagram corresponding with the frame part Sw of trace Si as the predetermined number of the lateral profile of Wa.
Namely, straight line determination portion c2 according to form the polygon (such as hexagonal configuration) determined by profile determination portion c1 convex closure region A1 multiple (such as 6) summit B ... position, determine to form block diagram as straight line (such as forming 4 straight lines of the foursquare lateral profile) L of the predetermined number of the lateral profile of Wa.Specifically, straight line determination portion c2 according to by form polygon convex closure region A1 any 2 summits B, B multiple straight line L ... in each straight line L and at least one party in relativeness between the number of pixel of convex closure region A1 overlap of polygon and adjacent straight line L, determine to form the straight line L of block diagram as the predetermined number of the lateral profile of Wa.
Namely, straight line determination portion c2 have passed the convex closure region A1 forming the polygon determined by profile determination portion c1 any 2 summits B, B multiple straight line L ... in, determine the straight line L that the number of multiple pixel overlaps of the convex closure region A1 forming polygon is more than predetermined value, this straight line L is defined as form the straight line L of block diagram as the lateral profile of Wa.
Specifically, straight line determination portion c2 such as implements the straight-line detection process based on RANSAC method for the pixel of the convex closure region A1 forming the polygon determined by profile determination portion c1, determines thus to form the straight line L of block diagram as the lateral profile of Wa.Such as, straight line determination portion c2 from form convex closure region A1 6 summit B ... middle selection any 2 summits B, B, be defined as the straight line L that these 2 summits B, B are coupled together forming the candidate (candidate straight line L) of block diagram as the straight line L of the foursquare lateral profile of Wa.Then, straight line determination portion c2 is for determined whole candidate's straight line L, calculate and form the pixel count overlapping with multiple pixels of convex closure region A1, candidate's straight line L more than predetermined value for the pixel count calculated is defined as form the straight line L of block diagram as the lateral profile of Wa.Thus, be such as defined as forming the straight line L (with reference to Fig. 5 A) of block diagram as the lateral profile of Wa by with the generation of the trace Si candidate's straight line Lc lacked beyond candidate's straight line La corresponding to a white part, relatively short candidate's straight line Lb.
In addition, the straight-line detection process of RANSAC method is known technology, therefore omits detailed description at this.
In addition, straight line determination portion c2 have passed any 2 summits B, B of forming the convex closure region A1 of polygon determined by profile determination portion c1 multiple straight line L ... in, determine the straight line L roughly equal with the interior angle of the frame of adjacent straight line L angulation and polygon, calculate the evaluation of estimate of each straight line L to the pixel additional weight overlapping with the convex closure region A1 of polygon in the pixel forming this straight line L, straight line L high for the evaluation of estimate calculated is defined as form the straight line L of block diagram as the lateral profile of Wa.
Specifically, straight line determination portion c2 such as determines the candidate straight line Ld roughly equal with the interior angle (90 °) of the foursquare frame of adjacent straight line L angulation and seal S for whole candidate's straight line L.Then, straight line determination portion c2, to the pixel additional weight overlapping with the convex closure region A1 of polygon in the pixel forming determined candidate's straight line Ld, calculates the evaluation of estimate of each candidate's straight line L according to predetermined computing formula.Such as, when the block diagram of trace Si expands as a part (part of the downside in such as Fig. 5 B) of Wa, if straight line determination portion c2 from this part 3 summit B ... middle selection any 2 summits B, B and determine candidate's straight line L, then become determine 3 candidate's straight line L ... state.Then, straight line determination portion c2 for these 3 candidate's straight line L ... each calculate respectively with the candidate straight line L adjacent in each end (in figure 5b, 2, left and right candidate's straight line Lc for representing with double dot dash line) angulation, determine that calculated angle is substantially equal to candidate's straight line L (the candidate's straight line Ld in figure 5b, for representing with dot-and-dash line) of 90 °.And then, straight line determination portion c2 is in the pixel forming determined candidate's straight line L, determine the pixel overlapping with the convex closure region A1 of polygon (such as in figure 5b, pixel for side, both ends, left and right), to determined pixel additional weight, according to predetermined computing formula Calculation Estimation value.
Then, the evaluation of estimate of straight line determination portion c2 to each candidate's straight line L calculated compares, and is defined as by candidate's straight line Ld the highest for evaluation of estimate forming the straight line L of block diagram as the lateral profile of Wa.Such as, the candidate's straight line Ld represented with dot-and-dash line does not become the shape at the edge of the convex closure region A1 along polygon, but the pixel of side, both ends, left and right is overlapping with the convex closure region A1 of polygon, therefore evaluation of estimate is higher than other candidate's straight lines Ld represented by dashed line, is defined as forming the straight line L of block diagram as the lateral profile of Wa.
Like this, straight line presumption unit 6c in photographed images Ia (the first binary image Ib), presumption form the block diagram corresponding with the foursquare frame of seal S as the lateral profile of Wa 4 straight line L ...Specifically, top line correspondence L1, below line correspondence L2, left side line correspondence L3 and the right line correspondence L4 are estimated as the straight line L (with reference to Fig. 6 A) corresponding with foursquare each limit up and down by straight line presumption unit 6c.
Frame test section 6d detects the block diagram of the photographed images Ia of the trace Si be made up of the straight line L of the predetermined number deduced by straight line presumption unit 6c as Wa.Specifically, frame test section 6d possesses summit determination portion d1, mapping transformation portion d2.
Summit determination portion d1 determines the summit C (with reference to Fig. 6 B) of the block diagram of photographed images Ia as Wa.
That is, the some C of the predetermined number intersected between the straight line L of the predetermined number determined by straight line determination portion c2 is defined as the summit C of the frame part Sw (block diagram is as Wa) of trace Si by summit determination portion d1.Specifically, summit determination portion d1 in photographed images Ia, by the formation lateral profile determined by straight line determination portion c2 4 straight line L ... in adjacent straight line L between intersect and formed 4 points be defined as the summit C of block diagram as Wa.At this moment, for 4 straight line L ... in mutually mutually disjoint straight line L between (such as top line correspondence L1 and left side line correspondence L3 etc.), summit determination portion d1 by make the straight line L of at least one party to predetermined direction extend obtain intersection point.
Then, frame test section 6d, according to the summit C of the predetermined number determined by summit determination portion d1, detects the block diagram of photographed images Ia as Wa.That is, frame test section 6d is by using determine 4 points as the frame part Sw (block diagram is as Wa) that the region detection of summit C is trace Si.
In addition, summit determination portion d1 also according to the coordinate position of sign image Ma corresponding with the mark Sm of trace Si in photographed images Ia, can determine the summit C of frame part Sw (block diagram is as Wa).
Namely, such as shown in Figure 7 A, when a part (such as upper right portion) of trace Si there occurs ooze out, this part can become the black picture element of pixel value " 0 " when binaryzation, likely suitably cannot form the straight line L of block diagram as the lateral profile of Wa from the photographed images Ia presumption of this trace Si.Therefore, on print face, utilize near mark Sm (in preset range) to there is the situation on the summit of the frame of polygon, summit determination portion d1 considers the coordinate position of the sign image Ma in photographed images Ia, determines the summit C of block diagram as Wa.
Specifically, summit determination portion d1 such as prepares the pattern image Pa (with reference to Fig. 7 B) corresponding with indicating the shape of Sm, utilize the characteristic information (such as SIFT (Scale-Invariant Features Transform: Scale invariant measure feature converts) characteristic quantity) of this pattern image Pa, in photographed images Ia, determine the region comprising the sign image Ma similar with pattern image Pa.Then, summit determination portion d1, in photographed images Ia, determines the summit C of block diagram as Wa in the preset range being benchmark with the coordinate position of sign image Ma.Thus, summit determination portion d1 can determine the summit C of corresponding block diagram as Wa from the part of oozing out that there occurs trace Si, can suitably detect the block diagram of photographed images Ia as Wa (with reference to Fig. 7 C) by frame test section 6d.
Mapping transformation portion d2 carries out the mapping transformation process generating image Ic after mapping transformation (with reference to Fig. 6 B).
That is, mapping transformation portion d2 is according to the summit C of the predetermined number determined by summit determination portion d1, carries out mapping transformation process for the photographed images Ia obtained by image acquiring section 6a, image Ic after the mapping transformation of generation polygon.Specifically, mapping transformation portion d2 calculate the block diagram with the crooked square shape of profile determined by summit determination portion d1 as Wa 4 summit C ... coordinate position become foursquare 4 summit C ... the coordinate transform formula of coordinate position.Then, mapping transformation portion d2, according to the coordinate transform formula calculated, implements mapping transformation process to the photographed images Ia of trace Si, generates the block diagram corresponding with the frame part Sw of trace Si and is transformed to image Ic after foursquare mapping transformation as the profile of Wa.
Then, frame test section 6d in image Ic (photographed images of polygon), detects the foursquare frame part Sw (block diagram as Wa) corresponding with the frame of seal S after the mapping transformation generated by mapping transformation portion d2.
In addition, above-mentioned mapping transformation process is known technology, omits detailed description at this.
Information reading part 6e carries out reading process, namely from code information Sc, reads original predetermined information.
That is, information reading part 6e reads predetermined information from the code information Sc in the frame part Sw (block diagram is as Wa) of the photographed images Ia of the trace Si detected by frame test section 6d.Specifically, information reading part 6e reads predetermined information from the block diagram of the second bianry image Id corresponding with image Ic after the mapping transformation generated by mapping transformation portion d2 as the code information Sc in Wa.
Such as, information reading part 6e is in the second binary image Id, detect the block diagram that forms and detected by frame test section 6d as 2 almost parallel edges of Wa, the line of the reservation shape coupled together between the intermediate point at these 2 edges (such as square etc.) will be defined as the reading area D (with reference to Fig. 6 C) of code information Sc.Then, information reading part 6e scans along predetermined direction from the precalculated position (such as left etc.) of reading area D, determines the coordinate position existing for set of the set of the white pixel of pixel value " 1 " and the black pixel of pixel value " 0 " respectively.Information reading part 6e, for the arrangement of the set of determined white pixel and the set of black pixel, implements the decoding process corresponding with the coded system of code information Sc, reads the original predetermined information (such as URL etc.) represented by code information Sc.
At this moment, information reading part 6e carries out reading process to each region corresponding with each limit Sa of frame part Sw in reading area D.That is, carry out reading original predetermined information from code information Sc with the number of times (such as 4 times) that the number of the limit Sa with frame part Sw is corresponding.At this, each limit Sa of frame part Sw be addition of identical code information Sc respectively, therefore, information reading part 6e such as also can be configured to read out multiple when detecting the original predetermined information at more than 2 places, be judged to suitably to read this predetermined information.
In addition, above-mentioned reading process is known technology, therefore omits detailed description at this.
Action handling part 7 performs predetermined action according to the original predetermined information of the code information Sc read by information reading part 6e.
That is, action handling part 7 is when have read the predetermined information of more than predetermined number by information reading part 6e, controls the execution of the process corresponding with this predetermined information.Specifically, such as, when have read URL as predetermined information, action handling part 7 controls communication control unit 11, the specific webpage of the Internet that the URL acquired by access specifies.Then, action handling part 7 performs instruction according to the various process preset (such as the playback etc. of specific sound, image), controls display control unit 9, voice communication portion 10 etc., makes it perform various process.
Display part 8 is such as made up of display panels etc., according to the vision signal from display control unit 9, is presented in display frame by the image (such as live image etc.) of gained of being made a video recording by image pickup part 3.
Display control unit 9 carries out following control, namely reads the view data of the display be temporarily stored in storer 2 and makes to be presented on display part 8.
Specifically, display control unit 9 possesses VRAM (Video Random Access Memory: video RAM), VRAM controller, digital video code etc.In addition, digital video code reads with predetermined playback frame speed (such as 30fps) via VRAM controller and reads and the brightness signal Y be stored in VRAM (omitting diagram) and colour difference signal Cb, Cr from storer 2 under the control of central control 1 from VRAM, produces vision signal and output to display part 8 based on these data.
Such as, display control unit 9 will be made a video recording by image pickup part 3 and imaging control part 4 and generated by image data generating section 5 multiple two field pictures ... successively upgrade on one side with predetermined frame rate of display, fact is presented on display part 8.
The external user of the external unit that voice communication portion 10 is connected with via communication network N is conversed.
Specifically, voice communication portion 10 possesses microphone 10a, loudspeaker 10b, data transformation portion 10c etc.In addition, voice communication portion 10 carries out A/D conversion process by the words sound that send of data transformation portion 10c to the user inputted from microphone 10a, words voice data will be sent to output to central control 1, and under the control of central control 1, by data transformation portion 10c to export from communication control unit 11 and the voice datas such as voice data of answering inputted carry out D/A conversion process, export from loudspeaker 10b.
Communication control unit 11 carries out the transmitting-receiving of data via communication network N and communication antenna 11a.
That is, communication antenna 11a be can carry out with this mobile terminal 100 with predetermined communication mode (such as W-CDMA (Wideband Code Division Multiple Access: Wideband Code Division Multiple Access (WCDMA)) mode, GSM (the Global System for Mobile Communications: global system for mobile communications that adopt in the communication of wireless base station (omit and illustrate); Registered trademark) mode etc.) antenna of transmitting-receiving of corresponding data.In addition, communication control unit 11, according to the communication protocol corresponding with predetermined communication mode, carries out the transmitting-receiving of data between wireless base station via communication antenna 11a by the communication channel set in this communication mode.That is, communication control unit 11 exports and the indicator signal inputted according to from central control 1, and the external unit to communication counterpart carries out the transmitting-receiving with the transmitting-receiving of the sound in the call of the external user of this external unit and the data of Email.
In addition, the structure of communication control unit 11 is examples, is not limited to this, can suitably change arbitrarily, although such as omit diagram, also can be configured to load wireless LAN module, can via access point (Access Point) visited communication network N.
Communication network N is such as the communication network be connected with external unit by mobile terminal 100 via wireless base station, gateway server (omitting diagram) etc.
In addition, communication network N is such as the communication network utilizing industrial siding and existing common public circuit to construct, and can apply the various line form such as LAN (LAN (Local Area Network)), WAN (wide area network).In addition, telecommunication network N such as comprises various communication network network diagram, IP network, VoIP (Voice over Internet Protocol: Internet ready phones) gateway, the Internet Service Providers etc. such as telephone wire road network, isdn line road network, industrial siding, mobile radio communication, telstar circuit, CATV line network.
Operation inputting part 12 is for inputting various instruction to terminal body.
Specifically, the tripper that the photography instruction that operation inputting part 12 possesses subject is correlated with, the cursor key up and down that pattern is relevant with the selection instruction of function etc. indicate the input of relevant communication association button, text to indicate the various buttons (all omitting diagram) such as relevant digital keys and mark button with the execution of the incoming call of decision button, phone and the transmitting-receiving of Email etc.
In addition, if by user operation various button, then operation inputting part 12 exports the operation instruction with operated button corresponding to central control 1.Central control 1 exports and the operation instruction inputted according to from operation inputting part 12, makes each portion perform predetermined action (transmitting-receiving etc. of the shooting of such as subject, the incoming call of phone, Email).
In addition, operation inputting part 12 both can have the touch-screen arranged integratedly with display part 8, also according to the scheduled operation of user to touch-screen, can export the operation instruction corresponding with this scheduled operation to central control 1.
< code reading process >
Then, the code reading process of mobile terminal 100 is described with reference to Fig. 2 ~ Fig. 7.
Fig. 2 is the process flow diagram of an example of the action representing code reading process.
In addition, suppose that the trace Si made a video recording by following code reading process is such as by the precalculated position (with reference to Fig. 3 A) of affixing one's seal at recording medium P such as postcards.In addition, suppose that each limit Sa of the frame part Sw of trace Si be addition of identical code information Sc respectively.
As shown in Figure 2, first, if have input shooting instruction according to the scheduled operation of user to operation inputting part 12, then imaging control part 4 makes image pickup part 3 make a video recording trace Si, and image data generating section 5 generates view data (the step S1 of the photographed images Ia transmitted from electro-photographic portion 3b; With reference to Fig. 3 B).
Then, the yuv data of the photographed images Ia of generation is outputted to storer 2 by image data generating section 5, is stored in this storer 2.
In addition, in order to the trace Si that easily makes a video recording under closer to foursquare state, display control unit 9 also can make display part 8 show the guiding corresponding with the profile of trace Si to show.
Then, the image acquiring section 6a in code process portion 6 obtains view data (such as brightness data) (the step S2 of the predetermined resolution of the photographed images Ia generated by image data generating section 5 from storer 2; With reference to Fig. 4 A).Then, binaryzation portion 6b implements the binary conversion treatment of carrying out binaryzation with predetermined threshold value for the view data of the photographed images Ia obtained by image acquiring section 6a, generates the view data (step S3) of the first binary image Ib (with reference to Fig. 4 B).
Then, the profile determination portion c1 of straight line presumption unit 6c implements convex closure process to the view data of the first binary image Ib generated by binaryzation portion 6b, determines the convex closure region A1 (with reference to Fig. 4 C) (step S4) of the polygon corresponding with the lateral profile of frame part Sw thus.Specifically, profile determination portion c1 formed by convex closure process multiple convex closure region A ... in, convex closure region A maximum for area is defined as the convex closure region A1 of the polygon (such as hexagonal configuration) corresponding with the lateral profile of the frame part Sw of trace Si.
Then, straight line determination portion c2 any 2 the summit B by forming the convex closure region A1 of polygon determined by profile determination portion c1 multiple straight line L ... in, straight line L more than predetermined value for the number of the multiple pixel overlaps forming the convex closure region A1 of polygon is defined as form the straight line L (step S5) of the block diagram corresponding with frame part Sw as the lateral profile of Wa.Specifically, straight line determination portion c2 from form convex closure region A1 6 summit B ... middle selection any 2 summits B, B, be defined as the straight line L connecting these 2 summits B, B forming the candidate straight line L of block diagram as the foursquare lateral profile of Wa.Then, candidate's straight line L more than predetermined value for the number of pixels overlapping with the multiple pixels forming convex closure region A1, in determined candidate's straight line L, is defined as forming the straight line L of block diagram as the lateral profile of Wa by straight line determination portion c2.
Then, straight line determination portion c2 have passed form polygon convex closure region A1 any 2 summits B, B multiple straight line L ... in, consider and adjacent straight line L angulation and determine to form the straight line L (step S6) of block diagram as the lateral profile of Wa.Specifically, straight line determination portion c2 is for whole candidate's straight line L, determine the candidate straight line L roughly equal with the interior angle (90 °) of the foursquare frame of adjacent candidate's straight line L angulation and seal S, the pixel additional weight overlapping to the convex closure region A1 with polygon, calculates the evaluation of estimate of each candidate's straight line L according to predetermined computing formula.Then, the evaluation of estimate of straight line determination portion c2 to each candidate's straight line L calculated compares, and is defined as by candidate's straight line L the highest for evaluation of estimate forming the straight line L of block diagram as the lateral profile of Wa.
In addition, the order of the determination process of the determination process of the straight line L of step S5, the straight line L of step S6 is an example, is not limited to this, such as also can be contrary.
Then, the point of the predetermined number intersected between the straight line L of the predetermined number of the formation lateral profile determined by straight line determination portion c2 is defined as the summit C (step S7) of the frame part Sw (block diagram is as Wa) of trace Si by the summit determination portion d1 of frame test section 6d.At this moment, summit determination portion d1 determines the summit C (with reference to Fig. 7 A ~ Fig. 7 C) of block diagram as Wa with also can considering the coordinate position of the sign image Ma in photographed images Ia.
Then, frame test section 6d, according to the determination result of summit determination portion d1, determines whether to determine 4 the summit Cs (step S8) of block diagram as Wa.At this, if it is determined that for determining 4 summit C (the step S8s of block diagram as Wa; Be), then mapping transformation portion d2 with determine 4 summit C ... coordinate position be benchmark, mapping transformation process (step S9) is carried out to photographed images Ia.Specifically, mapping transformation portion d2 calculate the block diagram with the crooked square shape of profile as Wa 4 summit C ... coordinate position become foursquare 4 summit C ... the such coordinate transform formula of coordinate position, according to the coordinate transform formula calculated, implement mapping transformation process to the photographed images Ia of trace Si, the block diagram generating trace Si is transformed to image Ic after foursquare mapping transformation as the profile of Wa.
In addition, if be judged in step s 8 not determine 4 summit C (the step S8s of block diagram as Wa; No), the CPU of central control 1 skips later process, and this code reading process is terminated.
Then, frame test section 6d in image Ic, detects the foursquare frame part Sw (block diagram as Wa) (step S10) corresponding with the frame of seal S after the mapping transformation generated by mapping transformation portion d2.Then, binaryzation portion 6b implements the binary conversion treatment of carrying out binaryzation with predetermined threshold value to the view data of image Ic after the mapping transformation generated by mapping transformation portion d2, generates the view data (step S11) of the second binary image Id.
Then, information reading part 6e carries out following reading process, namely reads predetermined information (step S12) from the code information Sc in the frame part Sw (block diagram is as Wa) of the second binary image Id corresponding with image Ic after the mapping transformation generated by mapping transformation portion d2.Specifically, information reading part 6e determines foursquare reading area D in the second binary image Id, scan along predetermined direction from the precalculated position (such as left etc.) of this reading area D, determine the coordinate position existing for set of the set of the white pixel of pixel value " 1 " and the black pixel of pixel value " 0 " respectively.Then, information reading part 6e implements decoding process to the arrangement of the set of the set of determined white pixel and black pixel, reads the original predetermined information (such as URL etc.) represented by code information Sc.
Then, information reading part 6e judges whether have read original predetermined information (step S13) for many times from code information Sc by reading process.
At this, if it is determined that for repeatedly have read original predetermined information (step S13; Be), then information reading part 6e is judged to suitably to read this predetermined information, action handling part 7 performs predetermined action (such as access the Internet is reset to the sound determined, image) (step S14) according to the predetermined information (such as URL etc.) read by information reading part 6e.
On the other hand, if be judged in step s 13 repeatedly not read original predetermined information (step S13; No), then the CPU of central control 1 skips the process of step S14, and this code reading process is terminated.
As described above, mobile terminal 100 according to the present embodiment, the trace Si of the frame part Sw extracode information Sc of the polygon (such as square) around the watermark image Sp that subtend is predetermined carries out making a video recording in the photographed images Ia of gained, form the lateral profile of the frame part Sw (block diagram as Wa) corresponding with the frame of polygon, estimate the straight line L of the predetermined number corresponding with the angle number of the frame of polygon, detect the block diagram of the photographed images Ia be made up of the straight line L of the predetermined number deduced as Wa, therefore the straight line L of block diagram as the predetermined number of the lateral profile of Wa is formed by utilizing, suitably can detect the block diagram of photographed images Ia as Wa.Namely, such as when trace Si there occurs ooze out and lack white and cannot the summit C of suitably detection block image Wa, also can form block diagram as the straight line L of the predetermined number of the lateral profile of Wa by presumption, utilize this straight line L from photographed images Ia, suitably detect frame part Sw (block diagram is as Wa).
In addition, determine the region (convex closure region A1) of the polygon corresponding with the lateral profile of the frame part Sw (block diagram is as Wa) of photographed images Ia, according to form the region of polygon determined multiple summit B ... position, determine to form the straight line L of block diagram as the predetermined number of the lateral profile of Wa, therefore, it is possible to utilize the straight line L of the predetermined number determined, from photographed images Ia, suitably detect that block diagram is as Wa.Specifically, have passed form polygon region any 2 summits B, B multiple straight line L ... in, according at least one party in the relativeness between the number of the pixel of the region overlap of each straight line L and polygon and adjacent straight line L, can determine to form the straight line L of block diagram as the predetermined number of the lateral profile of Wa.
Namely, have passed form polygon region any 2 summits B, B multiple straight line L ... in, determine the straight line L that the number of multiple pixel overlaps in the region forming polygon is more than predetermined value, this straight line L is defined as form the straight line L of block diagram as the lateral profile of Wa, therefore, it is possible to consider that formation block diagram is as the overlapping degree between candidate's straight line L of the straight line L of the lateral profile of Wa and multiple pixels in the region of formation polygon, suitably determines to form the straight line L of block diagram as the lateral profile of Wa.
In addition, have passed any 2 summit B in the region forming polygon, multiple straight line L of B, in, determine the straight line L roughly equal with the interior angle of the frame of adjacent straight line L angulation and polygon, the pixel additional weight overlapping with the region of polygon in the pixel forming this straight line L, calculate the evaluation of estimate of each straight line L, straight line L high for the evaluation of estimate calculated is defined as form the straight line L of block diagram as the lateral profile of Wa, therefore, it is possible to consider form block diagram as the straight line L of the lateral profile of Wa candidate's straight line L between relativeness, suitably determine to form the straight line L of block diagram as the lateral profile of Wa.
In addition, the point forming the predetermined number intersected between the straight line L of the predetermined number of the lateral profile of frame part Sw (block diagram is as Wa) is defined as the summit C of block diagram as Wa, according to the summit C of the predetermined number determined, detect the block diagram of photographed images Ia as Wa, therefore, it is possible to utilize the point (summit C) forming the predetermined number intersected between the straight line L of block diagram as the predetermined number of the lateral profile of Wa, suitably detect the block diagram of photographed images Ia as Wa.At this moment, according to the coordinate position of the sign image Ma corresponding with the mark Sm of the reservation shape in the bight of the frame of polygon in photographed images Ia, determine the summit C of block diagram as Wa, even carry out to there occurs the trace Si oozed out the photographed images Ia of gained of making a video recording thus, also suitably can determine the summit C of block diagram as Wa, suitably can detect that the block diagram of photographed images Ia is as Wa.
And then, according to the summit C of the predetermined number determined, mapping transformation process is carried out to photographed images Ia, generate the photographed images (after mapping transformation image Ic) of polygon, detect the block diagram corresponding with the frame of the photographed images Ia of generated polygon as Wa, even therefore ooze out there occurs, lack the photographed images Ia that white trace Si carries out shooting gained, also the point (summit C) forming the predetermined number intersected between the straight line L of block diagram as the predetermined number of the lateral profile of Wa can be utilized suitably to carry out mapping transformation process, consequently suitably can detect that the block diagram of polygon is as Wa.
In addition, predetermined information can suitably be read from the code information Sc in the frame part Sw (block diagram is as Wa) of photographed images Ia.At this moment, when have read the predetermined information of more than predetermined number from the photographed images Ia of the trace Si that addition of identical multiple code information Sc to frame part Sw, control the execution of the process corresponding with this predetermined information, therefore by advance to frame part Sw multiple embed multiple code information Sc ... can stably read original predetermined information from code information Sc, suitably can perform the process corresponding with the predetermined information read out.
In addition, the present invention is not limited to above-mentioned embodiment, can carry out the change of various improvement and design in the scope not departing from main contents of the present invention.
Below, the second embodiment of mobile terminal 100 is described.
< second embodiment >
Fig. 8 is the block diagram of the Sketch of the mobile terminal 200 representing the second embodiment.
As shown in Figure 8, the code process portion 206 of the mobile terminal 200 of the second embodiment, except possessing image acquiring section 6a, binaryzation portion 6b, straight line presumption unit 6c, frame test section 6d and information reading part 6e, also possesses pixel count reduction portion 6f.
In addition, roughly the same with the mobile terminal 100 of above-mentioned embodiment on the structure of the mobile terminal 200 of the second embodiment point beyond following detailed description, omit detailed description.
Pixel count reduction portion 6f carries out following pixel count and reduces process, namely reduces the number of the pixel be present in the background of photographed images Ia.
That is, pixel count reduction portion 6f carries out following process, namely makes the number of the pixel be present in background relatively reduce relative to the first binary image Ib corresponding with the photographed images Ia obtained by image acquiring section 6a.Specifically, trace Si affixes one's seal at color, recording medium that pattern is identical, but in the situation of such as affixing one's seal on the line lattice N of notebook etc. (with reference to Figure 10 A), likely suitably cannot be estimated the straight line L of the lateral profile forming frame part Sw (block diagram is as Wa) by straight line presumption unit 6c.
Therefore, pixel count reduction portion 6f obtains the view data of the first binary image Ib generated by binaryzation portion 6b, after implementing to make the process of white pixel and black pixel inversion (with reference to Figure 10 B), carry out the expansion process for removing the little pixel set of ratio predetermined value in the background being present in the first binary image Ib and shrink process (with reference to Figure 10 C etc.).Such as, pixel count reduction portion 6f is for each pixel of handling object of (with reference to Figure 10 B) of black white reverse image Ie becoming the first binary image Ib, after implementing the expansion process of the pixel that increase by 1 is enclosed (with reference to Figure 10 C), implement the shrink process (with reference to Figure 11 A) pixel that 2 enclose divested, then, the expansion process (with reference to Figure 11 B) that pixel is enclosed in increase by 1 is implemented.Thus, become pixel count in the background being present in the first binary image Ib relative to the state forming the pixel count of block diagram as Wa and relatively reduce, straight line presumption unit 6c estimate form the straight line L of block diagram as the lateral profile of Wa time, seek to alleviate the impact of the pixel in the background being present in trace Si.
Then, in the first binary image Ib of straight line presumption unit 6c after pixel count reduction portion 6f process, presumption form the lateral profile of the frame part Sw (block diagram as Wa) corresponding with the foursquare frame of seal S 4 straight line L ... after, frame test section 6d detects the block diagram of photographed images Ia as Wa (with reference to Figure 11 C).
< code reading process >
Then, the code reading process of the mobile terminal 200 of the second embodiment is described with reference to Fig. 9.
Fig. 9 is the process flow diagram of an example of the action representing code reading process.
In addition, roughly the same with the code reading process of the mobile terminal 100 of above-mentioned embodiment on the code reading process of the mobile terminal 200 of the second embodiment point beyond following detailed description, omit detailed description.
As shown in Figure 9, code process portion 206 is same with the code reading process of the mobile terminal 100 of above-mentioned embodiment, carries out each process of step S1 ~ S3, generates the view data of the first binary image Ib (with reference to Fig. 4 B).
The view data of pixel count reduction portion 6f to the first generated binary image Ib is carried out pixel and is reduced process (step S21).Specifically, pixel count reduction portion 6f is after implementing to make the process of white pixel and black pixel inversion to the view data of the first binary image Ib, carry out expansion process and shrink process, the pixel count be present in the background of the first binary image Ib is relatively reduced relative to the pixel count forming frame part Sw (block diagram is as Wa).
Then, profile determination portion c1 implements convex closure process to the view data that pixel count reduces the first binary image Ib after process, determines the convex closure region A1 (with reference to Fig. 4 C) (step S4) of the polygon corresponding with the lateral profile of frame part Sw thus.
Then, code process portion 206 is same with the code reading process of the mobile terminal 100 of above-mentioned embodiment, carry out each process that step S4 is later, carry out the detection (step S5, S6) of the straight line L of the lateral profile forming frame part Sw (block diagram is as Wa), the detection (step S10) of frame part Sw (block diagram is as Wa), the reading (step S12) etc. from the predetermined information of code information.
Therefore, according to the mobile terminal 200 of the second embodiment, carry out the process that the number of the pixel be present in the background of photographed images Ia is relatively reduced, in photographed images Ia after this treatment, presumption forms the straight line L of the predetermined number of the lateral profile of frame part Sw (block diagram is as Wa), therefore do not affix one's seal in color at trace Si, on the recording medium that pattern is identical, but when such as affixing one's seal on the notebook with line lattice N, relatively reduce by making the number of the pixel be present in the background of photographed images Ia, also the impact of the pixel be present in the background of photographed images Ia can be alleviated, suitably presumption forms the straight line L of block diagram as the lateral profile of Wa.
In addition, in the above-described embodiment, straight line determination portion c2 have passed form polygon convex closure region A1 any 2 summits B, B multiple straight line L ... in, with the both sides of the relativeness between the number of the pixel of the convex closure region A1 overlap of each straight line L and polygon and adjacent straight line L for benchmark, determine to form block diagram as the straight line L of the predetermined number of the lateral profile of Wa, but also can only using any one party as benchmark.
Such as, and then in the above-described embodiment, the shape of frame part Sw is set to square, but is an example, being not limited to this, also can be the polygon beyond square.
In addition, in the above-described embodiment, to the code information Sc that each limit Sa of the frame part Sw of trace Si is additional identical respectively, but an example, be not limited to this, such as, also can add mutually different code information Sc.In this case, the amount of code information (the original predetermined information) Sc being embedded into frame part Sw can be increased.
And then, in the above-described embodiment, as image processing apparatus, illustrate mobile terminal 100,200, but an example, be not limited to this, as long as can the execution of check processing of controller chassis part Sw (block diagram is as Wa), then can suitably change arbitrarily.
In addition, in the above-described embodiment, be configured under the control of the central control 1 of mobile terminal 100 (200), by being carried out driving the function realized as acquisition unit, presumption unit, detecting unit by image acquiring section 6a, straight line presumption unit 6c, frame test section 6d, but being not limited to this, also can being configured to realize by performing preset program etc. by the CPU of central control 1.
That is, by comprise obtain processing procedure, presumption processing procedure, check processing process program be stored in stored program program storage.Then, the CPU of central control 1 can be made to play function as following unit by obtaining processing procedure, namely the photographed images Ia trace Si of seal S being carried out to shooting gained is obtained, wherein this seal S defines the frame of the polygon of the surrounding of predetermined print, and making to addition of predetermined information coding to trace Si is the regularly arranged code information Sc of pixel set.In addition, also the CPU of central control 1 can be made to play function as following unit by presumption processing procedure, namely in the photographed images Ia obtained, form the lateral profile of the frame part Sw corresponding with the frame of polygon, estimate the straight line L of the predetermined number corresponding with the angle number of the frame of polygon.In addition, also the CPU of central control 1 can be made to play function as the unit detecting the frame part Sw of photographed images Ia be made up of the straight line L of the predetermined number deduced by check processing process.
Equally, unit is reduced for outline specifying unit, straight line determining unit, summit determining unit, generation unit, reading unit, processing unit, pixel count, also can be configured to perform preset program etc. by the CPU of central control 1 and realize.
And then, as the medium of the embodied on computer readable of the program stored for performing above-mentioned each process, except ROM, hard disk etc., also can the movable-type such as nonvolatile memory, the CD-ROM recording mediums such as flash memory be applied.In addition, carrier wave (Carrier wave) is also applied as the medium of data providing program via predetermined communication line.
Then, for the 3rd embodiment of the present invention, use accompanying drawing that concrete form is described.But scope of invention is not limited to illustrative example.
Figure 12 is the block diagram of the Sketch representing the mobile terminal 300 applying an embodiment of the invention.
As shown in figure 12, the code process portion 206 of the mobile terminal 300 of the 3rd embodiment, except possessing image acquiring section 6a, binaryzation portion 6b and information reading part 6e, also possesses rim detection portion 6g, parallel edge extraction unit 6h, reading area determination portion 6i.
In addition, roughly the same with the mobile terminal 100 of above-mentioned embodiment on the structure of the mobile terminal 300 of the 3rd embodiment point beyond following detailed description, omit detailed description.
Rim detection portion 6g detects the edge E of print image.
That is, rim detection portion 6g detects multiple edge E from the binary image Ib corresponding with the photographed images Ia obtained by image acquiring section 6a.Specifically, such as rim detection portion 6g uses predetermined differential filtrator (such as Laplce's filtrator etc.) to carry out differential calculation for the view data of the binary image Ib generated by binaryzation portion 6b, and sensed luminance value, color and concentration have place jumpy as edge E.Then, rim detection portion 6g generates the view data of edge image Ic (with reference to Figure 15 C) according to the edge E detected.
In addition, the content of above-mentioned edge detection process is an example, is not limited to this, can suitably changes arbitrarily.
Parallel edge extraction unit 6h extracts parallel 2 edges E, E out.
That is, parallel edge extraction unit 6h detected by rim detection portion 6g multiple edge E ... in, extract out and almost parallel 2 edges E, the E at interval that the width of frame part Sw is roughly equal.Specifically, parallel edge extraction unit 6h is for edge image Ic, from precalculated position (such as left etc.), in turn apply parallel edge filter F to predetermined direction (such as to inferior), extract out and have and almost parallel 2 edges E, the E (with reference to Figure 16 A and Figure 16 B) at interval that the width of frame part Sw is roughly equal.
Parallel edge filter F adds 2 edge inspection area Fa, the Fa with preset width (such as 5 pixels) at predetermined spaced intervals with predetermined length (such as 20 pixels).By the interval be adjusted at the interval of these 2 edge inspection area Fa, Fa and the width of frame part Sw is roughly equal, parallel edge extraction unit 6h extracts out and has and almost parallel 2 edges E, the E at interval that the width of frame part Sw is roughly equal from edge image Ic thus.At this moment, parallel edge extraction unit 6h makes 2 edge inspection area Fa, Fa rotate predetermined angular (such as 90 °) around center Fc, extracts parallel 2 edge E, the Es corresponding with frame part Sw out thus.
In addition, in Figure 16 A and Figure 16 B, a Watch with magnifier illustrates a part for the left of edge image Ic.
In addition, use the content of the process of above-mentioned parallel edge filter F to be an example, be not limited to this, can suitably change arbitrarily.Such as, also can prepare to change multiple filtrators of 2 edge inspection area Fa, the width of Fa and the anglec of rotation as parallel edge filter F, use the parallel edge filter F that these are whole respectively, extract parallel 2 edge E, the Es corresponding with frame part Sw out.
Reading area determination portion 6i determines the reading area A of code information Sc.
That is, reading area determination portion 6i is according to almost parallel 2 edges E, E of being extracted out by parallel edge extraction unit 6h, determines the reading area A of code information Sc in photographed images Ia.Specifically, the region (with the region of the inner side of 2 edge E, Es corresponding region) corresponding with the line between the intermediate point connecting almost parallel 2 edges E, E, in the binary image Ib corresponding with photographed images Ia, is defined as reading area A by reading area determination portion 6i.
Such as, reading area determination portion 6i determines the line of reservation shape (such as square etc.) by coupling together between the intermediate point of almost parallel 2 edges E, E of being extracted out by parallel edge extraction unit 6h, make determined line corresponding with binary image Ib, in this binary image Ib, determine the reading area A (with reference to Figure 16 C) of code information Sc thus.
Information reading part 6e carries out the reading process reading original predetermined information from code information Sc.
That is, information reading part 6e reads predetermined information from code information Sc in the reading area A determined by reading area determination portion 6i.Specifically, information reading part 6e is in the binary image Ib corresponding with photographed images Ia, and the pixel value (code information Sc) according to the pixel be present in reading area A reads predetermined information.Such as, information reading part 6e scans along predetermined direction from the precalculated position (such as left) of reading area A, determines the coordinate position existing for set of the set of the white pixel of pixel value " 1 " and the black pixel of pixel value " 0 " respectively.Information reading part 6e, for the arrangement of the set of determined white pixel and the set of black pixel, implements the decoding process corresponding with the coded system of code information Sc, reads the original predetermined information (such as URL etc.) represented by code information Sc.
At this moment, information reading part 6e carries out reading process to each region corresponding with each limit Sa of frame part Sw in reading area A.That is, original predetermined information is read with number of times (such as 4 times) corresponding to the number of the limit Sa with frame part Sw from code information Sc.At this, each limit Sa of frame part Sw be addition of identical code information Sc respectively, therefore, information reading part 6e such as also can be configured to read out multiple when detecting the original predetermined information at more than 2 places, be judged to suitably to read this predetermined information.
In addition, above-mentioned reading process is known technology, therefore omits detailed description at this.
Action handling part 7, according to the original predetermined information of the code information Sc read by information reading part 6e, performs predetermined action.
That is, action handling part 7 is when have read the predetermined information of more than predetermined number by information reading part 6e, controls the execution of the process corresponding with this predetermined information.Specifically, such as, when have read URL as predetermined information, action handling part 7 controls communication control unit 11, the specific webpage of the Internet that the URL that access is obtained specifies.Then, action handling part 7 performs instruction according to the various process preset (such as the playback etc. of specific sound, image), controls display control unit 9 and voice communication portion 10 etc., makes it perform various process.
Display part 8 is such as made up of LCD panel etc., according to the vision signal from display control unit 9, is presented in display frame by the image (such as live image etc.) of gained of being made a video recording by image pickup part 3.
Display control unit 9 carries out following control, namely reads the view data of the display be temporarily stored in storer 2 and makes to be presented on display part 8.
Specifically, display control unit 9 possesses VRAM (Video Random Access Memory: video RAM), VRAM controller, digital video code etc.In addition, digital video code reads with predetermined playback frame speed (such as 30fps) via VRAM controller and reads and the brightness signal Y be stored in VRAM (omitting diagram) and colour difference signal Cb, Cr from storer 2 under the control of central control 1 from VRAM, produces vision signal and output to display part 8 based on these data.
Such as, display control unit 9 will be made a video recording by image pickup part 3 and imaging control part 4 with predetermined frame rate of display and generated by image data generating section 5 multiple two field pictures ... successively carry out renewal fact to be on one side presented on display part 8.
The external user of the external unit that voice communication portion 10 is connected with via communication network N is conversed.
Specifically, voice communication portion 10 possesses microphone 10a, loudspeaker 10b, data transformation portion 10c etc.In addition, voice communication portion 10 carries out A/D conversion process by the words sound that send of data transformation portion 10c to the user inputted from microphone 10a, words voice data will be sent to output to central control 1, and under the control of central control 1, by data transformation portion 10c to export from communication control unit 11 and the voice datas such as voice data of answering inputted carry out D/A conversion process, export from loudspeaker 10b.
Communication control unit 11 carries out the transmitting-receiving of data via communication network N and communication antenna 11a.
That is, communication antenna 11a be can carry out with this mobile terminal 300 with predetermined communication mode (such as W-CDMA (Wideband Code Division Multiple Access: Wideband Code Division Multiple Access (WCDMA)) mode, GSM (the Global System for Mobile Communications: global system for mobile communications that adopt in the communication of wireless base station (omit and illustrate); Registered trademark) mode etc.) antenna of transmitting-receiving of corresponding data.In addition, communication control unit 11, according to the communication protocol corresponding with predetermined communication mode, carries out the transmitting-receiving of data between wireless base station via communication antenna 11a by the communication channel set in this communication mode.That is, communication control unit 11 exports and the indicator signal inputted according to from central control 1, and the external unit to communication counterpart carries out the transmitting-receiving with the transmitting-receiving of the sound in the call of the external user of this external unit, the data of Email.
In addition, the structure of communication control unit 11 is examples, is not limited to this, can suitably change arbitrarily, although such as omit diagram, also can be configured to load wireless LAN module, can via access point (Access Point) visited communication network N.
Communication network N is such as the communication network that mobile terminal 300 is connected with external unit via wireless base station, gateway server (omitting diagram) etc.
In addition, communication network N is such as the communication network utilizing industrial siding and existing common public circuit to construct, and can apply the various line form such as LAN (LAN (Local Area Network)), WAN (wide area network).In addition, telecommunication network N such as comprises various communication network network diagram, IP network, VoIP (Voice over Internet Protocol: Internet ready phones) gateway, the Internet Service Providers etc. such as telephone wire road network, isdn line road network, industrial siding, mobile radio communication, telstar circuit, CATV line network.
Operation inputting part 12 is for inputting various instruction to terminal body.
Specifically, operation inputting part 12 possesses relevant cursor key up and down such as the selection instruction of the photography instruction of subject relevant tripper, pattern and function etc., decision button, the relevant communication association button of execution instruction of the call incoming of phone and the transmitting-receiving of Email etc., the input of text indicate the digital keys of being correlated with and the various button (all omitting diagram) such as mark button.
In addition, if by user operation various button, then operation inputting part 12 exports the operation instruction with operated button corresponding to central control 1.Central control 1 exports and the operation instruction inputted according to from operation inputting part 12, makes each portion perform predetermined action (transmitting-receiving etc. of the shooting of such as subject, the call incoming of phone, Email).
In addition, operation inputting part 12 both can have the touch-screen arranged integratedly with display part 8, also according to the scheduled operation of user to touch-screen, can export the operation instruction corresponding with this scheduled operation to central control 1.
< code reading process >
Then, the code reading process of mobile terminal 300 is described with reference to Figure 13 ~ Figure 16.
Figure 13 is the process flow diagram of an example of the action representing code reading process.
In addition, suppose that the trace Si made a video recording by following code reading process is such as by the precalculated position (with reference to Figure 14 A) of affixing one's seal at recording medium P such as postcards.In addition, suppose that each limit Sa of the frame part Sw of trace Si be addition of identical code information Sc respectively.
As shown in figure 13, first, if have input shooting instruction according to the scheduled operation of user to operation inputting part 12, then imaging control part 4 makes image pickup part 3 make a video recording trace Si, and image data generating section 5 generates view data (the step S1 of the photographed images Ia transmitted from electro-photographic portion 3b; With reference to Figure 14 B).
Then, the yuv data of the photographed images Ia of generation is outputted to storer 2 by image data generating section 5, is stored in this storer 2.
In addition, in order to easy shooting trace Si under closer to foursquare state, display control unit 9 also can make display part 8 show the guiding corresponding with the profile of trace Si to show.
Then, the image acquiring section 6a in code process portion 6 obtains view data (such as brightness data) (the step S2 of the predetermined resolution of the photographed images Ia generated by image data generating section 5 from storer 2; With reference to Figure 15 A).Then, binaryzation portion 6b implements the binary conversion treatment of carrying out binaryzation with predetermined threshold value for the view data of the photographed images Ia obtained by image acquiring section 6a, generates the view data (step S3) of binary image Ib.
Then, rim detection portion 6g detects multiple edge E in the binary image Ib generated by binaryzation portion 6b, generates the view data (step S4) of edge image Ic (with reference to Figure 15 C).
Then, parallel edge extraction unit 6h, for the view data of edge image Ic, sequentially applies parallel edge filter F, extracts 2 almost parallel edge E, E (step S5 out from preposition (such as left etc.); With reference to Figure 16 A and Figure 16 B).
Then, the region corresponding with the line between the intermediate point connecting almost parallel 2 edges E, E of being extracted out by parallel edge extraction unit 6h, in binary image Ib, is defined as the reading area A (step S6) of code information Sc by reading area determination portion 6i.
Then, information reading part 6e carries out the reading process (step S7) reading predetermined information in the reading area A determined by reading area determination portion 6i of binary image Ib from code information Sc.Specifically, information reading part 6e scans along predetermined direction from the precalculated position (such as left etc.) of the reading area A of the line as reservation shape, determines the coordinate position existing for set of the set of the white pixel of pixel value " 1 " and the black pixel of pixel value " 0 " respectively.Then, information reading part 6e implements decoding process for the arrangement of the set of determined white pixel and the set of black pixel, reads the original predetermined information (such as URL etc.) represented by code information Sc.
In addition, in order to more efficiently carry out the reading process of information reading part 6e, also before this reading process, the outer of frame part Sw can be carried out to binary image Ib and is formed as the such mapping transformation process of square.
Then, information reading part 6e judges whether have read original predetermined information (step S8) for many times from code information Sc by reading process.
At this, if it is determined that for repeatedly have read original predetermined information (step S8; Be), then information reading part 6e is judged to suitably to read this predetermined information, action handling part 7 performs predetermined action (such as access the Internet, the sound carrying out determining, the playback of image) (step S9) according to the predetermined information (such as URL etc.) read by information reading part 6e.
On the other hand, if be judged in step s 8 repeatedly not read original predetermined information (step S8; No), then the CPU of central control 1 skips the process of step S9, and this code reading process is terminated.
As described above, mobile terminal 300 according to the present embodiment, the binary image Ib carrying out the photographed images Ia of shooting gained from the trace Si with the frame part Sw extracode information Sc of preset width of the surrounding to predetermined watermark image Sp detects multiple edge E, according to from multiple edge E, middle extraction and 2 almost parallel edge E at interval that the width of frame part Sw is roughly equal, E, the reading area A of code information Sc is determined in binary image Ib, predetermined information is read from code information Sc in this reading area A, therefore by utilizing the edge shape of the frame part Sw of the trace Si affixed one's seal on recording medium P, the reading area A that code information Sc exists can be suitably determined in binary image Ib.That is, there is code information Sc in the inner side of frame part Sw, therefore by determining 2 edges E, E of this frame part Sc, also suitably can determine the reading area A of the code information Sc in binary image Ib.Specifically, in binary image Ib, by region corresponding for the region of the inner side with 2 edges E, E, more particularly with by region corresponding for the line coupled together between the intermediate point of 2 edges E, E be defined as reading area A, therefore, it is possible to more suitably determine reading area A in photographed images Ia.
Thus, in reading area A, read original predetermined information from code information Sc, therefore, it is possible to suitably read predetermined information from photographed images Ia.
In addition, when have read the predetermined information of more than predetermined number from the photographed images Ia of the trace Si that addition of identical multiple code information Sc to frame part Sw, control the execution of the process corresponding with this predetermined information, therefore by advance to frame part Sw multiple embed multiple code information Sc ... can stably read original predetermined information from code information Sc, suitably can perform the process corresponding with the predetermined information read out.
In addition, the present invention is not limited to above-mentioned embodiment, can carry out the change of various improvement and design in the scope not departing from main contents of the present invention.
That is, as long as can determine frame part Sw from photographed images Ia, and read information from the frame part determined, then the defining method of frame part Sw also can be arbitrary method.
Such as, in the above-described embodiment, also can prepare the frame template image corresponding with the frame of predetermined polygon, determine frame part Sw by the coupling between frame template image and photographed images Ia.
In addition, such as in the above-described embodiment, as the reading area A of code information Sc, illustrate with in photographed images Ia (binary image Ib) by the intermediate point of almost parallel 2 edges E, E between region corresponding to the line that couples together, but an example, be not limited to this, as long as the region corresponding with the region of the inner side of almost parallel 2 edges E, E, then can suitably change arbitrarily.
In addition, in the above-described embodiment, the shape of frame part Sw is set to square, but is an example, be not limited to this, as long as have preset width, then can suitably change arbitrarily.That is, the shape of frame part Sw also can be such as the polygon beyond square, can also be circular.
And then, in the above-described embodiment, to the code information Sc that each limit Sa of the frame part Sw of trace Si is additional identical respectively, but an example, be not limited to this, such as also can add mutually different code information Sc, in this case, the amount of code information (the original predetermined information) Sc being embedded into frame part Sw can be increased.
And then, in the above-described embodiment, as image processing apparatus, illustrate mobile terminal 300, but an example, be not limited to this, as long as can the execution of reading process of control routine information Sc, then can suitably change arbitrarily.
In addition, in the above-described embodiment, be configured under the control of the central control 1 of mobile terminal 300, by carrying out driving by image acquiring section 6a, rim detection portion 6g, parallel edge extraction unit 6h, reading area determination portion 6i, information reading part 6e the function realized as acquisition unit, detecting unit, extraction unit, determining unit, reading unit, but being not limited to this, also can being configured to realize by performing preset program etc. by the CPU of central control 1.
That is, obtaining processing procedure, check processing process, pump-and-treat system process by comprising, determining processing procedure, the program of reading process process is stored in stored program program storage.Then, the CPU of central control 1 can be made to play function as following unit by obtaining processing procedure, namely obtain and carry out to trace Si the print image of gained of making a video recording, wherein this trace Si adds predetermined information coding to the frame part Sw with preset width of the surrounding of predetermined watermark image Sp is that the regularly arranged code information Sc of pixel set forms.In addition, the CPU of central control 1 can be made to play function as the unit detecting multiple edge E from the print image obtained by check processing process.In addition, the CPU of central control 1 can be made to play function as following unit by pump-and-treat system process, namely detect multiple edge E ... in, extract out and almost parallel 2 edges E, the E at interval that the width of frame part Sw is roughly equal.In addition, by determining that processing procedure makes the CPU of central control 1 play function as following unit, namely according to 2 edges E, E extracting out, the reading area A of code information Sc can be determined in print image.In addition, the CPU of central control 1 can be made to play function as reading the unit of predetermined information from code information Sc in the reading area A determined by reading process process,
Equally, for processing unit, also can be configured to perform preset program etc. by the CPU of central control 1 and realize.
And then, as the medium of the embodied on computer readable of the program stored for performing above-mentioned each process, except ROM, hard disk etc., also can the movable-type such as nonvolatile memory, the CD-ROM recording mediums such as flash memory be applied.In addition, carrier wave (Carrier wave) is also applied as the medium of data providing program via predetermined communication line.

Claims (19)

1. an image processing apparatus, is characterized in that, possesses:
Acquisition unit, it obtains the print image trace that addition of code information in frame part being carried out to shooting gained;
Determining unit, the frame image-region corresponding with the trace of above-mentioned frame part in the print image obtained by above-mentioned acquisition unit is defined as the reading area reading above-mentioned code information by it;
Reading unit, it reads above-mentioned code information from the above-mentioned reading area determined by above-mentioned determining unit.
2. image processing apparatus according to claim 1, is characterized in that,
Above-mentioned determining unit possesses:
Detecting unit, it detects multiple edge from the print image obtained by above-mentioned acquisition unit;
Extract unit out, they are in the multiple edges detected by above-mentioned detecting unit, extract out and 2 almost parallel edges at interval that the width of above-mentioned frame part is roughly equal, wherein
According to 2 edges extracted out by above-mentioned extraction unit, in above-mentioned print image, determine the reading area of above-mentioned code information,
Above-mentioned reading unit, in the above-mentioned reading area determined by above-mentioned determining unit, reads above-mentioned predetermined information from above-mentioned code information.
3. image processing apparatus according to claim 2, is characterized in that,
The region corresponding with the region of the inner side clamped with above-mentioned 2 edges, in above-mentioned print image, is defined as above-mentioned reading area by above-mentioned determining unit.
4. image processing apparatus according to claim 2, is characterized in that,
Above-mentioned code information is added along the vertical direction substantially vertical with this Width in the Width substantial middle side of above-mentioned frame part,
The region corresponding with the line between the intermediate point connecting above-mentioned 2 edges, in above-mentioned marking image, is defined as above-mentioned reading area by above-mentioned determining unit.
5. image processing apparatus according to claim 1, is characterized in that,
Identical multiple above-mentioned code information is added to above-mentioned frame part,
Also possess: processing unit, it, when be have read the above-mentioned predetermined information of more than predetermined number by above-mentioned reading unit, controls the execution of the process corresponding with this information.
6. image processing apparatus according to claim 1, is characterized in that,
Above-mentioned determining unit possesses:
Presumption unit, it is in the print image obtained by above-mentioned acquisition unit, forms the lateral profile of above-mentioned frame part, estimates the straight line of the predetermined number corresponding with the angle number of the frame of above-mentioned polygon;
Detecting unit, it detects the above-mentioned frame part of the above-mentioned print image be made up of the straight line of the predetermined number deduced by above-mentioned presumption unit.
7. image processing apparatus according to claim 6, is characterized in that,
Above-mentioned presumption unit possesses:
Outline specifying unit, it determines the region of the polygon corresponding with the lateral profile of the above-mentioned frame part of the print image obtained by above-mentioned acquisition unit;
Straight line determining unit, the position on multiple summits in the region of its polygon determined by above-mentioned outline specifying unit according to formation, determines the straight line of the predetermined number of the lateral profile forming above-mentioned frame part.
8. image processing apparatus according to claim 6, is characterized in that,
Above-mentioned straight line determining unit is in the multiple straight lines on any 2 summits that have passed the region forming the polygon determined by above-mentioned outline specifying unit, according at least one party in the relativeness between the number of the pixel of the region overlap of each straight line and above-mentioned polygon and adjacent straight line, determine the straight line of the predetermined number of the lateral profile forming above-mentioned frame part.
9. image processing apparatus according to claim 8, is characterized in that,
Above-mentioned straight line determining unit is also in the multiple straight lines on any 2 summits that have passed the region forming the polygon determined by above-mentioned outline specifying unit, determine this straight line to be defined as the straight line of the lateral profile forming above-mentioned frame part by the straight line that the number of multiple pixel overlaps in the region forming above-mentioned polygon is more than predetermined value.
10. image processing apparatus according to claim 8, is characterized in that,
Above-mentioned straight line determining unit is also in the multiple straight lines on any 2 summits that have passed the region forming the polygon determined by above-mentioned outline specifying unit, determine the straight line roughly equal with the interior angle of the frame of adjacent straight line angulation and above-mentioned polygon, the pixel additional weight overlapping with the region of above-mentioned polygon in the pixel forming this straight line, calculate the evaluation of estimate of each straight line, straight line high for the evaluation of estimate calculated is defined as the straight line of the lateral profile forming above-mentioned frame part.
11. image processing apparatus according to claim 7, is characterized in that,
Above-mentioned detecting unit possesses: summit determining unit, and the point of the predetermined number intersected between the straight line of the predetermined number determined by above-mentioned straight line determining unit is defined as the summit of above-mentioned frame part by it, wherein
According to the summit of the predetermined number determined by above-mentioned summit determining unit, detect the above-mentioned frame part of above-mentioned print image.
12. image processing apparatus according to claim 11, is characterized in that,
Above-mentioned trace forms the mark of reservation shape in the bight of the frame of above-mentioned polygon,
Above-mentioned summit determining unit also determines the position of the sign image corresponding with above-mentioned mark in the print image obtained by above-mentioned acquisition unit, according to the position of determined above-mentioned sign image, determines the summit of above-mentioned frame part.
13. image processing apparatus according to claim 12, is characterized in that,
Also possess: the first generation unit, it, according to the summit of the predetermined number determined by above-mentioned summit determining unit, carries out mapping transformation process to the print image obtained by above-mentioned acquisition unit, generates the print image of polygon,
Above-mentioned detecting unit detects the frame part corresponding with the frame of the print image of the polygon generated by above-mentioned first generation unit.
14. image processing apparatus according to claim 6, is characterized in that,
Also possess: reading unit, it reads predetermined information from the above-mentioned code information in the above-mentioned frame part of the above-mentioned print image detected by above-mentioned detecting unit.
15. image processing apparatus according to claim 1, is characterized in that,
Additional multiple above-mentioned code information on the frame of the polygon of above-mentioned trace,
Possess: processing unit, it, when be have read the above-mentioned predetermined information of more than predetermined number by above-mentioned reading unit, controls the execution of the process corresponding with this predetermined information.
16. image processing apparatus according to claim 6, is characterized in that,
Also possess: pixel count reduces unit, and it is to the print image obtained by above-mentioned acquisition unit, carries out processing the number of the pixel be present in the background of this print image is reduced relatively,
In the print image of above-mentioned presumption unit after above-mentioned pixel count reduces the process of unit, estimate the straight line of above-mentioned predetermined number.
17. image processing apparatus according to claim 1, is characterized in that,
Above-mentioned frame is roughly circular,
Possess: mark determining unit, it determines to be formed in the mark in above-mentioned ring part in the image obtained by above-mentioned acquisition unit,
Above-mentioned reading unit, according to the mark determined by above-mentioned mark determining unit, reads above-mentioned code information.
18. image processing apparatus according to claim 17, is characterized in that,
Also possess: the second generation unit, the position of mark determined by above-mentioned mark determining unit as benchmark, is carried out mapping transformation process to the image obtained by above-mentioned acquisition unit, is generated the image above-mentioned ring part being transformed to positive toroidal by it.
19. 1 kinds of image processing methods employing image processing apparatus, is characterized in that, comprising:
Obtain to the trace that addition of code information in frame part carry out make a video recording gained print image obtain step;
The frame image-region corresponding with the trace of above-mentioned frame part in the print image this obtained is defined as the determining step of the reading area reading above-mentioned code information;
The read step of above-mentioned code information is read from this above-mentioned reading area determined.
CN201410387805.6A 2013-08-08 2014-08-08 Image processing apparatus and image processing method Active CN104346613B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2013-164741 2013-08-08
JP2013164741A JP5858012B2 (en) 2013-08-08 2013-08-08 Image processing apparatus, image processing method, and program
JP2013164747A JP5862623B2 (en) 2013-08-08 2013-08-08 Image processing apparatus, image processing method, and program
JP2013-164747 2013-08-08

Publications (2)

Publication Number Publication Date
CN104346613A true CN104346613A (en) 2015-02-11
CN104346613B CN104346613B (en) 2018-06-15

Family

ID=52502184

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410387805.6A Active CN104346613B (en) 2013-08-08 2014-08-08 Image processing apparatus and image processing method

Country Status (1)

Country Link
CN (1) CN104346613B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107644183A (en) * 2017-09-01 2018-01-30 福建联迪商用设备有限公司 One-dimension code CMOS images the coding/decoding method and terminal of engine
CN108460381A (en) * 2018-03-13 2018-08-28 南京邮电大学 Invoice reimbursement Information locating based on image recognition and intercept method
CN110313183A (en) * 2017-02-23 2019-10-08 奈飞公司 Iterative technique for being encoded to video content
US11153585B2 (en) 2017-02-23 2021-10-19 Netflix, Inc. Optimizing encoding operations when generating encoded versions of a media title
US11166034B2 (en) 2017-02-23 2021-11-02 Netflix, Inc. Comparing video encoders/decoders using shot-based encoding and a perceptual visual quality metric
US11444999B2 (en) 2017-02-23 2022-09-13 Netflix, Inc. Iterative techniques for generating multiple encoded versions of a media title
US11910039B2 (en) 2017-07-18 2024-02-20 Netflix, Inc. Encoding technique for optimizing distortion and bitrate

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1334543A (en) * 2000-07-25 2002-02-06 佳能株式会社 Bills processing method and device
CN1521656A (en) * 2003-02-10 2004-08-18 吴建明 Novel electronic signature stamp technique
US20090221880A1 (en) * 2002-08-20 2009-09-03 Welch Allyn, Inc. Diagnostic instrument workstation

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1334543A (en) * 2000-07-25 2002-02-06 佳能株式会社 Bills processing method and device
US20090221880A1 (en) * 2002-08-20 2009-09-03 Welch Allyn, Inc. Diagnostic instrument workstation
CN1521656A (en) * 2003-02-10 2004-08-18 吴建明 Novel electronic signature stamp technique

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11184621B2 (en) 2017-02-23 2021-11-23 Netflix, Inc. Techniques for selecting resolutions for encoding different shot sequences
CN110313183A (en) * 2017-02-23 2019-10-08 奈飞公司 Iterative technique for being encoded to video content
US11153585B2 (en) 2017-02-23 2021-10-19 Netflix, Inc. Optimizing encoding operations when generating encoded versions of a media title
US11166034B2 (en) 2017-02-23 2021-11-02 Netflix, Inc. Comparing video encoders/decoders using shot-based encoding and a perceptual visual quality metric
CN110313183B (en) * 2017-02-23 2021-11-12 奈飞公司 Iterative techniques for encoding video content
US11444999B2 (en) 2017-02-23 2022-09-13 Netflix, Inc. Iterative techniques for generating multiple encoded versions of a media title
US11758146B2 (en) 2017-02-23 2023-09-12 Netflix, Inc. Techniques for positioning key frames within encoded video sequences
US11818375B2 (en) 2017-02-23 2023-11-14 Netflix, Inc. Optimizing encoding operations when generating encoded versions of a media title
US11870945B2 (en) 2017-02-23 2024-01-09 Netflix, Inc. Comparing video encoders/decoders using shot-based encoding and a perceptual visual quality metric
US11871002B2 (en) 2017-02-23 2024-01-09 Netflix, Inc. Iterative techniques for encoding video content
US11910039B2 (en) 2017-07-18 2024-02-20 Netflix, Inc. Encoding technique for optimizing distortion and bitrate
CN107644183A (en) * 2017-09-01 2018-01-30 福建联迪商用设备有限公司 One-dimension code CMOS images the coding/decoding method and terminal of engine
CN108460381A (en) * 2018-03-13 2018-08-28 南京邮电大学 Invoice reimbursement Information locating based on image recognition and intercept method

Also Published As

Publication number Publication date
CN104346613B (en) 2018-06-15

Similar Documents

Publication Publication Date Title
CN104346613A (en) Image processing apparatus and image processing method
US9357107B2 (en) Image-processing device, image-capturing device, image-processing method, and recording medium
US9445069B2 (en) Image-processing device, image-capturing device, image-processing method, and recording medium
US9633417B2 (en) Image processing device and image capture device performing restoration processing using a restoration filter based on a point spread function
US9407884B2 (en) Image pickup apparatus, control method therefore and storage medium employing phase difference pixels
CN105103534B (en) Photographic device and calibration method
US9866750B2 (en) Image processing device, imaging device, image processing method, and image processing program
US9196029B2 (en) Threshold setting device for setting threshold used in binarization process, object detection device, threshold setting method, and computer readable storage medium
US9892495B2 (en) Image processing device, imaging device, image processing method, and image processing program
US9619871B2 (en) Image processing device, imaging apparatus, image processing method, and program
US9829676B2 (en) Imaging device and focusing control method
US9881362B2 (en) Image processing device, image-capturing device, image processing method, and program
JP2008283649A (en) Image processing method, image region detecting method, image processing program, image region detection program, image processing apparatus, and image region detecting apparatus
US9633418B2 (en) Image processing device, imaging apparatus, image processing method, and program
US20160381285A1 (en) Imaging device and focusing control method
US9361500B2 (en) Image processing apparatus, image processing method, and recording medium
US10778903B2 (en) Imaging apparatus, imaging method, and program
US10567647B2 (en) Image processing apparatus and image processing method
JPWO2014098143A1 (en) Image processing apparatus, imaging apparatus, image processing method, and image processing program
JP5702891B2 (en) Image processing apparatus, imaging apparatus, computer, image processing method and program
JP6217225B2 (en) Image collation device, image collation method and program
US20150103216A1 (en) Image processing device and method and imaging apparatus
CN104871532A (en) Image capture device and operation control method thereof
JP5858012B2 (en) Image processing apparatus, image processing method, and program
JP6286919B2 (en) Image processing apparatus, image processing method, and program

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant