CN108109108A - A kind of image split-joint method and device based on cosine similarity adaptive algorithm - Google Patents
A kind of image split-joint method and device based on cosine similarity adaptive algorithm Download PDFInfo
- Publication number
- CN108109108A CN108109108A CN201611056684.2A CN201611056684A CN108109108A CN 108109108 A CN108109108 A CN 108109108A CN 201611056684 A CN201611056684 A CN 201611056684A CN 108109108 A CN108109108 A CN 108109108A
- Authority
- CN
- China
- Prior art keywords
- edge
- image
- edge image
- pixel
- formula
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformation in the plane of the image
- G06T3/40—Scaling the whole image or part thereof
- G06T3/4038—Scaling the whole image or part thereof for image mosaicing, i.e. plane images composed of plane sub-images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Image Analysis (AREA)
Abstract
An embodiment of the present invention provides a kind of image mosaic technologies and device based on cosine similarity adaptive algorithm, are related to regarding networking technology field.Wherein, this method includes:Obtain the digital picture of each fragment of object;Gray processing processing is carried out to digital picture, obtains gray level image;Binaryzation is carried out to gray level image, obtains binary image;Each edge image is obtained from binary image;Respectively using each edge image as first edge image, the second edge image consistent with first edge image pixel is obtained;For each first edge image, the cosine similar value between the first edge image and each second edge image is calculated;For the first edge image of cosine similar value maximum and second edge image, the original image where first edge image and the digital picture where second edge image are spliced.This method can cause the accuracy of image mosaic to improve, and reduce the time of the time of splicing.
Description
Technical field
The present invention relates to regarding networking technology field, more particularly to a kind of image based on cosine similarity adaptive algorithm
Joining method and device.
Background technology
Image mosaic technology is exactly that the image that several have lap is combined into the technology of a seamless high-definition picture.
Available for being spliced to cultural relic fragments or precious document and photo fragment spliced.
It is at present that image is subjected to local mesh subdivision for the means that image mosaic technology mainly uses, based on local optimum
Solution is set about from local boundary, and the situation of local angle point is coordinated to identify whether to achieve the effect that best match.
Inventor has found in the application prior art, and a large amount of cumbersome calculating are generally required when splicing piece image, are made
It obtains during image mosaic, reparation speed is slow, precision is low.
The content of the invention
In view of the above problems, it is proposed that the embodiment of the present invention overcomes the above problem or at least partly in order to provide one kind
A kind of image mosaic technology based on cosine similarity adaptive algorithm and corresponding one kind to solve the above problems is based on cosine
The image splicing device of similitude adaptive algorithm.
To solve the above-mentioned problems, the embodiment of the invention discloses a kind of images based on cosine similarity adaptive algorithm
Joining method, which is characterized in that including:
Obtain the digital picture of each fragment of object;
Gray processing processing is carried out to each above-mentioned digital picture, obtains each gray level image;
Binaryzation is carried out to each above-mentioned gray level image, obtains binary image;
Each edge image is obtained from each binary image;
Respectively using each edge image as first edge image, from other edges outside above-mentioned first edge image
In image, the second edge image consistent with above-mentioned first edge image pixel is obtained;
For each first edge image, when there are at least one above-mentioned second edge image, then calculating above-mentioned first side
Cosine similar value between edge image and each second edge image;
It, will be where above-mentioned first edge image for the first edge image of cosine similar value maximum and second edge image
Original image and above-mentioned second edge image where digital picture spliced.
Preferably, the above-mentioned image split-joint method based on cosine similarity adaptive algorithm, which is characterized in that for each
First edge image, when there are at least one above-mentioned second edge image, then calculating above-mentioned first edge image and each second side
The step of cosine similar value between edge image, including:
Obtain above-mentioned first edge image and each above-mentioned second edge image;
According to above-mentioned first edge image and each above-mentioned second edge image, calculate above-mentioned first edge with it is each above-mentioned
Each cosine similar value of second edge.
Preferably, the above-mentioned image split-joint method based on cosine similarity adaptive algorithm, which is characterized in that from each two
The step of each edge image being obtained in value image, including:
From each binary image each edge image is obtained according in formula one, formula two, formula three, formula four;
The formula for obtaining above-mentioned edge image is as follows:
Formula one:Ai=[Ii:1], left hand edge image is extracted;Wherein, AiRepresent the picture element matrix of left hand edge image;IiTable
Show the gray value of i-th of edge pixel;Above-mentioned gray value represents the brightness value of pixel in above-mentioned binary image;
Formula two:Bj=[Jj:1], right hand edge image is extracted;Wherein, BjRepresent the picture element matrix of right hand edge image;JjTable
Show the gray value of j-th of edge pixel;Above-mentioned gray value represents the brightness value of pixel in above-mentioned binary image;
Formula three:Ck=[1:Kk], extract top edge image;Wherein, CkRepresent the picture element matrix of top edge image;KkTable
Show the gray value of k-th of edge pixel;Above-mentioned gray value represents the brightness value of pixel in above-mentioned binary image;
Formula four:Dh=[1:Hh], extract lower edge image;Wherein, DhRepresent the picture element matrix of lower edge image;HhTable
Show the gray value of h-th of edge pixel;Above-mentioned gray value represents the brightness value of pixel in above-mentioned binary image.
Preferably, the above-mentioned image split-joint method based on cosine similarity adaptive algorithm, which is characterized in that according to above-mentioned
First edge image and each above-mentioned second edge image calculate each cosine of above-mentioned first edge and each above-mentioned second edge
The step of similar value, including:
First edge image represents that second edge image is represented using picture element matrix B ', matrix using picture element matrix A
B represents the transposed matrix of above-mentioned picture element matrix B ';Above-mentioned first edge image is as above-mentioned second edge image pixel;
The cosine similar value formula of above-mentioned first edge image and second edge image is calculated according to formula five;
Similarity represents cosine similar value;Cos (θ) represents the cosine function that angle is θ;AB represents two matrix multiples;‖
The modular multiplication of A ‖ ‖ B ‖ representing matrixes A is with the mould of matrix B;
Wherein, above-mentioned formula five is:
According to another aspect of the present invention, a kind of image mosaic dress based on cosine similarity adaptive algorithm is provided
It puts, including:
A kind of image splicing device based on cosine similarity adaptive algorithm, which is characterized in that including:
Image digitazation module:For obtaining the digital picture of each fragment of object;
Gradation of image processing module:For carrying out gray processing processing to each above-mentioned digital picture, each gray-scale map is obtained
Picture;
Binarization block:For carrying out binaryzation to each above-mentioned gray level image, binary image is obtained;
Edge acquisition module:For obtaining each edge image from each binary image;
Edge matching module:For respectively using each edge image as first edge image, from above-mentioned first edge
In other edge images outside image, the second edge image consistent with above-mentioned first edge image pixel is obtained;
Cosine similar modular blocks:For for each first edge image, when there are at least one above-mentioned second edge image,
Then calculate the cosine similar value between above-mentioned first edge image and each second edge image;
Matching module:For for the first edge image of cosine similar value maximum and second edge image, by above-mentioned
The digital picture where original image and above-mentioned second edge image where one edge image is spliced.
Preferably, the above-mentioned image splicing device based on cosine similarity adaptive algorithm, which is characterized in that above-mentioned cosine
Similar modular blocks include:
Obtain edge pixel matrix module:Obtain above-mentioned first edge image and each above-mentioned second edge image;
Calculate cosine similar value module:According to above-mentioned first edge image and each above-mentioned second edge image, in acquisition
State each cosine similar value of first edge and each above-mentioned second edge.
Preferably, the above-mentioned image splicing device based on cosine similarity adaptive algorithm, which is characterized in that above-mentioned edge
Acquisition module includes:
Left hand edge extraction module:For extracting left hand edge image according to formula one;
Right hand edge extraction module:For extracting right hand edge image according to formula two;
Top edge extraction module:For extracting top edge image according to formula three;
Lower edge extraction module:For extracting lower edge image according to formula four;
Wherein,
Formula one:Ai=[Ii:1], left hand edge image is extracted;Wherein, AiRepresent the picture element matrix of left hand edge image;IiTable
Show the gray value of i-th of edge pixel;Above-mentioned gray value represents the brightness value of pixel in above-mentioned binary image;
Formula two:Bj=[Jj:1], right hand edge image is extracted;Wherein, BjRepresent the picture element matrix of left hand edge image;JjTable
Show the gray value of j-th of edge pixel;Above-mentioned gray value represents the brightness value of pixel in above-mentioned binary image;
Formula three:Ck=[1:Kk], extract top edge image;Wherein, CkRepresent the picture element matrix of left hand edge image;KkTable
Show the gray value of k-th of edge pixel;Above-mentioned gray value represents the brightness value of pixel in above-mentioned binary image;
Formula four:Dh=[1:Hh], extract lower edge image;Wherein, DhRepresent the picture element matrix of left hand edge image;HhTable
Show the gray value of h-th of edge pixel;Above-mentioned gray value represents the brightness value of pixel in above-mentioned binary image.
Preferably, the above-mentioned image splicing device based on cosine similarity adaptive algorithm, which is characterized in that calculate cosine
Similar value module includes:
Calculate cosine similar value submodule:For calculating above-mentioned first edge image and second edge image according to formula five
Cosine similar value.
Wherein, above-mentioned formula five is:
A in formula five represents the picture element matrix of first edge image, and B represents turning for the picture element matrix of second edge image
It puts.
The embodiment of the present invention includes advantages below:
The embodiment of the present invention applies the characteristic regarding networking, the characteristics of using cosine similarity, calculates two fragmentation pattern pictures
The common edge having so that the accuracy of image mosaic improves, and is for big spirogram due to carrying out image mosaic process
Piece parallel computation, the time that can make splicing based on cosine similarity are reduced.
Description of the drawings
Fig. 1 is a kind of networking schematic diagram for regarding networking of the present invention;
Fig. 2 is a kind of hardware architecture diagram of node server of the present invention;
Fig. 3 is a kind of hardware architecture diagram of access switch of the present invention;
Fig. 4 is that a kind of Ethernet association of the present invention turns the hardware architecture diagram of gateway;
Fig. 5 is a kind of step flow chart of image split-joint method based on cosine similarity adaptive algorithm of the present invention;
Fig. 6 is a kind of step flow chart of image split-joint method based on cosine similarity adaptive algorithm of the present invention;
Fig. 6 a-1 to Fig. 6 a-19 are the digital picture signals of the article fragment of required splicing in the embodiment of the present invention two
Figure;
Fig. 6 b are the gray level images represented in the embodiment of the present invention two using grayscale value;
Fig. 6 c are the binary images of fragment in the embodiment of the present invention two;
Fig. 6 d are the images obtained in the embodiment of the present invention two by splicing;
Fig. 7 is a kind of device of image split-joint method based on cosine similarity adaptive algorithm of the present invention;
Fig. 8 is a kind of device of image split-joint method based on cosine similarity adaptive algorithm of the present invention.
Specific embodiment
In order to make the foregoing objectives, features and advantages of the present invention clearer and more comprehensible, it is below in conjunction with the accompanying drawings and specific real
Applying mode, the present invention is described in further detail.
It is the important milestone of network Development depending on networking, is a real-time network, can realize HD video real-time Transmission,
Push numerous the Internet, applications to HD video, high definition is face-to-face.
It, can be such as high in a network platform by required service depending on networking using real-time high-definition video switching technology
Clear video conference, video monitoring, emergency command, digital broadcast television, delay TV, the Web-based instruction, show Intellectualized monitoring analysis
Field live streaming, VOD program requests, TV Mail, individual character recording (PVR), Intranet (manage) channel, intelligent video Broadcast Control, information issue by oneself
The services such as etc. tens of kinds videos, voice, picture, word, communication, data are all incorporated into a system platform, by TV or
Computer realizes that high-definition quality video plays.
For those skilled in the art is made to more fully understand the embodiment of the present invention, below to being introduced depending on networking:
Depending on networking, applied portion of techniques is as described below:
Network technology (Network Technology)
Network technology innovation depending on networking improves traditional ethernet (Ethernet), potential huge on network to face
Video flow.(Circuit is exchanged different from simple network packet packet switch (Packet Switching) or lattice network
Switching), Streaming demands are met using Packet Switching depending on networking technology.Possesses grouping depending on networking technology
Flexible, the simple and low price exchanged, is provided simultaneously with quality and the safety assurance of circuit switching, it is virtually electric to realize the whole network switch type
Road and the seamless connection of data format.
Switching technology (Switching Technology)
Two advantages of asynchronous and packet switch of Ethernet are used depending on networking, eliminating Ethernet on the premise of full compatibility lacks
It falls into, possesses the end-to-end seamless connection of the whole network, direct user terminal directly carries IP data packets.User data is in network-wide basis
It is not required to any format conversion.It is the higher level form of Ethernet depending on networking, is a real-time exchange platform, can realizes at present mutually
The whole network large-scale high-definition realtime video transmission that networking can not be realized pushes numerous Internet video applications to high Qinghua, unitizes.
Server technology (Server Technology)
It is different from traditional server, its Streaming Media depending on the server technology in networking and unified video platform
Transmission be built upon it is connection-oriented on the basis of, data-handling capacity is unrelated with flow, communication time, and single network layer is with regard to energy
Enough include signaling and data transmission.For voice and video business, handled depending on networking and unified video platform Streaming Media
Complexity many simpler than data processing, efficiency substantially increase hundred times or more than traditional server.
Reservoir technology (Storage Technology)
The ultrahigh speed reservoir technology of unified video platform in order to adapt to the media content of vast capacity and super-flow and
State-of-the-art real time operating system is employed, the programme information in server instruction is mapped to specific hard drive space, media
Content is no longer pass through server, and moment is directly delivered to user terminal, and user waits typical time to be less than 0.2 second.It optimizes
Sector distribution greatly reduces the mechanical movement of hard disc magnetic head tracking, and resource consumption only accounts for the 20% of ad eundem IP internets, but
The concurrent flow more than 3 times of traditional disk array is generated, overall efficiency promotes 10 times or more.
Network security technology (Network Security Technology)
Structural design depending on networking by servicing independent licence system, equipment and the modes such as user data is completely isolated every time
The network security problem of puzzlement internet has thoroughly been eradicated from structure, antivirus applet, fire wall is generally not required, has prevented black
Visitor and the attack of virus, provide structural carefree secure network to the user.
Service innovative technology (Service Innovation Technology)
Business and transmission are merged by unified video platform, whether single user, private user or a net
The sum total of network is all only once to connect automatically.User terminal, set-top box or PC are attached directly to unified video platform, obtain rich
The multimedia video service of rich colorful various forms.Unified video platform is traditional to substitute with table schema using " menu type "
Complicated applications program, and can use considerably less code that complicated application can be realized, and realize the new business innovation of " endless ".
Networking depending on networking is as described below:
It is a kind of central controlled network structure depending on networking, which can be Tree Network, Star network, ring network etc. class
Type, but centralized control node is needed to control whole network in network on this basis.
As shown in Figure 1, it is divided into access net and Metropolitan Area Network (MAN) two parts depending on networking.
The equipment of access mesh portions can be mainly divided into 3 classes:Node server, access switch, terminal is (including various machines
Top box, encoding board, memory etc.).Node server is connected with access switch, and access switch can be with multiple terminal phases
Even, and Ethernet can be connected.
Wherein, node server is the node that centralized control functions are played in access net, can control access switch and terminal.
Node server can directly be connected with access switch, can also directly be connected with terminal.
Similar, the equipment of metropolitan area mesh portions can also be divided into 3 classes:Metropolitan area server, node switch, node serve
Device.Metropolitan area server is connected with node switch, and node switch can be connected with multiple node servers.
Wherein, node server is the node server for accessing mesh portions, i.e. node server had both belonged to access wet end
Point, and belong to metropolitan area mesh portions.
Metropolitan area server is the node that centralized control functions are played in Metropolitan Area Network (MAN), can control node switch and node serve
Device.Metropolitan area server can be directly connected to node switch, can also be directly connected to node server.
It can be seen that be entirely a kind of central controlled network structure of layering depending on networking network, and node server and metropolitan area
The network controlled under server can be the various structures such as tree-shaped, star-like, annular.
Visually claim, access mesh portions can form unified video platform (empty thiol point), and multiple unified videos are put down
Platform can be formed regarding networking;Each unified video platform can be interconnected by metropolitan area and wide area depending on networking.
Classify depending on networked devices
1.1 embodiment of the present invention can be mainly divided into 3 classes depending on the equipment in networking:Server, interchanger is (including ether
Net gateway), terminal (including various set-top boxes, encoding board, memory etc.).Depending on networking can be divided on the whole Metropolitan Area Network (MAN) (or
National net, World Wide Web etc.) and access net.
1.2 equipment for wherein accessing mesh portions can be mainly divided into 3 classes:Node server, access switch is (including ether
Net gateway), terminal (including various set-top boxes, encoding board, memory etc.).
The particular hardware structure of each access network equipment is:
Node server:
As shown in Fig. 2, mainly include Network Interface Module 201, switching engine module 202, CPU module 203, disk array
Module 204;
Wherein, Network Interface Module 201, the Bao Jun that CPU module 203, disk array module 204 are come in enter switching engine
Module 202;Switching engine module 202 to the bag come in look into the operation of address table 205, so as to obtain the navigation information of bag;
And the bag is stored according to the navigation information of bag the queue of corresponding pack buffer 206;If the queue of pack buffer 206 approaches
It is full, then it abandons;All pack buffer queues of 202 poll of switching engine mould, are forwarded to if meeting the following conditions:1) port
Send caching less than;2) the queue package counting facility is more than zero.Disk array module 204 mainly realizes the control to hard disk, including
The operations such as initialization, read-write to hard disk;CPU module 203 is mainly responsible between access switch, terminal (not shown)
Protocol processes, to address table 205 (including descending protocol packet address table, uplink protocol package address table, data packet addressed table)
Configuration and, the configuration to disk array module 204.
Access switch:
As shown in figure 3, mainly include Network Interface Module (downstream network interface module 301, uplink network interface module
302), switching engine module 303 and CPU module 304;
Wherein, the bag (upstream data) that downstream network interface module 301 is come in enters bag detection module 305;Bag detection mould
Whether mesh way address (DA), source address (SA), type of data packet and the packet length of the detection bag of block 305 meet the requirements, if met,
Corresponding flow identifier (stream-id) is then distributed, and into switching engine module 303, is otherwise abandoned;Uplink network interface mould
The bag (downlink data) that block 302 is come in enters switching engine module 303;The data packet that CPU module 204 is come in enters switching engine
Module 303;Switching engine module 303 to the bag come in look into the operation of address table 306, so as to obtain the navigation information of bag;
It is gone if the bag into switching engine module 303 is downstream network interface toward uplink network interface, with reference to flow identifier
(stream-id) bag is stored in the queue of corresponding pack buffer 307;If the queue of the pack buffer 307 is close full,
It abandons;If the bag into switching engine module 303 is not that downstream network interface is gone toward uplink network interface, according to bag
Navigation information is stored in the data packet queue of corresponding pack buffer 307;If the queue of the pack buffer 307 is close full,
Then abandon.
All pack buffer queues of 303 poll of switching engine module, are divided to two kinds of situations in embodiments of the present invention:
It is gone if the queue is downstream network interface toward uplink network interface, meets the following conditions and be forwarded to:1)
The port send caching less than;2) the queue package counting facility is more than zero;3) token that rate control module generates is obtained;
It is gone if the queue is not downstream network interface toward uplink network interface, meets the following conditions and be forwarded to:
1) port send caching less than;2) the queue package counting facility is more than zero.
Rate control module 208 is configured by CPU module 204, to all downlink networks in programmable interval
The pack buffer queue that interface is gone toward uplink network interface generates token, to control the code check of forwarded upstream.
CPU module 304 is mainly responsible for the protocol processes between node server, configuration to address table 306 and,
Configuration to rate control module 308.
Ethernet association turns gateway:
As shown in figure 4, mainly include Network Interface Module (downstream network interface module 401, uplink network interface module
402), switching engine module 403, CPU module 404, bag detection module 405, rate control module 408, address table 406, Bao Huan
Storage 407 and MAC add modules 409, MAC removing modules 410.
Wherein, the data packet that downstream network interface module 401 is come in enters bag detection module 405;Bag detection module 405 is examined
The ethernet mac DA of measured data bag, ethernet mac SA, Ethernet length or frame type, regarding networking mesh way address
DA, whether meet the requirements depending on networking source address SA, depending on networking data Packet type and packet length, corresponding stream is distributed if meeting
Identifier (stream-id);Then, MAC DA, MAC SA, length or frame type are subtracted by MAC removing modules 410
(2byte), and enter corresponding order caching, otherwise abandon;
Downstream network interface module 401 detects the transmission caching of the port, if there is Bao Ze regarding with networking mesh according to bag
Address D A knows the ethernet mac DA of corresponding terminal, adds the ethernet mac DA of terminal, Ethernet assists the MAC for turning gateway
SA, Ethernet length or frame type, and send.
The function that Ethernet association turns other modules in gateway is similar with access switch.
Terminal:
Mainly include Network Interface Module, Service Processing Module and CPU module;For example, set-top box mainly connects including network
Mouth mold block, video/audio encoding and decoding engine modules, CPU module;Encoding board mainly includes Network Interface Module, video encoding engine
Module, CPU module;Memory mainly includes Network Interface Module, CPU module and disk array module.
The equipment of 1.3 metropolitan area mesh portions can be mainly divided into 2 classes:Node server, node switch, metropolitan area server.
Wherein, node switch mainly includes Network Interface Module, switching engine module and CPU module;Metropolitan area server mainly includes
Network Interface Module, switching engine module and CPU module are formed.
2nd, regarding networking data package definition
2.1 access network data package definitions
Accessing the data packet of net mainly includes following sections:Destination address (DA), source address (SA), reserve bytes,
payload(PDU)、CRC。
As shown in the table, accessing the data packet of net mainly includes following sections:
DA | SA | Reserved | Payload | CRC |
Wherein:
Destination address (DA) is made of 8 bytes (byte), and first character section represents type (such as the various associations of data packet
Discuss bag, multicast packet, unicast packet etc.), be up to 256 kinds of possibility, the second byte to the 6th byte is metropolitan area net address,
Seven, the 8th bytes are access net address;
Source address (SA) is also to be made of 8 bytes (byte), is defined identical with destination address (DA);
Reserve bytes are made of 2 bytes;
Payload parts have different length according to the type of different datagrams, if being if various protocol packages
64 bytes, if single group unicast packets words be 32+1024=1056 byte, be not restricted to certainly more than 2 kinds;
CRC is made of 4 bytes, and computational methods follow the Ethernet CRC algorithm of standard.
2.2 Metropolitan Area Network (MAN) packet definitions
The topology of Metropolitan Area Network (MAN) is pattern, may there is 2 kinds, connection even of more than two kinds, i.e. node switching between two equipment
2 kinds can be all can exceed that between machine and node server, node switch and node switch, node switch and node server
Connection.But the metropolitan area net address of metropolitan area network equipment is unique, is closed to accurately describe the connection between metropolitan area network equipment
System, introduces parameter in embodiments of the present invention:Label uniquely describes a metropolitan area network equipment.
(Multi-Protocol Label Switch, multiprotocol label are handed over by the definition of label and MPLS in this specification
Change) label definition it is similar, it is assumed that between device A and equipment B there are two connection, then data packet slave device A to equipment B just
There are 2 labels, data packet slave device B to device A also there are 2 labels.Label is divided into label, outgoing label, it is assumed that data packet enters
The label (entering label) of device A is 0x0000, and the label (outgoing label) when this data packet leaves device A may reform into
0x0001.The networking flow of Metropolitan Area Network (MAN) is to enter network process under centralized Control, also means that address distribution, the label of Metropolitan Area Network (MAN)
Distribution is all dominated by metropolitan area server, and node switch, node server are all passive execution, this point with
The label distribution of MPLS is different, and the distribution of the label of MPLS is the result that interchanger, server are consulted mutually.
As shown in the table, the data packet of Metropolitan Area Network (MAN) mainly includes following sections:
DA | SA | Reserved | Label | Payload | CRC |
That is destination address (DA), source address (SA), reserve bytes (Reserved), label, payload (PDU), CRC.Its
In, the form of label may be referred to be defined as below:Label is 32bit, wherein high 16bit retains, only with low 16bit, its position
Put is between the reserve bytes and payload of data packet.
Based on the above-mentioned characteristic regarding networking, it is proposed that one of the core concepts of the embodiments of the present invention, it then follows regarding the association of networking
View, the camera being connected by the set-top box requests server controls of local terminal with the set-top box of opposite end, server are ordered according to the request
The set-top box of opposite end is made to receive and adjusts camera.
With reference to Fig. 5, a kind of image mosaic embodiment based on cosine similarity adaptive algorithm of the present invention is shown
Flow chart of steps specifically may include steps of:
Step S501:Obtain the digital picture of each fragment of object.
Edge after article is damaged can not possibly must be regular straight line or camber line, will splice above
Fragment be referred to as fragment.Fragment after being destroyed such as porcelain class, paper, photo or pottery class and other items.
More than fragment is shot into photo, which is digitized processing, becomes digital picture.It will be to be spliced together
Each fragment of object is required to shooting and is digitized processing into photo, obtains digital picture.Digital picture, also known as number
Image or digital image are that two dimensional image is represented with limited digital numerical value pixel.It is represented by array or matrix, light position
All it is discrete with intensity.Digital picture be digitized by analog image, using pixel as basic element, number can be used
Word computer or the image of digital circuit storage and processing.
Step S502:Gray processing processing is carried out to each digital picture, obtains each gray level image.
Above-mentioned digital picture is largely coloured image, need to coloured image be carried out gray proces, i.e., by each picture
Vegetarian refreshments is changed to 255 kinds by more than 1,600 ten thousand (255*255*255) color variation range planted, and can make subsequent image calculation amount
It reduces.
Step S503:Binaryzation is carried out to each gray level image, obtains binary image.
The grayscale value of above-mentioned gray level image is extracted, image is subjected to binaryzation according to the grayscale value of the described image pixel
Processing;Each pixel of gray level image is represented in the way of 0 or 1.
Above-mentioned image after gray proces, the grayscale value of each pixel can by the digital representation between 0 to 255,
It is 125 to set threshold value, if the value of pixel is more than 125, the value of the pixel is recorded as 1;If the value of pixel is less than 125,
Then the pixel is to being recorded as 0;According to above procedure, the binary picture of the described image after gray proces can be obtained
Picture.
Step S504:Each edge image is obtained from each binary image.
Pixel in the binary image is not extracted for 0 position, outer peripheral one week not for 0 two-value
Change pixel is edge pixel.After edge is proposed, by left hand edge pixel, right hand edge pixel, top edge pixel, lower edge
Pixel is respectively put into matrix, and above-mentioned matrix becomes matrix of edge.
If being spliced according to the matched method of left and right edges, extract left hand edge pixel, right hand edge pixel and put respectively
Enter in matrix of edge;
If or, spliced according to the matched method of lower edges, extract top edge pixel, lower edge pixel and difference
It is put into matrix of edge;
If or, spliced according to the method for edge matching up and down, extract top edge pixel, lower edge pixel,
Left hand edge pixel, right hand edge pixel are simultaneously respectively put into matrix of edge.
The matrix of edge is one-to-one relationship with the edge image.
Aforesaid operations are carried out to the binary image of each fragment, extract each edge image.
Step S505:Respectively using each edge image as first edge image, from outside the first edge image
Other edge images in, obtain the second edge image consistent with the first edge image pixel.
Respectively using each edge image as first edge pixel, will each edge image and other edge images into
Row matching, i.e., compare the pixel of the first edge and the pixel of other edge images, if pixel number one
Sample, then the edge is the second edge image with first edge images match.
If carry out image mosaic using left and right edges, left hand edge and right hand edge are extracted in step S504.
If carry out image mosaic using lower edges, top edge and lower edge are extracted in step S504.
Wherein, number foundation is divided into 0 existing for second edge, then it represents that does not have with the matched edge of first edge, first
Edge is expressed as the edge of the object of required splicing, then the picture using this first edge as common edge is not present.
Step S506:For each first edge image, when there are at least one second edge image, then calculating institute
State the cosine similar value between first edge image and each second edge image;
Each edge image is extracted respectively as first edge image, and corresponding second edge pixel carries out calculating cosine
Similar value.The scope of the cosine similar value is 0 to 1, and cosine similar value is bigger, represents the edge of first edge and second edge
Matching degree is higher.The each cosine similar value of record and first edge pixel, the correspondence of second edge pixel.
Such as, the first edge image be a1, corresponding second edge image be b1, b2, b3, similar value c11 tables
Show the cosine similar value of first edge image a1 and second edge image b1;C12 represents first edge image a1 and second edge
The cosine similar value of image b2;C13 represents the cosine similar value of first edge image a1 and second edge image b3.By cosine phase
It is put into like value in matrix, then for c11 in the first row the first row of matrix, representative is first edge image a1 and second edge image
The cosine similar value of b1.
The cosine similar value is put into cosine similar matrix in the manner described above, and record the cosine similar value with
The correspondence of the first edge matrix and the second edge matrix.
Step S507:For the first edge image of cosine similar value maximum and second edge image, by first side
The digital picture picture where digital picture and the second edge image where edge image is spliced.
According to the cosine similar value of each first edge image, the corresponding cosine similar value of the first edge image is obtained
Maximum illustrates the digital picture of the first edge image and the second edge image where first edge image and the second side
The common edge of digital picture where edge image, i.e. above-mentioned two digital image are from first edge image and second edge image
Place's fracture.
By the original image where each first edge image and corresponding cosine similar value it is maximum described second
Digital picture picture where edge image is spliced, which can be completed splicing.
In the embodiment of the present invention, preliminary matches are carried out according to number of pixels to the edge of the fragment, then calculates and corresponds to
The cosine similar value at edge quickly can accurately find the fragment of common first edges so that and matching accuracy is greatly improved,
The time that splicing uses shortens.
With reference to Fig. 6, show that a kind of edge image splicing based on cosine similarity adaptive algorithm of the present invention is implemented
The step flow chart of example, specifically may include steps of:
Step S601:Obtain the digital picture of each fragment of object.
Edge after article is damaged can not possibly must be regular straight line or camber line, will splice above
Fragment be referred to as fragment;Fragment after being destroyed such as porcelain class, paper, photo or pottery class and other items.
More than fragment is shot into photo, which is digitized processing, becomes digital type image.Wherein, it is necessary to
Each fragment of the object of splicing is required to shooting and is digitized processing into photo, obtains digital picture.Digital picture, again
Claim digital image or digital image, be that two dimensional image is represented with limited digital numerical value pixel.It is represented by array or matrix, light
All it is discrete according to position and intensity.Digital picture be digitized by analog image, using pixel as basic element, can
With the image stored and processed with digital computer or digital circuit.
Such as Fig. 6 a-1 to Fig. 6 a-19, each fragment digital picture of object to be spliced together is represented;It is selected in the present embodiment
Image is paper chips.
Step S602:Gray processing processing is carried out to each digital picture, obtains each gray level image.
Above-mentioned digital picture is largely coloured image, need to coloured image be carried out gray proces, i.e., by each picture
Vegetarian refreshments is changed to 256 kinds by more than 1,600 ten thousand (256*256*256) color variation range planted, and can make subsequent image calculation amount
It reduces.
As shown in Figure 6 b, by taking single picture as an example, each picture is subjected to gray proces, uses the tables of data between 0~255
Show the brightness of each pixel;Obtain the gray level image represented using grayscale value.
Step S603:Binaryzation is carried out to each gray level image, obtains binary image.
The brightness value of each pixel in above-mentioned gray level image is extracted, above-mentioned brightness value is grayscale value, according to described image
Image is carried out binary conversion treatment by the grayscale value of pixel;I.e. by each pixel of gray level image in the way of 0 or 1 table
Show.
Above-mentioned image after gray proces, the grayscale value of each pixel can by the digital representation between 0 to 255,
It is 125 to set threshold value, if the value of pixel is more than 125, the value of the pixel is recorded as 1;If the value of pixel is less than 125,
Then the pixel is to being recorded as 0;According to above procedure, the binary picture of the described image after gray proces can be obtained
Picture.
Such as Fig. 6 c, a certain gray level image is subjected to binaryzation, obtained binary image.
Step S604:Each edge image is obtained from each binary image.
Pixel in the binary image is not extracted for 0 position, outer peripheral one week not for 0 two-value
Change pixel is edge pixel.After edge is proposed, by left hand edge pixel, right hand edge pixel, top edge pixel, lower edge
Pixel is respectively put into matrix, and above-mentioned matrix becomes matrix of edge.
The left hand edge image of the binary image is extracted according to formula one, is column vector;
The right hand edge image of the binary image is extracted according to formula two, is column vector;
The top edge image of the binary image is extracted according to formula three, is row vector;
The lower edge image of the binary image is extracted according to formula four, is row vector;
Formula one:Ai=[Ii:1], left hand edge image is extracted;Wherein, AiRepresent the picture element matrix of left hand edge image;IiTable
Show the gray value of i-th of edge pixel;The gray value represents the brightness value of pixel in the binary image;
Formula two:Bj=[Jj:1], right hand edge image is extracted;Wherein, BjRepresent the picture element matrix of right hand edge image;JjTable
Show the gray value of j-th of edge pixel;The gray value represents the brightness value of pixel in the binary image;
Formula three:Ck=[1:Kk], extract top edge image;Wherein, CkRepresent the picture element matrix of top edge image;KkTable
Show the gray value of k-th of edge pixel;The gray value represents the brightness value of pixel in the binary image;
Formula four:Dh=[1:Hh], extract lower edge image;Wherein, DhRepresent the picture element matrix of lower edge image;HhTable
Show the gray value of h-th of edge pixel;The gray value represents the brightness value of pixel in the binary image.
Aforesaid operations are carried out to the binary image of each fragment, extract each edge image.
If being spliced according to the matched method of left and right edges, extract left hand edge pixel using formula one, use formula
Two extraction right hand edge pixels are simultaneously respectively put into matrix of edge;
If or, spliced according to the matched method of lower edges, extract top edge pixel using formula three, use public affairs
Four lower edge pixel of formula is simultaneously respectively put into matrix of edge;
If or, spliced according to the method for edge matching up and down, extract top edge pixel using formula three, make
Lower edge pixel is extracted with formula four, left hand edge pixel is extracted using formula one, right hand edge pixel is extracted using formula two and is divided
It is not put into matrix of edge.
Step S605:Respectively using each edge image as first edge image, from outside the first edge image
Other edge images in, obtain the second edge image consistent with the first edge image pixel.
Respectively using each edge image as first edge image, extracted according to the matrix of edge of the first edge image
The pixel of the first edge figure is the pixel of the first edge image;
Extract the pixel of the matrix of edge of other edge images outside the first edge
It is second edge figure with the corresponding edge image of matrix of edge as the number of pixels of the first edge image
Picture.
Wherein, number foundation is divided into 0 existing for second edge, then it represents that does not have with the matched edge of first edge, first
Edge is expressed as the edge of the object of required splicing, then the picture using this first edge as common edge is not present.
Step S606:Obtain the picture element matrix of the first edge and the picture element matrix of each second edge.
According to step S604, the picture element matrix A of the first edge and the pixel square of each second edge are obtained
Battle array B '.
Due to first edge image and second edge images match, matrix A and matrix B ' data amount check as.
Such as, matrix A 1X19, then matrix B ' also it is 1X19.
Step S607:According to the picture element matrix of the first edge and the picture element matrix of each second edge, obtain
Each cosine similar value of the first edge and each second edge.
First edge image represents that second edge image is represented using picture element matrix B ' using picture element matrix A, wherein
Matrix B represents the transposed matrix of the picture element matrix B '.
The cosine similar value formula of the first edge image and second edge image is calculated according to formula five;
Similarity represents cosine similar value;Cos (θ) represents the cosine function that angle is θ;AB represents two matrix multiples;‖
The modular multiplication of A ‖ ‖ B ‖ representing matrixes A is with the mould of matrix B;
Wherein, the formula five is:
According to matrix multiple and the multiplied result of matrix modulus, then formula five can be converted into:
Wherein, n representing matrixes A or matrix B ' data bulk, if matrix be 1x19 matrix, then n represent 19;It is asking
In formula, the i=1 in ∑ is represented, is calculated since when i is equal to 1, is terminated when calculating always to n;N and i is positive integer.
If matrix A is 1x19;Matrix B ' it is 1x19;Then
AB=A1*B1+A2*B2+A3*B3+…+A18*B18+A19*B19;
Record the above results are XS, then XS=AB.
Such as the matrix that A is 1x19, B is the matrix of 1x19, then according to the rule of matrix multiple, XS is the matrix of 1x1, that is, is counted
Value.
‖ A ‖ represent that matrix A modulus, since A is vector, the method for modulus is to put down each element in matrix A
It sums, then extracts square root behind side;Method to matrix B modulus is will to sum after each element square in matrix B, then is extracted square root.
Using T representing matrixes A and the modulus product of matrix B, then formula six is drawn.
Formula six is:T=sqrt ((sum (A) ^2) * (sum (B) ^2))
According to foregoing description, it can derive that formula seven is by formula five:
Formula seven:
The first edge and each cosine similar value of each second edge image are calculated according to formula seven, and with the first side
Edge image is each cosine similar value described in row label record and first edge image and the correspondence of second edge image.
Step S608:For the first edge image of cosine similar value maximum and second edge image, by first side
The digital picture picture where original image and the second edge image where edge image is spliced.
Each cosine similar value is calculated according to formula seven, it is corresponding to extract each first edge image according to formula eight
Maximum cosine similar value.
Formula eight:Mi=line (max (max (similarity, i))), wherein i represent the row number of first edge image.
MiCorresponding first edge image and the second edge graphical representation first edge image and the second edge image
For the common edge of the digital picture where the digital picture where first edge image and second edge image, i.e. above-mentioned two number
Group picture seems to be broken at first edge image and second edge image.
By the original image where each first edge image and corresponding cosine similar value it is maximum described second
Digital picture picture where edge image is spliced, which can be completed splicing.
Such as Fig. 6 d, document fragment is spliced in the manner described above, you can obtain complete document.
Using the splicing of document in the present invention, historical relic, porcelain or other objects can also be spelled in practice
It connects.The present invention is to the object of splicing without limiting.
In the embodiment of the present invention, preliminary matches are carried out according to number of pixels to the edge of the fragment, then calculates and corresponds to
The cosine similar value at edge quickly can accurately find the fragment of common first edges so that and matching accuracy is greatly improved,
The time that splicing uses shortens.
It should be noted that for embodiment of the method, in order to be briefly described, therefore it is all expressed as to a series of action group
It closes, but those skilled in the art should know, the embodiment of the present invention and from the limitation of described sequence of movement, because according to
According to the embodiment of the present invention, some steps may be employed other orders or be carried out at the same time.Secondly, those skilled in the art also should
Know, embodiment described in this description belongs to preferred embodiment, and the involved action not necessarily present invention is implemented
Necessary to example.
With reference to Fig. 7, a kind of image mosaic embodiment based on cosine similarity adaptive algorithm of the present invention is shown
Installation drawing can specifically include following module:
Image digitazation module 701:For obtaining the digital picture of each fragment of object;
Gradation of image processing module 702:For carrying out gray processing processing to each digital picture, each gray scale is obtained
Image;
Binarization block 703:For carrying out binaryzation to each gray level image, binary image is obtained;
Edge acquisition module 704:For obtaining each edge image from each binary image;
Edge matching module 705:For respectively using each edge image as first edge image, from first side
In other edge images outside edge image, the second edge image consistent with the first edge image pixel is obtained;
Cosine similar modular blocks 706:For for each first edge image, when there are at least one second edge figures
Picture then calculates the cosine similar value between the first edge image and each second edge image;
Matching module 707:For for the first edge image of cosine similar value maximum and second edge image, will described in
The digital picture where original image and the second edge image where first edge image is spliced.
Wherein, each fragment of the object of required splicing by optics is shown by image digitazation module 701 and be converted into
Digital picture;Each digital picture carries out gray proces by gradation processing module 702, obtains each gray level image;Two
Value module 703 is according to the grayscale value of each pixel of each gray level image, by above-mentioned each pixel of each gray level image with 0
Or 1 represent, obtain each binary image;Above-mentioned each binary image is by edge acquisition module 704, by above-mentioned each binary picture
The edge extracting of picture out be input to edge matching module 705, obtain each first edge image and with each first edge figure
As the consistent second edge image of pixel;It is similar that each first edge image with corresponding second edge image is input to cosine
In module 706, first edge image and each cosine similar value of second edge image are obtained;Each cosine similar value is input to
In matching module 707, according to the first edge image of cosine similar value maximum and second edge image, by the first edge figure
As the digital picture where the original image at place and the second edge image is spliced.
In the embodiment of the present invention, preliminary matches are carried out according to number of pixels to the edge of the fragment, then calculates and corresponds to
The cosine similar value at edge quickly can accurately find the fragment of common first edges so that and matching accuracy is greatly improved,
The time that splicing uses shortens.
With reference to Fig. 8, a kind of image mosaic embodiment based on cosine similarity adaptive algorithm of the present invention is shown
Installation drawing can specifically include following module:
Image digitazation module 801:For obtaining the digital picture of each fragment of object;
Gradation of image processing module 802:For carrying out gray processing processing to each digital picture, each gray scale is obtained
Image;
Binarization block 803:For carrying out binaryzation to each gray level image, binary image is obtained;
Edge acquisition module 804:For obtaining each edge image from each binary image;
Edge matching module 805:For respectively using each edge image as first edge image, from first side
In other edge images outside edge image, the second edge image consistent with the first edge image pixel is obtained;
Cosine similar modular blocks 806:For for each first edge image, when there are at least one second edge figures
Picture then calculates the cosine similar value between the first edge image and each second edge image;
Matching module 807:For for the first edge image of cosine similar value maximum and second edge image, will described in
The digital picture where original image and the second edge image where first edge image is spliced.
Preferably, cosine similar modular blocks 806 further include:
Obtain edge pixel matrix module 8061:For obtaining the picture element matrix of the first edge and each described second
The picture element matrix at edge;
Calculate cosine similar value module 8062:For the picture element matrix according to the first edge and each second side
The picture element matrix of edge by formula five, obtains each cosine similar value of the first edge and each second edge.
Wherein, each fragment of the object of required splicing by optics is shown by image digitazation module 801 and be converted into
Digital picture;Each digital picture carries out gray proces by gradation processing module 802, obtains each gray level image;Two
Value module 803 is according to the grayscale value of each pixel of each gray level image, by above-mentioned each pixel of each gray level image with 0
Or 1 represent, obtain each binary image;Above-mentioned each binary image is by edge acquisition module 804, by above-mentioned each binary picture
The edge extracting of picture out be input to edge matching module 805, obtain each first edge image and with each first edge figure
As the consistent second edge image of pixel;It is similar that each first edge image with corresponding second edge image is input to cosine
In acquisition edge pixel matrix module 8061 in module 806, edge pixel matrix module 8061 is obtained by first edge matrix
With second edge Input matrix to cosine similar value module 8062 is calculated, according to the picture element matrix of the first edge and each institute
The picture element matrix of second edge is stated, by formula five, obtains each cosine phase of the first edge and each second edge
Like value;Each cosine similar value is input in matching module 807, according to the first edge image of cosine similar value maximum and
Second edge image, by the original image where the first edge image and the digital picture where the second edge image
Spliced.
Preferably, if carrying out fragments mosaicing using the matched mode of left and right edges, edge acquisition module 804 uses the left side
Edge extraction module 8041 is extracted the edge pixel matrix of left hand edge, the edge of left hand edge is extracted using right hand edge extraction module 8042
Picture element matrix;
If or, carry out fragments mosaicing using the matched mode of lower edges, edge acquisition module 804 is carried using top edge
Modulus block 8043 is extracted the edge pixel matrix of left hand edge, the edge pixel of lower edge is extracted using lower edge extraction module 8044
Matrix;
If or, carry out fragments mosaicing using the mode of edge matching up and down, edge acquisition module 804 uses the left side
Edge extraction module 8041 is extracted left hand edge image, right hand edge image is extracted using right hand edge extraction module 8042, uses top edge
Extraction module 8043 extracts top edge image, extracts lower edge image using lower edge extraction module 8044.
It calculates cosine similar value submodule 8063 and receives the edge image that edge acquisition module 804 extracts, calculate the
The cosine similar value of one edge image and each corresponding edge image.
For device embodiment, since it is basicly similar to embodiment of the method, so description is fairly simple, it is related
Part illustrates referring to the part of embodiment of the method.
In the embodiment of the present invention, preliminary matches are carried out according to number of pixels to the edge of the fragment, then calculates and corresponds to
The cosine similar value at edge quickly can accurately find the fragment of common first edges so that and matching accuracy is greatly improved,
The time that splicing uses shortens.
Each embodiment in this specification is described by the way of progressive, the highlights of each of the examples are with
The difference of other embodiment, just to refer each other for identical similar part between each embodiment.
It should be understood by those skilled in the art that, the embodiment of the embodiment of the present invention can be provided as method, apparatus or calculate
Machine program product.Therefore, the embodiment of the present invention can be used complete hardware embodiment, complete software embodiment or combine software and
The form of the embodiment of hardware aspect.Moreover, the embodiment of the present invention can be used one or more wherein include computer can
With in the computer-usable storage medium (including but not limited to magnetic disk storage, CD-ROM, optical memory etc.) of program code
The form of the computer program product of implementation.
The embodiment of the present invention be with reference to according to the method for the embodiment of the present invention, terminal device (system) and computer program
The flowchart and/or the block diagram of product describes.It should be understood that it can realize flowchart and/or the block diagram by computer program instructions
In each flow and/or block and flowchart and/or the block diagram in flow and/or box combination.These can be provided
Computer program instructions are set to all-purpose computer, special purpose computer, Embedded Processor or other programmable data processing terminals
Standby processor is to generate a machine so that is held by the processor of computer or other programmable data processing terminal equipments
Capable instruction generation is used to implement in one flow of flow chart or multiple flows and/or one box of block diagram or multiple boxes
The device for the function of specifying.
These computer program instructions, which may also be stored in, can guide computer or other programmable data processing terminal equipments
In the computer-readable memory to work in a specific way so that the instruction being stored in the computer-readable memory generates bag
The manufacture of command device is included, which realizes in one flow of flow chart or multiple flows and/or one side of block diagram
The function of being specified in frame or multiple boxes.
These computer program instructions can be also loaded into computer or other programmable data processing terminal equipments so that
Series of operation steps is performed on computer or other programmable terminal equipments to generate computer implemented processing, thus
The instruction offer performed on computer or other programmable terminal equipments is used to implement in one flow of flow chart or multiple flows
And/or specified in one box of block diagram or multiple boxes function the step of.
Although the preferred embodiment of the embodiment of the present invention has been described, those skilled in the art once know base
This creative concept can then make these embodiments other change and modification.So appended claims are intended to be construed to
Including preferred embodiment and fall into all change and modification of range of embodiment of the invention.
Finally, it is to be noted that, herein, relational terms such as first and second and the like be used merely to by
One entity or operation are distinguished with another entity or operation, without necessarily requiring or implying these entities or operation
Between there are any actual relationship or orders.Moreover, term " comprising ", "comprising" or its any other variant meaning
Covering non-exclusive inclusion, so that process, method, article or terminal device including a series of elements are not only wrapped
Those elements are included, but also including other elements that are not explicitly listed or are further included as this process, method, article
Or the element that terminal device is intrinsic.In the absence of more restrictions, it is wanted by what sentence "including a ..." limited
Element, it is not excluded that also there are other identical elements in the process including the element, method, article or terminal device.
The electron focusing of the electron focusing method to a kind of camera provided by the present invention and a kind of camera fills above
It puts, is described in detail, specific case used herein is set forth the principle of the present invention and embodiment, more than
The explanation of embodiment is only intended to help the method and its core concept for understanding the present invention;Meanwhile for the general skill of this field
Art personnel, thought according to the invention, there will be changes in specific embodiments and applications, in conclusion this
Description should not be construed as limiting the invention.
Claims (8)
1. a kind of image split-joint method based on cosine similarity adaptive algorithm, which is characterized in that including:
Obtain the digital picture of each fragment of object;
Gray processing processing is carried out to each digital picture, obtains each gray level image;
Binaryzation is carried out to each gray level image, obtains binary image;
Each edge image is obtained from each binary image;
Respectively using each edge image as first edge image, from other edge images outside the first edge image
In, obtain the second edge image consistent with the first edge image pixel;
For each first edge image, when there are at least one second edge image, then calculating the first edge figure
Picture and the cosine similar value between each second edge image;
For the first edge image of cosine similar value maximum and second edge image, by the original where the first edge image
Digital picture where beginning image and the second edge image is spliced.
2. the method according to claim 1, which is characterized in that for each first edge image, when there are at least one described
Second edge image, then the step of calculating the cosine similar value between the first edge image and each second edge image, bag
It includes:
Obtain the first edge image and each second edge image;
According to the first edge image and each second edge image, the first edge and each described second are calculated
Each cosine similar value at edge.
3. the method according to claim 1, which is characterized in that the step of obtaining each edge image from each binary image,
Including:
From each binary image each edge image is obtained according in formula one, formula two, formula three, formula four;
The formula for obtaining the edge image is as follows:
Formula one:Ai=[Ii:1], left hand edge image is extracted;Wherein, AiRepresent the picture element matrix of left hand edge image;IiRepresent i-th
The gray value of a edge pixel;The gray value represents the brightness value of pixel in the binary image;
Formula two:Bj=[Jj:1], right hand edge image is extracted;Wherein, BjRepresent the picture element matrix of right hand edge image;JjRepresent jth
The gray value of a edge pixel;The gray value represents the brightness value of pixel in the binary image;
Formula three:Ck=[1:Kk], extract top edge image;Wherein, CkRepresent the picture element matrix of top edge image;KkRepresent kth
The gray value of a edge pixel;The gray value represents the brightness value of pixel in the binary image;
Formula four:Dh=[1:Hh], extract lower edge image;Wherein, DhRepresent the picture element matrix of lower edge image;HhRepresent h
The gray value of a edge pixel;The gray value represents the brightness value of pixel in the binary image.
4. method according to claim 2, which is characterized in that according to the first edge image and each second edge figure
Picture, the step of calculating each cosine similar value of the first edge and each second edge, including:
First edge image represents that second edge image is represented using picture element matrix B ' using picture element matrix A, matrix B table
Show the transposed matrix of the picture element matrix B ';The first edge image is as the second edge image pixel;
The cosine similar value formula of the first edge image and second edge image is calculated according to formula five;Similarity tables
Show cosine similar value;Cos (θ) represents the cosine function that angle is θ;AB represents two matrix multiples;‖ A ‖ ‖ B ‖ representing matrixes A
Modular multiplication with the mould of matrix B;
Wherein, the formula five is:
<mrow>
<mi>s</mi>
<mi>i</mi>
<mi>m</mi>
<mi>i</mi>
<mi>l</mi>
<mi>a</mi>
<mi>r</mi>
<mi>i</mi>
<mi>t</mi>
<mi>y</mi>
<mo>=</mo>
<mi>cos</mi>
<mrow>
<mo>(</mo>
<mi>&theta;</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mfrac>
<mrow>
<mi>A</mi>
<mo>&CenterDot;</mo>
<mi>B</mi>
</mrow>
<mrow>
<mo>|</mo>
<mo>|</mo>
<mi>A</mi>
<mo>|</mo>
<mo>|</mo>
<mo>|</mo>
<mo>|</mo>
<mi>B</mi>
<mo>|</mo>
<mo>|</mo>
</mrow>
</mfrac>
<mo>.</mo>
</mrow>
5. a kind of image splicing device based on cosine similarity adaptive algorithm, which is characterized in that including:
Image digitazation module:For obtaining the digital picture of each fragment of object;
Gradation of image processing module:For carrying out gray processing processing to each digital picture, each gray level image is obtained;
Binarization block:For carrying out binaryzation to each gray level image, binary image is obtained;
Edge acquisition module:For obtaining each edge image from each binary image;
Edge matching module:For respectively using each edge image as first edge image, from the first edge image
Outside other edge images in, obtain the second edge image consistent with the first edge image pixel;
Cosine similar modular blocks:For for each first edge image, when there are at least one second edge image, then counting
Calculate the cosine similar value between the first edge image and each second edge image;
Matching module:For for the first edge image of cosine similar value maximum and second edge image, by first side
The digital picture where original image and the second edge image where edge image is spliced.
6. device according to claim 5, which is characterized in that the cosine similar modular blocks include:
Obtain edge pixel matrix module:Obtain the first edge image and each second edge image;
Calculate cosine similar value module:According to the first edge image and each second edge image, described the is obtained
One edge and each cosine similar value of each second edge.
7. device according to claim 5, which is characterized in that the edge acquisition module includes:
Left hand edge extraction module:For extracting left hand edge image according to formula one;
Right hand edge extraction module:For extracting right hand edge image according to formula two;
Top edge extraction module:For extracting top edge image according to formula three;
Lower edge extraction module:For extracting lower edge image according to formula four;
Wherein,
Formula one:Ai=[Ii:1], left hand edge image is extracted;Wherein, AiRepresent the picture element matrix of left hand edge image;IiRepresent i-th
The gray value of a edge pixel;The gray value represents the brightness value of pixel in the binary image;
Formula two:Bj=[Jj:1], right hand edge image is extracted;Wherein, BjRepresent the picture element matrix of left hand edge image;JjRepresent jth
The gray value of a edge pixel;The gray value represents the brightness value of pixel in the binary image;
Formula three:Ck=[1:Kk], extract top edge image;Wherein, CkRepresent the picture element matrix of left hand edge image;KkRepresent kth
The gray value of a edge pixel;The gray value represents the brightness value of pixel in the binary image;
Formula four:Dh=[1:Hh], extract lower edge image;Wherein, DhRepresent the picture element matrix of left hand edge image;HhRepresent h
The gray value of a edge pixel;The gray value represents the brightness value of pixel in the binary image.
8. device according to claim 6, which is characterized in that calculating cosine similar value module includes:
Calculate cosine similar value submodule:For being calculated according to formula five more than the first edge image and second edge image
String similar value.
Wherein, the formula five is:
<mrow>
<mi>s</mi>
<mi>i</mi>
<mi>m</mi>
<mi>i</mi>
<mi>l</mi>
<mi>a</mi>
<mi>r</mi>
<mi>i</mi>
<mi>t</mi>
<mi>y</mi>
<mo>=</mo>
<mi>cos</mi>
<mrow>
<mo>(</mo>
<mi>&theta;</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mfrac>
<mrow>
<mi>A</mi>
<mo>&CenterDot;</mo>
<mi>B</mi>
</mrow>
<mrow>
<mo>|</mo>
<mo>|</mo>
<mi>A</mi>
<mo>|</mo>
<mo>|</mo>
<mo>|</mo>
<mo>|</mo>
<mi>B</mi>
<mo>|</mo>
<mo>|</mo>
</mrow>
</mfrac>
<mo>;</mo>
</mrow>
A in formula five represents the picture element matrix of first edge image, and B represents the transposition of the picture element matrix of second edge image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611056684.2A CN108109108A (en) | 2016-11-25 | 2016-11-25 | A kind of image split-joint method and device based on cosine similarity adaptive algorithm |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611056684.2A CN108109108A (en) | 2016-11-25 | 2016-11-25 | A kind of image split-joint method and device based on cosine similarity adaptive algorithm |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108109108A true CN108109108A (en) | 2018-06-01 |
Family
ID=62204461
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201611056684.2A Pending CN108109108A (en) | 2016-11-25 | 2016-11-25 | A kind of image split-joint method and device based on cosine similarity adaptive algorithm |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108109108A (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110069653A (en) * | 2019-03-13 | 2019-07-30 | 平安科技(深圳)有限公司 | Method, apparatus, medium and the electronic equipment of profile diagram search pictures are drawn based on grass |
CN110251004A (en) * | 2019-07-16 | 2019-09-20 | 深圳市杉川机器人有限公司 | Sweeping robot and its cleaning method and computer readable storage medium |
CN111415298A (en) * | 2020-03-20 | 2020-07-14 | 北京百度网讯科技有限公司 | Image splicing method and device, electronic equipment and computer readable storage medium |
CN116996503A (en) * | 2023-08-03 | 2023-11-03 | 纽扣数字智能科技(深圳)集团有限公司 | Desktop image transmission method, system, electronic equipment and medium |
CN117173161A (en) * | 2023-10-30 | 2023-12-05 | 杭州海康威视数字技术股份有限公司 | Content security detection method, device, equipment and system |
CN117257333A (en) * | 2023-11-17 | 2023-12-22 | 深圳翱翔锐影科技有限公司 | True dual-energy X-ray bone densitometer based on semiconductor detector |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101673395A (en) * | 2008-09-10 | 2010-03-17 | 深圳华为通信技术有限公司 | Image mosaic method and image mosaic device |
CN103559697A (en) * | 2013-10-03 | 2014-02-05 | 王浩 | Scrap paper lengthwise cutting splicing and recovering algorithm based on FFT |
CN103679671A (en) * | 2014-01-12 | 2014-03-26 | 王浩 | Transverse and vertical sliced shredded paper splicing and recovery algorithm of FFT (Fast Fourier Transform) integrated comprehensive evaluation method |
-
2016
- 2016-11-25 CN CN201611056684.2A patent/CN108109108A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101673395A (en) * | 2008-09-10 | 2010-03-17 | 深圳华为通信技术有限公司 | Image mosaic method and image mosaic device |
CN103559697A (en) * | 2013-10-03 | 2014-02-05 | 王浩 | Scrap paper lengthwise cutting splicing and recovering algorithm based on FFT |
CN103679671A (en) * | 2014-01-12 | 2014-03-26 | 王浩 | Transverse and vertical sliced shredded paper splicing and recovery algorithm of FFT (Fast Fourier Transform) integrated comprehensive evaluation method |
Non-Patent Citations (1)
Title |
---|
王威娜等: "无重叠的文档碎片拼接方法", 《吉林化工学院学报》 * |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110069653A (en) * | 2019-03-13 | 2019-07-30 | 平安科技(深圳)有限公司 | Method, apparatus, medium and the electronic equipment of profile diagram search pictures are drawn based on grass |
CN110251004A (en) * | 2019-07-16 | 2019-09-20 | 深圳市杉川机器人有限公司 | Sweeping robot and its cleaning method and computer readable storage medium |
CN110251004B (en) * | 2019-07-16 | 2022-03-11 | 深圳市杉川机器人有限公司 | Sweeping robot, sweeping method thereof and computer-readable storage medium |
CN111415298A (en) * | 2020-03-20 | 2020-07-14 | 北京百度网讯科技有限公司 | Image splicing method and device, electronic equipment and computer readable storage medium |
CN111415298B (en) * | 2020-03-20 | 2023-06-02 | 北京百度网讯科技有限公司 | Image stitching method and device, electronic equipment and computer readable storage medium |
CN116996503A (en) * | 2023-08-03 | 2023-11-03 | 纽扣数字智能科技(深圳)集团有限公司 | Desktop image transmission method, system, electronic equipment and medium |
CN117173161A (en) * | 2023-10-30 | 2023-12-05 | 杭州海康威视数字技术股份有限公司 | Content security detection method, device, equipment and system |
CN117173161B (en) * | 2023-10-30 | 2024-02-23 | 杭州海康威视数字技术股份有限公司 | Content security detection method, device, equipment and system |
CN117257333A (en) * | 2023-11-17 | 2023-12-22 | 深圳翱翔锐影科技有限公司 | True dual-energy X-ray bone densitometer based on semiconductor detector |
CN117257333B (en) * | 2023-11-17 | 2024-02-20 | 深圳翱翔锐影科技有限公司 | True dual-energy X-ray bone densitometer based on semiconductor detector |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108109108A (en) | A kind of image split-joint method and device based on cosine similarity adaptive algorithm | |
CN108063672B (en) | A kind of management method and device of video conference terminal | |
CN108418778A (en) | A kind of internet and method, apparatus and interactive system regarding connected network communication | |
CN107995231A (en) | A kind of method and apparatus of remote control equipment | |
CN107979563A (en) | A kind of information processing method and device based on regarding networking | |
CN108075920A (en) | A kind of management method and system regarding networked terminals | |
CN107979760A (en) | The inspection method and device of a kind of live video | |
CN109302451A (en) | A kind of methods of exhibiting and system of picture file | |
CN108632238A (en) | A kind of method and apparatus of permission control | |
CN110149262A (en) | A kind for the treatment of method and apparatus and storage medium of signaling message | |
CN108965224A (en) | A kind of method and apparatus of video on demand | |
CN108881957A (en) | A kind of mixed method and device of multimedia file | |
CN109068089A (en) | A kind of conferencing data generation method and device | |
CN110473545A (en) | A kind of meeting treating method and apparatus based on meeting room | |
CN109788235A (en) | A kind of processing method and system of the minutes information based on view networking | |
CN109766753A (en) | A kind of finger print information acquisition methods and device | |
CN109617830A (en) | A kind of method and apparatus regarding real time demonstration business in networking | |
CN109561273A (en) | The method and apparatus for identifying video conference spokesman | |
CN109040656A (en) | A kind of processing method and system of video conference | |
CN110502548A (en) | A kind of search result recommended method, device and computer readable storage medium | |
CN107959658A (en) | A kind of Web conference method of data synchronization and its system | |
CN109743555A (en) | A kind of information processing method and system based on view networking | |
CN109525663A (en) | A kind of methods of exhibiting and system of video data | |
CN109302384A (en) | A kind of processing method and system of data | |
CN110798648A (en) | Video conference processing method and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information |
Address after: 100000 Dongcheng District, Beijing, Qinglong Hutong 1, 1103 house of Ge Hua building. Applicant after: Video Link Power Information Technology Co., Ltd. Address before: 100000 Beijing Dongcheng District gogoa building A1103-1113 Applicant before: BEIJING VISIONVERA INTERNATIONAL INFORMATION TECHNOLOGY CO., LTD. |
|
CB02 | Change of applicant information | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180601 |
|
RJ01 | Rejection of invention patent application after publication |