CN103793521B - Image processing method and device - Google Patents
Image processing method and device Download PDFInfo
- Publication number
- CN103793521B CN103793521B CN201410055503.9A CN201410055503A CN103793521B CN 103793521 B CN103793521 B CN 103793521B CN 201410055503 A CN201410055503 A CN 201410055503A CN 103793521 B CN103793521 B CN 103793521B
- Authority
- CN
- China
- Prior art keywords
- sight spot
- spot information
- picture
- extraction
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/40—Processing or translation of natural language
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Processing Or Creating Images (AREA)
Abstract
The present invention provides a kind of image processing method, the described method includes:Obtain the picture of selection;Obtain the corresponding camera site of the picture;Search the matched sight spot information in camera site corresponding with the picture;The sight spot information is superimposed upon on the picture.Using the image processing method, it is possible to increase the ability of picture carrying information, improves the abundant degree of the information content included in picture.In addition, present invention also offers a kind of picture processing unit and another image processing method and device.
Description
Technical field
The present invention relates to technical field of image processing, more particularly to a kind of image processing method and device.
Background technology
Existing mobile terminal all has the function of to take pictures, and when people often like being away on a trip, is clapped using mobile terminal
Picture is uploaded to network social intercourse platform according to after with friend to be shared.Traditional shooting instrument, after shooting obtains picture, figure
On piece is typically only capable to comprising information such as times, and the scarce capacity of picture carrying information, the information content that picture can carry is limited, from
And the abundant degree of the information content included in picture is not high.
The content of the invention
Based on this, it is necessary to for above-mentioned technical problem, there is provided a kind of picture for the ability that can improve picture carrying information
Treating method and apparatus.
A kind of image processing method, the described method includes:
Obtain the picture of selection;
Obtain the corresponding camera site of picture;
Search the matched sight spot information in camera site corresponding with the picture;
The sight spot information is superimposed upon on the picture.
A kind of picture processing unit, described device include:
Picture selecting module, for obtaining the picture of selection;
Camera site acquisition module, for reading the acquisition parameters information of the picture, from the acquisition parameters information
Extract the camera site of picture;
Sight spot information matching module, for searching the matched sight spot information in camera site with the picture;
Picture processing module, for the sight spot information to be superimposed upon the picture.
Above-mentioned image processing method and device, by obtaining the corresponding camera site of picture, and then obtain corresponding with picture
The matched sight spot information in camera site, matched sight spot information is superimposed upon on picture.Due to can be corresponding according to picture
Camera site matches sight spot information, so that sight spot information is superimposed upon on picture automatically, has obtained being printed on the figure of sight spot information
Piece, improves the ability of picture carrying information, so as to improve the abundant degree of the information content included in picture.
In addition, additionally provide the image processing method and device of another ability that can improve picture carrying information.
A kind of image processing method, the described method includes:
Terminal taking picture;
Obtain the position of the terminal;
Search the sight spot information with the location matches of the terminal;
The sight spot information is superimposed upon on the picture.
A kind of picture processing unit, described device include:
Taking module, for shooting picture;
Terminal location acquisition module, for obtaining the position of terminal;
Sight spot information matching module, for searching the sight spot information with the location matches of the terminal;
Picture processing module, for the sight spot information to be superimposed upon the picture.
Above-mentioned image processing method and device, the position of terminal is obtained in terminal taking picture, and then is got and end
The sight spot information of the location matches at end, then sight spot information is added on the picture of shooting.This method and device are in terminal taking
Sight spot information can be automatically matched to according to the spot for photography of photo during photo, so as to obtain being printed on the picture of sight spot information, carried
The high ability of picture carrying information, improves the abundant degree of the information content included in picture.
Brief description of the drawings
Fig. 1 is the flow diagram of image processing method in one embodiment;
Fig. 2 is the flow diagram searched in one embodiment with the matched sight spot information of shooting time of picture;
Fig. 3 is the module map of picture processing unit in one embodiment;
Fig. 4 is the module map of sight spot information matching module in one embodiment;
Fig. 5 is the module map of picture processing unit in another embodiment;
Fig. 6 is the flow diagram of image processing method in another embodiment;
Fig. 7 is the flow diagram searched in another embodiment with the sight spot information of the location matches of terminal;
Fig. 8 is the module map of picture processing unit in further embodiment;
Fig. 9 is the module map of sight spot information matching module in another embodiment;
Figure 10 is the module map of picture processing unit in another embodiment;
Figure 11 is the schematic diagram for the picture for being printed on sight spot information;
Figure 12 is the module map for the computer system for implementing the embodiment of the present invention.
Embodiment
In order to make the purpose , technical scheme and advantage of the present invention be clearer, with reference to the accompanying drawings and embodiments, it is right
The present invention is further elaborated.It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, and
It is not used in the restriction present invention.
As shown in Figure 1, in one embodiment, there is provided a kind of image processing method, this method with apply it is various can
It is illustrated in the terminal of storage and display picture, these terminals include but not limited to various desktop computers, on knee
Pocket computer, personal digital assistant, tablet computer, smart mobile phone, E-book reader, MP3(Moving Picture
Experts Group Audio Layer III, dynamic image expert's compression standard audio aspect 3)Or MP4(Moving
Picture Experts Group Audio Layer IV, dynamic image expert's compression standard audio aspect 4)Player etc..
This method includes:
Step 102, the picture of selection is obtained.
Picture is saved in terminal local, these pictures can shoot obtained photo in advance, can be stored in terminal
In local photograph album, user can select to need the picture handled, for example click on by the picture in the photograph album of browsing terminal local
The thumbnail of a certain pictures makes choice.
Step 104, the corresponding camera site of picture is obtained.
In one embodiment, step 104 includes:The acquisition parameters information of picture is read, is carried from acquisition parameters information
Take the corresponding camera site of picture.
Specifically, picture can generate acquisition parameters information in shooting, and acquisition parameters information is written to picture file
Head, acquisition parameters information includes but not limited to shooting time, color coding, picture width, picture height, photo resolution
With camera site etc..Wherein, the positional information of terminal can be obtained when shooting picture, which is the shooting position of picture
Put, can be the geographical longitude and latitude residing for terminal.For example in one embodiment, the acquisition parameters information of picture is picture
EXIF(Exchangeable Image File, exchangeable image file)Information.Specifically, it can be read from the head of picture file
The EXIF information of picture is got, and then the camera site of picture is extracted from acquisition parameters information.
In another embodiment, step 104 includes:Terminal is positioned, obtains the position of terminal, picture is corresponding
Camera site is the position of terminal.
Picture is possible to obtain the position less than terminal so as to get the camera site of picture in shooting, gives birth to
Into acquisition parameters information then do not include picture camera site.When the acquisition parameters information extraction from picture is corresponded to less than picture
Camera site when, the position of terminal can be obtained.Specifically, terminal can be positioned by the GPS module installed in terminal,
Obtain the position of terminal(Geographical longitude and latitude), using the position of terminal as the corresponding camera site of picture.
Step 106, the matched sight spot information in camera site corresponding with picture is searched.
In one embodiment, step 106 includes:From scene data storehouse between lookup camera site corresponding with picture
Distance sight spot information within a preset range, extract closest sight spot information as matched sight spot information.
Sight spot information and corresponding geographical location are stored in scene data storehouse.Sight spot information includes sight name, sight spot
At least one of type and sight spot icon, such as sight spot information are " east Chinese Overseas Town tea trench ", sight spot comprising sight name
Type is " eco-tour sight spot " etc., and the corresponding geographical location for storing the sight spot in scene data storehouse(Geographical longitude and latitude
Degree).
Specifically, can be according to the corresponding geographical location of the sight spot information stored in scene data storehouse and the camera site of picture
The distance between camera site of the sight spot information stored in scene data storehouse and picture is calculated, lookup obtains distance in default model
Interior sight spot information is enclosed, then extracts the sight spot information nearest apart from the camera site of picture.
Step 108, matched sight spot information is superimposed upon on picture.
As described above, sight spot information includes sight name(Text information), sight spot type(Text information)With sight spot icon
(Graphical information)At least one of.Specifically, the figure layer of sight spot information can be established on picture, sight spot information is added to this
The specified location of figure layer, alternatively, being embedded into sight spot information as pixel in picture, so as to obtain being printed on the figure of sight spot information
Piece.After generation is printed on the picture of sight spot information, which can be shared or is stored in terminal by social network-i i-platform
It is local.Further, also multiple default templates can be stored in terminal local, defined in template the display location of sight spot information and
Other information(Such as shooting time, current weather etc.)Display location.After matched sight spot information is obtained, sight spot can be believed
Breath places corresponding position in a template, then the template that with the addition of sight spot information is superimposed upon on picture, finally by template
Fusion treatment is carried out with picture, generates the picture for being printed on sight spot information.
In the present embodiment, since sight spot information can be matched according to the corresponding camera site of picture, so that sight spot be believed
Breath is superimposed upon on picture automatically, has been obtained being printed on the picture of sight spot information, has been improved the ability of picture carrying information, so as to improve
The abundant degree of the information content included in picture.
In one embodiment, as shown in Fig. 2, the step of searching sight spot information matched with the camera site of picture is wrapped
Include:
Step 202, the distance between camera site of sight spot information and picture in scene data storehouse is calculated.
The sight spot information stored in scene data storehouse has corresponded to the geographical location at sight spot, according to the geographical location at sight spot and figure
The camera site of piece can calculate distance between the two.
Step 204, the corresponding hot value of sight spot information in scene data storehouse is obtained.
In one embodiment, sight spot information and corresponding hot value are stored in the scene data storehouse of server end,
The corresponding hot value of sight spot information is used to indicate that the concerned program of sight spot information, and hot value is bigger, then concerned degree is got over
It is high.Further, the frequency that scene data storehouse can be chosen according to sight spot information by great amount of terminals adjusts corresponding hot value, than
Such as, hot value is equal to the number that sight spot information is selected.
Step 206, the scape in scene data storehouse is calculated according to calculated distance and the corresponding hot value of sight spot information
The corresponding matching degree of point information, extracts the sight spot information of matching degree maximum as matched sight spot information.
In one embodiment, the corresponding matching degree of sight spot information in scene data storehouse and calculated distance are into anti-
Than hot value corresponding with sight spot information is inversely proportional.For example, calculate the corresponding matching degree of sight spot information according to equation below:a=
B/c, wherein, a is the corresponding matching degree of sight spot information, and b is the corresponding hot value of sight spot information, and c is sight spot information and picture
The distance between camera site.
The sight spot information of matching degree maximum is then nearest apart from the camera site of picture, and hot value is bigger, then may more meet
Sight spot in picture.It is corresponding according to the distance between sight spot information and camera site of picture and sight spot information in the present embodiment
Hot value to calculate its matching degree, and then the sight spot information of matching degree maximum is superimposed upon on picture so that be superimposed upon picture
On sight spot information best suit sight spot in picture, improve the accuracy of picture addition sight spot information.
Further, in one embodiment, in step 206, the sight spot information of also extractable distance within a preset range;
Obtain the corresponding hot value of sight spot information of extraction;The distance between camera site of sight spot information and picture further according to extraction
Hot value corresponding with the sight spot information of extraction calculates the corresponding matching degree of sight spot information of extraction, obtains the scape of matching degree maximum
Point information is as matched sight spot information.In the present embodiment, first from scene data storehouse extraction and the camera site of picture away from
Sight spot information within a preset range, then the corresponding hot value of these sight spot informations is obtained, so as to reduce calculating matching degree
When calculation amount, improve picture treatment effeciency.
In one embodiment, image processing method further includes:The shooting time of picture is extracted from acquisition parameters information,
Shooting time is superimposed upon on picture.In the present embodiment, except the sight spot information obtained from scene data storehouse is added to figure
Outside on piece, also the shooting time of picture is superimposed upon on picture, generates the new picture for being printed on sight spot information and shooting time.It is raw
Into new picture can share to social network-i i-platform or be stored in terminal local, picture correlation can be directly known from picture
Sight spot information and shooting time.
As shown in figure 3, in one embodiment, a kind of picture processing unit 300 is additionally provided, including:
Picture selecting module 302, for obtaining the picture of selection.
Camera site acquisition module 304, for obtaining the corresponding camera site of picture.
Sight spot information matching module 306, for searching the matched sight spot information in camera site corresponding with picture.
Picture processing module 308, for matched sight spot information to be superimposed upon picture.
In one embodiment, camera site acquisition module 304 is used to read the acquisition parameters information in picture, from shooting
The corresponding camera site of picture is extracted in parameter information.
In another embodiment, camera site acquisition module 304 is used to position terminal, obtains the position of terminal
Put, the corresponding camera site of picture is then the position of terminal.
In one embodiment, sight spot information matching module 306 is used to search the shooting with picture from scene data storehouse
The sight spot information of the distance between position within a preset range, extracts closest sight spot information and believes as matched sight spot
Breath.
In one embodiment, as shown in figure 4, sight spot information matching module 306 includes:
Distance calculation module 316, during for calculating corresponding with the picture camera site of the sight spot information in scene data storehouse
Between distance.
Hot value acquisition module 326, for obtaining the corresponding hot value of sight spot information in scene data storehouse.
Matching degree computing module 336, for calculating scape according to 316 calculated distance of distance calculation module and hot value
The corresponding matching degree of sight spot information in point data base.
Sight spot information determining module 346, for extracting the sight spot information of matching degree maximum as matched sight spot information.
In one embodiment, the corresponding matching degree of sight spot information in scene data storehouse and calculated distance are into anti-
Than hot value corresponding with sight spot information is inversely proportional.For example, matching degree computing module 336 is used to calculate scape according to equation below
The corresponding matching degree of point information:A=b/c, wherein, a is the corresponding matching degree of sight spot information, and b is the corresponding temperature of sight spot information
Value, c are the distance between sight spot information camera site corresponding with picture.
Sight spot information extraction module 356, for extracting the sight spot information of distance within a preset range.
Hot value acquisition module 326 is additionally operable to obtain the corresponding heat of sight spot information of the extraction of sight spot information extraction module 356
Angle value.
Matching degree computing module 336 be additionally operable between the sight spot information camera site corresponding with picture according to extraction away from
Hot value corresponding from the sight spot information of extraction calculates the corresponding matching degree of sight spot information of extraction.In the present embodiment, sight spot
The sight spot information that information determination module 346 is then used to obtain matching degree maximum is as matched sight spot information.
In one embodiment, sight spot information includes at least one of sight name, sight spot type and sight spot icon.Such as
Shown in Fig. 5, picture processing unit 300 further includes:
Shooting time acquisition module 310, for extracting the shooting time of picture from the acquisition parameters information of picture.This reality
Apply in example, picture processing module 308 is additionally operable to shooting time being superimposed upon on picture.
As shown in fig. 6, in one embodiment, another image processing method is additionally provided, this method is to apply each
Kind with shooting function terminal in is illustrated, these terminals include but not limited to various pocket computer on knee,
Personal digital assistant, tablet computer, smart mobile phone, E-book reader, MP3(Moving Picture Experts Group
Audio Layer III, dynamic image expert's compression standard audio aspect 3)Or MP4(Moving Picture Experts
Group Audio Layer IV, dynamic image expert's compression standard audio aspect 4)Player etc..This method includes:
Step 602, terminal taking picture.
Camera is installed in terminal, picture can be shot by camera.In the present embodiment, to shoot sight spot picture progress
Explanation.
Step 604, the position of terminal is obtained.
In one embodiment, GPS is installed in terminal(Global positioning system)Module, end can be obtained by GPS module
The position at end(Geographical longitude and latitude).
Step 606, the sight spot information with the location matches of terminal is searched.
In one embodiment, step 606 includes, and the distance between position of terminal is searched from scene data storehouse and is existed
Sight spot information in preset range, extracts closest sight spot information as matched sight spot information.
Sight spot information and corresponding geographical location are stored in scene data storehouse(That is the corresponding geographical longitude and latitude of the sight spot information
Degree).Sight spot information includes at least one of sight name, sight spot type and sight spot icon.Specifically, can be according to scene data
The corresponding geographical location of sight spot information stored in storehouse and the position of terminal calculate sight spot information and terminal in scene data storehouse
The distance between position, search and obtain the sight spot information of distance within a preset range, then extract apart from the position of terminal
Nearest sight spot information.
Step 608, matched sight spot information is superimposed upon on picture.
Sight spot information includes sight name(Text information), sight spot type(Text information)With sight spot icon(Graphical information)
At least one of.Specifically, the figure layer of sight spot information can be established on picture, sight spot information is added to specifying for the figure layer
At position, alternatively, being embedded into sight spot information as pixel in picture, so as to obtain being printed on the picture of sight spot information.Generation print
After having the picture of sight spot information, which can be shared or is stored in terminal local by social network-i i-platform.Into one
Step, also it can store multiple default templates in terminal local, the display location of sight spot information and other information defined in template
(Such as shooting time, current weather etc.)Display location.After matched sight spot information is obtained, sight spot information can be placed on
The template that with the addition of sight spot information, is then superimposed upon on picture by the corresponding position in template, finally by template and picture into
Row fusion treatment, generates the picture for being printed on sight spot information.
, can be according to the position of terminal in terminal taking photo in the present embodiment(The namely spot for photography of photo)From
It is dynamic to match sight spot information, so as to obtain being printed on the picture of sight spot information, the ability of picture carrying information is improved, improves figure
The abundant degree of the information content included in piece.
In one embodiment, as shown in fig. 7, the step of searching the sight spot information with the location matches of terminal includes:
Step 702, the distance between position of sight spot information and terminal in scene data storehouse is calculated.
The sight spot information stored in scene data storehouse has corresponded to the geographical location at sight spot, according to the geographical location at sight spot and end
The position at end can calculate distance between the two.
Step 704, the corresponding hot value of sight spot information in scene data storehouse is obtained.
In one embodiment, sight spot information and corresponding hot value are stored in the scene data storehouse of server end,
The corresponding hot value of sight spot information is used to indicate that the concerned program of sight spot information, and hot value is bigger, then concerned degree is got over
Height, further, the frequency that scene data storehouse cocoa is selected according to sight spot information adjust corresponding hot value, such as, temperature
Value is equal to the number that sight spot information is selected.
Step 706, corresponding of sight spot information in scene data storehouse is calculated according to calculated distance and hot value
With degree, the sight spot information of matching degree maximum is extracted as matched sight spot information.
In one embodiment, the corresponding matching degree of sight spot information in scene data storehouse and calculated distance are into anti-
Than hot value corresponding with sight spot information is inversely proportional.For example, calculate the corresponding matching degree of sight spot information according to equation below:a=
B/c, wherein, a is the corresponding matching degree of sight spot information, and b is the corresponding hot value of sight spot information, and c is sight spot information and terminal
The distance between position.
The sight spot information of matching degree maximum is then nearest apart from the position of terminal, and hot value is bigger, then may more meet picture
In sight spot.In the present embodiment, according to the distance between sight spot information and position of terminal and the corresponding hot value of sight spot information
To calculate its matching degree, and then the sight spot information of matching degree maximum is superimposed upon on picture so that the sight spot being superimposed upon on picture
Information best suits the sight spot in picture, improves the accuracy of picture addition sight spot information.
Further, in one embodiment, in step 706, the sight spot information of also extractable distance within a preset range;
Obtain the corresponding hot value of sight spot information of extraction;The distance between sight spot information and terminal location further according to extraction and extraction
The corresponding hot value of sight spot information calculate the corresponding matching degree of sight spot information of extraction, obtain the sight spot information of matching degree maximum
As matched sight spot information.In the present embodiment, the distance with the position of terminal is first extracted from scene data storehouse in default model
Interior sight spot information is enclosed, then obtains the corresponding hot value of these sight spot informations, calculation amount during matching degree is calculated so as to reduce,
Improve picture treatment effeciency.
In one embodiment, image processing method further includes:Obtain the shooting time of terminal taking picture;When will shoot
Between be superimposed upon on the picture of shooting.In the present embodiment, except the sight spot information obtained from scene data storehouse is added to picture
It is upper outer, also the shooting time of terminal taking picture is superimposed upon on picture, generates new sight spot information and the shooting time of being printed on
Picture.The new picture of generation can be shared to social network-i i-platform or be stored in terminal local, can directly know from picture
Picture correlation sight spot information and shooting time.
As shown in figure 8, in one embodiment, another picture processing unit 800 is additionally provided, including:
Taking module 802, for shooting picture.
Taking module 802 can be picture collection device, such as camera.
Terminal location acquisition module 804, for obtaining the position of terminal.
In one embodiment, terminal location acquisition module 804 is used for the geographical warp that terminal is obtained by GPS positioning module
Latitude.
Sight spot information matching module 806, for searching the sight spot information with the location matches of terminal.
Picture processing module 808, for matched sight spot information to be superimposed upon picture.
In one embodiment, sight spot information matching module 806 is used to search the position with terminal from scene data storehouse
The distance between sight spot information within a preset range, extract closest sight spot information as matched sight spot information.
In one embodiment, as shown in figure 9, sight spot information matching module 806 includes:
Distance calculation module 816, for calculating the distance between position of sight spot information and terminal in scene data storehouse.
Hot value acquisition module 826, for obtaining the corresponding hot value of sight spot information in scene data storehouse.
Matching degree computing module 836, for calculating scape according to 816 calculated distance of distance calculation module and hot value
The corresponding matching degree of sight spot information in point data base.
Sight spot information determining module 846, for extracting the sight spot information of matching degree maximum as matched sight spot information.
In one embodiment, the corresponding matching degree of sight spot information in scene data storehouse and calculated distance are into anti-
Than hot value corresponding with sight spot information is inversely proportional.For example, matching degree computing module 836 is used to calculate scape according to equation below
The corresponding matching degree of point information:A=b/c, wherein, a is the corresponding matching degree of sight spot information, and b is the corresponding temperature of sight spot information
Value, c are the distance between camera site of sight spot information and picture.
Sight spot information extraction module 856, for extracting the sight spot information of distance within a preset range.
Hot value acquisition module 826 is additionally operable to obtain the corresponding heat of sight spot information of the extraction of sight spot information extraction module 856
Angle value.
Matching degree computing module 836 is additionally operable to according to the distance between position of the sight spot information of extraction and terminal and extraction
The corresponding hot value of sight spot information calculate the corresponding matching degree of sight spot information of extraction.In the present embodiment, sight spot information determines
The sight spot information that module 846 is then used to obtain matching degree maximum is as matched sight spot information.
In one embodiment, sight spot information includes at least one of sight name, sight spot type and sight spot icon.Such as
Shown in Figure 10, picture processing unit 800 further includes:
Shooting time acquisition module 810, for obtaining the shooting time of terminal taking picture.
In the present embodiment, picture processing module 808 is additionally operable to shooting time being superimposed upon on picture.
Figure 11 is the signal of the picture for being printed on sight spot information obtained after the image processing method in execution above-described embodiment
Figure.As shown in figure 11, picture 1102 was the picture for not being superimposed sight spot information originally, can be the picture selected from terminal local,
It can also be the picture that terminal has just been shot.If picture 1102 is from the picture of terminal local selection, join from the shooting of picture
The camera site of picture and the shooting time of picture are extracted in number information, and then searches the matched scape in camera site with picture
Point information 1104.If picture 1102 is the picture that terminal has just been shot, the position of terminal is obtained, and then is searched and terminal
The sight spot information 1104 of location matches, and shooting time when obtaining terminal taking picture.Sight spot information 1104 includes sight spot name
Claim 1104a, sight spot Class1 104b.In addition, also shooting time 1106 can be also superimposed upon on picture 1102, so as to generate new
Picture.
Figure 12 is the module map for the computer system 1000 that can realize the embodiment of the present invention.The computer system 1000
A simply example for being suitable for the invention computer environment, it is impossible to be considered to propose appointing to the use scope of the present invention
What is limited.Computer system 1000 can not be construed to need to rely on or the exemplary computer system 1000 with diagram
One or more of component combination.
The computer system 1000 shown in Figure 12 is the example of a computer system for being suitable for the present invention.Have
Other frameworks of different sub-systems configuration can also use.Such as there are big well known desktop computer, notebook, individual digital to help
The similar devices such as reason, smart phone, tablet computer, portable electronic device, set-top box can be adapted for some of the present invention
Embodiment.But it is not limited to equipment enumerated above.
As shown in figure 12, computer system 1000 includes processor 1010, memory 1020 and system bus 1022.Including
Various system components including memory 1020 and processor 1010 are connected on system bus 1022.Processor 1010 is one
For performing the hardware of computer program instructions by arithmetic sum logical operation basic in computer system.Memory 1020
It is one to be used to temporarily or permanently store calculation procedure or data(For example, program state information)Physical equipment.System is total
Line 1020 can be any one in the bus structures of following several types, including memory bus or storage control, outer
If bus and local bus.Processor 1010 and memory 1020 can be by system bus 1022 into row data communication.Wherein
Memory 1020 includes read-only storage(ROM)Or flash memory(All it is not shown in figure), and random access memory(RAM), RAM
Typically refer to be loaded with the main storage of operating system and application program.
Computer system 1000 further includes display interface 1030(For example, graphics processing unit), display device 1040(Example
Such as, liquid crystal display), audio interface 1050(For example, sound card)And audio frequency apparatus 1060(For example, loudspeaker).Display device
1040 and audio frequency apparatus 1060 be media device for experiencing content of multimedia.
Computer system 1000 generally comprises a storage device 1070.Storage device 1070 can from a variety of computers
To read to select in medium, computer-readable medium refers to any available medium that can be accessed by computer system 1000,
Including mobile and fixed two media.For example, computer-readable medium includes but not limited to, flash memory(Miniature SD
Card), CD-ROM, digital versatile disc(DVD)Or other optical disc storages, cassette, tape, disk storage or other magnetic storages are set
Any other medium that is standby, or can simultaneously being accessed available for storage information needed by computer system 1000.
Computer system 1000 further includes input unit 1080 and input interface 1090(For example, I/O controller).User can
With by input unit 1080, such as the touch panel equipment in keyboard, mouse, display device 1040, input instruction and information arrive
In computer system 1000.Input unit 1080 is connected on system bus 1022 typically by input interface 1090, but
It can also be connected by other interfaces or bus structures, such as Universal Serial Bus(USB).
Computer system 1000 can carry out logical connection with one or more network equipment in a network environment.Network is set
Standby can be PC, server, router, smart phone, tablet computer or other common network nodes.Department of computer science
System 1000 passes through LAN(LAN)Interface 1100 or mobile comm unit 1110 are connected with the network equipment.LAN(LAN)
Refer in finite region, for example, family, school, computer laboratory or the office building using the network media, interconnection composition
Computer network.WiFi and twisted-pair feeder wiring Ethernet are two kinds of technologies of most common structure LAN.WiFi is a kind of
It can make 1000 swapping data of computer system or the technology of wireless network is connected to by radio wave.Mobile comm unit
1110 are answered and are called by radio communication diagram while being moved in a wide geographic area.Except logical
Beyond words, mobile comm unit 1110 is also supported to carry out in 2G, 3G or the 4G cellular communication system for providing mobile data service
Internet access.
It should be pointed out that other computer systems including than 1000 more or fewer subsystems of computer system
It can be suitably used for inventing.For example, computer system 1000 can include the bluetooth unit that data can be exchanged in short distance, for shining
The imaging sensor of phase, and for measuring the accelerometer of acceleration.
As detailed above, specifying for image processing method can be performed by being suitable for the invention computer system 1000
Operation.The form of the software instruction that computer system 1000 is operated in computer-readable medium by processor 1010 performs
These operations.These software instructions can be read into from storage device 1070 or by lan interfaces 1100 from another equipment
In memory 1020.The software instruction being stored in memory 1020 is so that processor 1010 performs above-mentioned picture processing side
Method.In addition, the present invention also can equally be realized by hardware circuit or hardware circuit combination software instruction.Therefore, this hair is realized
The bright combination for being not limited to any specific hardware circuit and software.
Embodiment described above only expresses the several embodiments of the present invention, its description is more specific and detailed, but simultaneously
Therefore the limitation to the scope of the claims of the present invention cannot be interpreted as.It should be pointed out that for those of ordinary skill in the art
For, without departing from the inventive concept of the premise, various modifications and improvements can be made, these belong to the guarantor of the present invention
Protect scope.Therefore, the protection domain of patent of the present invention should be determined by the appended claims.
Claims (14)
1. a kind of image processing method, the described method includes:
The picture of selection is obtained, the picture includes the picture that terminal local preserves;
Obtain the corresponding camera site of the picture;
The matched sight spot information in camera site corresponding with the picture is searched, including:Calculate the scape in scene data storehouse
The distance between point information camera site corresponding with the picture, the sight spot information obtained in the scene data storehouse is corresponding
Hot value, extracts the sight spot information of distance within a preset range, the corresponding hot value of sight spot information of extraction is obtained, according to extraction
The corresponding hot value of sight spot information of the distance between sight spot information and the camera site of picture and extraction calculate the scape of extraction
The corresponding matching degree of point information, the sight spot information for obtaining matching degree maximum are equal to as matched sight spot information, the matching degree
The ratio of the hot value and the calculated distance;
The sight spot information is superimposed upon on the picture;
Wherein, the position is geographical longitude and latitude;
The hot value is the number that sight spot information is selected;
The corresponding heat of the sight spot information of the distance between the sight spot information according to extraction and camera site of picture and extraction
The corresponding matching degree of sight spot information that angle value calculates extraction includes:The sight spot information that extraction is calculated according to formula a=b/c is corresponding
Matching degree, wherein, a is the corresponding matching degree of sight spot information of extraction, and b is the corresponding hot value of sight spot information of extraction, and c is to carry
The distance between the sight spot information taken and the camera site of picture.
2. the according to the method described in claim 1, it is characterized in that, step for obtaining the corresponding camera site of the picture
Suddenly, including:
The acquisition parameters information in the picture is read, the corresponding shooting position of the picture is extracted from the acquisition parameters information
Put;Or
Terminal is positioned, obtains the position of the terminal, the corresponding camera site of the picture is the position of the terminal.
3. according to the method described in claim 1, it is characterized in that, described search camera site matching corresponding with the picture
Sight spot information the step of, including:
The sight spot letter of the distance between camera site corresponding with the picture within a preset range is searched from scene data storehouse
Breath, extracts closest sight spot information as the matched sight spot information.
4. according to the method described in claim 2, it is characterized in that, the sight spot information include sight name, sight spot type and
At least one of sight spot icon;The method further includes:
The shooting time of picture is extracted from the acquisition parameters information;
The shooting time is superimposed upon on the picture.
5. a kind of image processing method, the described method includes:
Terminal taking picture;
Obtain the position of the terminal;
The sight spot information with the location matches of the terminal is searched, including:Calculate sight spot information in scene data storehouse with
The distance between corresponding camera site of the picture, obtains the corresponding hot value of sight spot information in the scene data storehouse,
The sight spot information of extraction distance within a preset range, obtains the corresponding hot value of sight spot information of extraction, according to the sight spot of extraction
The corresponding hot value of the sight spot information of the distance between information and camera site of picture and extraction calculates the sight spot information of extraction
Corresponding matching degree, the sight spot information for obtaining matching degree maximum are equal to the heat as matched sight spot information, the matching degree
The ratio of angle value and the calculated distance;
The sight spot information is superimposed upon on the picture;
Wherein, the position is geographical longitude and latitude;
The hot value is the number that sight spot information is selected;
The corresponding heat of the sight spot information of the distance between the sight spot information according to extraction and camera site of picture and extraction
The corresponding matching degree of sight spot information that angle value calculates extraction includes:The sight spot information that extraction is calculated according to formula a=b/c is corresponding
Matching degree, wherein, a is the corresponding matching degree of sight spot information of extraction, and b is the corresponding hot value of sight spot information of extraction, and c is to carry
The distance between the sight spot information taken and the camera site of picture.
6. according to the method described in claim 5, it is characterized in that, the lookup and the sight spot of the location matches of the terminal are believed
The step of breath, including:
The sight spot information of the distance between position of the terminal within a preset range is searched from scene data storehouse, extract away from
From nearest sight spot information as the matched sight spot information.
7. according to the method described in claim 5, it is characterized in that, the sight spot information include sight name, sight spot type and
At least one of sight spot icon;The method further includes:
Obtain the shooting time of picture described in terminal taking;
The shooting time is superimposed upon on the picture.
8. a kind of picture processing unit, it is characterised in that described device includes:
Picture selecting module, for obtaining the picture of selection, the picture includes the picture that terminal local preserves;
Camera site acquisition module, for obtaining the corresponding camera site of the picture, the position is geographical longitude and latitude;
Sight spot information matching module, for searching the matched sight spot information in camera site corresponding with the picture;
Picture processing module, for the sight spot information to be superimposed upon the picture;
The sight spot information matching module includes:
Distance calculation module, for calculating between the camera site corresponding with the picture of the sight spot information in scene data storehouse
Distance;
Hot value acquisition module, for obtaining the corresponding hot value of sight spot information in the scene data storehouse, the hot value
The number being selected for sight spot information;
Matching degree computing module, for being calculated according to the calculated distance and the hot value in the scene data storehouse
The corresponding matching degree of sight spot information, the matching degree is equal to the ratio of the hot value and the calculated distance;
Sight spot information determining module, for extracting the sight spot information of matching degree maximum as the matched sight spot information;
The sight spot information matching module further includes:Sight spot information extraction module, for extracting the distance within a preset range
Sight spot information;
The hot value acquisition module is additionally operable to obtain the corresponding hot value of sight spot information of the extraction;
The matching degree computing module be additionally operable to according to the sight spot information camera site corresponding with the picture of the extraction it
Between distance and the corresponding hot value of sight spot information of the extraction calculate the corresponding matching degree of sight spot information of the extraction, tool
Body is used for the corresponding matching degree of sight spot information that extraction is calculated according to formula a=b/c, wherein, a is that the sight spot information of extraction corresponds to
Matching degree, b be extraction the corresponding hot value of sight spot information, c be extraction sight spot information and picture camera site between
Distance.
9. device according to claim 8, it is characterised in that the camera site acquisition module is used to read the picture
In acquisition parameters information, the corresponding camera site of the picture is extracted from the acquisition parameters information;Or
The camera site acquisition module is used to position terminal, obtains the position of the terminal, the picture is corresponding
Camera site is the position of the terminal.
10. device according to claim 8, it is characterised in that the sight spot information matching module is used for from scene data
The sight spot information of the distance between camera site corresponding with the picture within a preset range is searched in storehouse, extraction is closest
Sight spot information as the matched sight spot information.
11. device according to claim 9, it is characterised in that the sight spot information include sight name, sight spot type and
At least one of sight spot icon;Described device further includes:
Shooting time acquisition module, for extracting the shooting time of picture from the acquisition parameters information;
The picture processing module is additionally operable to the shooting time being superimposed upon on the picture.
12. a kind of picture processing unit, it is characterised in that described device includes:
Taking module, for shooting picture;
Terminal location acquisition module, for obtaining the position of terminal, the position is geographical longitude and latitude;
Sight spot information matching module, for searching the sight spot information with the location matches of the terminal;
Picture processing module, for the sight spot information to be superimposed upon the picture;
The sight spot information matching module includes:
The distance between distance calculation module, the position for calculating the sight spot information in scene data storehouse and the terminal;
Hot value acquisition module, for obtaining the corresponding hot value of sight spot information in the scene data storehouse, the hot value
The number being selected for sight spot information;
Matching degree computing module, for being calculated according to the calculated distance and the hot value in the scene data storehouse
The corresponding matching degree of sight spot information, the matching degree is equal to the ratio of the hot value and the calculated distance;
Sight spot information determining module, for extracting the sight spot information of matching degree maximum as the matched sight spot information;
The sight spot information matching module further includes:Sight spot information extraction module, for extracting the distance within a preset range
Sight spot information;
The hot value acquisition module is additionally operable to obtain the corresponding hot value of sight spot information of the extraction;
The matching degree computing module is additionally operable to according to the distance between the sight spot information of the extraction and position of the terminal
Hot value corresponding with the sight spot information of the extraction calculates the corresponding matching degree of sight spot information of the extraction, specifically for root
The corresponding matching degree of sight spot information of extraction is calculated according to formula a=b/c, wherein, a is the corresponding matching of sight spot information of extraction
Degree, b are the corresponding hot value of sight spot information of extraction, and c is the distance between camera site of the sight spot information and picture extracted.
13. device according to claim 12, it is characterised in that the sight spot information matching module is used for from scene data
The sight spot information of the distance between position with the terminal within a preset range is searched in storehouse, extracts closest sight spot letter
Breath is used as the matched sight spot information.
14. device according to claim 12, it is characterised in that the sight spot information includes sight name, sight spot type
At least one of with sight spot icon;Described device further includes:
Shooting time acquisition module, for obtaining the shooting time of picture described in terminal taking;
The picture processing module is additionally operable to the shooting time being superimposed upon on the picture.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410055503.9A CN103793521B (en) | 2014-02-18 | 2014-02-18 | Image processing method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410055503.9A CN103793521B (en) | 2014-02-18 | 2014-02-18 | Image processing method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103793521A CN103793521A (en) | 2014-05-14 |
CN103793521B true CN103793521B (en) | 2018-04-27 |
Family
ID=50669187
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410055503.9A Active CN103793521B (en) | 2014-02-18 | 2014-02-18 | Image processing method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103793521B (en) |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104599307B (en) * | 2015-02-11 | 2018-05-15 | 成都品果科技有限公司 | A kind of method for showing picture using animation on mobile terminals |
CN106713840B (en) * | 2016-06-28 | 2018-09-04 | 腾讯科技(深圳)有限公司 | Virtual information display methods and device |
CN107577679A (en) * | 2016-07-04 | 2018-01-12 | 西安中兴新软件有限责任公司 | Recommend method and device in a kind of destination |
CN107270907B (en) * | 2017-05-22 | 2020-11-06 | 南京广电猫猫网络科技有限公司 | Tourist attraction information sharing method, route planning method and information sharing system |
WO2018223346A1 (en) * | 2017-06-08 | 2018-12-13 | 深圳市乃斯网络科技有限公司 | Method and system for positioning in photograph sharing |
CN107451186A (en) * | 2017-06-22 | 2017-12-08 | 珠海市魅族科技有限公司 | Photo processing method and device, computer installation and readable storage medium storing program for executing |
CN110019894B (en) * | 2017-07-21 | 2022-12-06 | 北京搜狗科技发展有限公司 | Position searching method and device |
CN109947966A (en) * | 2017-09-12 | 2019-06-28 | 中兴通讯股份有限公司 | Method, picture servers and the computer readable storage medium of shared pictorial information |
CN108399038A (en) * | 2018-01-17 | 2018-08-14 | 链家网(北京)科技有限公司 | A kind of picture synthetic method and mobile terminal |
CN113377482A (en) * | 2021-07-06 | 2021-09-10 | 浙江商汤科技开发有限公司 | Display method, display device, electronic equipment and computer-readable storage medium |
CN116521042B (en) * | 2023-06-26 | 2024-03-19 | 北京高德云信科技有限公司 | Line sharing object generation method, device, equipment, medium and program product |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6657661B1 (en) * | 2000-06-20 | 2003-12-02 | Hewlett-Packard Development Company, L.P. | Digital camera with GPS enabled file management and a device to determine direction |
CN101103626A (en) * | 2005-11-11 | 2008-01-09 | 索尼株式会社 | Imaging/reproducing device |
CN101540800A (en) * | 2008-03-19 | 2009-09-23 | 索尼爱立信移动通信日本株式会社 | Mobile terminal device and computer program |
CN101702165A (en) * | 2009-10-30 | 2010-05-05 | 高翔 | Live-action information system and method thereof based on GPS positioning and direction identification technology |
CN102385636A (en) * | 2011-12-22 | 2012-03-21 | 陈伟 | Intelligent searching method and device |
CN102722900A (en) * | 2012-06-05 | 2012-10-10 | 深圳市中兴移动通信有限公司 | Method and device for automatically adding introduction information to shot picture/video |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101625688A (en) * | 2008-07-11 | 2010-01-13 | 鸿富锦精密工业(深圳)有限公司 | Photo management system and management method |
CN101753807B (en) * | 2009-12-16 | 2012-11-28 | 惠州Tcl移动通信有限公司 | Image pick-up device |
CN202171801U (en) * | 2011-01-27 | 2012-03-21 | 深圳市美赛达科技有限公司 | Generating device including position information pictures and processing device including position information pictures |
US20130129142A1 (en) * | 2011-11-17 | 2013-05-23 | Microsoft Corporation | Automatic tag generation based on image content |
-
2014
- 2014-02-18 CN CN201410055503.9A patent/CN103793521B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6657661B1 (en) * | 2000-06-20 | 2003-12-02 | Hewlett-Packard Development Company, L.P. | Digital camera with GPS enabled file management and a device to determine direction |
CN101103626A (en) * | 2005-11-11 | 2008-01-09 | 索尼株式会社 | Imaging/reproducing device |
CN101540800A (en) * | 2008-03-19 | 2009-09-23 | 索尼爱立信移动通信日本株式会社 | Mobile terminal device and computer program |
CN101702165A (en) * | 2009-10-30 | 2010-05-05 | 高翔 | Live-action information system and method thereof based on GPS positioning and direction identification technology |
CN102385636A (en) * | 2011-12-22 | 2012-03-21 | 陈伟 | Intelligent searching method and device |
CN102722900A (en) * | 2012-06-05 | 2012-10-10 | 深圳市中兴移动通信有限公司 | Method and device for automatically adding introduction information to shot picture/video |
Also Published As
Publication number | Publication date |
---|---|
CN103793521A (en) | 2014-05-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103793521B (en) | Image processing method and device | |
US20160358042A1 (en) | Electronic Travel Album Generating Method and Computing Device | |
EP3786894A1 (en) | Method, device and apparatus for repositioning in camera orientation tracking process, and storage medium | |
EP2960852B1 (en) | Information processing device, information processing method, and program | |
US9584694B2 (en) | Predetermined-area management system, communication method, and computer program product | |
EP2071841A2 (en) | Method, apparatus and computer program product for displaying virtual media items in a visual media | |
US20090167919A1 (en) | Method, Apparatus and Computer Program Product for Displaying an Indication of an Object Within a Current Field of View | |
US9600932B2 (en) | Three dimensional navigation among photos | |
KR20140063839A (en) | Automatic privacy management for image sharing networks | |
US20200293179A1 (en) | Prioritization for presentation of media based on sensor data collected by wearable sensor devices | |
US10922479B2 (en) | Method and electronic device for creating an electronic signature | |
US9641768B2 (en) | Filter realization method and apparatus of camera application | |
CN104835105B (en) | Picture processing method and device | |
KR20170054746A (en) | Method and electronic device selecting an area associated with contents | |
CN106911885A (en) | Electronic equipment and method, photo taking | |
CN110215706A (en) | Location determining method, device, terminal and the storage medium of virtual objects | |
US20150130833A1 (en) | Map superposition method and electronic device | |
CN105096355B (en) | Image processing method and system | |
CN104902163B (en) | Image processing method and device, Picture Generation Method and device | |
KR20140097668A (en) | Method for providing mobile photobook service based on online | |
GB2513865A (en) | A method for interacting with an augmented reality scene | |
CN107888827A (en) | Image processing method and related product | |
US20140327806A1 (en) | Method and electronic device for generating thumbnail image | |
CN106909280A (en) | Electronic equipment and method, photo taking | |
KR101877901B1 (en) | Method and appratus for providing vr image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |