CN105635972A - Image processing method and apparatus - Google Patents
Image processing method and apparatus Download PDFInfo
- Publication number
- CN105635972A CN105635972A CN201610136635.3A CN201610136635A CN105635972A CN 105635972 A CN105635972 A CN 105635972A CN 201610136635 A CN201610136635 A CN 201610136635A CN 105635972 A CN105635972 A CN 105635972A
- Authority
- CN
- China
- Prior art keywords
- image
- positional information
- shooting time
- terminal
- module
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W64/00—Locating users or terminals or network equipment for network management purposes, e.g. mobility management
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Studio Devices (AREA)
Abstract
The invention provides an image processing method and apparatus. The method comprises the following steps: obtaining position information and a shooting time corresponding to a first image shot by a terminal having a positioning function; obtaining a second image shot by a terminal having no positioning function and a shooting time corresponding to the second image; determining the position information of the second image according to the position information and the shooting time corresponding to the first image and the shooting time corresponding to the second image; and processing the second image according to the position information of the second image. A terminal device can classify and display the second image according to the position information of the second image, and thus the user experience is improved.
Description
Technical field
It relates to image processing techniques, in particular to a kind of image processing method and device.
Background technology
Can simultaneously with camera and mobile phone when usual people travel outdoors, generally, mobile phone all has location function, and camera does not have location function; The terminal with location function can navigate to the positional information at place when user takes image (such as: longitude and the latitude taking position during image), the positional information with the image captured by the terminal of location function and correspondence is stored in the terminating unit such as computer, panel computer by user, wherein positional information is attached on image as the attribute of image, terminating unit can the positional information corresponding according to image image sorted out, arrange, record etc.
Equally, by not having, the image captured by the terminal of location function can also be stored in above-mentioned terminating unit user, but, the positional information at place when being to navigate to shooting image for the terminal without location function, thus it is unfavorable for that terminating unit is to the classification of image, arrangement and record etc.
Summary of the invention
For overcoming Problems existing in correlation technique, the disclosure provides a kind of image processing method and device. Described technical scheme is as follows:
First aspect according to disclosure embodiment, it is provided that a kind of image processing method, comprising:
Obtain positional information corresponding to first image taken by terminal with location function and shooting time;
Obtain and not there is the 2nd image taken by terminal of location function and shooting time corresponding to the 2nd image;
The shooting time corresponding according to positional information corresponding to the first image, shooting time and the 2nd image determines the positional information of the 2nd image;
2nd image is processed by the positional information according to the 2nd image.
The technical scheme that embodiment of the present disclosure provides can comprise following useful effect: the shooting time corresponding according to positional information corresponding to the first image, shooting time and the 2nd image determines the positional information of the 2nd image, terminating unit such that it is able to according to the positional information of the 2nd image, the 2nd image is sorted out, the process operation such as display. And then improve Consumer's Experience sense.
Can selection of land, the shooting time corresponding according to positional information corresponding to the first image, shooting time and the 2nd image is determined to comprise the positional information of the 2nd image:
Calculate the time difference often opening shooting time corresponding to the first image shooting time corresponding with the 2nd image;
The positional information determining the first image that the corresponding described time difference is minimum is the positional information of the 2nd image.
Can selection of land, the shooting time corresponding according to positional information corresponding to the first image, shooting time and the 2nd image is determined to comprise the positional information of the 2nd image:
The positional information corresponding to all first images and shooting time adopt interpolation algorithm, obtain the interpolating function that positional information is corresponding with shooting time;
The shooting time corresponding according to the 2nd image and interpolating function determine the positional information of the 2nd image.
The positional information of the 2nd image can be determined by above-mentioned two kinds of methods, terminating unit such that it is able to according to the positional information of the 2nd image, the 2nd image is sorted out, the process operation such as display. And then improve Consumer's Experience sense.
Can selection of land, according to the positional information of the 2nd image, the 2nd image is processed, comprising: according to the positional information of all 2nd images, all 2nd images are sorted out. And then improve Consumer's Experience sense. Wherein, the disclosure is not limited to classification operation, it is also possible to be that record waits operation.
Can selection of land, the method also comprises:
The positional information determining the 2nd image is an attribute information of the 2nd image.
To introduce inventive embodiments below and provide a kind of image processing apparatus, wherein device section is corresponding with aforesaid method, and corresponding content technique effect is identical, does not repeat them here.
Second aspect according to disclosure embodiment, it is provided that a kind of image processing apparatus, comprising:
First acquisition module, is configured to obtain positional information corresponding to the first image taken by the terminal with location function and shooting time;
2nd acquisition module, is configured to obtain the 2nd image taken by the terminal without location function and shooting time corresponding to the 2nd image;
First determination module, is configured to the positional information that shooting time corresponding to the 2nd image accessed by positional information corresponding to the first image accessed by the first acquisition module, shooting time and the 2nd acquisition module determines the 2nd image;
Processing module, is configured to the positional information according to the 2nd image and is processed by the 2nd image.
Can selection of land, the first determination module comprises calculating sub module and the first true stator modules:
Calculating sub module, is configured to calculate the time difference often opening shooting time corresponding to the first image shooting time corresponding with the 2nd image;
First true stator modules, the positional information being configured to determine the first image that the corresponding time difference is minimum is the positional information of the 2nd image.
Can selection of land, the first determination module comprises and obtains submodule block and the 2nd true stator modules:
Obtain submodule block, it is configured to the positional information that all first images is corresponding and shooting time employing interpolation algorithm, obtains the interpolating function that positional information is corresponding with shooting time;
2nd true stator modules, is configured to the shooting time corresponding according to the 2nd image and interpolating function determines the positional information of the 2nd image.
Can selection of land, processing module is configured to the positional information according to all 2nd images and is sorted out by all 2nd images.
Can selection of land, this device also comprises: the 2nd determination module, and the positional information being configured to determine the 2nd image is an attribute information of the 2nd image.
The third aspect according to disclosure embodiment, it is provided that a kind of image processing apparatus, comprising:
Treater;
For the storer of the performed instruction of storage of processor;
Wherein, this treater is configured to:
Obtain positional information corresponding to first image taken by terminal with location function and shooting time;
Obtain and not there is the 2nd image taken by terminal of location function and shooting time corresponding to the 2nd image;
The shooting time corresponding according to positional information corresponding to the first image, shooting time and the 2nd image determines the positional information of the 2nd image;
2nd image is processed by the positional information according to the 2nd image.
The technical scheme that embodiment of the present disclosure provides can comprise following useful effect: a kind of image processing method and device, and the method comprises: obtains positional information corresponding to first image taken by terminal with location function and shooting time; Obtain and not there is the 2nd image taken by terminal of location function and shooting time corresponding to described 2nd image; The shooting time corresponding according to positional information corresponding to the first image, shooting time and the 2nd image determines the positional information of the 2nd image; 2nd image is processed by the positional information according to the 2nd image. Terminating unit such that it is able to according to the positional information of the 2nd image, the 2nd image is sorted out, the process operation such as display. And then improve Consumer's Experience sense.
Should be understood that, it is only exemplary and explanatory that above general description and details hereinafter describe, and can not limit the disclosure.
Accompanying drawing explanation
Accompanying drawing herein is by being incorporated in specification sheets and forms the part of this specification sheets, shows and meets embodiment of the present disclosure, and is used from specification sheets one and explains principle of the present disclosure.
Fig. 1 is the schema of a kind of image processing method according to an exemplary embodiment;
Figure 1A is the interface schematic diagram of the exemplary embodiment of Fig. 1 when implementing;
Fig. 2 is the block diagram of a kind of image processing apparatus according to an exemplary embodiment;
Fig. 3 is the block diagram of a kind of image processing apparatus according to another exemplary embodiment;
Fig. 4 is the block diagram of a kind of image processing apparatus Gen Ju an exemplary embodiment again;
Fig. 5 is the block diagram of a kind of image processing apparatus Gen Ju an exemplary embodiment again;
Fig. 6 is the block diagram of a kind of image processing apparatus 600 according to an exemplary embodiment.
By above-mentioned accompanying drawing, the embodiment that the disclosure is clear and definite is shown, will have more detailed description hereinafter. These accompanying drawings and text description be not in order to by any mode limit the disclosure design scope, but by reference to specific embodiment for those skilled in the art illustrate concept of the present disclosure.
Embodiment
Here exemplary embodiment being described in detail, its example representation is in the accompanying drawings. When description below relates to accompanying drawing, unless otherwise indicated, the same numbers in different accompanying drawing represents same or similar key element. Enforcement mode described in exemplary embodiment does not represent all enforcement modes consistent with the disclosure below. On the contrary, they only with as in appended claims describe in detail, the example of device that aspects more of the present disclosure are consistent and method.
At present, people travel outdoors often can simultaneously with camera and mobile phone, and generally, mobile phone all has location function, and camera does not have location function. Certainly, it is also possible to there is mobile phone and not there is location function, and camera has the situation of location function. In a word, it is possible to terminal user held is divided into the terminal with location function and does not have the terminal of location function. The terminal with location function can navigate to the positional information at place when user takes image (such as: longitude and the latitude taking position during image), the positional information with the image captured by the terminal of location function and correspondence is stored in the terminating unit such as computer, panel computer by user, equally, user by not having, the image captured by the terminal of location function is also stored in this terminating unit. But, the positional information at place when being to navigate to shooting image for the terminal without location function, thus it is unfavorable for that above-mentioned terminal is to the classification of image, arrangement and record etc. Based on this, disclosure embodiment provides a kind of image processing method and device.
Fig. 1 is the schema of a kind of image processing method according to an exemplary embodiment, and the present embodiment is applied in terminating unit with this image processing method and illustrates, this terminating unit can be mobile telephone, computer, digital broadcast terminal, messaging devices, game console, tablet device, medical facilities, body-building equipment, personal digital assistant etc., as shown in Figure 1, the method comprises the following steps:
In step S101, obtain positional information corresponding to first image taken by terminal with location function and shooting time;
Wherein, location technology involved in disclosure embodiment is divided into two kinds: one is the location based on global positioning system (GPS) (GlobalPositioningSystem is called for short GPS), and a kind of is the location based on the base station moving operator. Locator means based on GPS utilizes the GPS locating module in terminal that the position signal of oneself is sent to backstage, location to realize terminal positioning. Architecture is then utilize base station to the measuring and calculating of the distance of terminal to determine terminal location. The positional information that first image is corresponding is generally longitude and the latitude of user position when user takes the first image. Terminating unit can obtain the exchangeable image file (ExchangeableImageFile of the first image, it is called for short EXIF) information, this EXIF information generally includes following at least one item: the sound recorded when the various and shooting conditions such as aperture during shooting, shutter, white balance, focal length, shooting time and terminal brand, model, color-code, shooting and above-mentioned positional information, thumbnail etc.
In step s 102, the shooting time corresponding with the 2nd image of the 2nd image taken by terminal without location function is obtained;
Terminating unit can obtain the EXIF information of the 2nd image, and this EXIF information generally includes following at least one item: the sound recorded when the various and shooting conditions such as aperture during shooting, shutter, white balance, focal length, shooting time and terminal brand, model, color-code, shooting and thumbnail etc. It should be noted that the EXIF information of the 2nd image does not comprise positional information corresponding to the 2nd image.
Figure 1A is the interface schematic diagram of the exemplary embodiment of Fig. 1 when implementing, and as shown in Figure 1, usual user click right can show the EXIF information of image on the terminal device. Adopt Windows7 operating system in order to, directly click right on image with terminating unit, inner in " detailed information ", it is possible to viewing GPS mono-, this is above-mentioned positional information.
In step s 103, corresponding according to positional information corresponding to the first image, shooting time and the 2nd image shooting time determines the positional information of the 2nd image;
In fact terminating unit is according to the relation between the shooting time that the first image the is corresponding shooting time corresponding with the 2nd image, and position relation corresponding to the first image estimates the position relation of the 2nd image.
In step S104, according to the positional information of the 2nd image, the 2nd image is processed.
After the positional information of the 2nd image is determined, terminating unit can determine that the positional information of the 2nd image is an attribute information of the 2nd image.
Can selection of land, terminating unit according to oneself electronics map such as Google, Gao De, can also show the concrete place name etc. that this positional information (during shooting image the longitude of position and latitude) is corresponding, such as: certain sight spot title, crossing title etc.
Can selection of land, all 2nd images can also be sorted out by terminating unit according to the positional information of all 2nd images. Such as: be all that the 2nd image taken at location A can be classified as a class, it is all that the 2nd image taken at B location can be classified as another class. Certainly, the first all images, the 2nd image can be sorted out by terminating unit according to the positional information of the positional information of the first image, the 2nd image.
Disclosure embodiment provides a kind of image processing method, comprising: obtain positional information corresponding to first image taken by terminal with location function and shooting time; Obtain and not there is the 2nd image taken by terminal of location function and shooting time corresponding to described 2nd image; The shooting time corresponding according to positional information corresponding to the first image, shooting time and the 2nd image determines the positional information of the 2nd image; 2nd image is processed by the positional information according to the 2nd image. Terminating unit such that it is able to according to the positional information of the 2nd image, the 2nd image is sorted out, the process operation such as display. And then improve Consumer's Experience sense.
On a upper embodiment basis, above-mentioned steps S103 comprises following optional mode:
A kind of optional mode, the shooting time corresponding according to positional information corresponding to the first image, shooting time and the 2nd image determines the positional information of the 2nd image, comprising: calculate the time difference often opening shooting time corresponding to the first image shooting time corresponding with the 2nd image; The positional information determining the first image that the corresponding time difference is minimum is the positional information of the 2nd image.
Such as: the shooting time that the 2nd image is corresponding is 216-1-12-9:03, the positional information that all first images are corresponding and shooting time are such as table 1:
The shooting time of the first image 1 is 216-1-12-9:00 | The positional information of the first image 1 is A |
The shooting time of the first image 2 is 216-1-12-9:10 | The positional information of the first image 2 is B |
The shooting time of the first image 3 is 216-1-12-9:20 | The positional information of the first image 3 is C |
The shooting time of the first image 4 is 216-1-12-9:25 | The positional information of the first image 4 is D |
The shooting time of the first image 5 is 216-1-12-9:30 | The positional information of the first image 4 is E |
The time difference that shooting time corresponding to the first image shooting time corresponding with the 2nd image is often opened in calculating is respectively: 3 minutes, 7 minutes, 17 minutes, 22 minutes, 27 minutes. The positional information A determining the first image 1 that the corresponding time difference is minimum is the positional information of the 2nd image.
Especially, it is assumed that minimum the first image of corresponding time difference can be more than one, and in this case, terminating unit can select one at random as the positional information of the 2nd image in the positional information of these the first images.
Another kind of optional mode, the shooting time corresponding according to positional information corresponding to the first image, shooting time and the 2nd image determines the positional information of the 2nd image, comprise: the positional information corresponding to all first images and shooting time adopt interpolation algorithm, obtain the interpolating function that positional information is corresponding with shooting time; The shooting time corresponding according to the 2nd image and interpolating function determine the positional information of the 2nd image.
Such as: obtain positional information corresponding to multiple the first images and shooting time, the interpolating function that positional information is corresponding with shooting time is determined by interpolation algorithm, wherein the interpolation algorithm in disclosure embodiment can be Lagrange's interpolation algorithm or Newton interpolation algorithm etc., and this is not limited by disclosure embodiment. After determining interpolating function, it is possible to substituted in this interpolating function by shooting time corresponding for the 2nd image, obtain the positional information that the 2nd image is corresponding.
Disclosure embodiment obtains positional information corresponding to the 2nd image by above-mentioned two kinds of optional modes, terminating unit such that it is able to according to the positional information of the 2nd image, the 2nd image is sorted out, the process operation such as display. And then improve Consumer's Experience sense.
Following is disclosure device embodiment, it is possible to for performing disclosure embodiment of the method. For the details not disclosed in disclosure device embodiment, please refer to disclosure embodiment of the method.
Fig. 2 is the block diagram of a kind of image processing apparatus according to an exemplary embodiment, and this image processing apparatus can realize becoming the some or all of of terminating unit by software, hardware or both combinations. This image processing apparatus can comprise:
First acquisition module 21, is configured to obtain positional information corresponding to the first image taken by the terminal with location function and shooting time;
2nd acquisition module 22, is configured to obtain the 2nd image taken by the terminal without location function and shooting time corresponding to the 2nd image;
First determination module 23, the shooting time that the 2nd image accessed by positional information, shooting time and the 2nd acquisition module 22 that the first image accessed by the first acquisition module 21 is corresponding is corresponding determines the positional information of the 2nd image;
Processing module 24, is configured to the positional information according to determined 2nd image of the first determination module 23 and is processed by the 2nd image.
In sum, the image processing apparatus that disclosure embodiment provides can determine the positional information of the 2nd image by shooting time corresponding to the 2nd image accessed by positional information corresponding to the first image accessed by the first acquisition module, shooting time and the 2nd acquisition module. Terminating unit such that it is able to according to the positional information of the 2nd image, the 2nd image is sorted out, the process operation such as display. And then improve Consumer's Experience sense.
Fig. 3 is the block diagram of a kind of image processing apparatus according to another exemplary embodiment, and this image processing apparatus can realize becoming the some or all of of terminating unit by software, hardware or both combinations. This image processing apparatus can comprise:
First acquisition module 21, is configured to obtain positional information corresponding to the first image taken by the terminal with location function and shooting time;
2nd acquisition module 22, is configured to obtain the shooting time that the 2nd image taken by the terminal without location function is corresponding;
First determination module 23, the shooting time that the 2nd image accessed by positional information, shooting time and the 2nd acquisition module 22 that the first image accessed by the first acquisition module 21 is corresponding is corresponding determines the positional information of the 2nd image.
Processing module 24, is configured to the positional information according to determined 2nd image of the first determination module 23 and is processed by the 2nd image.
Can selection of land, the first determination module 23 comprises calculating sub module 231 and the first true stator modules 232: wherein, calculating sub module 231 is configured to calculate the time difference often opening shooting time corresponding to the first image shooting time corresponding with the 2nd image; First true stator modules 232 is configured to determine that the positional information of the first image that the corresponding time difference is minimum is the positional information of the 2nd image.
In sum, the time difference often opening shooting time corresponding to the first image shooting time corresponding with the 2nd image is calculated by calculating sub module, first true stator modules determines that the positional information of the first image that the corresponding time difference is minimum is the positional information of the 2nd image, terminating unit such that it is able to according to the positional information of the 2nd image, the 2nd image is sorted out, the process operation such as display. And then improve Consumer's Experience sense.
Fig. 4 is the block diagram of a kind of image processing apparatus Gen Ju an exemplary embodiment again, and this image processing apparatus can realize becoming the some or all of of terminating unit by software, hardware or both combinations. This image processing apparatus can comprise:
First acquisition module 21, is configured to obtain positional information corresponding to the first image taken by the terminal with location function and shooting time;
2nd acquisition module 22, is configured to obtain the shooting time that the 2nd image taken by the terminal without location function is corresponding;
First determination module 23, the shooting time that the 2nd image accessed by positional information, shooting time and the 2nd acquisition module 22 that the first image accessed by the first acquisition module 21 is corresponding is corresponding determines the positional information of the 2nd image;
Processing module 24, is configured to the positional information according to determined 2nd image of the first determination module 23 and is processed by the 2nd image.
Can selection of land, first determination module 23 comprises acquisition submodule block 233 and the 2nd true stator modules 234: wherein, obtain submodule block 233 and it is configured to the positional information that all first images is corresponding and shooting time employing interpolation algorithm, obtain the interpolating function that positional information is corresponding with shooting time; 2nd true stator modules 234 is configured to the shooting time corresponding according to the 2nd image and interpolating function determines the positional information of the 2nd image.
In sum, by obtaining submodule block, positional information corresponding to all first images and shooting time are adopted interpolation algorithm, obtain the interpolating function that positional information is corresponding with shooting time; 2nd true stator modules determines the positional information of the 2nd image according to shooting time corresponding to the 2nd image and interpolating function, terminating unit such that it is able to according to the positional information of the 2nd image, the 2nd image is sorted out, the process operation such as display. And then improve Consumer's Experience sense.
Fig. 5 is the block diagram of a kind of image processing apparatus Gen Ju an exemplary embodiment again, and this image processing apparatus can realize becoming the some or all of of terminating unit by software, hardware or both combinations. This image processing apparatus can comprise:
First acquisition module 21, is configured to obtain positional information corresponding to the first image taken by the terminal with location function and shooting time;
2nd acquisition module 22, is configured to obtain the shooting time that the 2nd image taken by the terminal without location function is corresponding;
First determination module 23, the shooting time that the 2nd image accessed by positional information, shooting time and the 2nd acquisition module 22 that the first image accessed by the first acquisition module 21 is corresponding is corresponding determines the positional information of the 2nd image;
Processing module 24 is configured to the positional information according to all described 2nd images and is sorted out by all described 2nd images.
This device also comprises the 2nd determination module 25, and the positional information being configured to determine the 2nd image that the first determination module 23 is determined is an attribute information of the 2nd image.
In sum, by the 2nd determination module, the positional information of determined 2nd image is added on the 2nd image as an attribute; Terminating unit such that it is able to according to the positional information of the 2nd image, the 2nd image is sorted out, the process operation such as display. And then improve Consumer's Experience sense.
Fig. 6 is the block diagram of a kind of image processing apparatus 600 according to an exemplary embodiment. Such as, device 600 can be mobile telephone, computer, digital broadcast terminal, messaging devices, game console, tablet device, medical facilities, body-building equipment, personal digital assistant etc.
With reference to Fig. 5, device 600 can comprise following one or more assembly: processing components 602, storer 604, power supply module 606, multimedia groupware 608, audio-frequency assembly 610, the interface 612 of I/O (I/O), sensor module 614, and communications component 616.
The overall operation of the usual control device 600 of processing components 602, such as with display, the operation that telephone call, data corresponding, camera operation and recording operation are associated. Processing components 602 can comprise one or more treater 620 to perform instruction, to complete all or part of step of above-mentioned method. In addition, processing components 602 can comprise one or more module, and what be convenient between processing components 602 and other assemblies is mutual. Such as, processing components 602 can comprise multi-media module, mutual with what facilitate between multimedia groupware 608 and processing components 602.
Storer 604 is configured to store various types of data to be supported in the operation of device 600. The example of these data comprises for any application program of operation on device 600 or the instruction of method, contact data, telephone book data, message, picture, video etc. Storer 604 can be realized by the volatibility of any type or non-volatile memory device or their combination, such as static RAM (SRAM), electrically erasable read-only storage (EEPROM), erasable programmable read-only storage (EPROM), programmable read only memory (PROM), read-only storage (ROM), magneticstorage, flash device, disk or CD.
The various assembly that power supply module 606 is device 600 provides electric power. Power supply module 606 can comprise power-supply management system, one or more power supply, and other generate, manage and distribute, with for device 600, the assembly that electric power is associated.
The tactile control display screen that an output interface is provided that multimedia groupware 608 is included between described device 600 and user. In certain embodiments, touch control display screen and can comprise liquid-crystal display (LCD) and touch panel (TP). Touch panel comprises one or more touch sensing device with the gesture on sensing touch, slip and touch panel. Described touch sensing device can the border of not only sensing touch or sliding action, but also the detection time length relevant to described touch or slide and pressure. In certain embodiments, multimedia groupware 608 comprises a front-facing camera and/or rearmounted camera. When device 600 is in operator scheme, during such as screening-mode or video pattern, front-facing camera and/or rearmounted camera can receive outside multi-medium data. Each front-facing camera and rearmounted camera can be a fixing optical lens system or have focal length and optical zoom ability.
Audio-frequency assembly 610 is configured to export and/or input audio signal. Such as, audio-frequency assembly 610 comprises a microphone (MIC), and when device 600 is in operator scheme, during such as calling pattern, record pattern and speech recognition pattern, microphone is configured to receive external audio signal. The sound signal received can be stored in storer 604 further or be sent via communications component 616. In certain embodiments, audio-frequency assembly 610 also comprises a loud speaker, for output audio signal.
I/O interface 612 is for providing interface between processing components 602 and peripheral interface module, and above-mentioned peripheral interface module can be keyboard, some striking wheel, button etc. These buttons can include but not limited to: main bar button, volume button, startup button and locking button.
Sensor module 614 comprises one or more sensor, for providing the state estimation of all respects for device 600. Such as, sensor module 614 can detect the opening/closing state of device 600, the relative location of assembly, such as described assembly is indicating meter and the keypad of device 600, the position that sensor module 614 can also detect device 600 or device 600 1 assemblies changes, the presence or absence that user contacts with device 600, the temperature variation of device 600 orientation or acceleration/deceleration and device 600. Sensor module 614 can comprise close to sensor, be configured to without any physical contact time detection near the existence of object. Sensor module 614 can also comprise optical sensor, such as CMOS or ccd image sensor, for using in imaging applications. In certain embodiments, this sensor module 614 can also comprise acceleration transducer, gyro sensor, Magnetic Sensor, pressure transmitter or temperature sensor.
Communications component 616 is configured to be convenient to the communication of wired or wireless mode between device 600 and other equipment. Device 600 can access the wireless network based on communication standard, such as WiFi, 2G or 3G, or their combination. In an exemplary embodiment, communications component 616 receives the broadcast signal from outside broadcasting management systems or broadcast related information via broadcast channel. In an exemplary embodiment, described communications component 616 also comprises near-field communication (NFC) module, to promote short distance communication. Such as, can based on RF identification (RFID) technology in NFC module, Infrared Data Association (IrDA) technology, ultra broadband (UWB) technology, bluetooth (BT) technology and other technologies realize.
In the exemplary embodiment, device 600 can be realized by one or more application specific unicircuit (ASIC), digital signal processor (DSP), digital signal processing appts (DSPD), programmable logic device part (PLD), field-programmable gate array (FPGA), controller, microcontroller, microprocessor or other electronic components, for performing above-mentioned session display packing.
In the exemplary embodiment, additionally providing a kind of non-transitory computer-readable recording medium comprising instruction, such as, comprise the storer 604 of instruction, above-mentioned instruction can perform aforesaid method by the treater 620 of device 600. Such as, described non-transitory computer-readable recording medium can be ROM, random access memory (RAM), CD-ROM, tape, floppy disk and optical data storage equipment etc.
A kind of non-transitory computer-readable recording medium, when the instruction in described storage media is performed by the treater of device 600 so that device 600 can perform a kind of image processing method.
Described method comprises: obtain positional information corresponding to first image taken by terminal with location function and shooting time;
Obtain and not there is the 2nd image taken by terminal of location function and shooting time corresponding to described 2nd image;
The shooting time corresponding according to positional information corresponding to described first image, shooting time and described 2nd image determines the positional information of described 2nd image;
Described 2nd image is processed by the positional information according to described 2nd image.
Those skilled in the art, after considering specification sheets and putting into practice invention disclosed herein, will easily expect other embodiment of the present disclosure. The application is intended to contain any modification of the present disclosure, purposes or adaptations, and these modification, purposes or adaptations are followed general principle of the present disclosure and comprised the unexposed common practise in the art of the disclosure or conventional techniques means. Specification sheets and embodiment are only regarded as exemplary, and true scope of the present disclosure and spirit are pointed out by claim below.
Should be understood that, the disclosure is not limited to accurate structure described above and illustrated in the accompanying drawings, and can carry out various amendment and change not departing from its scope. The scope of the present disclosure is only limited by appended claim.
Claims (11)
1. an image processing method, it is characterised in that, comprising:
Obtain positional information corresponding to first image taken by terminal with location function and shooting time;
Obtain and not there is the 2nd image taken by terminal of location function and shooting time corresponding to described 2nd image;
The shooting time corresponding according to positional information corresponding to described first image, shooting time and described 2nd image determines the positional information of described 2nd image;
Described 2nd image is processed by the positional information according to described 2nd image.
2. method according to claim 1, it is characterised in that, the shooting time that the described positional information corresponding according to described first image, shooting time and described 2nd image are corresponding determines the positional information of described 2nd image, comprising:
Calculate the time difference often opening shooting time corresponding to the first image shooting time corresponding with described 2nd image;
The positional information determining the first image that the corresponding described time difference is minimum is the positional information of described 2nd image.
3. method according to claim 1, it is characterised in that, the shooting time that the described positional information corresponding according to described first image, shooting time and described 2nd image are corresponding determines the positional information of described 2nd image, comprising:
The positional information corresponding to all first images and shooting time adopt interpolation algorithm, obtain the interpolating function that positional information is corresponding with described shooting time;
The shooting time corresponding according to described 2nd image and described interpolating function determine the positional information of described 2nd image.
4. method according to the arbitrary item of claims 1 to 3, it is characterised in that, described 2nd image is processed by the described positional information according to described 2nd image, comprising:
All described 2nd images are sorted out by the positional information according to all described 2nd images.
5. method according to the arbitrary item of claims 1 to 3, it is characterised in that, also comprise:
The positional information determining described 2nd image is an attribute information of described 2nd image.
6. an image processing apparatus, it is characterised in that, comprising:
First acquisition module, is configured to obtain positional information corresponding to the first image taken by the terminal with location function and shooting time;
2nd acquisition module, is configured to obtain the 2nd image taken by the terminal without location function and shooting time corresponding to described 2nd image;
First determination module, is configured to the positional information that shooting time corresponding to described 2nd image accessed by positional information corresponding to described first image accessed by described first acquisition module, shooting time and described 2nd acquisition module determines described 2nd image;
Processing module, is configured to the positional information according to described 2nd image and is processed by described 2nd image.
7. device according to claim 6, it is characterised in that, described first determination module comprises calculating sub module and the first true stator modules:
Described calculating sub module, is configured to calculate the time difference often opening shooting time corresponding to the first image shooting time corresponding with described 2nd image;
Described first true stator modules, the positional information being configured to determine the first image that the corresponding described time difference is minimum is the positional information of described 2nd image.
8. device according to claim 6, it is characterised in that, described first determination module comprises acquisition submodule block and the 2nd true stator modules:
Described acquisition submodule block, is configured to the positional information that all first images is corresponding and shooting time adopts interpolation algorithm, obtains the interpolating function that positional information is corresponding with described shooting time;
Described 2nd true stator modules, is configured to the shooting time corresponding according to described 2nd image and described interpolating function determines the positional information of described 2nd image.
9. device according to the arbitrary item of claim 6 to 8, it is characterised in that,
Described processing module is configured to the positional information according to all described 2nd images and is sorted out by all described 2nd images.
10. device according to the arbitrary item of claim 6 to 8, it is characterised in that, also comprise:
2nd determination module, the positional information being configured to determine described 2nd image is an attribute information of described 2nd image.
11. 1 kinds of image processing apparatus, it is characterised in that, described device comprises:
Treater;
For storing the storer of the performed instruction of described treater;
Wherein, described treater is configured to:
Obtain positional information corresponding to first image taken by terminal with location function and shooting time;
Obtain and not there is the 2nd image taken by terminal of location function and shooting time corresponding to described 2nd image;
The shooting time corresponding according to positional information corresponding to described first image, shooting time and described 2nd image determines the positional information of described 2nd image;
Described 2nd image is processed by the positional information according to described 2nd image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610136635.3A CN105635972A (en) | 2016-03-10 | 2016-03-10 | Image processing method and apparatus |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610136635.3A CN105635972A (en) | 2016-03-10 | 2016-03-10 | Image processing method and apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
CN105635972A true CN105635972A (en) | 2016-06-01 |
Family
ID=56050373
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610136635.3A Pending CN105635972A (en) | 2016-03-10 | 2016-03-10 | Image processing method and apparatus |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105635972A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112313472A (en) * | 2018-10-29 | 2021-02-02 | 松下知识产权经营株式会社 | Information presentation method, information presentation device, and information presentation system |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102997927A (en) * | 2011-09-09 | 2013-03-27 | 中国电信股份有限公司 | Information acquisition and processing method and apparatus |
CN103064937A (en) * | 2012-12-25 | 2013-04-24 | 广东欧珀移动通信有限公司 | Method and device for storing photos and based on shooting address and shooting time |
CN103167395A (en) * | 2011-12-08 | 2013-06-19 | 腾讯科技(深圳)有限公司 | Picture positioning method and system based on mobile terminal navigation function |
CN103870599A (en) * | 2014-04-02 | 2014-06-18 | 联想(北京)有限公司 | Shooting data collecting method, device and electronic equipment |
CN104866500A (en) * | 2014-02-25 | 2015-08-26 | 腾讯科技(深圳)有限公司 | Method and device for displaying pictures in classified manner |
CN105320689A (en) * | 2014-07-29 | 2016-02-10 | 中兴通讯股份有限公司 | Photo information processing method and device as well as terminal |
-
2016
- 2016-03-10 CN CN201610136635.3A patent/CN105635972A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102997927A (en) * | 2011-09-09 | 2013-03-27 | 中国电信股份有限公司 | Information acquisition and processing method and apparatus |
CN103167395A (en) * | 2011-12-08 | 2013-06-19 | 腾讯科技(深圳)有限公司 | Picture positioning method and system based on mobile terminal navigation function |
CN103064937A (en) * | 2012-12-25 | 2013-04-24 | 广东欧珀移动通信有限公司 | Method and device for storing photos and based on shooting address and shooting time |
CN104866500A (en) * | 2014-02-25 | 2015-08-26 | 腾讯科技(深圳)有限公司 | Method and device for displaying pictures in classified manner |
CN103870599A (en) * | 2014-04-02 | 2014-06-18 | 联想(北京)有限公司 | Shooting data collecting method, device and electronic equipment |
CN105320689A (en) * | 2014-07-29 | 2016-02-10 | 中兴通讯股份有限公司 | Photo information processing method and device as well as terminal |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112313472A (en) * | 2018-10-29 | 2021-02-02 | 松下知识产权经营株式会社 | Information presentation method, information presentation device, and information presentation system |
CN112313472B (en) * | 2018-10-29 | 2023-12-26 | 松下知识产权经营株式会社 | Information presentation method, information presentation device, and information presentation system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104105064B (en) | The method and device of location equipment | |
CN106231378A (en) | The display packing of direct broadcasting room, Apparatus and system | |
CN104519282A (en) | Image shooting method and device | |
CN106202194A (en) | The storage method and device of screenshot picture | |
CN103916940A (en) | Method and device for acquiring photographing position | |
CN104731880A (en) | Image ordering method and device | |
CN105069050A (en) | Search response method, apparatus and system | |
CN105654533A (en) | Picture editing method and picture editing device | |
CN105323244A (en) | Method and device for network identification | |
CN105260360A (en) | Named entity identification method and device | |
CN106203650A (en) | Call a taxi and ask sending method and device | |
CN105354017A (en) | Information processing method and apparatus | |
CN105549300A (en) | Automatic focusing method and device | |
CN105208284A (en) | Photographing reminding method and device | |
CN105227739A (en) | On-vehicle Bluetooth broadcasting method and device | |
CN105100193A (en) | Cloud business card recommendation method and device | |
CN105491518A (en) | Method and device for social reminding based on wearable devices | |
CN105407160A (en) | Interface display method and device | |
CN105631450A (en) | Character identifying method and device | |
CN104836721A (en) | Group session message reminding method and group session message reminding device | |
CN103957502A (en) | Location-based service program selecting method, device and terminal | |
CN105677435A (en) | Function invoking method, apparatus and terminal | |
CN105843894A (en) | Information recommending method and device | |
CN105551047A (en) | Picture content detecting method and device | |
CN105487774A (en) | Image grouping method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20160601 |
|
RJ01 | Rejection of invention patent application after publication |