CN107967339B - Image processing method, image processing device, computer-readable storage medium and computer equipment - Google Patents

Image processing method, image processing device, computer-readable storage medium and computer equipment Download PDF

Info

Publication number
CN107967339B
CN107967339B CN201711278857.XA CN201711278857A CN107967339B CN 107967339 B CN107967339 B CN 107967339B CN 201711278857 A CN201711278857 A CN 201711278857A CN 107967339 B CN107967339 B CN 107967339B
Authority
CN
China
Prior art keywords
image
processed
images
time range
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711278857.XA
Other languages
Chinese (zh)
Other versions
CN107967339A (en
Inventor
柯秀华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201711278857.XA priority Critical patent/CN107967339B/en
Publication of CN107967339A publication Critical patent/CN107967339A/en
Application granted granted Critical
Publication of CN107967339B publication Critical patent/CN107967339B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72451User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to schedules, e.g. using calendar applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
    • H04M1/72472User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons wherein the items are sorted according to specific criteria, e.g. frequency of use

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Library & Information Science (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Environmental & Geological Engineering (AREA)
  • Studio Devices (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application relates to an image processing method, an image processing device, a computer readable storage medium and a computer device. The method comprises the following steps: acquiring a plurality of images to be processed at the shooting moment of the images within a preset time range; carrying out scene recognition on the multiple images to be processed, and respectively acquiring image scenes corresponding to the multiple images to be processed; merging the same image scenes in the image scenes corresponding to the multiple images to be processed to obtain scene information corresponding to the preset time range; and associating the preset time range with a date in an application program calendar, and displaying scene information corresponding to the date in a calendar interface. According to the method, the scene information corresponding to the image can be obtained according to the time dimension statistics, the scene information and the date in the calendar are displayed in a correlated mode, the image information is extracted and visualized, a user can conveniently check and recall the journey according to the scene information corresponding to the date in the calendar, and the searching efficiency is improved.

Description

Image processing method, image processing device, computer-readable storage medium and computer equipment
Technical Field
The present application relates to the field of computer technologies, and in particular, to an image processing method and apparatus, a computer-readable storage medium, and a computer device.
Background
With the development of intelligent computer devices, more and more users take images with the intelligent computer devices. By means of shooting images, time, places, people, landscapes and the like corresponding to the current scene can be recorded. The information is recorded in the form of the image, so that a user can conveniently remember the scene corresponding to the image according to the image.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing device, a computer readable storage medium and computer equipment, which can acquire travel information corresponding to an image.
An image processing method comprising:
acquiring a plurality of images to be processed at the shooting moment of the images within a preset time range;
carrying out scene recognition on the multiple images to be processed, and respectively acquiring image scenes corresponding to the multiple images to be processed;
merging the same image scenes in the image scenes corresponding to the multiple images to be processed to obtain scene information corresponding to the preset time range;
and associating the preset time range with a date in an application program calendar, and displaying scene information corresponding to the date in a calendar interface.
An image processing apparatus comprising:
the device comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring a plurality of images to be processed within a preset time range at the shooting moment of the images;
the identification module is used for carrying out scene identification on the multiple images to be processed and respectively acquiring image scenes corresponding to the multiple images to be processed;
the merging module is used for merging the same image scenes in the image scenes corresponding to the multiple images to be processed to obtain scene information corresponding to the preset time range;
and the display module is used for associating the preset time range with a date in an application program calendar and displaying scene information corresponding to the date in a calendar interface.
A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method as set forth above.
A computer device comprising a memory and a processor, the memory having stored therein computer readable instructions which, when executed by the processor, cause the processor to perform a method as described above.
According to the method, after the image to be processed is obtained, the computer equipment can obtain the scene information corresponding to the image according to the time dimension statistics, the scene information and the date in the calendar are displayed in an associated mode, the image information is extracted and visualized, a user can conveniently check and recall the journey according to the scene information corresponding to the date in the calendar, and the searching efficiency is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a flow diagram of a method of image processing in one embodiment;
FIG. 2 is an interface diagram that illustrates context information in a calendar interface, under an embodiment;
FIG. 3 is a flow chart of an image processing method in another embodiment;
FIG. 4 is an interface diagram showing a travel record at a mobile terminal interface in one embodiment;
FIG. 5 is a flowchart of an image processing method in another embodiment;
FIG. 6 is a flowchart of an image processing method in another embodiment;
FIG. 7 is a flowchart of an image processing method in another embodiment;
FIG. 8 is a block diagram showing the structure of an image processing module according to one embodiment;
FIG. 9 is a block diagram showing the structure of an image processing module in another embodiment;
FIG. 10 is a block diagram showing the structure of an image processing module in another embodiment;
fig. 11 is a block diagram of a partial structure of a mobile phone related to a computer device provided in an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
FIG. 1 is a flow diagram of a method of image processing in one embodiment. As shown in fig. 1, an image processing method includes:
step 102, acquiring a plurality of images to be processed with the shooting time of the images within a preset time range.
The computer equipment can acquire a plurality of images to be processed with the shooting time of the images within a preset time range. The preset time range can be a time range input by a user or a time range set by a computer device. The preset time range may include a time range counted in time units, for example, if "day" is used as a time unit, the time range of 11/30/2017 includes 0: 0/2017/11/30/0/23: 59/s; also included are time ranges expressed as start time and end time, for example, 12 o 'clock 30 min 50 sec at 11/30/2017 to 12 o' clock 56 min 30 sec at 11/30/2017.
And 104, performing scene recognition on the multiple images to be processed, and respectively acquiring image scenes corresponding to the multiple images to be processed.
After the computer equipment acquires the image to be processed, the image information of the image to be processed can be acquired. The image to be processed can be an image shot by the computer equipment or an image downloaded by the computer equipment through a data network or a wireless local area network.
The image information of the image to be processed acquired by the computer equipment comprises shooting time, shooting place and image scene. When the computer device shoots the image, the moment of shooting the image is recorded, and the moment of shooting the image is stored in the image file, so that the computer device can directly obtain the moment of shooting the image from the image file. When the computer device is shooting an image, if the current computer device starts a Positioning function, such as GPS (Global Positioning System) Positioning, the computer will record current Positioning information, i.e. a current shooting location, in a stored image file, and the computer device can obtain the shooting location of the image by querying the image file. The image scene refers to scene information corresponding to an image to be processed, and includes: sea, forest, building, river, mountain, etc. The computer equipment can identify scene information corresponding to the image to be processed through the scene identification model. The scene recognition model can be a decision model which is constructed in advance through machine learning, a large number of sample images can be obtained when the decision model is constructed, the sample images comprise images of all scenes, such as river images, ocean images, forest images, building images, mountain images and the like, the scene corresponding to each sample image can be marked, the marked sample images are used as the input of the machine learning model, and the scene recognition model can be obtained through machine learning training.
And 106, combining the same image scenes in the image scenes corresponding to the plurality of images to be processed to obtain scene information corresponding to a preset time range.
After the image scenes corresponding to the multiple images to be processed are respectively obtained, whether the image scenes corresponding to the multiple images are the same or not can be identified. The same image scene is the same image scene, for example, if the image scenes corresponding to the two images to be processed are deserts, the image scenes corresponding to the two images are merged. And combining the same image scenes to obtain corresponding scene information within the preset time range. The corresponding scene information within the preset time range is the combined image scene, and the scene information may include one or more image scenes.
In one embodiment, if the scene information corresponding to the preset time range includes a plurality of image scenes, acquiring the number of images to be processed corresponding to each image scene and the time range corresponding to each image scene; the time range corresponding to each image scene is obtained according to the shooting time statistics of the image to be processed corresponding to each image scene.
The computer device can establish the association between each image scene and the image to be processed in the scene information, and count the number of the images to be processed corresponding to each image scene and the time range corresponding to each image scene. The time range corresponding to each image scene is obtained according to the shooting time statistics of the image to be processed corresponding to each image scene. Specifically, after acquiring the shooting time of the to-be-processed image corresponding to each image scene, the computer device acquires the earliest shooting time and the latest shooting time in the to-be-processed image, and generates the time range corresponding to the image scene according to the earliest shooting time and the latest shooting time.
And step 108, associating the preset time range with the date in the application program calendar, and displaying the scene information corresponding to the date in the calendar interface.
The computer device may also associate the preset time range with a calendar in the application. The calendar in the application program may be a calendar application program, or may be a calendar embedded in the application program, for example, a "calendar" applied to a system in a mobile phone, a calendar embedded in a mailbox, or the like. When associating the preset time range with the calendar in the application program, if the preset time range is a time range counted by a time unit, the preset time range is directly associated with the date in the calendar, for example, if the preset time range is 2017, month 11, day 1 in 2017 to month 11, day 30 in 2017, the date in the calendar is associated. If the preset time range is a time range represented by the starting time and the ending time, acquiring a date corresponding to the starting time and a date corresponding to the ending time, and if the date corresponding to the starting time and the date corresponding to the ending time are the same day, associating the dates corresponding to the day; if the date corresponding to the start time and the date corresponding to the end time are not on the same day, the association includes the date corresponding to the start time, the date corresponding to the end time, and the date between the date corresponding to the start time and the date corresponding to the end time, and for example, if the start time is 13 o 0 min 0 s on 10/1/2017 and the end time is 1 o 5 min 4 s on 10/20/2017, the association is from 10/1/2017 to 10/20/2017.
After associating the preset time range with a date in the application calendar, the computer device may present scene information corresponding to the associated date in the calendar interface. For example, the preset time range is from 11/27/2017 to 11/31/2017, and the scene information corresponding to the preset time range is the sea, so that the scene information is displayed as the sea at 11/27/017/11/28/11/29/11/30/11/31/017 in the calendar. And the scene information corresponding to the preset time range corresponding to the associated date is the scene information corresponding to the associated date. When the computer equipment displays the scene information, the scene information can be displayed by text information or image information.
In one embodiment, after the scene information corresponding to the preset time range includes a plurality of image scenes, the time range corresponding to each image scene may be acquired, the time range corresponding to each scene may be associated with a date in the calendar, and each image scene may be displayed at the associated date. For example, the scene information includes the sea and the mountain between 11 and 27 in 2017 and 31 in 11 and 2017. The time range corresponding to the sea is from 11/27/2017/11/29/2017, the time range corresponding to the mountain is from 11/30/2017/11/31/2017, the scene information displayed in the calendar from 11/27/2017/28/11/29/2017 is the sea, and the scene information displayed in the calendar from 11/29/2017/11/30 is the mountain.
Take a computer device as an example of a mobile terminal. FIG. 2 is an interface diagram that illustrates context information in a calendar interface, under an embodiment. As shown in fig. 2, in the calendar interface, when the user selects the date 2017, 11, month and 6, the scene information corresponding to 2017, 11, month and 6 can be displayed as the beach on the current calendar interface.
According to the method, after the image to be processed is obtained, the computer equipment can obtain the scene information corresponding to the image according to the time dimension statistics, the scene information and the date in the calendar are displayed in an associated mode, the image information is extracted and visualized, a user can conveniently check and recall the journey according to the scene information corresponding to the date in the calendar, and the searching efficiency is improved.
In one embodiment, after step 108, further comprising:
and step 110, acquiring schedule information corresponding to a date in the calendar, and generating a travel record according to the schedule information corresponding to the date and the scene information.
And 112, displaying the travel record corresponding to the date on a display interface of the computer equipment according to the sequence of the dates.
The computer equipment can obtain schedule information corresponding to a date in a calendar, wherein the schedule information corresponding to the calendar is schedule information added by a user or schedule information added by the computer equipment. For example, the 11-month 30-day schedule information of 2017 is for a meeting in city a. If the computer device detects schedule information and scene information corresponding to a date in the calendar, generating a travel record according to the schedule information and the scene information corresponding to the date in the calendar, for example, if the schedule information of 11/30 th in 2017 is a meeting in city a and the scene is a building, generating a travel record of 11/30 th in 2017 as 11/30 th in 2017: and b, a meeting in city A and construction. The computer equipment can obtain the travel records corresponding to a plurality of dates and display the travel records corresponding to the dates according to the sequence of the dates.
As shown in fig. 4, the travel record corresponding to the date is displayed in the computer device interface, and 2016, 2 and 13 days, tourism in city a, and forest are displayed according to the sequence of the dates; 2016, 3 months and 7 days, business A, ocean; 2016, 4 months and 9 days, travel in city A, grassland; 2016, 5 and 10 months, A am on business, mountain.
According to the method, the travel records corresponding to the dates in the calendar are generated according to the combination of the schedule information corresponding to the dates in the calendar and the scene information, and the travel records corresponding to the dates are displayed in a concentrated mode according to the time sequence, so that the travel is traced back according to the time sequence, and the efficiency of checking the travel is improved.
In one embodiment, after step 108, further comprising:
and step 114, respectively acquiring shooting places of the multiple images to be processed, and obtaining travel information corresponding to the preset time range according to the shooting places and the scene information.
And step 116, displaying the travel information corresponding to the preset time ranges on a display interface of the computer equipment according to the sequence of time.
The computer equipment can obtain the shooting place of the image to be processed and obtain the travel information corresponding to the preset time range according to the shooting place and the image scene of the image to be processed. When the travel information is acquired, the computer device can combine the same image scenes in the image scenes of the multiple images to be processed, and the computer device can also combine the same shooting places in the shooting places of the multiple images to be processed. When the computer device merges the same shooting locations among the shooting locations of the plurality of images to be processed, the location units that need to be merged may be set, for example, the shooting locations are merged in units of "city", and the shooting locations are merged in units of "province".
After the combined shooting place and image scene are obtained, the travel information corresponding to the preset time range can be obtained, and after the computer equipment obtains the travel information corresponding to the preset time ranges, the travel information corresponding to the preset time ranges can be displayed on a display interface of the computer equipment according to the time sequence.
According to the method, the journey information is generated according to the image scene and the shooting place of the image, visual display of the image information is facilitated, and a user can conveniently and quickly check the journey information.
In one embodiment, before acquiring a plurality of images to be processed whose shooting time is within a preset time range, the method further includes: if the number of the images with the shooting time within the preset time range exceeds a first threshold value, taking the images with the shooting time within the preset time range as the images to be processed
Before the computer device acquires the image to be processed, the computer device can screen the image stored in the computer device, and the image to be processed is acquired through screening. Wherein, the computer equipment screening obtains pending image and includes: the computer device detects whether the number of images within a preset time range at the shooting moment exceeds a first threshold value. The preset time range may be time intervals corresponding to different time units, for example, if "day" is used as a time unit, the preset time range may be 24 hours, that is, the computer device may acquire an image with a shooting time within 24 hours, for example, an image with a shooting time within the same day. After the computer device obtains the images with the shooting time within the preset time range, whether the number of the images with the shooting time within the preset time range exceeds a first threshold value can be detected, wherein the first threshold value can be a value preset by the computer device or a value set by a user. And when the computer equipment detects that the number of the images exceeds a first threshold value, taking the image with the shooting moment within a preset time range as an image to be processed. For example, when "day" is taken as a time unit, the computer device detects that more than 30 images are captured on day 11, month 21 in 2017, and then takes the image captured on day 21 in month 11 in 2017 as an image to be processed.
In one embodiment, the image to be processed is an image obtained by shooting, and includes an image without a person, an image in which the face area in the image is lower than a specified value, and the like, for example, an image in which the face area in the image is lower than 5% of the total area of the image. After the computer equipment acquires the image of which the shooting moment is within the preset time range, whether the image is the image acquired by shooting or not can be detected, if the image is the image acquired by shooting, whether a face does not exist in the image or the face area is lower than a specified value or not is detected, and an unmanned image or an image of which the face area is lower than the specified value in the image acquired by shooting is acquired. By the screening method, non-shot images such as screen capture images and images generated by image software can be screened out, and human images such as self-shot images and multi-person images can also be screened out. And the computer equipment detects whether the number of the images obtained by screening exceeds a first threshold value, and if the number of the images obtained by screening exceeds the first threshold value, the images obtained by screening are used as images to be processed.
In general, a user takes a small number of images in daily life, but when the user goes out of a trip, travels, or the like, the user takes a large number of images.
In the method in the embodiment of the application, if the computer device detects that the number of the images to be processed in the preset time range reaches the specified value, the images are taken as the images to be processed. That is, when a computer device stores a large number of images in a certain period of time, the images in the period of time are regarded as images to be processed. The method and the device avoid the increase of the power consumption of the computer device caused by frequent image information extraction of the image to be processed by the computer device.
In one embodiment, after step 108, further comprising:
and step 116, clustering the images to be processed corresponding to the same scene information to generate an atlas.
Step 118, if a touch instruction for the scene information displayed on the display interface of the computer device is received, displaying an atlas corresponding to the scene information on the display interface of the computer device.
After the computer device obtains the scene information corresponding to each time point, the to-be-processed images corresponding to the same scene information can be obtained, and if the same scene information corresponds to a plurality of to-be-processed images, the to-be-processed images corresponding to the same scene information can be generated into an atlas. The computer equipment can acquire each scene information corresponding to each time point, acquire the to-be-processed image corresponding to each scene information, and cluster the to-be-processed images corresponding to the same scene information to generate an atlas. For example, 11/21/2017 corresponds to two scene information, where one is a location: city a, street B, 110, scene: building; the second is the place: park D district C city A, scene: grass. The computer device may generate an atlas from one corresponding image 1, image 2, and image 3, and an atlas from two corresponding images 4, image 5, and image 6.
The computer device can display the scene information corresponding to the time point on the interface of the computer device, and after the computer device obtains the selection instruction of the displayed scene information, the selected corresponding atlas or the selected corresponding atlas is displayed on the current interface or the jump interface. The selection instruction can be a touch instruction acting on the interface of the computer equipment, or a voice instruction of a user.
Take a computer device as an example of a mobile terminal. When the computer equipment acquires the touch operation of scene information 'C city E scenic spot and river' on the time alignment optical axis, jumping to a mobile terminal interface, and displaying an atlas corresponding to the 'C city E scenic spot and river' in the mobile terminal interface.
According to the method, after the computer equipment acquires the selection instruction of the scene information, the atlas corresponding to the selected scene information can be displayed, namely the displayed scene information and the corresponding image can be associated by the computer equipment, so that a user can conveniently and directly check the corresponding image according to the scene information, and the operation steps of the user are simplified.
In one embodiment, after step 108, further comprising:
and step 120, if the face image is detected in the flowchart set, obtaining a face identifier corresponding to the face in the face image.
And step 122, if the computer equipment stores the contact person information corresponding to the face identification, sending the travel map set to the computer equipment corresponding to the contact person.
The computer equipment can adopt a face recognition algorithm to detect whether the travel image set contains a face image, and if the travel image set contains the face image, the face identification corresponding to the face in the face image can be further obtained. The face identification is a character string which is used for uniquely identifying the face and can be numbers, letters, symbols and the like.
After the computer device obtains the face identifier, whether the contact information corresponding to the face identifier is stored in the computer device or not can be searched. The searching whether the contact information corresponding to the face identifier is stored in the computer device comprises:
(1) the computer equipment obtains marking information of a human face in a human face image, wherein the marking information can be a name corresponding to the human face, the computer equipment searches whether a name corresponding to the human face exists in stored contacts, and if the name corresponding to the human face exists in the stored contacts, the computer equipment obtains the contact corresponding to the human face.
(2) The computer equipment can also acquire the head portrait corresponding to the stored contact person, the computer equipment carries out similarity matching on the face corresponding to the face identification and the head portrait corresponding to the stored contact person, and if the matching is successful, the contact person is the contact person corresponding to the face area.
After the computer equipment acquires the contact person corresponding to the face identification, whether the contact person corresponding to the face identification has the stored contact person information or not can be searched. The contact information can be a mobile phone number, a landline number, a social contact account number and the like. And when the computer equipment stores the contact information of the contact, the computer equipment sends the travel map set to the computer equipment corresponding to the contact.
According to the method, when the computer equipment detects the face image in the flow chart set, the face in the face image can be identified, and then the flow chart set is sent to the computer equipment corresponding to the face, namely the computer equipment shares the flow chart set to the corresponding user in the image, so that the user does not need to manually share the image one by one, the operation steps of the user are simplified, and the operation time of the user is saved.
FIG. 8 is a block diagram showing the structure of an image processing apparatus according to an embodiment. As shown in fig. 8, an image processing apparatus includes:
the acquiring module 802 is configured to acquire multiple images to be processed within a preset time range at a shooting time of the image.
The identifying module 804 is configured to perform scene identification on the multiple images to be processed, and respectively obtain image scenes corresponding to the multiple images to be processed.
The merging module 806 is configured to merge the same image scenes in the image scenes corresponding to the multiple images to be processed to obtain scene information corresponding to the preset time range.
The display module 808 is configured to associate the preset time range with a date in the application calendar, and display scene information corresponding to the date in the calendar interface.
In an embodiment, the merging module 806 is further configured to, if the scene information corresponding to the preset time range includes a plurality of image scenes, obtain the number of to-be-processed images corresponding to each image scene, and obtain the time range corresponding to each image scene. The time range corresponding to each image scene is obtained according to the shooting time statistics of the image to be processed corresponding to each image scene.
In an embodiment, the obtaining module 802 is further configured to obtain schedule information corresponding to a date in a calendar, and generate a travel record according to the schedule information corresponding to the date and the scene information. The display module 808 is further configured to display the travel record corresponding to the date on the display interface of the computer device according to the sequence of the dates.
In an embodiment, the obtaining module 802 is further configured to obtain shooting locations of a plurality of images to be processed, and obtain trip information corresponding to the preset time range according to the shooting locations and the scene information. The display module 808 is further configured to display the trip information corresponding to the multiple preset time ranges on the display interface of the computer device according to the sequence of time.
In an embodiment, the obtaining module 802 is further configured to take the image with the shooting time within the preset time range as the image to be processed if the number of the images with the shooting time within the preset time range of the image exceeds a first threshold.
Fig. 9 is a block diagram showing the structure of an image processing module in another embodiment. As shown in fig. 9, an image processing module includes: an acquisition module 902, an identification module 904, a merging module 906, a presentation module 908, and a clustering module 910. The acquiring module 902, the identifying module 904, the merging module 906 and the displaying module 908 have the same functions as the corresponding modules in fig. 8.
And the clustering module 910 is configured to cluster the to-be-processed images corresponding to the same scene information to generate an atlas.
The display module 908 is configured to display an atlas corresponding to the scene information on the display interface of the computer device if a touch instruction for the scene information displayed on the display interface of the computer device is received.
Fig. 10 is a block diagram showing the structure of an image processing module in another embodiment. As shown in fig. 10, an image processing module includes: an acquisition module 1002, an identification module 1004, a merging module 1006, a presentation module 1008, and a sending module 1010. The acquiring module 1002, the identifying module 1004, the merging module 1006 and the displaying module 1008 have the same functions as the corresponding modules in fig. 8.
The obtaining module 1002 is configured to, if a face image is detected in the atlas, obtain a face identifier corresponding to a face in the face image.
The sending module 1010 is configured to send the image set to the computer device corresponding to the contact person if the computer device has stored the contact person information corresponding to the face identifier.
The division of the modules in the image processing apparatus is only for illustration, and in other embodiments, the image processing apparatus may be divided into different modules as needed to complete all or part of the functions of the image processing apparatus.
The embodiment of the application also provides a computer readable storage medium. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the steps of:
(1) and acquiring a plurality of images to be processed with the shooting time of the images within a preset time range.
(2) And carrying out scene recognition on the plurality of images to be processed, and respectively acquiring image scenes corresponding to the plurality of images to be processed.
(3) And combining the same image scenes in the image scenes corresponding to the multiple images to be processed to obtain scene information corresponding to the preset time range.
(4) And associating the preset time range with a date in the application program calendar, and displaying scene information corresponding to the date in the calendar interface.
In one embodiment, further performing: if the scene information corresponding to the preset time range comprises a plurality of image scenes, acquiring the number of the images to be processed corresponding to each image scene and the time range corresponding to each image scene. The time range corresponding to each image scene is obtained according to the shooting time statistics of the image to be processed corresponding to each image scene.
In one embodiment, further performing: and acquiring schedule information corresponding to a date in the calendar, and generating a travel record according to the schedule information corresponding to the date and the scene information. And displaying the travel record corresponding to the date on a display interface of the computer equipment according to the sequence of the dates.
In one embodiment, further performing: and respectively acquiring shooting places of the multiple images to be processed, and obtaining the travel information corresponding to the preset time range according to the shooting places and the scene information. And displaying the travel information corresponding to the preset time ranges on a display interface of the computer equipment according to the time sequence.
In one embodiment, before acquiring a plurality of images to be processed whose shooting time is within a preset time range, the following is further performed: and if the number of the images with the shooting moments within the preset time range exceeds a first threshold value, taking the images with the shooting moments within the preset time range as the images to be processed.
In one embodiment, further performing: and clustering the images to be processed corresponding to the same scene information to generate an atlas. And if a touch instruction of the scene information displayed on the display interface of the computer equipment is received, displaying an atlas corresponding to the scene information on the display interface of the computer equipment.
In one embodiment, further performing: and if the face image is detected in the atlas, acquiring a face identifier corresponding to the face in the face image. And if the computer equipment stores the contact person information corresponding to the face identification, sending the image set to the computer equipment corresponding to the contact person.
A computer program product containing instructions which, when run on a computer, cause the computer to perform the steps of:
(1) and acquiring a plurality of images to be processed with the shooting time of the images within a preset time range.
(2) And carrying out scene recognition on the plurality of images to be processed, and respectively acquiring image scenes corresponding to the plurality of images to be processed.
(3) And combining the same image scenes in the image scenes corresponding to the multiple images to be processed to obtain scene information corresponding to the preset time range.
(4) And associating the preset time range with a date in the application program calendar, and displaying scene information corresponding to the date in the calendar interface.
In one embodiment, further performing: if the scene information corresponding to the preset time range comprises a plurality of image scenes, acquiring the number of the images to be processed corresponding to each image scene and the time range corresponding to each image scene. The time range corresponding to each image scene is obtained according to the shooting time statistics of the image to be processed corresponding to each image scene.
In one embodiment, further performing: and acquiring schedule information corresponding to a date in the calendar, and generating a travel record according to the schedule information corresponding to the date and the scene information. And displaying the travel record corresponding to the date on a display interface of the computer equipment according to the sequence of the dates.
In one embodiment, further performing: and respectively acquiring shooting places of the multiple images to be processed, and obtaining the travel information corresponding to the preset time range according to the shooting places and the scene information. And displaying the travel information corresponding to the preset time ranges on a display interface of the computer equipment according to the time sequence.
In one embodiment, before acquiring a plurality of images to be processed whose shooting time is within a preset time range, the following is further performed: and if the number of the images with the shooting moments within the preset time range exceeds a first threshold value, taking the images with the shooting moments within the preset time range as the images to be processed.
In one embodiment, further performing: and clustering the images to be processed corresponding to the same scene information to generate an atlas. And if a touch instruction of the scene information displayed on the display interface of the computer equipment is received, displaying an atlas corresponding to the scene information on the display interface of the computer equipment.
In one embodiment, further performing: and if the face image is detected in the atlas, acquiring a face identifier corresponding to the face in the face image. And if the computer equipment stores the contact person information corresponding to the face identification, sending the image set to the computer equipment corresponding to the contact person.
The embodiment of the application also provides computer equipment. As shown in fig. 11, for convenience of explanation, only the parts related to the embodiments of the present application are shown, and details of the technology are not disclosed, please refer to the method part of the embodiments of the present application. The computer device may be any terminal device including a mobile phone, a tablet computer, a PDA (Personal Digital Assistant), a POS (Point of Sales), a vehicle-mounted computer, a wearable device, and the like, taking the computer device as the mobile phone as an example:
fig. 11 is a block diagram of a partial structure of a mobile phone related to a computer device provided in an embodiment of the present application. Referring to fig. 11, the cellular phone includes: radio Frequency (RF) circuitry 1110, memory 1120, input unit 1130, display unit 1140, sensors 1150, audio circuitry 1160, wireless fidelity (WiFi) module 1170, processor 1180, and power supply 1190. Those skilled in the art will appreciate that the handset configuration shown in fig. 11 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The RF circuit 1110 may be configured to receive and transmit signals during information transmission and reception or during a call, and may receive downlink information of a base station and then process the downlink information to the processor 1180; the uplink data may also be transmitted to the base station. Typically, the RF circuitry includes, but is not limited to, an antenna, at least one Amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuitry 1110 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE)), e-mail, Short Messaging Service (SMS), and the like.
The memory 1120 may be used to store software programs and modules, and the processor 1180 may execute various functional applications and data processing of the mobile phone by operating the software programs and modules stored in the memory 1120. The memory 1120 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function (such as an application program for a sound playing function, an application program for an image playing function, and the like), and the like; the data storage area may store data (such as audio data, an address book, etc.) created according to the use of the mobile phone, and the like. Further, the memory 1120 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 1130 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone 1100. Specifically, the input unit 1130 may include a touch panel 1131 and other input devices 1132. Touch panel 1131, which may also be referred to as a touch screen, can collect touch operations of a user on or near the touch panel 1131 (for example, operations of the user on or near touch panel 1131 by using any suitable object or accessory such as a finger or a stylus pen), and drive corresponding connection devices according to a preset program. In one embodiment, the touch panel 1131 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 1180, and can receive and execute commands sent by the processor 1180. In addition, the touch panel 1131 can be implemented by using various types, such as resistive, capacitive, infrared, and surface acoustic wave. The input unit 1130 may include other input devices 1132 in addition to the touch panel 1131. In particular, other input devices 1132 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), and the like.
The display unit 1140 may be used to display information input by the user or information provided to the user and various menus of the cellular phone. The display unit 1140 may include a display panel 1141. In one embodiment, the Display panel 1141 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. In one embodiment, touch panel 1131 can cover display panel 1141, and when touch panel 1131 detects a touch operation thereon or nearby, the touch operation is transmitted to processor 1180 to determine the type of touch event, and then processor 1180 provides a corresponding visual output on display panel 1141 according to the type of touch event. Although in fig. 11, the touch panel 1131 and the display panel 1141 are two independent components to implement the input and output functions of the mobile phone, in some embodiments, the touch panel 1131 and the display panel 1141 may be integrated to implement the input and output functions of the mobile phone.
The cell phone 1100 can also include at least one sensor 1150, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 1141 according to the brightness of ambient light, and the proximity sensor may turn off the display panel 1141 and/or the backlight when the mobile phone moves to the ear. The motion sensor can comprise an acceleration sensor, the acceleration sensor can detect the magnitude of acceleration in each direction, the magnitude and the direction of gravity can be detected when the mobile phone is static, and the motion sensor can be used for identifying the application of the gesture of the mobile phone (such as horizontal and vertical screen switching), the vibration identification related functions (such as pedometer and knocking) and the like; the mobile phone may be provided with other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor.
Audio circuitry 1160, speaker 1161 and microphone 1162 may provide an audio interface between a user and a cell phone. The audio circuit 1160 may transmit the electrical signal converted from the received audio data to the speaker 1161, and convert the electrical signal into a sound signal for output by the speaker 1161; on the other hand, the microphone 1162 converts the collected sound signal into an electrical signal, and the electrical signal is received by the audio circuit 1160 and converted into audio data, and then the audio data is processed by the audio data output processor 1180, and then the audio data is sent to another mobile phone through the RF circuit 1110, or the audio data is output to the memory 1120 for subsequent processing.
WiFi belongs to short-distance wireless transmission technology, and the cell phone can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 1170, and provides wireless broadband internet access for the user. Although fig. 11 shows the WiFi module 1170, it is to be understood that it does not necessarily form part of the handset 1100 and may be omitted as desired.
The processor 1180 is a control center of the mobile phone, and is connected to various parts of the whole mobile phone through various interfaces and lines, and executes various functions of the mobile phone and processes data by operating or executing software programs and/or modules stored in the memory 1120 and calling data stored in the memory 1120, thereby performing overall monitoring of the mobile phone. In one embodiment, the processor 1180 may include one or more processing units. In one embodiment, the processor 1180 may integrate an application processor and a modem processor, wherein the application processor mainly handles operating systems, user interfaces, application programs, and the like; the modem processor handles primarily wireless communications. It will be appreciated that the modem processor described above may not be integrated within processor 1180.
The cell phone 1100 also includes a power supply 1190 (e.g., a battery) for providing power to various components, which may be logically coupled to the processor 1180 via a power management system, such that the power management system may be configured to manage charging, discharging, and power consumption.
In one embodiment, the cell phone 1100 may also include a camera, a bluetooth module, and the like.
In the embodiment of the present application, the processor 1180 included in the mobile terminal implements the following steps when executing the computer program stored in the memory:
(1) and acquiring a plurality of images to be processed with the shooting time of the images within a preset time range.
(2) And carrying out scene recognition on the plurality of images to be processed, and respectively acquiring image scenes corresponding to the plurality of images to be processed.
(3) And combining the same image scenes in the image scenes corresponding to the multiple images to be processed to obtain scene information corresponding to the preset time range.
(4) And associating the preset time range with a date in the application program calendar, and displaying scene information corresponding to the date in the calendar interface.
In one embodiment, further performing: if the scene information corresponding to the preset time range comprises a plurality of image scenes, acquiring the number of the images to be processed corresponding to each image scene and the time range corresponding to each image scene. The time range corresponding to each image scene is obtained according to the shooting time statistics of the image to be processed corresponding to each image scene.
In one embodiment, further performing: and acquiring schedule information corresponding to a date in the calendar, and generating a travel record according to the schedule information corresponding to the date and the scene information. And displaying the travel record corresponding to the date on a display interface of the computer equipment according to the sequence of the dates.
In one embodiment, further performing: and respectively acquiring shooting places of the multiple images to be processed, and obtaining the travel information corresponding to the preset time range according to the shooting places and the scene information. And displaying the travel information corresponding to the preset time ranges on a display interface of the computer equipment according to the time sequence.
In one embodiment, before acquiring a plurality of images to be processed whose shooting time is within a preset time range, the following is further performed: and if the number of the images with the shooting moments within the preset time range exceeds a first threshold value, taking the images with the shooting moments within the preset time range as the images to be processed.
In one embodiment, further performing: and clustering the images to be processed corresponding to the same scene information to generate an atlas. And if a touch instruction of the scene information displayed on the display interface of the computer equipment is received, displaying an atlas corresponding to the scene information on the display interface of the computer equipment.
In one embodiment, further performing: and if the face image is detected in the atlas, acquiring a face identifier corresponding to the face in the face image. And if the computer equipment stores the contact person information corresponding to the face identification, sending the image set to the computer equipment corresponding to the contact person.
Any reference to memory, storage, database, or other medium used herein may include non-volatile and/or volatile memory. Suitable non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and bus dynamic RAM (RDRAM).
The above examples only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (9)

1. An image processing method, comprising:
acquiring a plurality of images to be processed at the shooting moment of the images within a preset time range;
carrying out scene recognition on the multiple images to be processed, and respectively acquiring image scenes corresponding to the multiple images to be processed;
merging the same image scenes in the image scenes corresponding to the multiple images to be processed to obtain scene information corresponding to the preset time range; the scene information comprises one or more image scenes;
associating the preset time range with a date in an application program calendar, and displaying scene information corresponding to the date in a calendar interface;
acquiring schedule information corresponding to a date in a calendar, and generating a travel record according to the schedule information corresponding to the date and scene information;
displaying the travel record corresponding to the date on a display interface of the computer equipment according to the sequence of the date; the travel record comprises schedule information and scene information displayed by text information;
respectively acquiring shooting places of the multiple images to be processed, and obtaining travel information corresponding to the preset time range according to the shooting places and scene information; wherein the same shooting location of the multiple images to be processed can be combined.
2. The method of claim 1, further comprising:
if the scene information corresponding to the preset time range comprises a plurality of image scenes, acquiring the number of the images to be processed corresponding to each image scene and the time range corresponding to each image scene; and the time range corresponding to each image scene is obtained according to the shooting time statistics of the image to be processed corresponding to each image scene.
3. The method of claim 1 or 2, further comprising:
and displaying the travel information corresponding to the preset time ranges on a display interface of the computer equipment according to the time sequence.
4. The method according to claim 1 or 2, wherein before the plurality of images to be processed whose capturing time of the acquired image is within a preset time range, the method further comprises:
and if the number of the images with the shooting moments within the preset time range exceeds a first threshold value, taking the images with the shooting moments within the preset time range as images to be processed.
5. The method of claim 1 or 2, further comprising:
clustering the images to be processed corresponding to the same scene information to generate an atlas;
and if a touch instruction of the scene information displayed on a display interface of the computer equipment is received, displaying an atlas corresponding to the scene information on the display interface of the computer equipment.
6. The method of claim 5, further comprising:
if a face image is detected in the atlas, acquiring a face identifier corresponding to a face in the face image;
and if the computer equipment stores the contact person information corresponding to the face identification, sending the atlas to the computer equipment corresponding to the contact person.
7. An image processing apparatus characterized by comprising:
the device comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring a plurality of images to be processed within a preset time range at the shooting moment of the images;
the identification module is used for carrying out scene identification on the multiple images to be processed and respectively acquiring image scenes corresponding to the multiple images to be processed;
the merging module is used for merging the same image scenes in the image scenes corresponding to the multiple images to be processed to obtain scene information corresponding to the preset time range; the scene information comprises one or more image scenes;
the display module is used for associating the preset time range with a date in an application program calendar and displaying scene information corresponding to the date in a calendar interface;
the acquisition module is also used for acquiring schedule information corresponding to a date in a calendar and generating a travel record according to the schedule information corresponding to the date and scene information;
the display module is also used for displaying the travel record corresponding to the date on a display interface of the computer equipment according to the sequence of the dates; the travel record comprises schedule information and scene information displayed by text information;
the acquisition module is further used for respectively acquiring shooting places of the multiple images to be processed and obtaining travel information corresponding to the preset time range according to the shooting places and the scene information; wherein the same shooting location of the multiple images to be processed can be combined.
8. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method according to any one of claims 1 to 6.
9. A computer device comprising a memory and a processor, the memory having stored therein computer-readable instructions that, when executed by the processor, cause the processor to perform the method of any of claims 1 to 6.
CN201711278857.XA 2017-12-06 2017-12-06 Image processing method, image processing device, computer-readable storage medium and computer equipment Active CN107967339B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711278857.XA CN107967339B (en) 2017-12-06 2017-12-06 Image processing method, image processing device, computer-readable storage medium and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711278857.XA CN107967339B (en) 2017-12-06 2017-12-06 Image processing method, image processing device, computer-readable storage medium and computer equipment

Publications (2)

Publication Number Publication Date
CN107967339A CN107967339A (en) 2018-04-27
CN107967339B true CN107967339B (en) 2021-01-26

Family

ID=61998433

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711278857.XA Active CN107967339B (en) 2017-12-06 2017-12-06 Image processing method, image processing device, computer-readable storage medium and computer equipment

Country Status (1)

Country Link
CN (1) CN107967339B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108647097B (en) * 2018-05-16 2021-04-13 Oppo广东移动通信有限公司 Text image processing method and device, storage medium and terminal
CN108845742B (en) * 2018-06-22 2020-05-05 腾讯科技(深圳)有限公司 Image picture acquisition method and device and computer readable storage medium
CN109981903B (en) * 2019-03-27 2020-12-18 联想(北京)有限公司 Image processing method and electronic equipment
CN110222567B (en) * 2019-04-30 2021-01-08 维沃移动通信有限公司 Image processing method and device
CN110738465A (en) * 2019-10-16 2020-01-31 江西科技学院 Course prompting method, device, equipment and storage medium based on image recognition
CN112069346A (en) * 2020-09-03 2020-12-11 Oppo广东移动通信有限公司 Picture processing method and device, electronic equipment and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101286166A (en) * 2007-04-12 2008-10-15 索尼株式会社 Information presenting apparatus, information presenting method, and computer program
CN102622433A (en) * 2012-02-28 2012-08-01 北京百纳威尔科技有限公司 Multimedia information search processing method and device with shooting function
CN103164472A (en) * 2011-12-16 2013-06-19 腾讯科技(深圳)有限公司 Processing method and processing device of user generated content in social networking system
CN103797786A (en) * 2011-09-02 2014-05-14 株式会社尼康 Electronic camera, image-processing apparatus, and image-processing program
CN104317932A (en) * 2014-10-31 2015-01-28 小米科技有限责任公司 Photo sharing method and device
CN104408428A (en) * 2014-11-28 2015-03-11 东莞宇龙通信科技有限公司 Processing method and device for same-scene photos
CN106528059A (en) * 2015-09-10 2017-03-22 百度在线网络技术(北京)有限公司 Method and device used for generating calendar prompt information
CN107133352A (en) * 2017-05-24 2017-09-05 北京小米移动软件有限公司 Photo display methods and device

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10271017B2 (en) * 2012-09-13 2019-04-23 General Electric Company System and method for generating an activity summary of a person
US20040085578A1 (en) * 2002-11-03 2004-05-06 Quek Su Mien Producing personalized photo calendar
CN102004725A (en) * 2009-09-01 2011-04-06 刘旸 Multimedia file classification method and server
CN102592213A (en) * 2011-12-26 2012-07-18 北京百纳威尔科技有限公司 Schedule reminding system and method based scene
CN104216599A (en) * 2013-06-05 2014-12-17 北京壹人壹本信息科技有限公司 Method and device for reminding event recorded in note
CN105095213B (en) * 2014-04-22 2019-10-11 小米科技有限责任公司 Information correlation method and device
DE202015005394U1 (en) * 2014-08-02 2015-12-08 Apple Inc. Context-specific user interfaces
CN105989182A (en) * 2015-04-13 2016-10-05 乐视移动智能信息技术(北京)有限公司 Photo display method and intelligent terminal
CN106997567A (en) * 2016-01-26 2017-08-01 宇龙计算机通信科技(深圳)有限公司 A kind of user's travel information generation method and device
CN107066166A (en) * 2016-09-05 2017-08-18 广东欧珀移动通信有限公司 Method and terminal that a kind of screen locking picture is shown
CN106776999A (en) * 2016-12-07 2017-05-31 北京小米移动软件有限公司 Multi-medium data recommends method and device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101286166A (en) * 2007-04-12 2008-10-15 索尼株式会社 Information presenting apparatus, information presenting method, and computer program
CN103797786A (en) * 2011-09-02 2014-05-14 株式会社尼康 Electronic camera, image-processing apparatus, and image-processing program
CN103164472A (en) * 2011-12-16 2013-06-19 腾讯科技(深圳)有限公司 Processing method and processing device of user generated content in social networking system
CN102622433A (en) * 2012-02-28 2012-08-01 北京百纳威尔科技有限公司 Multimedia information search processing method and device with shooting function
CN104317932A (en) * 2014-10-31 2015-01-28 小米科技有限责任公司 Photo sharing method and device
CN104408428A (en) * 2014-11-28 2015-03-11 东莞宇龙通信科技有限公司 Processing method and device for same-scene photos
CN106528059A (en) * 2015-09-10 2017-03-22 百度在线网络技术(北京)有限公司 Method and device used for generating calendar prompt information
CN107133352A (en) * 2017-05-24 2017-09-05 北京小米移动软件有限公司 Photo display methods and device

Also Published As

Publication number Publication date
CN107967339A (en) 2018-04-27

Similar Documents

Publication Publication Date Title
CN107967339B (en) Image processing method, image processing device, computer-readable storage medium and computer equipment
CN108022274B (en) Image processing method, image processing device, computer equipment and computer readable storage medium
WO2018010512A1 (en) Method and device for uploading phtograph file
CN107679559B (en) Image processing method, image processing device, computer-readable storage medium and mobile terminal
US11182593B2 (en) Image processing method, computer device, and computer readable storage medium
US20190327357A1 (en) Information presentation method and device
CN109660728B (en) Photographing method and device
US11274932B2 (en) Navigation method, navigation device, and storage medium
CN109286728B (en) Call content processing method and terminal equipment
CN107948729B (en) Rich media processing method and device, storage medium and electronic equipment
CN109409235B (en) Image recognition method and device, electronic equipment and computer readable storage medium
CN108449481A (en) A kind of associated person information recommends method and terminal
CN108053184B (en) Item prompting method, mobile terminal and computer readable storage medium
CN110955788A (en) Information display method and electronic equipment
CN114827069A (en) Multimedia data sharing method and device
CN111222001A (en) Method for marking image, mobile terminal and storage medium
CN108287863B (en) Message record cleaning method and device
CN108668016B (en) Information processing method, device, mobile terminal and computer readable storage medium
CN107729857B (en) Face recognition method and device, storage medium and electronic equipment
CN108287873B (en) Data processing method and related product
CN108021668A (en) The method and apparatus of image procossing, electronic equipment, computer-readable recording medium
CN110785987B (en) Information processing method, device, mobile terminal and computer readable storage medium
CN107613093A (en) Communication message display method and user terminal
CN110753914A (en) Information processing method, storage medium and mobile terminal
CN111034152B (en) Information processing method, device, mobile terminal and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant after: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd.

Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant before: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant