CN110881101A - Shooting method, mobile terminal and device with storage function - Google Patents

Shooting method, mobile terminal and device with storage function Download PDF

Info

Publication number
CN110881101A
CN110881101A CN201811038372.8A CN201811038372A CN110881101A CN 110881101 A CN110881101 A CN 110881101A CN 201811038372 A CN201811038372 A CN 201811038372A CN 110881101 A CN110881101 A CN 110881101A
Authority
CN
China
Prior art keywords
mobile terminal
current
scene
shooting
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201811038372.8A
Other languages
Chinese (zh)
Inventor
宋特
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qiku Internet Technology Shenzhen Co Ltd
Original Assignee
Qiku Internet Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qiku Internet Technology Shenzhen Co Ltd filed Critical Qiku Internet Technology Shenzhen Co Ltd
Priority to CN201811038372.8A priority Critical patent/CN110881101A/en
Publication of CN110881101A publication Critical patent/CN110881101A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses a shooting method, a mobile terminal and a device with a storage function. The shooting method comprises the following steps: when the mobile terminal is detected to be in a shooting state, acquiring a picture to be shot and current environment information; extracting first scene information from a picture to be shot and extracting second scene information from current environment information; identifying a current shooting scene by combining the first scene information and the second scene information; and determining the shooting parameters of the mobile terminal according to the current shooting scene. Through the mode, the shooting effect can be improved, and the user experience is improved.

Description

Shooting method, mobile terminal and device with storage function
Technical Field
The present invention relates to the field of photography, and in particular, to a photography method, a mobile terminal, and a device having a storage function.
Background
With the development of science and technology, the shooting effect of the mobile terminal is better and better, and the adoption of the mobile terminal for shooting becomes a common thing. When the photo is shot, in order to improve the shooting experience of the user and bring convenience to the user to shoot better photos, a plurality of shooting modes and shooting parameters are preset in the camera. However, when the user actually uses the device, the user may not be able to correctly select a proper photographing mode, and may not know the photographing parameters and select inappropriate photographing parameters, thereby failing to obtain a good photographing effect.
Disclosure of Invention
The invention mainly solves the technical problem of how to improve the shooting effect and improve the user experience.
In order to solve the technical problems, the invention adopts a technical scheme that: provided is a photographing method including: when the mobile terminal is detected to be in a shooting state, acquiring a picture to be shot and current environment information; extracting first scene information from the picture to be shot and extracting second scene information from the current environment information; identifying a current shooting scene by combining the first scene information and the second scene information; and determining the shooting parameters of the mobile terminal according to the current shooting scene.
Wherein the extracting first scene information from the picture to be shot comprises: identifying at least one item of people, characters and objects in the picture to be shot; and/or said extracting second context information from said current context information, comprising: retrieving an event related to the current environmental information; and/or acquiring weather information and/or lighting conditions related to the current environment information.
Wherein the retrieving the event related to the current environment information comprises: retrieving events related to the environment information from a calendar, a chat log and a memo stored in the mobile terminal; and/or networked retrieval of events related to the environmental information.
Wherein the acquiring of the current environment information includes: and acquiring at least one of the current location, the current time, the current orientation and the direction of the mobile terminal.
Wherein, the acquiring the current environment information further comprises: judging whether the mobile terminal is in an indoor environment or not according to the current location of the mobile terminal; if the mobile terminal is in an indoor environment, acquiring an internal space map of a building where the mobile terminal is located; and determining the floor where the mobile terminal is located and the surrounding environment characteristics of the floor where the mobile terminal is located by using the internal space map.
Wherein, the acquiring the current environment information further comprises: if the mobile terminal is in an outdoor environment, acquiring a map of an area where the mobile terminal is located; and obtaining the current surrounding environment characteristics of the mobile terminal according to the map of the area where the mobile terminal is located.
Wherein the method further comprises: when the current shooting scene cannot be identified by combining the first scene information and the second scene information, obtaining the shooting scene identified by at least one other mobile terminal; selecting the most shooting scenes in the shooting scenes identified by the at least one other mobile terminal as the current shooting scenes; and/or performing fusion calculation on the shooting scenes identified by the at least one other mobile terminal to obtain a comprehensive scene, and taking the comprehensive scene as the current shooting scene.
Wherein the method further comprises: and adding an identifier for the shot picture or video according to the identified current scene.
In order to solve the technical problem, the invention adopts another technical scheme that: there is provided a mobile terminal including: a processor, a memory, a communication circuit, and a camera, the processor coupled to the memory, the communication circuit, and the camera; wherein the memory is for storing program instructions for implementing the communication method as described above; the processor, the communication circuit and the camera are configured to execute program instructions stored by the memory to implement the communication method as described above.
In order to solve the technical problem, the invention adopts another technical scheme that: there is provided an apparatus having a storage function, storing program instructions executable to implement the steps in the method as described above.
The invention has the beneficial effects that: different from the situation of the prior art, the invention can quickly and accurately select the proper shooting parameters by combining the information extracted from the picture to be shot and the current environment information as the basis for selecting the shooting parameters, thereby improving the shooting effect and improving the user experience.
Drawings
Fig. 1 is a schematic flow chart of a first embodiment of a photographing method provided by the present invention;
fig. 2 is a schematic flowchart of an embodiment of a method for acquiring current environment information in a shooting method provided by the present invention;
fig. 3 is a schematic flow chart of a second embodiment of the photographing method provided by the present invention;
fig. 4 is a schematic structural diagram of an embodiment of a mobile terminal provided in the present invention;
fig. 5 is a schematic structural diagram of an embodiment of the apparatus with a storage function according to the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, are within the scope of the present invention.
Referring to fig. 1, fig. 1 is a schematic flow chart of a shooting method according to a first embodiment of the present invention.
S101: and when the mobile terminal is detected to be in a shooting state, acquiring a picture to be shot and current environment information.
In a specific implementation scenario, whether the mobile terminal is in the shooting state is detected, for example, whether shooting software is started or whether any one or more cameras on the mobile terminal are in the working state is detected. And when the mobile terminal is detected to be in a shooting state, acquiring a picture to be shot and current environment information. The picture to be shot can be obtained by the camera in the working state, that is, the picture currently obtained by the camera in the working state can be used as the picture to be shot. The current environment information includes at least one of a current location of the mobile terminal, a current time, and a current orientation of the mobile terminal. The current location of the mobile terminal can be obtained through a global positioning system in the mobile terminal, the current time can be obtained through a clock module in the mobile terminal, and the current orientation of the mobile terminal can be obtained through a compass in the mobile terminal.
In another implementation scenario, in order to further determine more accurate shooting parameters, a map of a current area may be obtained in combination with a current location, and more detailed environmental information may be obtained in combination with map analysis, specifically, please refer to fig. 2, where fig. 2 is a schematic flowchart of an embodiment of a method for obtaining current environmental information in a shooting method provided by the present invention.
S201: judging whether the mobile terminal is in an indoor environment or not according to the current location of the mobile terminal;
in a specific implementation scenario, whether the mobile terminal is located indoors is determined according to a current location where the mobile terminal is located. Because the intensity and the color of the indoor light and the outdoor light are different, whether the current place is in the indoor environment or not is judged for further accurately shooting the required parameters. The map of the area where the current place belongs can be obtained, and whether the current place is indoor or not can be judged according to the map. The latitude and longitude information of the current place can be searched, and whether the current place is indoor or not can be judged according to the searched result.
S202: and if the mobile terminal is in an indoor environment, acquiring an internal space map of a building where the mobile terminal is located.
In a specific implementation scenario, when it is determined that the mobile terminal is in an indoor environment, an internal space map of a building in which the mobile terminal is located is obtained. For example, when it is determined that the mobile terminal is located in a shopping mall or an office building, the network searches for and acquires an internal space map of the shopping mall or the office building.
Specifically, in the present implementation scenario, when it is determined that the mobile terminal is located in a shopping mall, the internal space map of the shopping mall is searched and acquired in a network, and various shopping information about the shopping mall, such as advertisements, promotion information, group shopping information, etc., is searched, because the information includes merchants and their specific directions in the shopping information. For example, the promotional information may include the merchant's brand, specific directions (e.g., number XX building located in the shopping mall). Since the resident merchants of the shopping center will change frequently, when the shopping information is acquired, a screening threshold can be set, for example, the shopping information within one month is valid information, the shopping information at other times is invalid information, and the invalid information is ignored.
In another implementation scenario, when the mobile terminal is determined to be in an office building, the internal space map is searched and acquired on line, and various recruiter information about the office building, such as advertisements, recruitment information, etc., are searched, because the information includes the company name and its specific position in the office building. For example, the advertisement information may include the brand name of the company, the specific location (e.g., XX number on X building located in the office building). Since the resident companies of the office building may change frequently, when acquiring the solicited information, a threshold for filtering may be set, for example, solicited information within one month is valid information, and invalid information is ignored at other times.
S203: and determining the floor where the mobile terminal is located and the surrounding environment characteristics of the floor where the mobile terminal is located by using the internal space map.
In one specific implementation scenario, the floor on which the mobile terminal is located and the surrounding environmental features of the floor are determined by obtaining an interior space map of the building (e.g., a shopping mall or an office building). For example, when it is determined that the mobile terminal is located in a shopping mall, the internal space map of the shopping mall is acquired, and then it is determined that the floor where the mobile terminal is located is a floor 5 using the shopping mall space map, and the current location of the mobile terminal is a restaurant in the floor 5 of the shopping mall. Further, the decoration style of the restaurant can be searched on the network, for example, the restaurant is a retro style and has faint yellow lights, or the restaurant is a fast food restaurant such as kentucky, mcdonald and the like and has bright lights. For example, if it is determined that the mobile terminal is located in an office building and the floor on which the mobile terminal is located is the rooftop of the office building, the illumination is natural light from the outside.
The determination may further be made in conjunction with the current orientation of the mobile terminal. For example, the mobile terminal is currently located in 5 th floor of a certain shopping center and is shot towards the northwest direction, and the northwest corner of the 5 th floor of the shopping center is determined as a floor glass window through the space map of the shopping center.
S204: and if the mobile terminal is in the outdoor environment, acquiring a map of the area where the mobile terminal is located.
In a specific implementation scenario, when the mobile terminal is in an outdoor environment, a map of an area where the mobile terminal is located is obtained. For example, when it is determined that the mobile terminal is in an outdoor environment such as a park or a scenic spot, a map of the park or the scenic spot is searched and acquired on the network.
Specifically, in this implementation scenario, when it is determined that the mobile terminal is located in a certain scenic spot, a map of the certain scenic spot is searched and acquired over the network, and related information about news, introduction, travel notes, evaluations, and the like of the certain scenic spot is searched, where the related information may include descriptions about specific landscapes, surrounding environments, vegetation, and the like of different areas of the certain scenic spot, and for example, the descriptions of the certain scenic spot may include XX topographic features typical of the certain scenic spot, mountainous stones, wood trees, and XX scenic spots. Since the landscape of the scenic spot changes with time, when the relevant information is acquired, a screening threshold can be set, for example, the relevant information within one month is valid information, the relevant information at other times is invalid information, and the invalid information is ignored.
S205: and obtaining the current surrounding environment characteristics of the mobile terminal according to the map of the area where the mobile terminal is located.
In a specific implementation scenario, the current ambient environment characteristics of the mobile terminal are determined according to a map of an area where the mobile terminal is located, and according to a current location and the map of the area where the mobile terminal is located. For example, it is determined that the mobile terminal is located in a certain park, and after a map of the park is obtained, the current location of the mobile terminal is located according to the map of the park and the current location of the mobile terminal, so that the mobile terminal is currently located on a lawn of the park. For another example, after determining that the mobile terminal is located in a certain scenic spot and acquiring a map of the scenic spot, the location of the mobile terminal is determined according to the map of the scenic spot, and the mobile terminal is currently located in a canyon of the scenic spot.
In addition, the current orientation of the mobile terminal may be further determined, for example, the current mobile terminal is on a lawn of a certain park and is shot in a northwest direction, and it can be known from a map of the park that there is a rockery in the northwest direction of the current position of the mobile terminal, and the rockery may block sunlight.
S102: and extracting first scene information from the picture to be shot and extracting second scene information from the current environment information.
In a specific implementation scenario, the first scene information is extracted from the picture to be shot, and may be face recognition performed on the picture to be shot, and if the mobile terminal is in a networked state, retrieval is performed by combining a network and pictures in a mobile phone memory, so as to search whether a person which can be recognized exists in the picture to be shot. For example, a user's frequently-used contact is identified in the picture to be photographed, or a certain star is identified in the picture to be photographed. The character recognition can also be carried out on the picture to be shot, if the mobile terminal is in a networking state, the retrieval is carried out by combining the network and the pictures in the mobile phone memory, for example, the character of 'evening party', 'court' or 'happy birthday', and the like of the picture to be shot is recognized from the picture to be shot. The object identification can also be carried out on the picture to be shot, if the mobile terminal is in a networking state, the retrieval is carried out by combining the network and the pictures in the mobile phone memory, for example, the football, the bonfire, the birthday cake or the luggage case and the like are identified in the picture to be shot.
If the mobile terminal is in a disconnected state, for example, the user is defaulted to stop the mobile terminal or the communication quality of the current environment is poor, when the first scene information is extracted from the picture to be shot, face recognition, character recognition or object recognition is only carried out on the data in the memory of the mobile terminal. The specific implementation is basically the same as that of the mobile terminal in the networking state, and details are not repeated here.
In this implementation scenario, the extracting the second scenario information from the environment information may be retrieving an event related to the current location and the current time according to the located current location and the obtained current time. An event associated with the current location and the current time may be networked for retrieval. For example, locating that the current location is on a course and the current time is 30 minutes 2 pm, searching on the internet in combination with the two information to find one or more news stories indicating that a game is being played on the course at the current time. Events related to the located current location and the obtained current time can also be retrieved from the calendar, the chat log and the memo stored in the mobile terminal, for example, the memo is retrieved to the birthday of a certain person today, some conversations related to the current location are retrieved from the chat log, for example, the current location is located as a certain restaurant, and a birthday party or similar sentences are held for a certain person at a certain restaurant in the chat log.
The extracting of the second scene information from the environment information may also be to obtain weather information of the current location and time according to the located current location and the obtained current time, and/or further obtain the current lighting condition in combination with the current orientation of the mobile terminal. For example, if the weather condition of the current location at the current time is fine, the outside natural light is sufficient. Further in conjunction with the orientation of the mobile terminal, for example, the time is 7 am, the sun is in the east, and if the mobile terminal is shooting facing the east, the background light may be too strong, and the light of the person/object to be shot is dark.
If the mobile terminal is in a disconnected state, for example, the user is defaulted to halt, or the communication quality of the current environment is poor, the second scene information extracted from the environment information can only retrieve the events related to the located current place and the obtained current time from the calendar, the chat record and the memorandum stored in the mobile terminal. The specific implementation is basically the same as that of the mobile terminal in the networking state, and details are not repeated here.
S103: and identifying the current shooting scene by combining the first scene information and the second scene information.
In a specific implementation scenario, a current shooting scenario is identified in combination with first information extracted from a picture to be shot and second information extracted from current environment information. For example, the first information extracted from the picture to be shot is a football field, a football and a certain planet, the second information extracted from the current environment information is that the football field where the mobile terminal is currently located holds a football game, the weather is clear today and the sunshine is bright, the current time is 11 o' clock at noon, the current orientation of the mobile terminal is towards the south, the included angle between the light and the horizontal plane is close to 90 degrees, light supplement is not needed, the current shooting scene is identified as a motion shooting scene, and light supplement is not needed.
For another example, the character identified from the picture to be photographed is "happy birthday", the identified object has a birthday cake, the mobile terminal is located in a restaurant in a certain market and is obtained from the current environment information, the restaurant is searched for a vintage style, the light is dark, light needs to be supplemented, and a user mentions that a birthday meeting is held in the restaurant today from the chat records of the user. And identifying that the current shooting scene is a food shooting scene, correcting a yellow line and supplementing light.
For another example, lawn and blue sky are identified from the picture to be photographed, and the identified person has a common contact of the user, such as a girlfriend. The mobile terminal is obtained from the current environment information and is located on the lawn of a certain park, the current weather is cloudy, the orientation of the mobile terminal is just west, the current time is 7 o' clock in the morning, the sun is located in the east at the moment, and the light is weak. And recognizing that the current shooting scene is a portrait shooting scene and light is required to be supplemented.
S104: and determining the shooting parameters of the mobile terminal according to the current shooting scene.
In a specific embodiment, the recording parameters are determined from the recognized recording scene. For example, a scene is shot for a motion at present, and light supplement is not needed, then shooting speed parameters are adjusted to be snapshot, light parameters are default parameters, anti-shake is started, and other shooting parameters are adjusted correspondingly.
For another example, if a scene is shot for the food and light is needed, the shooting light parameters are adjusted to be light supplement, a food filter is added, and other shooting parameters are adjusted correspondingly.
For another example, when a scene is shot for a portrait and light is needed to be supplemented, the shooting light parameters are adjusted to supplement light, the beauty function is started, and other shooting parameters are adjusted correspondingly.
In other implementation scenarios, after the shooting parameters of the mobile terminal are determined according to the shooting scenario, the adjustment may be performed according to the usual usage habit of the user, for example, the user is used to use background blurring when the user is in a portrait shooting scenario, and then the background blurring function is started when the user is identified as the portrait shooting scenario. For another example, when the user is in a motion shooting scene, the user is used to adopt a fast continuous shooting mode, and when the user is identified as the current motion shooting scene, the fast continuous shooting mode is automatically started.
In order to further improve the accuracy of identifying the shooting scene, the first scene information can be further refined when the picture to be shot is extracted, and individual parameters in different shooting scenes can be adjusted according to the identified content. For example, when the face recognizes that the currently photographed object is a girlfriend, the parameter for turning on beauty is added to the portrait photographing scene, and when the face recognizes that the currently photographed object is only a general friend, the parameter for turning on beauty is not added to the portrait photographing scene. For another example, when the item recognizes that the current shot is a dessert, parameters of a warm-tone filter are added to the item shooting scene, so that the dessert looks more palatable. And when the object identifies that the current shot is the vegetable, the parameter of the cool filter is added in the object shooting scene, so that the vegetable looks fresher.
In the implementation scene, after the photo or the video is shot by using the shooting parameters determined by the identified current shooting scene, the identified current shooting scene is used as the identifier of the photo or the video. For example, if the identified scene is a birthday party of a friend, the identification of the picture or video taken may be accompanied by "birthday party of friend" or the like. It is also possible to create a "birthday party of a friend" folder in which the pictures or videos taken this time are stored. It can be found quickly when the subsequent user needs to search.
According to the above description, the information extracted from the picture to be shot and the current environment information is combined to be used as a basis for selecting the shooting parameters, so that the proper shooting parameters can be quickly and accurately selected, the shooting effect is improved, and the user experience is improved.
Referring to fig. 3, fig. 3 is a flowchart illustrating a shooting method according to a second embodiment of the present invention.
S301: and when the current shooting scene cannot be identified by combining the first scene information and the second scene information, obtaining the shooting scene identified by at least one other mobile terminal.
In a specific implementation scenario, the current shooting scenario cannot be identified by combining the first scenario information and the second scenario information, for example, the mobile terminal cannot be networked, and the information that can be acquired is too little, so that the current shooting scenario cannot be identified. The mobile terminal acquires a photographing scene recognized by at least one other terminal. The mobile terminal can be connected with at least one other terminal through a wireless connection method such as Bluetooth and the like, so that a shooting scene identified by the at least one other mobile terminal is obtained.
S302: selecting the most shooting scenes in the shooting scenes identified by the at least one other mobile terminal as the current shooting scenes; and/or performing fusion calculation on the shooting scenes identified by the at least one other mobile terminal to obtain a comprehensive scene, and taking the comprehensive scene as the current shooting scene.
In a specific implementation scenario, the mobile terminal acquires a shooting scenario identified by another mobile terminal, and then takes the shooting scenario as a current shooting scenario. In another implementation of scene aggregation, the mobile terminal acquires shooting scenes identified by a plurality of mobile terminals, and when the shooting scenes identified by the plurality of mobile terminals are consistent, the identified shooting scene is taken as the current shooting scene. And when the shooting scenes identified by the plurality of mobile terminals are inconsistent, acquiring the most shooting scenes as the shooting scenes. For example, shooting scenes recognized by five other mobile terminals are obtained, wherein the shooting scenes recognized by 4 mobile terminals are shooting scenes A, and the shooting scenes recognized by 1 mobile terminal are shooting scenes B, and then the shooting scenes A are taken as the current shooting scenes.
In another implementation scene, the mobile terminal acquires shooting scenes identified by a plurality of other mobile terminals, performs fusion calculation on the scenes identified by the other mobile terminals to obtain a comprehensive scene, and takes the comprehensive scene as the current shooting scene. For example, when shooting scenes recognized by a plurality of mobile terminals do not coincide, the most shooting scenes are acquired as shooting scenes. For example, shooting scenes recognized by five other mobile terminals are obtained, wherein the shooting scene recognized by 3 mobile terminals is a shooting scene A, and the shooting scene recognized by 2 mobile terminals is a shooting scene B, the shooting scenes A and B are fused, and parameters corresponding to the scenes A and B are taken as intermediate values.
In the implementation scene, after the photo or the video is shot by using the shooting parameters determined by the identified current shooting scene, the identified current shooting scene is used as the identifier of the photo or the video. For example, if the identified scene is a birthday party of a friend, the identification of the picture or video taken may be accompanied by "birthday party of friend" or the like. It is also possible to create a "birthday party of a friend" folder in which the pictures or videos taken this time are stored. It can be found quickly when the subsequent user needs to search.
According to the above description, in this embodiment, when the mobile terminal cannot identify the current shooting scene, a suitable shooting scene is found as the current shooting scene by acquiring scenes identified by other mobile terminals, so that the accuracy of the selected shooting scene can be improved, the shooting effect is improved, and the user experience is improved.
Referring to fig. 4, fig. 4 is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention. The mobile terminal 10 includes a processor 11, a memory 12, a communication circuit 13, and a camera 14. The processor 11 is coupled to the memory 12, the communication circuit 13 and the camera 14. The processor 11 is coupled to the memory 12, the communication circuit 13 and the camera 14. The memory 12 is used for storing program instructions, and the processor 11, in conjunction with the communication circuit 13 and the camera 14, is used for executing the program instructions in the memory 12 to communicate and perform the following methods:
when the processor 11 of the mobile terminal 10 detects that the camera 14 of the mobile terminal 10 is in the working state, the camera 14 acquires a picture to be shot, and the processor 11 and the communication circuit 13 acquire current environment information. The current environment information may be the current location of the mobile terminal 10 obtained by the global positioning system in the communication circuit 13, the current time obtained by the processor 11, or the current orientation of the mobile terminal 10. The processor 11 extracts first scene information from a picture to be shot, acquires second scene information from current environment information, and the processor 11 identifies a current shooting scene by combining the first scene information and the second scene information and determines shooting parameters of the mobile terminal 10 according to the current shooting scene.
In an implementation scenario, the processor 11 may extract the first scene information from the picture to be captured by performing face recognition, character recognition or object recognition on the picture to be captured by the processor 11 to identify a person, a character or an object included in the picture to be captured. The processor 11 extracting the second scene information from the current environment information includes, if the communication circuit 13 is connected to the network, the processor 11 networking retrieving events related to the current environment information and retrieving events related to the environment information from the calendar, the chat log and the memo stored in the memory 12 of the mobile terminal 10, and/or the processor 11 acquiring weather information and/or lighting conditions related to the current environment information. If the communication circuit 13 is not able to connect to the network, the processor 11 retrieves only events related to the environment information from the calendar, chat log and memo stored in the mobile terminal.
In other implementation scenarios, in order to further accurately identify the shooting scenario, the processor 11 determines whether the mobile terminal 10 is in an indoor environment according to the current location of the mobile terminal 10, if the current location of the mobile terminal 10 is in the indoor environment, the communication circuit 13 obtains an internal space map of a building where the mobile terminal is located, and the processor 11, in combination with the communication circuit 13, determines the floor where the mobile terminal 10 is located and the surrounding environment features of the floor where the mobile terminal is located by using the internal space map. If the processor 11 determines that the current location of the mobile terminal 10 is the outdoor environment, the communication circuit 13 obtains a map of the area where the mobile terminal 10 is located, and obtains the current ambient environment characteristics of the mobile terminal 10 according to the map of the area where the mobile terminal 10 is located.
In another implementation scenario, when the current shooting scenario cannot be identified by combining the first scenario information and the second scenario information, the processor 11 is connected to at least one other mobile terminal to obtain the shooting scenario identified by the at least one other mobile terminal. The processor 11 selects the most shooting scenes among the shooting scenes identified by the at least one other mobile terminal as the current shooting scene. Or the processor 11 performs fusion calculation on the shooting scenes identified by at least one other mobile terminal to obtain a comprehensive scene, and the comprehensive scene is used as the current shooting scene.
After the camera 14 takes a picture using the shooting parameters determined according to the identified current shooting scene, the processor 11 uses the identified current shooting scene as an identifier of the picture or video. For example, if the identified scene is a birthday party of a friend, the identification of the picture or video taken may be accompanied by "birthday party of friend" or the like. Processor 11 may also create a "birthday party of friend" folder in memory 12 in which the photograph or video taken at this time is stored. It can be found quickly when the subsequent user needs to search.
As can be seen from the above description, the mobile terminal in this embodiment may combine the scene information extracted from the scene to be photographed and the current environment information during photographing, so as to accurately determine the current photographing scene, determine the photographing parameters of the mobile terminal according to the current photographing scene, and improve the accuracy of the selected photographing scene, thereby improving the photographing effect and improving the user experience.
Referring to fig. 5, fig. 5 is a schematic structural diagram of an embodiment of a device with a storage function according to the present invention. The means 20 with memory function has stored therein at least one program instruction 21, the program instruction 21 being for performing the method as shown in fig. 1-3. In one embodiment, the apparatus with storage function may be a storage chip in a device, a hard disk, or a removable hard disk or other readable and writable storage tool such as a flash disk, an optical disk, or the like, and may also be a server or the like.
As can be seen from the above description, the program instruction stored in the apparatus embodiment with a storage function in this embodiment may be used to obtain a picture to be shot and current environment information when the mobile terminal is in a shooting state, extract first scene information and second scene information from the picture to be shot and the current environment information, identify a current shooting scene by combining the first scene information and the second scene information, determine a shooting parameter of the mobile terminal according to the current shooting scene, and improve accuracy of the selected shooting scene, thereby improving a shooting effect and improving user experience.
The shooting method provided by the invention can be used for accurately determining the current shooting scene by combining the scene information extracted from the scene to be shot and the current environment information when the mobile terminal shoots, determining the shooting parameters of the mobile terminal according to the current shooting scene, and improving the accuracy of the selected shooting scene, the shooting effect and the user experience.
The above description is only an embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes performed by the present specification and drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. A photographing method, characterized by comprising:
when the mobile terminal is detected to be in a shooting state, acquiring a picture to be shot and current environment information;
extracting first scene information from the picture to be shot and extracting second scene information from the current environment information;
identifying a current shooting scene by combining the first scene information and the second scene information;
and determining the shooting parameters of the mobile terminal according to the current shooting scene.
2. The method according to claim 1, wherein the extracting first scene information from the picture to be shot comprises:
identifying at least one item of people, characters and objects in the picture to be shot; and/or
The extracting second scene information from the current environment information includes:
retrieving an event related to the current environmental information; and/or
And acquiring weather information and/or illumination conditions related to the current environment information.
3. The method of claim 2, wherein retrieving events related to the current environmental information comprises:
retrieving events related to the environment information from a calendar, a chat log and a memo stored in the mobile terminal; and/or
Networking retrieves events related to the environmental information.
4. The method of claim 1, wherein obtaining current environment information comprises:
and acquiring at least one of the current location, the current time and the current orientation of the mobile terminal.
5. The method of claim 4, wherein obtaining current environmental information further comprises:
judging whether the mobile terminal is in an indoor environment or not according to the current location of the mobile terminal;
if the mobile terminal is in an indoor environment, acquiring an internal space map of a building where the mobile terminal is located;
and determining the floor where the mobile terminal is located and the surrounding environment characteristics of the floor where the mobile terminal is located by using the internal space map.
6. The method of claim 5, wherein obtaining current environmental information further comprises:
if the mobile terminal is in an outdoor environment, acquiring a map of an area where the mobile terminal is located;
and obtaining the current surrounding environment characteristics of the mobile terminal according to the map of the area where the mobile terminal is located.
7. The method of claim 1, further comprising:
when the current shooting scene cannot be identified by combining the first scene information and the second scene information, obtaining the shooting scene identified by at least one other mobile terminal;
selecting the most shooting scenes in the shooting scenes identified by the at least one other mobile terminal as the current shooting scenes; and/or performing fusion calculation on the shooting scenes identified by the at least one other mobile terminal to obtain a comprehensive scene, and taking the comprehensive scene as the current shooting scene.
8. The method of claim 1, further comprising:
and adding an identifier for the shot picture or video according to the identified current scene.
9. A mobile terminal, comprising: a processor, a memory, and communication circuitry and a camera, the processor coupled to the memory, the communication circuitry and the camera;
wherein the memory is for storing program instructions for implementing the photographing method according to any one of claims 1 to 8;
the processor, the communication circuit and the camera are configured to execute program instructions stored by the memory to implement the photographing method according to any one of claims 1 to 8.
10. An apparatus having storage functionality, wherein program instructions are stored, which program instructions are executable to implement the steps in the method according to any one of claims 1 to 8.
CN201811038372.8A 2018-09-06 2018-09-06 Shooting method, mobile terminal and device with storage function Withdrawn CN110881101A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811038372.8A CN110881101A (en) 2018-09-06 2018-09-06 Shooting method, mobile terminal and device with storage function

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811038372.8A CN110881101A (en) 2018-09-06 2018-09-06 Shooting method, mobile terminal and device with storage function

Publications (1)

Publication Number Publication Date
CN110881101A true CN110881101A (en) 2020-03-13

Family

ID=69727163

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811038372.8A Withdrawn CN110881101A (en) 2018-09-06 2018-09-06 Shooting method, mobile terminal and device with storage function

Country Status (1)

Country Link
CN (1) CN110881101A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111787235A (en) * 2020-08-12 2020-10-16 努比亚技术有限公司 Shooting control method and device and computer readable storage medium
CN113596327A (en) * 2021-07-21 2021-11-02 维沃移动通信(杭州)有限公司 Shooting method, shooting device, electronic equipment and storage medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1936685A (en) * 2005-09-21 2007-03-28 索尼株式会社 Photographic device, method of processing information, and program
CN101686323A (en) * 2008-09-26 2010-03-31 三洋电机株式会社 Imaging apparatus and mode appropriateness evaluating method
CN102647449A (en) * 2012-03-20 2012-08-22 西安联客信息技术有限公司 Intelligent shooting method and intelligent shooting device based on cloud service and mobile terminal
CN103945088A (en) * 2013-01-21 2014-07-23 华为终端有限公司 Method and device for scene recognition
CN103973979A (en) * 2014-04-23 2014-08-06 小米科技有限责任公司 Method and device for configuring shooting parameters
CN104092942A (en) * 2014-07-17 2014-10-08 深圳市中兴移动通信有限公司 Shooting method and device
CN105407281A (en) * 2015-11-13 2016-03-16 努比亚技术有限公司 Scene based photographing device and method
CN105516507A (en) * 2015-12-25 2016-04-20 联想(北京)有限公司 Information processing method and electronic equipment
WO2016061011A2 (en) * 2014-10-15 2016-04-21 Microsoft Technology Licensing, Llc Camera capture recommendation for applications
CN107820020A (en) * 2017-12-06 2018-03-20 广东欧珀移动通信有限公司 Method of adjustment, device, storage medium and the mobile terminal of acquisition parameters
CN108270969A (en) * 2018-01-17 2018-07-10 上海爱优威软件开发有限公司 A kind of method and terminal device of adjust automatically exposal model
WO2018147581A1 (en) * 2017-02-09 2018-08-16 Samsung Electronics Co., Ltd. Method and apparatus for selecting capture configuration based on scene analysis

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1936685A (en) * 2005-09-21 2007-03-28 索尼株式会社 Photographic device, method of processing information, and program
CN101686323A (en) * 2008-09-26 2010-03-31 三洋电机株式会社 Imaging apparatus and mode appropriateness evaluating method
CN102647449A (en) * 2012-03-20 2012-08-22 西安联客信息技术有限公司 Intelligent shooting method and intelligent shooting device based on cloud service and mobile terminal
CN103945088A (en) * 2013-01-21 2014-07-23 华为终端有限公司 Method and device for scene recognition
CN103973979A (en) * 2014-04-23 2014-08-06 小米科技有限责任公司 Method and device for configuring shooting parameters
CN104092942A (en) * 2014-07-17 2014-10-08 深圳市中兴移动通信有限公司 Shooting method and device
WO2016061011A2 (en) * 2014-10-15 2016-04-21 Microsoft Technology Licensing, Llc Camera capture recommendation for applications
CN105407281A (en) * 2015-11-13 2016-03-16 努比亚技术有限公司 Scene based photographing device and method
CN105516507A (en) * 2015-12-25 2016-04-20 联想(北京)有限公司 Information processing method and electronic equipment
WO2018147581A1 (en) * 2017-02-09 2018-08-16 Samsung Electronics Co., Ltd. Method and apparatus for selecting capture configuration based on scene analysis
CN107820020A (en) * 2017-12-06 2018-03-20 广东欧珀移动通信有限公司 Method of adjustment, device, storage medium and the mobile terminal of acquisition parameters
CN108270969A (en) * 2018-01-17 2018-07-10 上海爱优威软件开发有限公司 A kind of method and terminal device of adjust automatically exposal model

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111787235A (en) * 2020-08-12 2020-10-16 努比亚技术有限公司 Shooting control method and device and computer readable storage medium
CN111787235B (en) * 2020-08-12 2022-07-12 努比亚技术有限公司 Shooting control method and device and computer readable storage medium
CN113596327A (en) * 2021-07-21 2021-11-02 维沃移动通信(杭州)有限公司 Shooting method, shooting device, electronic equipment and storage medium
CN113596327B (en) * 2021-07-21 2024-01-23 维沃移动通信(杭州)有限公司 Shooting method, shooting device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
US11895068B2 (en) Automated content curation and communication
CN109068056B (en) Electronic equipment, filter processing method of image shot by electronic equipment and storage medium
CN101924992B (en) Method, system and equipment for acquiring scene information through mobile terminal
CN107358639B (en) Photo display method and photo display system based on intelligent terminal
CN104243798B (en) Image processing apparatus, server and storage medium
CN106851104B (en) A kind of method and device shot according to user perspective
WO2016173423A1 (en) Image processing method, apparatus and device, and computer storage medium
JP6231387B2 (en) Server, client terminal, system, and recording medium
JP4984044B2 (en) Image capturing system, image capturing condition setting method, terminal and server used therefor
CN108600632B (en) Photographing prompting method, intelligent glasses and computer readable storage medium
WO2006028108A1 (en) Image processing system and method, and terminal and server used for the same
US20090193021A1 (en) Camera system and method for picture sharing based on camera perspective
CN103533241A (en) Photographing method of intelligent filter lens
CN102647449A (en) Intelligent shooting method and intelligent shooting device based on cloud service and mobile terminal
WO2018059206A1 (en) Terminal, method of acquiring video, and data storage medium
CN106462946A (en) Processing digital photographs in response to external applications
JP2003274320A (en) Imaging device and device and method for image information processing
CN102164345A (en) Method for recording positional information in picture taken by mobile phone
CN110881101A (en) Shooting method, mobile terminal and device with storage function
WO2015154383A1 (en) Photographing method and photographing terminal
WO2019205170A1 (en) Photographic method and terminal device
CN102196147A (en) Method for recording position information in pictures photographed by digital camera
CN106231198B (en) Shoot the method and device of image
JP2002077805A (en) Camera with photographing memo function
TWI494864B (en) Method and system for searching image and computer program product using the method

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20200313