CN110740263B - Image processing method and terminal equipment - Google Patents

Image processing method and terminal equipment Download PDF

Info

Publication number
CN110740263B
CN110740263B CN201911054922.XA CN201911054922A CN110740263B CN 110740263 B CN110740263 B CN 110740263B CN 201911054922 A CN201911054922 A CN 201911054922A CN 110740263 B CN110740263 B CN 110740263B
Authority
CN
China
Prior art keywords
picture
input
terminal device
target time
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911054922.XA
Other languages
Chinese (zh)
Other versions
CN110740263A (en
Inventor
邱昌鑫
杨飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201911054922.XA priority Critical patent/CN110740263B/en
Publication of CN110740263A publication Critical patent/CN110740263A/en
Application granted granted Critical
Publication of CN110740263B publication Critical patent/CN110740263B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the invention provides an image processing method and terminal equipment, and relates to the technical field of image processing. The method comprises the following steps: receiving a first input under the condition that the terminal equipment displays a place model corresponding to a target place; selecting a first viewfinder frame in the location model in response to the first input; receiving a second input; selecting a target time period in response to the second input; processing the first viewing picture according to the illumination information corresponding to the target time interval to obtain a second viewing picture; and displaying the second framing picture. The second viewing picture is also a viewing effect picture of the first viewing picture under illumination corresponding to the target time period, so that a user can obtain the second viewing picture as a viewing effect picture referred to before shooting in advance under the condition of not stepping on a point on site, the preparation and observation work in the early stage of shooting is simplified, and the convenience of viewing and shooting is improved.

Description

Image processing method and terminal equipment
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to an image processing method and a terminal device.
Background
With the widespread and extensive application of terminal devices such as smart phones and smart cameras, the terminal devices are becoming indispensable electronic products in people's lives. With the development of the shooting function of the terminal device, people increasingly use the terminal device to shoot. For example, when traveling, people can shoot buildings and landscapes on the way through smart cameras.
In real life, when people want to take a picture like a large picture at a specific location (such as a tourist destination), the people usually need to reach the location in advance and find the conditions of the location such as a viewing angle, a viewing time period, etc. The excellent shooting effect can be obtained only by shooting under the proper viewing conditions. However, this makes the preparation and observation work for the previous stage of shooting very complicated, resulting in low convenience of framing shooting.
Disclosure of Invention
The embodiment of the invention provides an image processing method and terminal equipment, and aims to solve the problem that the convenience of framing and shooting is low due to the fact that preparation and observation work in the early stage of shooting is very complicated.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention further provides an image processing method, which is applied to a terminal device, where the method includes:
receiving a first input under the condition that the terminal equipment displays a place model corresponding to a target place;
selecting a first viewfinder frame in the location model in response to the first input;
receiving a second input;
selecting a target time period in response to the second input;
processing the first viewing picture according to the illumination information corresponding to the target time interval to obtain a second viewing picture;
and displaying the second framing picture.
In a second aspect, an embodiment of the present invention further provides a terminal device, where the terminal device includes:
the first receiving module is used for receiving first input under the condition that the terminal equipment displays the place model corresponding to the target place;
the first selection module is used for responding to the first input and selecting a first framing picture in the place model;
the second receiving module is used for receiving a second input;
a second selection module for selecting a target time period in response to the second input;
the processing module is used for processing the first view-finding picture according to the illumination information corresponding to the target time interval to obtain a second view-finding picture;
and the first display module is used for displaying the second framing picture.
In a third aspect, an embodiment of the present invention further provides a terminal device, where the terminal device includes a processor, a memory, and a computer program stored on the memory and executable on the processor, and when executed by the processor, the computer program implements the steps of the image processing method described above.
In a fourth aspect, the embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the image processing method as described above.
In the embodiment of the invention, the terminal device receives and responds to the first input of the user when displaying the place model corresponding to the target place, the first framing picture is selected in the place model, then the terminal device can receive and responds to the second input of the user, the target time period is selected, and the first framing picture can be processed according to the illumination information corresponding to the target time period, so that the second framing picture can be obtained and displayed, the second framing picture is also the framing effect picture of the first framing picture under the illumination corresponding to the target time period, therefore, the user can obtain the second framing picture as the framing effect picture referred to before shooting in advance under the condition of not stepping on the spot, the preparation and observation work in the early stage before shooting is simplified, and the convenience of framing and shooting is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive labor.
FIG. 1 is a flow chart of an image processing method according to an embodiment of the present invention;
FIG. 2 is a flow chart of another image processing method according to an embodiment of the present invention;
FIG. 3 is a schematic view of an interface provided by an embodiment of the present invention;
fig. 4 is a block diagram of a terminal device according to an embodiment of the present invention;
fig. 5 is a block diagram of another terminal device according to an embodiment of the present invention;
fig. 6 is a schematic diagram of a hardware structure of a terminal device according to an embodiment of the present invention for implementing various embodiments of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
In various embodiments of the present invention, it should be understood that the sequence numbers of the following processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
Referring to fig. 1, an embodiment of the present invention provides an image processing method applied to a terminal device, and as shown in fig. 1, the image processing method includes:
step 101, the terminal device receives a first input when displaying a location model corresponding to a target location.
In the embodiment of the invention, a user can click a remote view option in photographing software of the terminal equipment and input a target place, the photographing software transmits the place information corresponding to the target place to the photographing software, and the photographing software can return to a place model corresponding to the target place and display the place model on the photographing software. The target site may be a specific building, an administrative area, or the like, and the embodiment of the present invention is not particularly limited thereto.
In the embodiment of the invention, a user can browse a place model on photographing software of the terminal equipment, can move a 'view window' icon on the place model and select a first view picture desired by the user, so that the terminal equipment can receive a first input used for selecting the first view picture in the place model by the user, and the first view picture corresponds to place information, building information, object information, photographing angle information and the like.
In step 102, the terminal device selects a first view in the location model in response to the first input.
In the embodiment of the invention, after receiving the first input, the terminal device may select a first viewfinder defined by the "viewfinder window" in the location model in response to the first input.
Step 103, the terminal device receives a second input.
In the embodiment of the invention, the user can select the target time period in the photographing software of the terminal device, and the terminal device can receive the second input of the target time period selected by the user.
And 104, the terminal equipment responds to the second input and selects the target time interval.
In the embodiment of the present invention, in response to the second input, the terminal device may determine a period input by the user or selected by clicking or the like as the target period.
And 105, processing the first view-finding picture according to the illumination information corresponding to the target time interval to obtain a second view-finding picture.
In the embodiment of the present invention, the illumination information corresponding to the location in the first finder screen in the target time period may include illumination information of the light emitting device and illumination information emitted by ambient natural light. In the case that the user selects the target time period, the terminal device first identifies the light-emitting device in the first framed screen according to an image recognition technology, and then may acquire first illumination information of the light-emitting device in the first framed screen in the target time period through a light collecting device equipped on the light-emitting device or the light-emitting device itself. The terminal device may further determine second illumination information corresponding to ambient natural light of a place in the first finder screen at the target time period based on the sunshine large data collected by the software. And finally, the terminal equipment processes the first view-finding picture according to at least one of the two parts of illumination information to obtain a second view-finding picture, wherein the second view-finding picture is the view-finding picture of the first view-finding picture in the target time interval.
The light-emitting device can comprise an outdoor display screen or an intelligent street lamp and the like, and can self-emit light to further influence the exposure of the first framing picture. Ambient natural light is mainly sunlight.
It should be noted that the light-emitting device may not exist in the first view-finding picture, and obtain the first illumination information for the outdoor display screen or the intelligent street lamp in the target time period.
And 106, the terminal equipment displays a second framing picture.
In the embodiment of the present invention, the terminal device may display the second view-finding picture to the user when obtaining the second view-finding picture, and may display parameters, such as shooting time, location information, shooting angle information, exposure duration, whether the special effect filter and the flash lamp are turned on, corresponding to the second view-finding picture, and the user may store parameters, such as shooting time, location information, shooting angle information, exposure duration, whether the special effect filter and the flash lamp are turned on, corresponding to the second view-finding picture.
In the embodiment of the invention, the terminal device receives and responds to the first input of the user when displaying the place model corresponding to the target place, the first framing picture is selected in the place model, then the terminal device can receive and responds to the second input of the user, the target time period is selected, and the first framing picture can be processed according to the illumination information corresponding to the target time period, so that the second framing picture can be obtained and displayed, the second framing picture is also the framing effect picture of the first framing picture under the illumination corresponding to the target time period, therefore, the user can obtain the second framing picture as the framing effect picture referred to before shooting in advance under the condition of not stepping on the spot, the preparation and observation work in the early stage before shooting is simplified, and the convenience of framing and shooting is improved.
Referring to fig. 2, another embodiment of the present invention provides an image processing method applied to a terminal device, as shown in fig. 2, the image processing method includes:
in step 201, the terminal device receives a first input when displaying a location model corresponding to a target location.
In an embodiment of the present invention, when the target location input by the user is a location in a city, the location model may be a street view model. When the target location input by the user is a location in a natural landscape, the location model may be a landscape model.
In step 202, the terminal device selects a first viewfinder frame in the location model in response to the first input.
In an embodiment of the present invention, after receiving the first input, the terminal device may determine a first finder picture defined by a "finder window" in the location model in response to the first input.
In step 203, the terminal device receives a second input.
In the embodiment of the invention, the user can select the target time period in the photographing software of the terminal device, and the terminal device can receive the second input of the target time period selected by the user.
In an alternative, the terminal device may display a timeline of the time of day, e.g., a point on the timeline that may represent 1 minute, 5 minutes, 30 minutes, etc. The user may select a target time period of the day by moving a slider on the timeline.
In step 204, the terminal device selects a target time period in response to the second input.
In the embodiment of the present invention, in response to the second input, the terminal device may determine a period input by the user or selected by clicking or the like as the target period.
Step 205, determining first illumination information corresponding to the light-emitting device in the target time period and/or second illumination information corresponding to the ambient natural light in the target time period.
In the embodiment of the present invention, in a case where the target period selected by the user is a night period, the influence of the ambient natural light on the light of the first finder screen is not large at night, and the terminal device may determine only the first illumination information corresponding to the light-emitting device in the first finder screen in the target period.
When the target time period selected by the user is the daytime time period, the light emitting devices such as the street lamps are usually in the off state, the influence of the light emitting devices such as the outdoor display screen on the light of the first view-finding picture is not large, and the terminal device can only determine the second illumination information corresponding to the ambient natural light of the first view-finding picture.
Of course, in order to make the effect graph processed according to the illumination information closer to reality, the terminal device may determine first illumination information corresponding to the lighting device of the internet of things in the target time period and second illumination information corresponding to the ambient natural light in the target time period.
Optionally, the lighting device may specifically be an internet of things lighting device. Because the light-emitting equipment of the Internet of things is connected to the Internet of things, information such as the light-emitting time interval and the light-emitting intensity of the light-emitting equipment of the Internet of things can be obtained more conveniently based on the service function of the Internet of things.
And step 206, the terminal device processes the first view-finding picture according to the first illumination information and/or the second illumination information to obtain a second view-finding picture.
Alternatively, the first illumination information may include at least one of an illumination intensity, an illumination direction, and a light emission range corresponding to the light emitting device during the target period, and the second illumination information may include at least one of an illumination intensity, an illumination direction, and a light emission range corresponding to the ambient natural light during the target period.
In the embodiment of the invention, the terminal device may determine first illumination information corresponding to the light-emitting device in the first viewfinder frame in the target time period, and simulate the device illumination condition corresponding to the first viewfinder frame in the target time period according to at least one of the illumination intensity, the illumination direction and the light-emitting range corresponding to the light-emitting device in the target time period.
The terminal device may determine second illumination information corresponding to the ambient natural light of the location of the first view-finding picture in the target time period, and simulate a natural illumination condition corresponding to the first view-finding picture in the target time period according to at least one of an illumination intensity, an illumination direction, and a light-emitting range corresponding to the ambient natural light in the target time period.
The terminal device can further superimpose the first illumination information and the second illumination information on the first viewing picture, comprehensively simulate the illumination condition of the first viewing picture corresponding to the target time period, and obtain a second viewing picture.
In the embodiment of the invention, a user can adjust the target time period on the photographing software of the terminal device, the photographing software can process the first view-finding picture according to the first illumination information and/or the second illumination information of the target time period to obtain the second view-finding picture, and the user can select the picture corresponding to the target time period which the user most wants to photograph.
In step 207, the terminal device displays a second viewfinder screen.
In the embodiment of the present invention, the terminal device may display the second view-finding picture to the user when obtaining the second view-finding picture, and may display parameters, such as shooting time, location information, shooting angle information, exposure duration, whether the special effect filter and the flash lamp are turned on, corresponding to the second view-finding picture, and the user may store parameters, such as shooting time, location information, shooting angle information, exposure duration, whether the special effect filter and the flash lamp are turned on, corresponding to the second view-finding picture.
For example, referring to fig. 3, a target time period of 18: 35, the second framing picture corresponds to the following parameters: sensitivity (International Organization for Standardization, ISO): 150; shutter speed: 6S; aperture: f2.8; exposure compensation: none. The user may click on the "save" option 301 to save the parameters.
In step 208, the terminal device receives a third input.
In the embodiment of the invention, the user arrives at the place where the second viewing picture is shot in the target time period, and the preview interface of the shooting software is opened, so that the terminal equipment can receive the third input used by the user for starting to monitor the picture in the preview interface.
In step 209, the terminal device starts monitoring the preview screen in response to the third input.
In the embodiment of the invention, the terminal equipment can monitor the preview picture shot by the camera of the photographing software under the condition that the photographing software is opened by a user.
And step 210, the terminal device takes a snapshot under the condition that the preview picture is matched with the second viewing picture.
In the embodiment of the invention, when the similarity between the preview picture and the second view-finding picture is greater than the similarity threshold, the photographing software of the terminal device can automatically take a snapshot according to the parameters of the second view-finding picture stored by the user, so that the user can obtain a picture with almost the same picture effect as the second view-finding picture, that is, the user can photograph the picture desired by the user. In the shooting process, a user does not need to step on points in advance on site, the preparation and the visual work in the early stage of shooting are simplified, and the convenience of framing and shooting is improved.
In step 211, the terminal device obtains reference information, wherein the reference information comprises at least one of historical photos related to the first viewfinder frame, current time, current weather conditions and current light source conditions.
In the embodiment of the present invention, the history photos related to the first finder screen may be history photos related to the first finder screen taken by other users, and the history photos are collected by the photographing software.
And 212, the terminal equipment generates recommended shooting parameters according to the reference information.
In the embodiment of the present invention, the terminal device may synthesize the reference information to generate recommended shooting parameters for the current situation of the target location, where the recommended shooting parameters may include: shooting time, place information, shooting angle information, exposure duration, special effect filter and whether a flash lamp is started or not.
In step 213, the terminal device displays the recommended shooting parameters.
In the embodiment of the invention, the terminal equipment can display the recommended shooting parameter option, and the user can click and check the recommended shooting parameter option so as to shoot according to the recommended shooting parameter.
In the embodiment of the invention, the user can shoot according to the recommended shooting parameters, the recommended shooting parameters are generated according to the reference information, the reference information comprises at least one of historical photos, current time, current weather conditions and current light source conditions related to the first framing picture, and the recommended shooting parameters can be generated and displayed according to the current conditions of the places in the first framing picture, so that when the user shoots the photos according to the recommended shooting parameters, the photos with better effect can be shot, and the shooting quality is improved.
Optionally, in the embodiment of the present invention, each step implemented by the photographing software may be completed by a terminal device, or may be completed by a server of the photographing software, and the embodiment of the present invention is not limited herein.
In the embodiment of the invention, the terminal device receives and responds to the first input of the user when displaying the place model corresponding to the target place, the first framing picture is selected in the place model, then the terminal device can receive and responds to the second input of the user, the target time period is selected, and the first framing picture can be processed according to the illumination information corresponding to the target time period, so that the second framing picture can be obtained and displayed, the second framing picture is also the framing effect picture of the first framing picture under the illumination corresponding to the target time period, therefore, the user can obtain the second framing picture as the framing effect picture referred to before shooting in advance under the condition of not stepping on the spot, the preparation and observation work in the early stage before shooting is simplified, and the convenience of framing and shooting is improved. In addition, in the case where the third input is received, the terminal device may start monitoring the preview screen, and when the preview screen matches the second finder screen, the terminal device may perform snapshot, and the user may obtain a photograph that is almost the same as the effect of the second finder screen. The terminal equipment can also shoot according to the recommended shooting parameters, so that the shot pictures can be combined with the current time, the current weather condition, the current light source condition and other field conditions, the picture effect is better, and the shooting quality is improved.
Having described the image processing method according to the embodiment of the present invention, a terminal device according to the embodiment of the present invention will be described with reference to the accompanying drawings.
Referring to fig. 4, an embodiment of the present invention further provides a structural block diagram of a terminal device, and referring to fig. 4, the terminal device 400 includes:
a first receiving module 401, configured to receive a first input when the terminal device displays a location model corresponding to the target location;
a first selecting module 402, configured to select a first viewfinder frame in the location model in response to the first input;
a second receiving module 403, configured to receive a second input;
a second selecting module 404, configured to select a target time period in response to the second input;
a processing module 405, configured to process the first view frame according to the illumination information corresponding to the target time period to obtain a second view frame;
the first display module 406 is configured to display the second viewfinder picture.
Optionally, referring to fig. 5, the processing module 405 includes:
the determining sub-module 4051 is configured to determine first illumination information corresponding to the light-emitting device in the target time period, and/or second illumination information corresponding to ambient natural light in the target time period;
the processing sub-module 4052 is configured to process the first view frame according to the first illumination information and/or the second illumination information, so as to obtain a second view frame.
Optionally, referring to fig. 5, the terminal device 400 further includes:
a third receiving module 407, configured to receive a third input;
a monitoring module 408, configured to start monitoring a preview screen in response to the third input;
and a snapshot module 409, configured to take a snapshot when the preview picture matches the second view picture.
Optionally, referring to fig. 5, the terminal device 400 further includes:
an obtaining module 410, configured to obtain reference information, where the reference information includes at least one of a history photograph, a current time, a current weather condition, and a current light source condition related to the first viewfinder frame;
the generating module 411 is configured to generate recommended shooting parameters according to the reference information;
and a second display module 412, configured to display the recommended shooting parameter.
The terminal device provided in the embodiment of the present invention can implement each process implemented by the terminal device in the method embodiments of fig. 1 and fig. 2, and is not described herein again to avoid repetition.
In the embodiment of the invention, the terminal device receives and responds to the first input of the user through the first receiving module under the condition of displaying the place model corresponding to the target place, the first selection module selects the first view-finding picture in the place model, then the terminal device can receive and respond to the second input of the user, the target time interval is selected through the second selection module, and the processing module can process the first view-finding picture according to the illumination information corresponding to the target time interval, so that the second view-finding picture, namely the view-finding effect picture of the first view-finding picture under the illumination corresponding to the target time interval can be obtained and displayed through the first display module, therefore, the user can obtain the second view-finding picture as the view-finding effect picture referred to before shooting in advance without stepping on the spot, thereby simplifying the preparation and observation work in the early stage before shooting, the convenience of framing and shooting is improved.
Fig. 6 is a schematic diagram of a hardware structure of a terminal device for implementing various embodiments of the present invention;
the terminal device 500 includes but is not limited to: a radio frequency unit 501, a network module 502, an audio output unit 503, an input unit 504, a sensor 505, a display unit 506, a user input unit 507, an interface unit 508, a memory 509, a processor 510, and a power supply 511. Those skilled in the art will appreciate that the terminal device configuration shown in fig. 6 does not constitute a limitation of the terminal device, and that the terminal device may include more or fewer components than shown, or combine certain components, or a different arrangement of components. In the embodiment of the present invention, the terminal device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The processor 510 is configured to receive a first input when the terminal device displays a location model corresponding to the target location; selecting a first viewfinder frame in the location model in response to the first input; receiving a second input; selecting a target time period in response to the second input; processing the first viewing picture according to the illumination information corresponding to the target time interval to obtain a second viewing picture; and displaying the second framing picture.
In the embodiment of the invention, the terminal device receives and responds to the first input of the user when displaying the place model corresponding to the target place, the first framing picture is selected in the place model, then the terminal device can receive and responds to the second input of the user, the target time period is selected, and the first framing picture can be processed according to the illumination information corresponding to the target time period, so that the second framing picture can be obtained and displayed, the second framing picture is also the framing effect picture of the first framing picture under the illumination corresponding to the target time period, therefore, the user can obtain the second framing picture as the framing effect picture referred to before shooting in advance under the condition of not stepping on the spot, the preparation and observation work in the early stage before shooting is simplified, and the convenience of framing and shooting is improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 501 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 510; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 501 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 501 can also communicate with a network and other devices through a wireless communication system.
The terminal device provides the user with wireless broadband internet access through the network module 502, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 503 may convert audio data received by the radio frequency unit 501 or the network module 502 or stored in the memory 509 into an audio signal and output as sound. Also, the audio output unit 503 may also provide audio output related to a specific function performed by the terminal apparatus 500 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 503 includes a speaker, a buzzer, a receiver, and the like.
The input unit 504 is used to receive an audio or video signal. The input Unit 504 may include a Graphics Processing Unit (GPU) 5041 and a microphone 5042, and the Graphics processor 5041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 506. The image frames processed by the graphic processor 5041 may be stored in the memory 509 (or other storage medium) or transmitted via the radio frequency unit 501 or the network module 502. The microphone 5042 may receive sounds and may be capable of processing such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 501 in case of the phone call mode.
The terminal device 500 further comprises at least one sensor 505, such as light sensors, motion sensors and other sensors. Specifically, the light sensor includes an ambient light sensor that adjusts the brightness of the display panel 5061 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 5061 and/or a backlight when the terminal device 500 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal device posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 505 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 506 is used to display information input by the user or information provided to the user. The Display unit 506 may include a Display panel 5061, and the Display panel 5061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 507 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal device. Specifically, the user input unit 507 includes a touch panel 5071 and other input devices 5072. Touch panel 5071, also referred to as a touch screen, may collect touch operations by a user on or near it (e.g., operations by a user on or near touch panel 5071 using a finger, stylus, or any suitable object or attachment). The touch panel 5071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 510, and receives and executes commands sent by the processor 510. In addition, the touch panel 5071 may be implemented in various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 5071, the user input unit 507 may include other input devices 5072. In particular, other input devices 5072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 5071 may be overlaid on the display panel 5061, and when the touch panel 5071 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 510 to determine the type of the touch event, and then the processor 510 provides a corresponding visual output on the display panel 5061 according to the type of the touch event. Although in fig. 5, the touch panel 5071 and the display 5061 are two independent components to implement the input and output functions of the terminal device, in some embodiments, the touch panel 5071 and the display 5061 may be integrated to implement the input and output functions of the terminal device, and is not limited herein.
The interface unit 508 is an interface for connecting an external device to the terminal apparatus 500. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 508 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal apparatus 500 or may be used to transmit data between the terminal apparatus 500 and the external device.
The memory 509 may be used to store software programs as well as various data. The memory 509 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 509 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 510 is a control center of the terminal device, connects various parts of the entire terminal device by using various interfaces and lines, and performs various functions of the terminal device and processes data by running or executing software programs and/or modules stored in the memory 509 and calling data stored in the memory 509, thereby performing overall monitoring of the terminal device. Processor 510 may include one or more processing units; preferably, the processor 510 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 510.
The terminal device 500 may further include a power supply 511 (e.g., a battery) for supplying power to various components, and preferably, the power supply 511 may be logically connected to the processor 510 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the terminal device 500 includes some functional modules that are not shown, and are not described in detail herein.
Preferably, an embodiment of the present invention further provides a terminal device, which includes a processor 510, a memory 509, and a computer program that is stored in the memory 509 and can be run on the processor 510, and when the computer program is executed by the processor 510, the computer program implements each process of the above-mentioned image processing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the embodiment of the image processing method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (8)

1. An image processing method applied to a terminal device is characterized by comprising the following steps:
receiving a first input under the condition that the terminal equipment displays a place model corresponding to a target place;
selecting a first viewfinder frame in the location model in response to the first input;
receiving a second input;
selecting a target time period in response to the second input;
processing the first viewing picture according to the illumination information corresponding to the target time interval to obtain a second viewing picture;
displaying the second viewfinder;
after the displaying the second viewfinder picture, the method further comprises:
receiving a third input;
in response to the third input, starting to monitor a preview screen;
and under the condition that the preview picture is matched with the second viewing picture, capturing.
2. The method according to claim 1, wherein the step of processing the first viewing frame according to the illumination information corresponding to the target time interval to obtain a second viewing frame comprises:
determining first illumination information corresponding to the light-emitting device in the target time period and/or second illumination information corresponding to the ambient natural light in the target time period;
and processing the first view-finding picture according to the first illumination information and/or the second illumination information to obtain a second view-finding picture.
3. The method of claim 1, wherein after displaying the second viewfinder, further comprising:
acquiring reference information, wherein the reference information comprises at least one of historical photos, current time, current weather conditions and current light source conditions related to the first framing picture;
generating recommended shooting parameters according to the reference information;
and displaying the recommended shooting parameters.
4. A terminal device, characterized in that the terminal device comprises:
the terminal device comprises a first receiving module, a second receiving module and a display module, wherein the first receiving module is used for receiving first input under the condition that the terminal device displays a place model corresponding to a target place;
the first selection module is used for responding to the first input and selecting a first framing picture in the place model;
the second receiving module is used for receiving a second input;
a second selection module for selecting a target time period in response to the second input;
the processing module is used for processing the first view-finding picture according to the illumination information corresponding to the target time interval to obtain a second view-finding picture;
the first display module is used for displaying the second framing picture;
the terminal device further includes:
a third receiving module for receiving a third input;
the monitoring module is used for responding to the third input and starting to monitor a preview picture;
and the snapshot module is used for taking a snapshot under the condition that the preview picture is matched with the second viewing picture.
5. The terminal device of claim 4, wherein the processing module comprises:
the determining submodule is used for determining first illumination information corresponding to the light-emitting device in the target time period and/or second illumination information corresponding to the ambient natural light in the target time period;
and the processing submodule is used for processing the first view-finding picture according to the first illumination information and/or the second illumination information to obtain a second view-finding picture.
6. The terminal device according to claim 4, wherein the terminal device further comprises:
an obtaining module, configured to obtain reference information, where the reference information includes at least one of a history photograph, a current time, a current weather condition, and a current light source condition related to the first viewfinder frame;
the generating module is used for generating recommended shooting parameters according to the reference information;
and the second display module is used for displaying the recommended shooting parameters.
7. A terminal device, characterized in that it comprises a processor, a memory and a computer program stored on the memory and executable on the processor, which computer program, when executed by the processor, implements the steps of the image processing method according to any one of claims 1 to 3.
8. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the image processing method according to any one of claims 1 to 3.
CN201911054922.XA 2019-10-31 2019-10-31 Image processing method and terminal equipment Active CN110740263B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911054922.XA CN110740263B (en) 2019-10-31 2019-10-31 Image processing method and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911054922.XA CN110740263B (en) 2019-10-31 2019-10-31 Image processing method and terminal equipment

Publications (2)

Publication Number Publication Date
CN110740263A CN110740263A (en) 2020-01-31
CN110740263B true CN110740263B (en) 2021-03-12

Family

ID=69270488

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911054922.XA Active CN110740263B (en) 2019-10-31 2019-10-31 Image processing method and terminal equipment

Country Status (1)

Country Link
CN (1) CN110740263B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112511737A (en) * 2020-10-29 2021-03-16 维沃移动通信有限公司 Image processing method and device, electronic equipment and readable storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101247481A (en) * 2007-02-16 2008-08-20 李西峙 System and method for producing and playing real-time three-dimensional movie/game based on role play
CN101515372A (en) * 2009-02-04 2009-08-26 北京石油化工学院 Visual analyzing and predicting method based on a virtual geological model
CN102483859A (en) * 2009-09-29 2012-05-30 索尼计算机娱乐公司 Panoramic image display device and panoramic image display method
CN105844714A (en) * 2016-04-12 2016-08-10 广州凡拓数字创意科技股份有限公司 Augmented reality based scenario display method and system
CN107515674A (en) * 2017-08-08 2017-12-26 山东科技大学 It is a kind of that implementation method is interacted based on virtual reality more with the mining processes of augmented reality
CN108391445A (en) * 2016-12-24 2018-08-10 华为技术有限公司 A kind of virtual reality display methods and terminal
WO2019048102A1 (en) * 2017-09-06 2019-03-14 Brainlab Ag Determining the relative position between a thermal camera and a 3d camera using a hybrid phantom and hybrid phantom
CN109859307A (en) * 2018-12-25 2019-06-07 维沃移动通信有限公司 A kind of image processing method and terminal device
CN110058398A (en) * 2019-04-25 2019-07-26 深圳市声光行科技发展有限公司 A kind of VR telescope

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106954019A (en) * 2017-02-27 2017-07-14 捷开通讯(深圳)有限公司 The method of adjustment camera site and intelligent capture apparatus
CN109104564B (en) * 2018-06-26 2021-01-08 维沃移动通信有限公司 Shooting prompting method and terminal equipment

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101247481A (en) * 2007-02-16 2008-08-20 李西峙 System and method for producing and playing real-time three-dimensional movie/game based on role play
CN101515372A (en) * 2009-02-04 2009-08-26 北京石油化工学院 Visual analyzing and predicting method based on a virtual geological model
CN102483859A (en) * 2009-09-29 2012-05-30 索尼计算机娱乐公司 Panoramic image display device and panoramic image display method
CN105844714A (en) * 2016-04-12 2016-08-10 广州凡拓数字创意科技股份有限公司 Augmented reality based scenario display method and system
CN108391445A (en) * 2016-12-24 2018-08-10 华为技术有限公司 A kind of virtual reality display methods and terminal
CN107515674A (en) * 2017-08-08 2017-12-26 山东科技大学 It is a kind of that implementation method is interacted based on virtual reality more with the mining processes of augmented reality
WO2019048102A1 (en) * 2017-09-06 2019-03-14 Brainlab Ag Determining the relative position between a thermal camera and a 3d camera using a hybrid phantom and hybrid phantom
CN109859307A (en) * 2018-12-25 2019-06-07 维沃移动通信有限公司 A kind of image processing method and terminal device
CN110058398A (en) * 2019-04-25 2019-07-26 深圳市声光行科技发展有限公司 A kind of VR telescope

Also Published As

Publication number Publication date
CN110740263A (en) 2020-01-31

Similar Documents

Publication Publication Date Title
CN108848308B (en) Shooting method and mobile terminal
CN108668083B (en) Photographing method and terminal
CN109639970B (en) Shooting method and terminal equipment
CN107786817B (en) A kind of photographic method and mobile terminal
CN108055402B (en) Shooting method and mobile terminal
CN108307109B (en) High dynamic range image preview method and terminal equipment
CN108174103B (en) Shooting prompting method and mobile terminal
CN111416940A (en) Shooting parameter processing method and electronic equipment
CN109639969B (en) Image processing method, terminal and server
CN108924414B (en) Shooting method and terminal equipment
CN109819168B (en) Camera starting method and mobile terminal
CN108174109B (en) Photographing method and mobile terminal
CN107730460B (en) Image processing method and mobile terminal
CN109788204A (en) Shoot processing method and terminal device
CN108040209B (en) Shooting method and mobile terminal
CN109618218B (en) Video processing method and mobile terminal
CN111246111B (en) Photographing method, electronic device, and medium
CN110602387B (en) Shooting method and electronic equipment
CN108924413B (en) Shooting method and mobile terminal
CN111064888A (en) Prompting method and electronic equipment
CN110868535A (en) Shooting method, shooting parameter determination method, electronic equipment and server
CN110163036B (en) Image recognition method and device
CN108345657B (en) Picture screening method and mobile terminal
CN108243489B (en) Photographing control method and mobile terminal
CN108156386B (en) Panoramic photographing method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant