CN109413333B - Display control method and terminal - Google Patents

Display control method and terminal Download PDF

Info

Publication number
CN109413333B
CN109413333B CN201811433680.0A CN201811433680A CN109413333B CN 109413333 B CN109413333 B CN 109413333B CN 201811433680 A CN201811433680 A CN 201811433680A CN 109413333 B CN109413333 B CN 109413333B
Authority
CN
China
Prior art keywords
target window
input
image
target
window
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811433680.0A
Other languages
Chinese (zh)
Other versions
CN109413333A (en
Inventor
索国国
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201811433680.0A priority Critical patent/CN109413333B/en
Publication of CN109413333A publication Critical patent/CN109413333A/en
Application granted granted Critical
Publication of CN109413333B publication Critical patent/CN109413333B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Environmental & Geological Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a display control method and a terminal, wherein the method comprises the following steps: receiving a first input of a user; capturing an image in response to the first input; displaying the photographed image on a target window; the target window is displayed in an image display area of a shooting preview interface, and the area of the target window is smaller than that of the image display area. By the display control method provided by the invention, the shot image can be checked based on the target window in the process of shooting the image, so that the image can be checked conveniently, a user can determine whether to continue shooting according to the shot image or not, or the shooting angle, the shooting parameters and the like are adjusted to continue shooting, and the image shooting effect is improved.

Description

Display control method and terminal
Technical Field
The present invention relates to the field of communications technologies, and in particular, to a display control method and a terminal.
Background
With the development of terminal technology, the functions of terminals are more and more diversified, wherein cameras have become indispensable functions of many terminals. At present, in the process of using a camera of a terminal, after a user shoots an image, the shot image can be presented in a thumbnail control of a shooting preview interface in a thumbnail mode, and when the user needs to check the shot image, the user needs to click the thumbnail control, quit the shooting preview interface and enter a gallery, and then the original image can be displayed.
Therefore, the problem that in the prior art, images are inconvenient to view in the process of shooting the images exists.
Disclosure of Invention
The embodiment of the invention provides a display control method and a terminal, aiming at solving the problem that image viewing is inconvenient in the image shooting process.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides a display control method, which is applied to a terminal, and the method includes:
receiving a first input of a user;
capturing an image in response to the first input;
displaying the photographed image on a target window;
the target window is displayed in an image display area of a shooting preview interface, and the area of the target window is smaller than that of the image display area.
In a second aspect, an embodiment of the present invention further provides a terminal. The terminal includes:
the first receiving module is used for receiving a first input of a user;
a photographing module for photographing an image in response to the first input;
the first display module is used for displaying the shot image on a target window;
the target window is displayed in an image display area of a shooting preview interface, and the area of the target window is smaller than that of the image display area.
In a third aspect, an embodiment of the present invention further provides a terminal, including a processor, a memory, and a computer program stored in the memory and capable of running on the processor, where the computer program, when executed by the processor, implements the steps of the display control method described above.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the display control method are implemented.
In the embodiment of the invention, a first input of a user is received; capturing an image in response to the first input; displaying the photographed image on a target window; the target window is displayed in the image display area of the shooting preview interface, the area of the target window is smaller than that of the image display area, the shot image can be checked based on the target window in the image shooting process, the image can be checked conveniently, a user can determine whether to continue shooting according to the shot image or not, or shooting angles, shooting parameters and the like are adjusted to continue shooting, and therefore the image shooting effect is improved.
Drawings
Fig. 1 is a flowchart of a display control method according to an embodiment of the present invention;
FIG. 2 is a diagram illustrating a target window displayed on a preview screen according to an embodiment of the present invention;
FIG. 3 is a flowchart of a display control method according to another embodiment of the present invention;
fig. 4 is a structural diagram of a terminal provided in an embodiment of the present invention;
fig. 5 is a schematic diagram of a hardware structure of a terminal for implementing various embodiments of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The embodiment of the invention provides a display control method, which is applied to a terminal, wherein the terminal can be a Computer, a mobile phone, a Tablet Personal Computer (Tablet Personal Computer), a Laptop Computer (Laptop Computer), a Personal digital assistant (PDA for short), a Wearable Device (Wearable Device), or the like.
Referring to fig. 1, fig. 1 is a flowchart of a display control method according to an embodiment of the present invention, as shown in fig. 1, including the following steps:
step 101, receiving a first input of a user.
In an embodiment of the present invention, the first input may be a voice input, a touch input on a terminal screen or a hover touch input, a press input to a physical key of a terminal, or other inputs that may trigger image capturing.
Step 102, in response to the first input, capturing an image.
For example, the first input is a touch input to a shooting button of the terminal camera, and the image is shot when the user touches the shooting button of the terminal camera.
It should be noted that one or at least two images may be captured in response to the first input.
Step 103, displaying the shot image on a target window; the target window is displayed in an image display area of a shooting preview interface, and the area of the target window is smaller than that of the image display area.
In an embodiment of the present invention, the image display area is an area for displaying a preview image in the shooting preview interface. The target window is an area for displaying a captured image.
The display parameters (e.g., window size, display position, transparency, etc.) of the above-described target window may be preset. In practical applications, a setting option for the target window may be added in the camera setting in advance, and the display parameters of the target window may be set through the setting option. For example, the position and size of the target window may be set by setting coordinates, for example, vertex coordinates of the target window:
the coordinates of the top left vertex are: (set value/total number of rows, set value/total number of columns);
the coordinates of the vertices of the lower right corner are: (set value/total number of rows, set value/total number of columns);
the total number of rows and the total number of columns are fixed values in the camera, the set value is larger than zero and smaller than the corresponding fixed value, and the target window is rectangular, so that the position and the size of the target window can be determined based on the coordinates of the two vertexes. In addition, when the parameter set by the user does not meet the preset display requirement, prompt can be performed.
For example, as shown in fig. 2, a target window 11 is displayed in the upper left corner of the image display area 10 of the shooting preview interface, wherein the size of the target window 11 is one quarter of the area of the image display area 10 of the shooting preview interface. The shooting preview interface is further provided with a shooting control 12 and a thumbnail control 13.
The display parameters of the target window may also be determined according to input parameters input by the user, for example, when the user input is a sliding input, the display parameters of the target window may be determined according to parameters of a sliding distance, a sliding time length, a sliding track, and the like of the sliding input.
It should be noted that, in the embodiment of the present invention, the target window may be immediately displayed under the condition that the terminal screen displays the shooting preview interface; the target window may be displayed in a case where an input for instructing display of the target window is received; the target window may be displayed when it is detected that the image is captured.
For ease of understanding, the following description is made with reference to examples:
example one: when a user starts a camera of the terminal to enter a shooting preview interface, a target window can be immediately displayed in an image display area of the shooting preview interface, and at the moment, the target window can not display any content or can display a preview image. After the user captures an image, the currently captured image is displayed in the target window. Therefore, under the condition that the effect of the currently shot image is poor, the user can adjust the shooting angle, the shooting parameters and the like based on the image to continue shooting so as to obtain the image with better shooting effect, and under the condition that the image effect is good, shooting can be finished or other objects can be shot so as to avoid the image with similar shooting angle content.
Example two: when the user starts the camera of the terminal to enter the shooting preview interface, the target window may not be displayed immediately. And after the user shoots the image, displaying the target window, and displaying the currently shot image in the target window for the user to view.
The display control method of the embodiment of the invention receives a first input of a user; capturing an image in response to the first input; displaying the photographed image on a target window; the target window is displayed in the image display area of the shooting preview interface, the area of the target window is smaller than that of the image display area, the shot image can be checked based on the target window in the image shooting process, the image can be checked conveniently, a user can determine whether to continue shooting according to the shot image or not, or shooting angles, shooting parameters and the like are adjusted to continue shooting, and therefore the image shooting effect is improved.
Referring to fig. 3, fig. 3 is a flowchart of a display control method according to an embodiment of the present invention. The difference between the embodiment of the present invention and the previous embodiment is mainly to further define that the display parameters of the target window can be adjusted based on the user input. In an embodiment of the present invention, after the displaying the captured image in the target window, the method further includes: receiving a second input of the user; adjusting a display parameter of the target window in response to the second input; wherein the display parameter includes at least one of a window size, a display position, a display duration, and a transparency.
As shown in fig. 3, the display control method provided in the embodiment of the present invention includes the following steps:
step 301, receiving a first input of a user.
This step may be the same as step 101, and is not described herein again to avoid repetition.
Step 302, in response to the first input, capturing an image.
This step may be the same as step 102, and is not described herein again to avoid repetition.
And 303, displaying the shot image in a target window.
This step may be the same as step 103, and is not described herein again to avoid repetition.
Step 304, receiving a second input of the user;
in the embodiment of the present invention, the second input may be a voice input, a touch input on a terminal screen, or a floating touch input. The touch input on the terminal screen may be a touch input for a shooting button, or a touch input for a target window.
Step 305, responding to the second input, and adjusting the display parameters of the target window; wherein the display parameter includes at least one of a window size, a display position, a display duration, and a transparency.
In the embodiment of the present invention, after receiving the second input of the user, the input parameter of the second input may be obtained, and the display parameter of the target window may be adjusted according to the input parameter of the second input. For example, in a case where the second input is a sliding input on the terminal screen, the position of the vertex at the upper left corner of the target window may be adjusted to the initial position of the sliding input, and the size of the target window may be adjusted according to the sliding distance of the sliding input, such as the longer the sliding distance, the larger the target window; or the transparency or the display duration of the target window can be adjusted according to the sliding distance of the sliding input, for example, the longer the sliding distance is, the higher the transparency or the longer the display duration of the target window is, and the like.
Optionally, after receiving the second input of the user, the mobile terminal may further obtain a moving parameter of the terminal (for example, a moving distance and a moving direction of the terminal in space), or obtain an area where the user holds the terminal, and adjust a size or a position of the mobile terminal according to the moving parameter of the terminal or the area where the user holds the terminal.
The display control method provided by the embodiment of the invention can not only realize that the shot image is viewed based on the target window in the process of shooting the image, but also flexibly adjust the display parameters of the target window, so that the displayed target window is more suitable for the use requirements of users.
Optionally, the second input is a first drag input for the shooting control;
the step 305, namely, the adjusting the display parameter of the target window in response to the second input, includes:
adjusting the size of the target window according to the size adjustment speed corresponding to the input parameter of the first dragging input, wherein the input parameter of the first dragging input comprises at least one of dragging direction, dragging distance, dragging duration and dragging speed;
alternatively, the first and second electrodes may be,
in response to the first drag input, resizing the target window to a preset window size.
In an embodiment, the size of the target window may be adjusted according to at least one of a dragging direction, a dragging distance, a dragging duration, a dragging speed, and the like in a case where the user drags the shooting control.
For example, the corresponding relationship between the input parameter of the drag input and the resizing speed may be pre-established, such as the drag direction R1 corresponding to the resizing speed v1, the drag direction R2 corresponding to the resizing speed v2, and so on, so as to quickly find the corresponding resizing speed based on the input parameter of the drag input, so as to adjust the size of the target window based on the found resizing speed; a mapping function between the input parameters of the dragging input and the resizing speed may also be pre-established, for example, v ═ k × T × D, where v denotes the resizing speed, k is an adjustable fixed value, T denotes the dragging duration, and D denotes the dragging distance.
The resizing speed may be indicative of a window area adjusted per unit time (e.g., per second), such as an increased window area per second or a decreased window area per second. The window area may be a relative area, for example, the area of the target window S1, and the resizing speed may be 0.05S 1/sec.
In practical applications, when the resizing speed is determined based on at least one of the dragging distance, the dragging duration, the dragging speed, and the like, it may also be determined whether to resize the target window or to resize the target window based on the dragging direction. For example, the first dragging direction corresponds to a target window being enlarged, and the second dragging direction corresponds to a target window being reduced.
Optionally, when the first drag input is ended, or the shooting control returns to a position before the drag, or the size of the target window reaches a certain preset size (e.g., 80% of the shooting preview interface), the adjustment of the size of the target window may be stopped.
Optionally, in the process of adjusting the size of the target window, a position of a vertex of the target window may be kept unchanged, a position of a geometric center of the target window may also be kept unchanged, or a position of the image display window may also not be fixed, which is not limited in the embodiment of the present invention.
In the embodiment, the size adjustment speed is determined based on the input parameter of the first drag input, and the size of the target window is adjusted based on the determined size adjustment speed, so that the size of the target window can be adjusted gradually.
In another embodiment, the size of the target window may be directly adjusted to a preset window size in response to the first drag input, so as to improve the efficiency of size adjustment of the target window. The size of the preset window may be set according to actual requirements, for example, the size of the preset window is the size of a shooting preview interface, so that a user can conveniently preview a shot image in a full screen mode, or the size of the preset window is one eighth of the shooting preview interface, so as to reduce the influence of image display on a preview image.
For ease of understanding, the following description is made with reference to examples:
example one: in the process that the user presses the shooting control to shoot the image, the finger pressing the shooting control can not be loosened, and the shooting control is continuously dragged. At this time, the terminal displays the shot image in the target window, determines the size adjustment speed according to the dragging time of the shooting control by the user, and determines whether to adjust the display window to be larger or smaller according to the size adjustment speed based on the dragging direction, for example, the display window is adjusted to be larger according to the size adjustment speed when dragging to the left, and the display window is adjusted to be larger according to the size adjustment speed when dragging to the right.
Example two: in the process that the user presses the shooting control to shoot the image, the finger pressing the shooting control can not be loosened, and the shooting control is continuously dragged. At this time, the terminal displays the shot image in the target window, and can directly adjust the size of the target window to the preset window size, for example, adjust the size of the target window to be the same as that of the shooting preview interface, so that the user can conveniently preview the shot image in full screen.
Optionally, after the size of the target window is adjusted to the preset window size in response to the first drag input, the method further includes:
under the condition that a preset condition is met, restoring the size of the target window to the size of the window before adjustment;
wherein the preset condition comprises at least one of the following:
the first drag input ends;
the shooting control is restored to the display position before dragging;
receiving a third input, wherein the third input is received after the first drag input is finished;
and the adjusted display time length of the target window reaches a preset time length.
In an embodiment of the present invention, the third input may be any input received after the first drag input is ended, for example, a voice input, a touch input on a terminal screen, an input for a terminal key, a floating touch input, and the like.
In practical applications, after the dragging of the shooting control is stopped, the target window can be restored to the state before adjustment when an input of clicking a screen, sliding the screen, pressing a volume key, a power key, or the like is received.
The preset time period can be set according to actual requirements, for example, 2 seconds, 3 seconds, and the like.
The embodiment of the invention can improve the flexibility of the control of the target window by recovering the target window to the state before the adjustment. In addition, under the condition that the size of the preset window is the size of the shooting preview interface, the size of the target window is restored to the size of the window before adjustment, so that the user can conveniently continue to preview the picture for shooting.
Optionally, the adjusting the display parameter of the target window in response to the second input includes:
responding to the second input, and acquiring the movement parameters of the terminal, wherein the movement parameters comprise a movement distance and a movement direction;
and adjusting the display position of the target window according to the movement parameters.
In an embodiment of the present invention, the second input may be a touch input on the shooting interface, or a touch input (e.g., a drag input) for the shooting control.
In practical application, in the process that the user presses the shooting control to shoot the image, the finger pressing the shooting control can not be loosened, and the terminal (for example, a mobile phone) can be moved after the shooting control is dragged to move for a preset distance. At this time, movement parameters such as a movement distance and a movement direction of the terminal may be acquired by a gyroscope or the like, and the display position of the target window may be adjusted based on the movement parameters of the terminal. For example, if the user moves the terminal a first distance to the left, the target window may be moved to the right by the first distance with reference to the left boundary of the shooting preview interface, so as to keep the position of the target window relative to the user's vision unchanged.
The display position of the target window is adjusted based on the mobile parameters of the terminal, so that the adjustment of the display position of the target window is flexible and convenient.
Optionally, before the captured image is displayed in the target window, the method further includes:
receiving a fourth input;
determining display parameters of a target window according to the fourth input parameters;
displaying a target window in the image display area according to the display parameters;
wherein the display parameter includes at least one of a window size, a display position, a display duration, and a transparency.
In an embodiment of the present invention, the fourth input may be a touch input on a terminal screen, an input for a terminal key, a floating touch input, or the like.
For example, in a case that a shooting preview interface is displayed on a terminal screen, if a closed track that a user slides out of the shooting preview interface is detected, the size and the display position of a target window can be determined based on the closed track; or under the condition that a shooting preview interface is displayed on the terminal screen, if the user does not release the finger after pressing the shooting control to shoot the image and continues to drag the shooting control, the size, the display duration and the like of the target window can be determined according to the duration, the distance and the like for dragging the shooting control, for example, the longer the dragging distance is, the larger the target window is, the longer the dragging time is, and the longer the display duration is; and so on.
In the embodiment of the invention, the display parameters of the target window are determined based on the input parameters input by the user, and the target window is displayed in the image display area of the shooting preview interface based on the determined display parameters, so that the display of the target window is more flexible, the requirement of the user for viewing the image is better met, and the operation of later adjustment can be reduced.
Optionally, before the captured image is displayed in the target window, the method further includes:
determining a contact position of a user contacting a terminal screen;
determining an area which is away from the preset range of the contact position as a first target display area;
the displaying the photographed image in a target window includes:
and displaying the target window in the first target display area, and displaying the shot image in the target window.
In practical situations, when a user holds the terminal with fingers, the fingers of the user often touch only a certain range of area of the terminal screen. Therefore, the display area of the target window can be determined based on the contact position of the finger contacting the terminal screen when the user holds the terminal, and the user can operate the target window conveniently.
In the embodiment of the invention, the preset range can be set according to actual requirements. For example, a region where the distance is a predetermined length from the contact position as a dot to a radius is determined as the first target display region.
The embodiment of the invention can display the target window in the area which is away from the preset range of the contact position, thereby facilitating the operation of the target window by a user.
Optionally, before the captured image is displayed in the target window, the method further includes:
acquiring a main area of a preview image in the image display area;
determining a second target display area of the target window in the image display area according to the main body area;
displaying the target window in the second target display area;
and when the target window is displayed in the second target display area, the shielding area of the main body area is minimum.
In the embodiment of the invention, the shooting subject can be identified by identifying the preview image of the image display area in the shooting preview interface, and the subject area can be determined based on the area where the shooting subject is located. For example, an object such as a person or an animal may be a subject of photographing, and in a case where a face is recognized to be included in the preview image, a region where the face is located may be acquired, and the region where the face is located may be determined as a subject region. After the main body area is determined, the target window can be displayed at a position where the main body area is not shielded or the area shielding the main body area is smaller as much as possible, so that the shielding of the target window on the shot main body in the preview image is reduced, and the shooting effect is influenced.
According to the embodiment of the invention, the target window is displayed in the second target display area with the minimum shielding area for the main body area, so that the shielding of the target window on the shot main body in the preview image is reduced, and the shooting effect is influenced.
Optionally, the displaying the captured image in the target window includes:
displaying M images in M target windows;
wherein, the M target windows correspond to the M images one by one; m is an integer determined according to the number N of the images shot in the target time period, and M is smaller than or equal to N;
the starting time of the target time period is the current time T, the ending time is T-T, and T is preset duration;
and the M images are N images shot in the target time period, and the images at the front M are sorted from late to early according to the shooting time.
In practical applications, in the case where a user captures a plurality of images within a target time period, a plurality of target windows may be displayed to display a plurality of images captured recently to increase the number of displayable images.
Alternatively, M may be equal to N in the case that N is less than or equal to K; in the case where N is greater than K, M ═ K, where K is a preset value, e.g., K is 3, 4, etc. By setting the maximum number of displayable images, it is possible to avoid that too many images are displayed to seriously affect the image preview.
According to the embodiment of the invention, the plurality of recently shot images are displayed in the plurality of target windows, so that a user can conveniently check the plurality of shot images, and more reference information for shooting the images can be provided for the user.
Optionally, the size of the target windows may be adjusted according to the change of the number of the target windows, for example, the larger the number of the target windows, the smaller each target window is.
It should be noted that, in the case that the number of the target windows changes, the embodiment of the present invention may also keep the size of the target window for displaying the latest captured image unchanged, and adjust the sizes of the other target windows, so that the user may more accurately recognize the content of the latest captured image.
Optionally, the displaying the captured image in the target window includes:
displaying a shot first image on a first target window, and displaying a shot second image on a second target window, wherein the shooting time of the second image is later than that of the first image, and the second target window is positioned on a first side of the first target window;
after the displaying the photographed image in the target window, the method further includes:
receiving a first sliding input of a user for a shooting control;
adjusting a position of the second target window to be on a second side of the first target window in response to the first sliding input;
wherein the first side and the second side are different.
In the embodiment of the present invention, the first side may be any side, for example, a left side, a right side, an upper side, a lower side, and the like. The second side may be any side different from the first side.
For example, the first image may be displayed in the first object window after the first image is captured, the second object window may be displayed on the right side of the first object window after the second image is captured, and the second image may be displayed in the second object window. If the user slides the shooting control, the position of the second target window can be adjusted to the left side of the first target window, and if the user continues to slide the shooting control, the position of the second target window can be adjusted to the upper side of the first target window again, and so on.
According to the embodiment of the invention, the position of the second target window can be conveniently adjusted by sliding the shooting control.
Optionally, the first input is a slide input;
the displaying the photographed image in a target window includes:
acquiring input parameters of the sliding input;
acquiring a target side associated with the input parameter;
displaying a fourth target window on the target side of the third target window, and displaying the photographed image in the fourth target window;
wherein the third target window is a window that has been displayed in the image display area before the fourth target window is displayed.
In an embodiment of the present invention, the input parameters of the sliding input may include a sliding direction, a sliding distance, a sliding duration, a sliding frequency, and the like. The above target side may be one of left, right, upper, lower, and the like.
Optionally, an association relationship between the input parameter and the display side may be established in advance, for example, a leftward sliding is associated with a left side, a rightward sliding is associated with a right side, an upward sliding is associated with an upper side, and a downward sliding is associated with a lower side; or the sliding distance in the first distance interval is associated with the left side, the sliding distance in the second distance interval is associated with the right side, the sliding distance in the third distance interval is associated with the left side, the sliding distance in the fourth distance interval is associated with the right side, and so on. Thus, the associated target side can be determined according to the input parameters, and the display position of the window for displaying the photographed image is determined.
The third target window may be any window that is displayed before the fourth target window, may be a window that is displayed most recently, or may be a window that is displayed earliest, which is not limited in this embodiment of the present invention.
The embodiment of the invention can respond to the sliding input to shoot the image and simultaneously determine the position of the window for displaying the shot image according to the input parameter of the sliding input, and has more convenient and flexible operation.
The following describes embodiments of the present invention with reference to examples:
the display control method provided by the embodiment of the invention comprises the following steps:
and a step a1, displaying a target window in the image display area of the shooting preview interface.
In this step, the target window may be displayed according to default display parameters (for example, size, position, transparency, and the like), or the image display window may be displayed according to display parameters of the previous time when the target window was displayed.
For example, when the user opens the camera to display the shooting preview interface, a target window is displayed in the image display area of the shooting preview interface, and the position of the default target window is at the leftmost upper corner of the image display area of the whole shooting preview interface and is one fourth of the area of the image display area of the whole shooting preview interface.
And step a2, adjusting the size of the target window.
In this step, the size (also referred to as a size) of the target window during shooting may be adjusted in different operation modes so that the size of the area for image preview and the size of the area for viewing an image are appropriately sized.
For example, during the shooting process, the target window for displaying the image may gradually become larger by pressing the shooting control (e.g., the shooting button) without being released and dragging upward, and in the case of dragging the shooting control downward, the target window for displaying the image may gradually become smaller, wherein the increasing or decreasing speed of the target window may be determined based on the distance D and the dragging time T at which the shooting control is dragged.
It should be noted that the expansion may be stopped when the target window expands to a first preset multiple (e.g., 0.8, 0.65, etc.) of the shooting preview interface, so as to reduce the influence of image viewing on image preview; the reduction may be stopped in a case where the reduction of the target window is to a second preset multiple (e.g., 0.2, 0.15, etc.) of the photographing preview interface, so as not to make it difficult to recognize the content of the displayed image.
Alternatively, when the user has not started to capture an image, the target window may have only a border and display the corresponding content of the camera preview. When the user photographs an image, the target window displays the photographed image.
And a step of 3, adjusting the display position of the target window.
The step can adjust the display position of the target window in the shooting process through different operation modes, so that the target window is located at a proper position, and a user can conveniently view the image.
For example, when the shooting control (e.g., shooting button) is pressed during shooting and is not released and dragged to the left by a certain amplitude, the amplitude can be set according to the actual situation, for example, 1/20 which is set as the diameter of the shooting button. The position of the target window is not fixed relative to the edge of the shooting preview interface any more, and when a user presses the shooting control and moves the mobile phone up, down, left or right, the position of the target window can correspondingly move along with the movement of the mobile phone until the target window moves to the edge of the preview image; or the target window moves to the expected position of the user, when the user releases the shooting control, the shooting control returns to the original position, and the position of the target window is fixed at the same time. By the method, the user can adjust the position of the target window, the target window can not cover the main area of the camera shooting preview interface, and the user can see the main part in the shooting preview interface.
Step a4, displaying the captured image in the target window.
For example, after the user takes an image, a thumbnail of the taken image is displayed in the thumbnail control, and the taken image is displayed in the target window, so that the user does not need to perform any operation again to view the image. In addition, the target window can also serve as a gallery function when presenting images, and users can directly view the photographed original image by clicking the target window. If the user does not perform any operation after taking the picture, the time for the target window to display the image is a certain preset time (for example, 5 seconds), or the image is displayed until the user takes the next image and displays the newly taken image.
Optionally, the number of the target windows may be one or more. In practical applications, the user may set in advance whether only the currently captured image is displayed or the previously captured image is included. For example, after the image is taken for the first time, only one target window, that is, the first target window, may be displayed on the shooting preview interface; after the second shot of the image, when the user selects the second shot while displaying the previously shot image, the second shot image may be displayed in a second target window, and the second target window may be defaulted to the right of the first target window with adjacent boundaries overlapping and aligned up and down.
Optionally, when the user slides the shooting control to the right, the second target window may be shifted to the left of the first target window. When the user slides the photographing control to the right again, the second target window may be shifted to the upper side or the lower side of the first display window. After the above operation, the second target window may already exceed the shooting preview interface, and all target windows in the shooting preview interface may be adaptively sized, so that the area occupied by all target windows does not exceed a third preset multiple (e.g., 80%) of the whole shooting preview interface, so that the user may view multiple images just shot.
Optionally, the operation of dragging the shooting button may also trigger full-screen display of the image. For example, dragging the capture control once, the image in the target window may be displayed full screen, e.g., covering the entire capture preview interface. When the user performs any other operation, for example, clicking a screen, sliding the screen, double clicking the screen, pressing a volume key, pressing a power key, and the like, the target window returns to the previous state; or display automatically returns to the previous state after a certain length of time (e.g., 2 seconds).
Optionally, the operation of dragging the shooting button may also control the display duration. For example, during the process of shooting the image, the user does not immediately release the finger, but drags the shooting button in a certain direction, wherein the direction can be any one direction or a preset direction. At the moment the image is taken, the image is displayed in the target window. And at the moment of dragging the shooting button, displaying the image in the target window to the whole camera screen or the terminal screen until the shooting button is dragged to the original position by the user, or the image is not displayed in a full screen and is restored to the original state when the user releases the finger.
The embodiment of the invention can realize the preview and shooting of the camera and simultaneously check the image just shot. In addition, the user can adjust the target window by dragging the shooting control, and the operation is convenient.
Referring to fig. 4, fig. 4 is a structural diagram of a terminal according to an embodiment of the present invention. As shown in fig. 4, the terminal 400 includes:
a first receiving module 401, configured to receive a first input of a user;
a photographing module 402 for photographing an image in response to the first input;
a first display module 403, configured to display the captured image in a target window;
the target window is displayed in an image display area of a shooting preview interface, and the area of the target window is smaller than that of the image display area.
Optionally, the terminal further includes:
a second receiving module, configured to receive a second input of the user after the photographed image is displayed in the target window;
a first adjusting module, configured to adjust a display parameter of the target window in response to the second input;
wherein the display parameter includes at least one of a window size, a display position, a display duration, and a transparency.
Optionally, the second input is a first drag input for the shooting control;
the first adjusting module is specifically configured to:
adjusting the size of the target window according to the size adjustment speed corresponding to the input parameter of the first dragging input, wherein the input parameter of the first dragging input comprises at least one of dragging direction, dragging distance, dragging duration and dragging speed;
alternatively, the first and second electrodes may be,
in response to the first drag input, resizing the target window to a preset window size.
Optionally, the terminal further includes:
a restoring module, configured to restore the size of the target window to a window size before adjustment when a preset condition is met after the size of the target window is adjusted to a preset window size in response to the first drag input;
wherein the preset condition comprises at least one of the following:
the first drag input ends;
the shooting control is restored to the display position before dragging;
receiving a third input, wherein the third input is received after the first drag input is finished;
and the adjusted display time length of the target window reaches a preset time length.
Optionally, the first adjusting module is specifically configured to:
responding to the second input, and acquiring the movement parameters of the terminal, wherein the movement parameters comprise a movement distance and a movement direction;
and adjusting the display position of the target window according to the movement parameters.
Optionally, the terminal further includes:
a first determining module, configured to determine a contact position where a user contacts a terminal screen before the captured image is displayed in the target window;
the second determining module is used for determining an area which is away from the preset range of the contact position as a first target display area;
the first display module is specifically configured to:
and displaying the target window in the first target display area, and displaying the shot image in the target window.
Optionally, the terminal further includes:
an acquisition module, configured to acquire a main area of a preview image in the image display area before the captured image is displayed in a target window;
a third determining module, configured to determine, according to the main body region, a second target display region of the target window in the image display region;
the second display module is used for displaying the target window in the second target display area;
and when the target window is displayed in the second target display area, the shielding area of the main body area is minimum.
Optionally, the first display module is specifically configured to:
displaying M images in M target windows;
wherein, the M target windows correspond to the M images one by one; m is an integer determined according to the number N of the images shot in the target time period, and M is smaller than or equal to N;
the starting time of the target time period is the current time T, the ending time is T-T, and T is preset duration;
and the M images are N images shot in the target time period, and the images at the front M are sorted from late to early according to the shooting time.
Optionally, the first display module is specifically configured to:
displaying a shot first image on a first target window, and displaying a shot second image on a second target window, wherein the shooting time of the second image is later than that of the first image, and the second target window is positioned on a first side of the first target window;
the terminal further comprises:
the third receiving module is used for receiving a first sliding input of a user for a shooting control after the shot image is displayed in the target window;
a second adjustment module, configured to adjust a position of the second target window in response to the first sliding input, so that the second target window is located on a second side of the first target window;
wherein the first side and the second side are different.
Optionally, the first input is a slide input;
the first display module is specifically configured to:
acquiring input parameters of the sliding input;
acquiring a target side associated with the input parameter;
displaying a fourth target window on the target side of the third target window, and displaying the photographed image in the fourth target window;
wherein the third target window is a window that has been displayed in the image display area before the fourth target window is displayed.
The terminal 400 provided in the embodiment of the present invention can implement each process implemented by the terminal in the method embodiments of fig. 1 and fig. 3, and is not described herein again to avoid repetition.
In the terminal 400 of the embodiment of the present invention, the first receiving module 401 is configured to receive a first input of a user; a photographing module 402 for photographing an image in response to the first input; a first display module 403, configured to display the captured image in a target window; the target window is displayed in the image display area of the shooting preview interface, the area of the target window is smaller than that of the image display area, the shot image can be checked based on the target window in the image shooting process, the image can be checked conveniently, a user can determine whether to continue shooting according to the shot image or not, or shooting angles, shooting parameters and the like are adjusted to continue shooting, and therefore the image shooting effect is improved.
Fig. 5 is a schematic diagram of a hardware structure of a terminal for implementing various embodiments of the present invention. Referring to fig. 5, the terminal 500 includes, but is not limited to: a radio frequency unit 501, a network module 502, an audio output unit 503, an input unit 504, a sensor 505, a display unit 506, a user input unit 507, an interface unit 508, a memory 509, a processor 510, and a power supply 511. Those skilled in the art will appreciate that the terminal configuration shown in fig. 5 is not intended to be limiting, and that the terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
Wherein, the processor 510 is configured to receive a first input from a user; capturing an image in response to the first input; displaying the photographed image on a target window; the target window is displayed in an image display area of a shooting preview interface, and the area of the target window is smaller than that of the image display area.
The embodiment of the invention can realize that the shot image is checked based on the target window in the process of shooting the image, so that the image checking is more convenient, and a user can conveniently determine whether to continue shooting or not according to the shot image, or adjust the shooting angle, the shooting parameters and the like to continue shooting, thereby improving the image shooting effect.
The processor 510 is further configured to:
receiving a second input of a user after the photographed image is displayed in the target window;
adjusting a display parameter of the target window in response to the second input;
wherein the display parameter includes at least one of a window size, a display position, a display duration, and a transparency.
Optionally, the second input is a first drag input for the shooting control;
the processor 510 is further configured to:
adjusting the size of the target window according to the size adjustment speed corresponding to the input parameter of the first dragging input, wherein the input parameter of the first dragging input comprises at least one of dragging direction, dragging distance, dragging duration and dragging speed;
alternatively, the first and second electrodes may be,
in response to the first drag input, resizing the target window to a preset window size.
Optionally, the processor 510 is further configured to:
after the target window is adjusted to the preset window size in response to the first dragging input, restoring the size of the target window to the size of the window before adjustment under the condition that a preset condition is met;
wherein the preset condition comprises at least one of the following:
the first drag input ends;
the shooting control is restored to the display position before dragging;
receiving a third input, wherein the third input is received after the first drag input is finished;
and the adjusted display time length of the target window reaches a preset time length.
Optionally, the processor 510 is further configured to:
responding to the second input, and acquiring the movement parameters of the terminal, wherein the movement parameters comprise a movement distance and a movement direction;
and adjusting the display position of the target window according to the movement parameters.
Optionally, the processor 510 is further configured to:
determining a contact position of a user contacting a terminal screen before the photographed image is displayed in a target window;
determining an area which is away from the preset range of the contact position as a first target display area;
the displaying the photographed image in a target window includes:
and displaying the target window in the first target display area, and displaying the shot image in the target window.
Optionally, the processor 510 is further configured to:
acquiring a main area of a preview image in the image display area before the shot image is displayed in a target window;
determining a second target display area of the target window in the image display area according to the main body area;
displaying the target window in the second target display area;
and when the target window is displayed in the second target display area, the shielding area of the main body area is minimum.
Optionally, the processor 510 is further configured to:
displaying M images in M target windows;
wherein, the M target windows correspond to the M images one by one; m is an integer determined according to the number N of the images shot in the target time period, and M is smaller than or equal to N;
the starting time of the target time period is the current time T, the ending time is T-T, and T is preset duration;
and the M images are N images shot in the target time period, and the images at the front M are sorted from late to early according to the shooting time.
Optionally, the processor 510 is further configured to:
displaying a shot first image on a first target window, and displaying a shot second image on a second target window, wherein the shooting time of the second image is later than that of the first image, and the second target window is positioned on a first side of the first target window;
after the displaying the photographed image in the target window, the method further includes:
receiving a first sliding input of a user for a shooting control;
adjusting a position of the second target window to be on a second side of the first target window in response to the first sliding input;
wherein the first side and the second side are different.
Optionally, the first input is a slide input;
the processor 510 is further configured to:
acquiring input parameters of the sliding input;
acquiring a target side associated with the input parameter;
displaying a fourth target window on the target side of the third target window, and displaying the photographed image in the fourth target window;
wherein the third target window is a window that has been displayed in the image display area before the fourth target window is displayed.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 501 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 510; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 501 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 501 can also communicate with a network and other devices through a wireless communication system.
The terminal provides wireless broadband internet access to the user through the network module 502, such as helping the user send and receive e-mails, browse web pages, access streaming media, and the like.
The audio output unit 503 may convert audio data received by the radio frequency unit 501 or the network module 502 or stored in the memory 509 into an audio signal and output as sound. Also, the audio output unit 503 may also provide audio output related to a specific function performed by the terminal 500 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 503 includes a speaker, a buzzer, a receiver, and the like.
The input unit 504 is used to receive an audio or video signal. The input Unit 504 may include a Graphics Processing Unit (GPU) 5041 and a microphone 5042, and the Graphics processor 5041 processes image data of a still image or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 506. The image frames processed by the graphic processor 5041 may be stored in the memory 509 (or other storage medium) or transmitted via the radio frequency unit 501 or the network module 502. The microphone 5042 may receive sounds and may be capable of processing such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 501 in case of the phone call mode.
The terminal 500 also includes at least one sensor 505, such as light sensors, motion sensors, and other sensors. Specifically, the light sensor includes an ambient light sensor that adjusts the brightness of the display panel 5061 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 5061 and/or a backlight when the terminal 500 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 505 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 506 is used to display information input by the user or information provided to the user. The Display unit 506 may include a Display panel 5061, and the Display panel 5061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 507 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal. Specifically, the user input unit 507 includes a touch panel 5071 and other input devices 5072. Touch panel 5071, also referred to as a touch screen, may collect touch operations by a user on or near it (e.g., operations by a user on or near touch panel 5071 using a finger, stylus, or any suitable object or attachment). The touch panel 5071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 510, and receives and executes commands sent by the processor 510. In addition, the touch panel 5071 may be implemented in various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 5071, the user input unit 507 may include other input devices 5072. In particular, other input devices 5072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 5071 may be overlaid on the display panel 5061, and when the touch panel 5071 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 510 to determine the type of the touch event, and then the processor 510 provides a corresponding visual output on the display panel 5061 according to the type of the touch event. Although in fig. 5, the touch panel 5071 and the display panel 5061 are two independent components to implement the input and output functions of the terminal, in some embodiments, the touch panel 5071 and the display panel 5061 may be integrated to implement the input and output functions of the terminal, and is not limited herein.
The interface unit 508 is an interface for connecting an external device to the terminal 500. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 508 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the terminal 500 or may be used to transmit data between the terminal 500 and external devices.
The memory 509 may be used to store software programs as well as various data. The memory 509 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 509 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 510 is a control center of the terminal, connects various parts of the entire terminal using various interfaces and lines, and performs various functions of the terminal and processes data by operating or executing software programs and/or modules stored in the memory 509 and calling data stored in the memory 509, thereby performing overall monitoring of the terminal. Processor 510 may include one or more processing units; preferably, the processor 510 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 510.
The terminal 500 may further include a power supply 511 (e.g., a battery) for supplying power to various components, and preferably, the power supply 511 may be logically connected to the processor 510 through a power management system, so that functions of managing charging, discharging, and power consumption are performed through the power management system.
In addition, the terminal 500 includes some functional modules that are not shown, and are not described in detail herein.
Preferably, an embodiment of the present invention further provides a terminal, including a processor 510, a memory 509, and a computer program stored in the memory 509 and capable of running on the processor 510, where the computer program, when executed by the processor 510, implements each process of the above-mentioned display control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
An embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the display control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (11)

1. A display control method is applied to a terminal and is characterized by comprising the following steps:
receiving a first input of a user;
capturing an image in response to the first input;
displaying the photographed image on a target window;
the target window is displayed in an image display area of a shooting preview interface, and the area of the target window is smaller than that of the image display area;
the first input is a sliding input;
the displaying the photographed image in a target window includes:
acquiring input parameters of the sliding input;
acquiring a target side associated with the input parameter;
displaying a fourth target window on the target side of the third target window, and displaying the photographed image in the fourth target window;
wherein the third target window is a window that has been displayed in the image display area before the fourth target window is displayed.
2. The method of claim 1, wherein after displaying the captured image in a target window, the method further comprises:
receiving a second input of the user;
adjusting a display parameter of the target window in response to the second input;
wherein the display parameter includes at least one of a window size, a display position, a display duration, and a transparency.
3. The method of claim 2, wherein the second input is a first drag input for a capture control;
the adjusting the display parameter of the target window in response to the second input comprises:
adjusting the size of the target window according to the size adjustment speed corresponding to the input parameter of the first dragging input, wherein the input parameter of the first dragging input comprises at least one of dragging direction, dragging distance, dragging duration and dragging speed;
alternatively, the first and second electrodes may be,
in response to the first drag input, resizing the target window to a preset window size.
4. The method of claim 3, wherein after resizing the target window to a preset window size in response to the first drag input, the method further comprises:
under the condition that a preset condition is met, restoring the size of the target window to the size of the window before adjustment;
wherein the preset condition comprises at least one of the following:
the first drag input ends;
the shooting control is restored to the display position before dragging;
receiving a third input, wherein the third input is received after the first drag input is finished;
and the adjusted display time length of the target window reaches a preset time length.
5. The method of claim 2, wherein said adjusting display parameters of the target window in response to the second input comprises:
responding to the second input, and acquiring the movement parameters of the terminal, wherein the movement parameters comprise a movement distance and a movement direction;
and adjusting the display position of the target window according to the movement parameters.
6. The method of claim 1, wherein before the target window displays the captured image, the method further comprises:
determining a contact position of a user contacting a terminal screen;
determining an area which is away from the preset range of the contact position as a first target display area;
the displaying the photographed image in a target window includes:
and displaying the target window in the first target display area, and displaying the shot image in the target window.
7. The method of claim 1, wherein before the target window displays the captured image, the method further comprises:
acquiring a main area of a preview image in the image display area;
determining a second target display area of the target window in the image display area according to the main body area;
displaying the target window in the second target display area;
and when the target window is displayed in the second target display area, the shielding area of the main body area is minimum.
8. The method of claim 1,
after the displaying the photographed image in the target window, the method further includes:
receiving a first sliding input of a user for a shooting control;
adjusting a position of the second target window to be on a second side of the first target window in response to the first sliding input;
wherein the first side and the second side are different.
9. A terminal, comprising:
the first receiving module is used for receiving a first input of a user;
a photographing module for photographing an image in response to the first input;
the first display module is used for displaying the shot image on a target window;
the target window is displayed in an image display area of a shooting preview interface, and the area of the target window is smaller than that of the image display area;
the first input is a sliding input;
the first display module is specifically configured to:
acquiring input parameters of the sliding input;
acquiring a target side associated with the input parameter;
displaying a fourth target window on the target side of the third target window, and displaying the photographed image in the fourth target window;
wherein the third target window is a window that has been displayed in the image display area before the fourth target window is displayed.
10. A terminal comprising a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the display control method according to any one of claims 1 to 8.
11. A computer-readable storage medium, characterized in that a computer program is stored thereon, which computer program, when being executed by a processor, carries out the steps of the display control method according to any one of claims 1 to 8.
CN201811433680.0A 2018-11-28 2018-11-28 Display control method and terminal Active CN109413333B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811433680.0A CN109413333B (en) 2018-11-28 2018-11-28 Display control method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811433680.0A CN109413333B (en) 2018-11-28 2018-11-28 Display control method and terminal

Publications (2)

Publication Number Publication Date
CN109413333A CN109413333A (en) 2019-03-01
CN109413333B true CN109413333B (en) 2022-04-01

Family

ID=65455961

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811433680.0A Active CN109413333B (en) 2018-11-28 2018-11-28 Display control method and terminal

Country Status (1)

Country Link
CN (1) CN109413333B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113497888B (en) * 2020-04-07 2023-05-02 华为技术有限公司 Photo preview method, electronic device and storage medium
CN112214621A (en) * 2020-10-19 2021-01-12 深圳市圆周率软件科技有限责任公司 Image viewing method and electronic equipment
CN112492205B (en) * 2020-11-30 2023-05-09 维沃移动通信(杭州)有限公司 Image preview method and device and electronic equipment
CN117119285A (en) * 2023-02-27 2023-11-24 荣耀终端有限公司 Shooting method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1947412A (en) * 2004-05-13 2007-04-11 索尼株式会社 Imaging unit, image screen display method, and user interface
CN105554553A (en) * 2015-12-15 2016-05-04 腾讯科技(深圳)有限公司 Method and device for playing video through floating window
CN106131394A (en) * 2016-06-15 2016-11-16 青岛海信移动通信技术股份有限公司 A kind of method and device taken pictures

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104980659A (en) * 2015-06-30 2015-10-14 广州三星通信技术研究有限公司 Electronic terminal for controlling shooting display and control method thereof
US20180203596A1 (en) * 2017-01-19 2018-07-19 Microsoft Technology Licensing, Llc Computing device with window repositioning preview interface

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1947412A (en) * 2004-05-13 2007-04-11 索尼株式会社 Imaging unit, image screen display method, and user interface
CN105554553A (en) * 2015-12-15 2016-05-04 腾讯科技(深圳)有限公司 Method and device for playing video through floating window
CN106131394A (en) * 2016-06-15 2016-11-16 青岛海信移动通信技术股份有限公司 A kind of method and device taken pictures

Also Published As

Publication number Publication date
CN109413333A (en) 2019-03-01

Similar Documents

Publication Publication Date Title
CN108513070B (en) Image processing method, mobile terminal and computer readable storage medium
CN108668083B (en) Photographing method and terminal
EP3780577B1 (en) Photography method and mobile terminal
CN111182205B (en) Photographing method, electronic device, and medium
CN108471498B (en) Shooting preview method and terminal
CN108495029B (en) Photographing method and mobile terminal
CN109361869B (en) Shooting method and terminal
CN108989672B (en) Shooting method and mobile terminal
CN109413333B (en) Display control method and terminal
CN109683777B (en) Image processing method and terminal equipment
CN110505400B (en) Preview image display adjustment method and terminal
CN110213440B (en) Image sharing method and terminal
CN111147752B (en) Zoom factor adjusting method, electronic device, and medium
CN110198413B (en) Video shooting method, video shooting device and electronic equipment
CN109683802B (en) Icon moving method and terminal
CN108228902B (en) File display method and mobile terminal
CN111432195A (en) Image shooting method and electronic equipment
CN108924422B (en) Panoramic photographing method and mobile terminal
CN108881721B (en) Display method and terminal
CN108132749B (en) Image editing method and mobile terminal
CN108174110B (en) Photographing method and flexible screen terminal
CN111597370A (en) Shooting method and electronic equipment
CN111405181B (en) Focusing method and electronic equipment
CN110536005B (en) Object display adjustment method and terminal
CN110321449B (en) Picture display method and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant