CN108769506B - Image acquisition method and device, mobile terminal and computer readable medium - Google Patents

Image acquisition method and device, mobile terminal and computer readable medium Download PDF

Info

Publication number
CN108769506B
CN108769506B CN201810340133.1A CN201810340133A CN108769506B CN 108769506 B CN108769506 B CN 108769506B CN 201810340133 A CN201810340133 A CN 201810340133A CN 108769506 B CN108769506 B CN 108769506B
Authority
CN
China
Prior art keywords
screen
camera
touch
operation information
touch operation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810340133.1A
Other languages
Chinese (zh)
Other versions
CN108769506A (en
Inventor
舒茂非
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201810340133.1A priority Critical patent/CN108769506B/en
Publication of CN108769506A publication Critical patent/CN108769506A/en
Priority to PCT/CN2019/080957 priority patent/WO2019201088A1/en
Application granted granted Critical
Publication of CN108769506B publication Critical patent/CN108769506B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders

Abstract

The application discloses an image acquisition method, an image acquisition device, a mobile terminal and a computer readable medium, and belongs to the technical field of double-sided screen mobile terminals. The method comprises the following steps: when the first screen is monitored to display an image acquisition interface of a camera application, displaying a parameter adjustment interface of the camera device on the second screen; acquiring touch operation information input based on the parameter adjustment interface; and adjusting camera parameters of the camera device according to the touch operation information. When the user uses the camera application on the first screen, the parameter of the camera is adjusted through the second screen without using the first screen to adjust the parameter of the camera, so that the influence on the use of the camera application by the user is avoided, and the influence on a framing picture caused by the operation on the first screen is also avoided.

Description

Image acquisition method and device, mobile terminal and computer readable medium
Technical Field
The present application relates to the technical field of a dual-screen mobile terminal, and more particularly, to an image acquisition method, an image acquisition device, a mobile terminal, and a computer-readable medium.
Background
At present, when a mobile terminal is used for taking a picture, parameters such as a focal length and exposure compensation of a camera need to be set, and the existing setting mode needs to be adjusted in an interface of the camera. For example, the adjustment of the focal length often requires a single-finger or double-finger operation on the screen, which may affect the view-finding picture during the operation, and may cause inconvenience in photographing due to the cumbersome focusing operation.
Disclosure of Invention
The application provides an image acquisition method, an image acquisition device, a mobile terminal and a computer readable medium, so as to overcome the defects.
In a first aspect, an embodiment of the present application provides an image capturing method, which is applied to a mobile terminal, where the mobile terminal includes a first screen, a second screen, and a camera device, and the method includes: when the first screen is monitored to display an image acquisition interface of a camera application, displaying a parameter adjustment interface of the camera device on the second screen; acquiring touch operation information input based on the parameter adjustment interface; and adjusting camera parameters of the camera device according to the touch operation information.
In a second aspect, an embodiment of the present application further provides an image capturing apparatus, which is applied to a mobile terminal, where the mobile terminal includes a first screen, a second screen, and a camera device, and the apparatus includes: the device comprises a display unit, an acquisition unit and an adjustment unit. And the display unit is used for displaying a parameter adjusting interface of the camera device on the second screen when the situation that the first screen displays the image acquisition interface of the camera application is monitored. And the acquisition unit is used for acquiring touch operation information input based on the parameter adjustment interface. And the adjusting unit is used for adjusting the camera parameters of the camera device according to the touch operation information.
In a third aspect, an embodiment of the present application further provides a mobile terminal, including a first screen, a second screen, a camera device, a memory, and a processor, where the first screen, the second screen, and the memory are all coupled with the processor. The memory stores instructions that, when executed by the processor, cause the processor to: when the first screen is monitored to display an image acquisition interface of a camera application, displaying a parameter adjustment interface of the camera device on the second screen; acquiring touch operation information input based on the parameter adjustment interface; and adjusting camera parameters of the camera device according to the touch operation information.
In a fourth aspect, the present application also provides a computer-readable medium having program code executable by a processor, where the program code causes the processor to execute the above method.
The embodiment of the application provides an image acquisition method, an image acquisition device, a mobile terminal and a computer readable medium, when it is monitored that an image acquisition interface applied by a camera is displayed on a first screen, a parameter adjustment interface of the camera device is displayed on a second screen, and touch operation information acting on the second screen is acquired. And then, adjusting the camera parameters to be adjusted according to the touch operation information. Therefore, when the user uses the camera application on the first screen, the parameter of the camera is adjusted through the second screen without using the first screen to adjust the parameter of the camera, so that the influence on the use of the camera application by the user is avoided, and the influence on the framing picture caused by the operation on the first screen is also avoided.
Additional features and advantages of embodiments of the present application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of embodiments of the present application. The objectives and other advantages of the embodiments of the application may be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 illustrates a schematic structural diagram of a mobile terminal according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of a mobile terminal according to another embodiment of the present application from a first perspective;
fig. 3 is a schematic structural diagram of a mobile terminal according to a second view angle provided in another embodiment of the present application;
FIG. 4 is a schematic diagram illustrating a preview interface provided by an embodiment of the present application;
FIG. 5 is a flowchart illustrating a method of an image capture method according to an embodiment of the present application;
fig. 6 is a schematic diagram illustrating a touch interface provided in an embodiment of the present application;
FIG. 7 is a schematic diagram illustrating a brightness adjustment interface provided by an embodiment of the present application;
FIG. 8 is a schematic diagram illustrating a preset swipe gesture provided by an embodiment of the present application;
FIG. 9 is a schematic diagram illustrating an exposure compensation parameter adjustment interface provided by an embodiment of the present application;
FIG. 10 is a flow chart of a method of image acquisition provided by another embodiment of the present application;
fig. 11 shows a block diagram of an image capturing apparatus provided in an embodiment of the present application;
fig. 12 shows a block diagram of a mobile terminal according to an embodiment of the present application for executing the method according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
Referring to fig. 1 and 2, a mobile terminal 100 according to an embodiment of the present application is shown, where fig. 1 is a side view of the mobile terminal 100, fig. 2(a) is a front view of the mobile terminal 100, and fig. 2(b) is a rear view of the mobile terminal 100. The mobile terminal 100 includes a first screen 120, a second screen 130, and a power key 140. The first screen 120, the second screen 130 and the power key 140 are disposed on the housing 12 of the mobile terminal 100, and the first screen 120 and the second screen 130 are two relatively independent display screens, as an embodiment, the first screen 120 is disposed on a first side 101 of the housing 12, and the second screen 130 is disposed on a second side 102 of the housing 12, wherein the first side 101 and the second side 102 are two opposite surfaces facing away from each other, in some embodiments, the first side 101 is a front surface of the mobile terminal 100, and the second side 102 is a back surface of the mobile terminal 100, so that the mobile terminal 100 shown in fig. 1 and 2 is a double-sided mobile terminal. In addition, the size of the first screen 120 and the second screen 130 is not limited, and may be similar.
Referring to fig. 3, a mobile terminal 200 according to another embodiment of the present application is shown, where the mobile terminal 200 includes: the first casing 22, the second casing 23, the first screen 220 that sets up on the first side 201 of first casing 22, and the second screen 230 that sets up on the second side 202 of second casing 23, first side 201 and second side 202 are two opposite faces, and first casing 22 and second casing 23 rotate through the rotation piece 240 and connect, and first side 201 and second side 202 can be close to each other or keep away from each other under the drive of rotation piece 240. The rotating member 240 may include a rotating shaft, and the extension portion of the first housing 22 and the extension portion of the second housing 23 are sleeved on the rotating shaft to realize the rotating connection between the first housing 22 and the second housing 23. The mobile terminal 200 illustrated in fig. 3 is a folding screen. In addition, the mobile terminal 200 shown in fig. 3 further includes a power key 250 disposed on the first housing 22 or the second housing 23, and as an embodiment, the power key 250 is disposed on the first housing 22.
The mobile terminal is installed with a camera application, and when the camera application runs on the first screen or the second screen, a main interface of the camera is displayed on the first screen or the second screen, as shown in fig. 4, the interface is a preview interface of a preview image, the preview image may be an image captured by the first camera or the second camera, and a user can adjust camera parameters in the interface of the camera application, where the camera parameters include a focal length of the camera, a filter, an ISO value, an EV value, and other parameters.
However, at present, adjustment of camera parameters is often required to be performed in an interface of a camera, for example, adjustment of a focal length is often required to perform operation of a single finger or a double finger on a screen, on one hand, a framing picture is affected during operation, and on the other hand, inconvenient photographing is caused by complicated focusing operation.
Therefore, in order to solve the technical defect, an embodiment of the present application provides an image capturing method, as shown in fig. 5, for improving user experience of a user when using a camera application, and in particular, the method is applied to the mobile terminal, and the method includes: s501 to S503.
S501: when it is monitored that the first screen displays an image acquisition interface of a camera application, a parameter adjustment interface of the camera device is displayed on the second screen.
An application program running state table is stored in the mobile terminal, and the table includes identifiers of all application programs installed in the current mobile terminal and a state corresponding to the identifier of each application program, for example, as shown in table 1 below:
TABLE 1
Identification of applications Status of state Point in time
APP1 Foreground running state 2017/11/3/13:20
APP2 Background run state 2017/11/4/14:10
APP3 Non-operating state 2017/11/5/8:20
APP4 Background run state 2017/11/5/10:03
APP5 Background run state 2017/11/4/9:18
In table 1, APP1 is an identifier of an application, and may be a content such as a name or a package name of the application, which is used to refer to an identity of the application, and the corresponding time point is a time point when the application switches to a corresponding state, for example, the time point of APP1 in table 1, which is identified as 20 at 11/3/13 in 2017, APP1 runs on the screen of the mobile terminal, that is, switches to a foreground running state.
The states of the application programs comprise a foreground running state, a background running state and a non-running state. The foreground running state refers to that an application program runs on a screen through an interface, and a user can interact with the application program through the interface, for example, an execution instruction is input or some information is observed through the interface. The background running state means that the application program runs in a resource manager of the system, but generally has no interface. When the user starts the camera application, a screen for running the camera application is set for the camera application, namely the camera application is selected to run on the first screen or the second screen, and then the mobile terminal records that the camera application is in a foreground running state and runs on the first screen. The non-running state means that the application program is not started, namely, is not in a foreground running state and is not in a background running state.
As an embodiment, a corresponding screen may also be set for the camera application, and if the display screen of the camera application is set as the first screen, the camera application is inevitably run on the first screen as long as the camera application runs in the foreground.
After determining which screen the camera application runs on, a current display interface of the camera application, for example, an image capture interface of the camera application, on which a preview interface of a preview image is displayed, where the preview image may be an image captured by the first camera or the second camera, and may be the interface shown in fig. 4, may be further determined.
When the first screen is monitored to display an image acquisition interface of the camera application, a parameter adjustment interface of the camera device is displayed on the second screen, in the adjustment interface, a user can input touch operation information or other contents, and the interface can be used as an information input interface.
In addition, in order to avoid misoperation of the second screen when the second screen runs an application program or is on the system desktop, a touch pad function may be set for the second screen, and the second screen only acquires touch operation information and does not respond to the touch operation information. Specifically, the second screen includes a touch response mode and a touch pad mode. And in the touch response mode, the mobile terminal receives touch operation information through the second screen and controls an application running on the second screen to execute corresponding operation according to the touch operation information. That is, in the touch response mode, the second screen can receive touch operation information input by a user, and the touch operation information is transmitted to the system, the system obtains an operation corresponding to the touch operation information, specifically, the operation corresponds to an application running on the second screen, and a gesture input by the user is input for an application interface running on the second screen, for example, a certain key in the application may be clicked by the user, the touch operation information is input, and the mobile terminal performs an operation corresponding to the gesture in response to the gesture and updates the content displayed on the second screen. That is, when the second screen is in the touch response mode, the second screen can receive the touch operation information and the collected gesture can be acquired by the mobile terminal, and the mobile terminal can click the area capable of being operated in the interface of the application program running on the second screen according to the gesture, that is, in the touch response mode, the user can normally use the second screen to input the touch operation information and perform the corresponding operation.
In the touchpad mode, the second screen is used as a touchpad to receive touch operation information. Specifically, in the touchpad mode, the second screen is only used as a touchpad, that is, only touch operation information is received, and the mobile terminal does not respond to the touch operation information to control an application running on the second screen to execute an operation corresponding to the gesture. Specifically, when the second screen is in the touchpad mode, the mobile terminal does not respond to the operation of the application running on the second screen corresponding to the touch operation information, and only after the touch operation information is collected, determines whether the touch operation information can act on the camera application running on the first screen, for example, adjusting a camera parameter or controlling lens switching or photographing of the camera application.
For example, a main interface of the APP1 is displayed on the second screen, a touch button is arranged in the main interface, and the touch button enables a user to switch the APP1 from the main interface to a certain detailed interface. When the second screen is in the touch response mode, the user clicks the touch button of the APP1, the mobile terminal acquires touch operation information for the touch button, and switches the APP1 from the main interface to a certain detailed interface. When the second screen is in the touchpad mode, the user clicks the touch key of the APP1, the second screen further serves as a touchpad to acquire touch operation information, and even if the touch operation information corresponds to the touch key, the terminal is not moved to respond to the touch operation information for controlling the operation of the APP1, that is, only the touch operation information is acquired, the APP1 is not switched from the main interface to a specific interface, and accordingly the second screen is frozen.
In the touchpad mode, the second screen is in a touch screen locking state, wherein in the touch screen locking state, the second screen is in a black screen state, but touch operation information input by a user on the second screen can be collected. The parameter adjustment interface on the second screen may be a lock screen interface of the second screen.
Specifically, the second screen of the mobile terminal comprises two independently working components, namely a display screen and a touch screen, wherein the touch screen is nearly transparent and is attached to the display screen. The touch screen recognizes the gesture signal and transmits the gesture signal to a processor of the mobile terminal, and the display screen is responsible for displaying corresponding images. In the touch screen locking state, the second screen closes the display screen, but the touch screen is still in the power-on state, and then the second screen is in the black screen state, and in the black screen state, the user can input touch gestures in the modes of sliding or clicking on the second comment and the like, and the touch gestures can be collected by the mobile terminal.
As another embodiment, in the touchpad mode, if a touch interface is displayed on the second screen, the touch interface is used as a parameter adjustment interface, and the gesture operation of the area outside the touch interface is shielded, as shown in fig. 6, the touch interface is displayed on the interface of APP2, specifically, the touch interface may be a popup window, and the touch interface is located on the upper layer of the interface of APP2, and the system shields the operation outside the touch interface, and the user can input the touch operation information through the touch interface.
The state of the second screen is recorded in the mobile terminal, and specifically, the state of the second screen can be determined by setting a screen state parameter of the second screen, and when the second screen is in the touch response mode and the touch pad mode, and the values of the screen state parameter are different, and looking at the screen state parameter.
When the second screen is in the touch pad mode, the second screen is operated to control the camera application, and the application program running on the second screen can be prevented from being operated by mistake.
S502: and acquiring touch operation information input based on the parameter adjustment interface.
S503: and adjusting camera parameters of the camera device according to the touch operation information.
The camera device may be the first camera or the second camera selected by the user, and specifically, the user may select the first camera or the second camera in an interface of the camera application, for example, in an image acquisition interface, and the adjusted camera parameters are the camera parameters of the selected camera. The camera device can then capture an image according to the adjusted camera parameters.
And the user inputs touch operation information through the parameter adjusting interface of the second screen, and then adjusts the camera parameters of the camera device according to the touch operation information. The camera parameter may include at least one of exposure time, sensitivity, or focal length, and the touch operation information may be a touch gesture including a sliding direction and a sliding length, and the adjusting the camera parameter of the camera device according to the touch operation information may be adjusting the camera parameter of the camera device according to the sliding direction and the sliding length of the touch gesture.
For example, the camera parameter to be adjusted selected in the parameter adjustment interface may be exposure time, sensitivity, or focal length of the camera device, and the selected camera parameter is adjusted by the sliding direction and the sliding length of the sliding gesture input in the parameter adjustment interface, for example, sliding from the bottom to the top of the screen increases the camera parameter to be adjusted, and the increased value may be determined according to the sliding length, for example, increasing the length of the focal length, or increasing the exposure time and the sensitivity, and the like. In particular, reference may be made to the manner of adjusting the camera parameters according to the slide gesture described in the subsequent embodiments.
In addition, in some embodiments, the camera parameters to be adjusted are determined by acquiring an adjustment instruction input based on the parameter adjustment interface, where the adjustment instruction includes the camera parameters to be adjusted, and determining the camera parameters to be adjusted based on the adjustment instruction. In this manner, the user selects the camera parameter to be adjusted through the parameter adjustment interface on the second screen, thereby determining the camera parameter to be currently adjusted.
In other embodiments, the camera parameter to be adjusted is determined by determining, as the camera parameter to be adjusted, a camera parameter corresponding to the touch operation information acquired based on the second screen according to a preset correspondence between the camera parameter and the touch operation information.
The camera parameters corresponding to different touch operation information are stored in the mobile terminal, for example, the camera parameter corresponding to the touch operation information 1 is a focal length, the camera parameter corresponding to the touch operation information 2 is an IOS value, and the camera parameter corresponding to the touch operation information 3 is camera switching, where the touch operation information 1 may slide from a first side to a second side of the second screen, where the first side and the second side are opposite sides. The touch operation information 2 slides from the third side to the fourth side of the second screen, wherein the third side and the fourth side are opposite sides. The touch operation information 3 may be two times of continuous clicks on the second screen, for example, when it is detected that the touch operation information is two times of continuous clicks, the camera is switched once, for example, when the camera used by the current camera application is the first camera, after the two times of continuous clicks, the camera used is switched to the second camera.
In still other embodiments, the determining of the camera parameter to be adjusted may be further by determining whether a current interface of the camera application running on the first screen is a preview interface.
The mobile terminal can monitor an interface of each application program running in the foreground, specifically, the interface can be a display request for acquiring a certain interface of the application program, and based on the request, the interface corresponding to the request can be determined and the interface corresponding to the request can be displayed in response to the request. For example, when a camera application is started, a start request of the camera application is acquired, an interface corresponding to the start request is a main interface of the camera application, and the interface of the camera application can be determined to be the main interface. And when the user clicks a preset button in the main interface to cause interface switching or new content is added to the interface, the mobile terminal can also monitor the current interface.
Then, it is determined whether a current interface of the camera application running on the first screen is a preview interface, where the preview interface may be a main interface of the camera application, such as an interface shown in fig. 4, and no camera parameter corresponds to the preview interface. And if the user clicks a certain camera parameter, displaying an adjustment interface corresponding to the camera parameter, as shown in fig. 7, which is a brightness adjustment interface, switching to the brightness adjustment interface after the user clicks the preview window, where the brightness adjustment interface corresponds to the brightness parameter, displaying a diagonal window and a brightness adjustment bar in the interface, as shown in fig. 7, where the adjustment bar corresponding to the sun pattern is the brightness adjustment bar, and then the touch gesture acquired by the second screen is used to adjust the brightness parameter. Therefore, the parameters to be adjusted currently can be determined according to the interface displayed on the first screen.
After the current parameter to be adjusted is determined, the camera parameter to be adjusted can be adjusted according to the touch operation information, for example, when the touch operation information is a click action, the camera parameter can be adjusted according to the number of clicks or the frequency of clicks.
As an implementation manner, if the touch operation information is a slide gesture, determining a camera parameter to be adjusted, and adjusting the determined camera parameter according to the touch operation information includes:
and judging whether the current interface of the camera application running on the first screen is a preview interface. If the interface is previewed, the current interface on the first screen is indicated to be corresponding to the camera parameters, and the current parameters to be adjusted can be judged according to the touch operation information.
Specifically, whether the touch operation information is a preset sliding gesture is judged; and if the gesture is a preset sliding gesture, judging that the camera parameter to be adjusted is a focal length.
Specifically, the preset slide gesture may be a gesture of sliding from one side of the second screen to the opposite side, for example, as shown in fig. 8, the preset slide gesture is a gesture of sliding from a first side of the second screen to the second side, or sliding from the second side to the first side, or sliding back and forth between the first side and the second side, and more specifically, may be a sliding between the first side and the second side of the second screen with a single finger.
And after determining that the current camera parameter to be adjusted is the focal length, adjusting the focal length according to the sliding direction and the sliding length of the touch operation information. In particular, the sliding direction determines whether the increase or decrease in focal length is small, while the sliding length determines the magnitude of the increase or decrease.
Specifically, if the sliding direction of the touch operation information is from the first side to the second side of the second screen, the focal length is increased according to the sliding length, wherein the first side and the second side are opposite sides. And if the sliding direction of the touch operation information is from the second side edge to the first side edge of the second screen, reducing the focal length according to the sliding length.
And the specific adjustment mode is that the corresponding value is linearly adjusted by the focal length every time the user slides for 1cm on the second screen until the maximum or minimum focal length is reached and the sliding operation in the corresponding direction is no longer responded.
It should be noted that, because the gestures of the user holding the mobile terminal are different, and there is a possibility of a horizontal screen or a vertical screen, the manner of adjusting the focal length according to the sliding direction and the sliding length of the touch operation information is as follows: and determining the horizontal and vertical screen states of the mobile terminal, and adjusting the focal length according to the horizontal and vertical screen states and the sliding direction and the sliding length of the touch operation information.
Defining the bottom of the mobile terminal as a first side edge and the top as a second side edge, wherein the bottom of the mobile terminal is one side provided with a home key or a microphone, the top of the mobile terminal is one side provided with a camera or an earphone of the mobile terminal, and assuming that the first screen is arranged on the front side of the mobile terminal and the second screen is arranged on the back side of the mobile terminal, the first screen faces to the visual angle of a user and sequentially comprises a third side edge, the first side edge and a fourth side edge in the clockwise direction of the second side edge.
The screen states of the mobile terminal include four screen states, namely a first vertical screen state, a first horizontal screen state, a second vertical screen state and a second horizontal screen state. In the first vertical screen state, the top of the mobile terminal is on the top, the bottom of the mobile terminal is on the bottom, that is, the receiver of the mobile terminal is on the top, the home key of the mobile terminal is on the bottom, that is, the first side edge is on the top, and the second side edge is on the bottom. In the first landscape screen state, the top of the mobile terminal is on the left side, the bottom of the mobile terminal is on the right side, the fourth side is on the top, the third side is on the bottom, in the second portrait screen state, the bottom of the mobile terminal is on the top, the top of the mobile terminal is on the bottom, namely the second side is on the top, the first side is on the bottom, in the second landscape screen state, the top of the mobile terminal is on the right side, the bottom of the mobile terminal is on the left side, namely the third side is on the top, and the fourth side is on the bottom.
Specifically, the mobile terminal may be provided with an attitude detection unit capable of detecting an attitude of the mobile terminal, such as a sensor capable of performing attitude determination, such as a gyroscope. The posture of the mobile terminal can be determined according to the data collected by the posture detection unit, so that which of the four states is the mobile terminal can be determined. For example, the posture detection unit includes a gyroscope. According to the fixed axis property of the gyroscope and the known preset attitude of the mobile terminal relative to the direction of a rotating shaft of the gyroscope, when the attitude of the mobile terminal is different from the preset attitude and the movement in a certain direction of pitching, yawing or rolling is generated, because the direction of the rotating shaft of the gyroscope is kept unchanged, the change of the angular relationship of the mobile terminal relative to the rotating shaft and the change of the angular relationship of the preset attitude relative to the rotating shaft are detected, the attitude change of the mobile terminal can be determined, and the current attitude of the mobile terminal can be determined.
As an embodiment, a posture of the mobile terminal in the first vertical screen state is used as a preset posture of the mobile terminal, and when the mobile terminal is rotated from the first vertical screen state to another screen state, coordinate transformation, a rotation direction and a rotation angle in a three-dimensional space of a screen can be detected, so that rotation data of the mobile terminal when the mobile terminal is stopped after rotation can be determined, wherein the rotation data can be information with six degrees of freedom, and thus the screen state can be determined.
In addition, the screen state of the mobile terminal may also be acquired through a function within a system of the mobile terminal. For example, the state of the screen is obtained according to a function typedefNS _ ENUM (nseteger, uideviceorganization), and specifically, the correspondence between the function return result and the screen state is as shown in table 2 below:
TABLE 2
Function return result Status of screen
UIDeviceOrientationPortrait First vertical screen state
UIDeviceOrientationPortraitUpsideDown Second vertical screen state
UIDeviceOrientationLandscapeLeft First horizontal screen state
UIDeviceOrientationLandscapeRight Second landscape state
The current screen status of the mobile terminal can be obtained according to table 2 above.
Judging whether the current interface of the camera application running on the first screen is a preview interface. And if so, detecting the horizontal and vertical screen states of the mobile terminal.
When the mobile terminal is detected to be in a first vertical screen state or a second vertical screen state, if touch operation information input based on a second screen slides between the first side edge and the second side edge, the camera parameter to be adjusted is judged to be a focal length, and the focal length is adjusted according to the sliding direction and length of the first side edge and the second side edge.
Specifically, when the mobile terminal is in the first vertical screen state, if the sliding direction of the touch operation information is from the second side edge to the first side edge of the second screen, the focal length is increased according to the sliding length, and if the sliding direction of the touch operation information is from the first side edge to the second side edge of the second screen, the focal length is decreased according to the sliding length.
And when the mobile terminal is in a second vertical screen state, if the sliding direction of the touch operation information is from the first side edge to the second side edge of the second screen, the focal length is increased according to the sliding length, and if the sliding direction of the touch operation information is from the second side edge to the first side edge of the second screen, the focal length is decreased according to the sliding length.
When the mobile terminal is detected to be in a first transverse screen state or a second transverse screen state, if the touch operation information input based on the second screen slides between the third side and the fourth side, the camera parameter to be adjusted is judged to be the focal length, and the focal length is adjusted according to the sliding direction and length of the third side and the fourth side.
Specifically, when the mobile terminal is in the first landscape screen state, if the sliding direction of the touch operation information is from the third side to the fourth side of the second screen, the focal length is increased according to the sliding length, and if the sliding direction of the touch operation information is from the fourth side to the third side of the second screen, the focal length is decreased according to the sliding length.
And when the mobile terminal is in a second transverse screen state, if the sliding direction of the touch operation information is from the fourth side to the third side of the second screen, increasing the focal length according to the sliding length, and if the sliding direction of the touch operation information is from the third side to the fourth side of the second screen, decreasing the focal length according to the sliding length.
Thus, it can be ensured that the user adjusts the focus using a swipe gesture that remains the same.
And if the current interface of the camera application running on the first screen is not the preview interface, judging whether the current interface of the camera application running on the first screen corresponds to a camera parameter adjusting interface. And if the camera parameter adjusting interface corresponds to the camera parameter adjusting interface, taking the corresponding camera parameter adjusting interface as the camera parameter to be adjusted.
Specifically, when the user clicks an adjustment case corresponding to a certain camera parameter, a camera parameter adjustment interface is displayed, as shown in fig. 9, the user clicks an ISO button to display an exposure compensation parameter adjustment interface, and the user can adjust the exposure compensation parameter based on touch operation information on the second screen. Specifically, adjusting the exposure compensation parameter may be similar to adjusting the focal length, or may be adjusted according to the sliding direction and the sliding distance, which is not described herein again. The adjustment modes of all camera parameters can adopt similar gestures, so that the trouble brought by a user for adapting to different habits is reduced, and the user experience is improved.
Referring to fig. 10, an embodiment of the present application provides an image capturing method, where the method is used to improve user experience when a user uses a camera application, and specifically, the method is applied to the mobile terminal, and the method includes: s1001 to S1006.
S1001: when the first screen is monitored to display an image acquisition interface of the camera application, whether the second screen is in a touch response mode or a touch pad mode is judged.
The operation of S1002 is performed if the second screen is in the touch response mode, and the operation of S1003 is performed if the second screen is in the touchpad mode. Specifically, the aforementioned embodiments can be referred to for the touch response mode or the touch pad mode, and details are not repeated herein.
S1002: and switching the second screen into a touch pad mode, and displaying a parameter adjusting interface.
When the second screen is in the touch response mode, in order to avoid misoperation of the application program running on the second screen by a user, the second screen is switched to the touch pad mode.
S1003: and acquiring touch operation information acting on the second screen.
S1004: the camera parameters to be adjusted are determined.
S1005: and adjusting the camera parameters to be adjusted according to the touch operation information.
S1006: when the camera application is detected not to be run on the first screen, the second screen is switched from a touch pad mode to a touch response mode.
When the user exits the camera application running on the first screen, for example, when the camera application is switched to a background running mode or is closed, it is determined that the camera application is not running on the first screen, the second screen is switched from the touchpad mode to the touch response mode, and the user can continue to operate the running application program on the second screen.
In addition, in an embodiment of detecting that the camera application is not running on the first screen, when it is detected that the first screen is locked, it may be determined that the camera application is not running on the first screen.
Also, the above-described operation S1006 is not limited to being performed after S1005, and in particular, may be performed after S1002.
It should be noted that, the above steps are parts of detailed description, and reference may be made to the foregoing embodiments, which are not repeated herein.
Referring to fig. 11, an embodiment of the present application provides an image capturing apparatus 1100, which is used to improve user experience when a user uses a camera application, and in particular, the apparatus is applied to the mobile terminal, and the apparatus includes: an acquisition unit 1101, a determination unit 1102, and an adjustment unit 1103.
A display unit 1101, configured to display a parameter adjustment interface of the camera device on the second screen when it is monitored that the first screen displays an image capture interface of a camera application.
An obtaining unit 1102, configured to obtain touch operation information input based on the parameter adjustment interface.
An adjusting unit 1103, configured to adjust a camera parameter of the camera device according to the touch operation information.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
Referring to fig. 1 and 2 again, based on the above method and apparatus, an embodiment of the present application further provides a mobile terminal 100, which includes an electronic body 10, where the electronic body 10 includes a housing 12, and a first screen 120 and a second screen 130 disposed on the housing 12. The housing 12 may be made of metal, such as steel or aluminum alloy. In this embodiment, the first screen 120 generally includes the first display screen 121, and may also include a circuit for responding to a touch operation performed on the first display screen 121, and the second screen 130 also generally includes the second display screen 131, and may also include a circuit for responding to a touch operation performed on the second display screen 131. The Display may be a Liquid Crystal Display (LCD), which in some embodiments is also a touch screen.
Referring to fig. 12, in an actual application scenario, the mobile terminal 100 may be used as a smart phone terminal, in which case the electronic body 10 generally further includes one or more processors 102 (only one is shown in the figure), a memory 104, an RF (Radio Frequency) module 106, an audio circuit 110, a sensor 114, an input module 118, and a power module 122. It will be understood by those skilled in the art that the structure shown in fig. 12 is merely illustrative and is not intended to limit the structure of the electronic body 10. For example, the electronics body portion 10 may also include more or fewer components than shown in FIG. 12, or have a different correspondence than shown in FIG. 12.
Those skilled in the art will appreciate that all other components are peripheral devices with respect to the processor 102, and the processor 102 is coupled to the peripheral devices through a plurality of peripheral interfaces 124. The peripheral interface 124 may be implemented based on the following criteria: the Universal Asynchronous Receiver/Transmitter 200 (UART), General Purpose Input/Output (GPIO), Serial Peripheral Interface (SPI), and Inter-Integrated Circuit (I2C), but is not limited to the above standard. In some examples, the peripheral interface 124 may comprise only a bus; in other examples, the peripheral interface 124 may also include other components, such as one or more controllers, for example, a display controller for interfacing with the display screen, including the first display screen and the second display screen, or a memory controller for interfacing with a memory. These controllers may also be separate from the peripheral interface 124 and integrated within the processor 102 or a corresponding peripheral.
The memory 104 may be used to store software programs and modules, and the processor 102 executes various functional applications and data processing by executing the software programs and modules stored in the memory 104. The memory 104 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid state memory. In some examples, the memory 104 may further include memory remotely located from the processor 102, which may be connected to the electronic body portion 10 or display screen via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The RF module 106 is configured to receive and transmit electromagnetic waves, and achieve interconversion between the electromagnetic waves and electrical signals, so as to communicate with a communication network or other devices. The RF module 106 may include various existing circuit elements for performing these functions, such as an antenna, a radio frequency transceiver, a digital signal processor, an encryption/decryption chip, a Subscriber Identity Module (SIM) card, memory, and so forth. The RF module 106 may communicate with various networks such as the internet, an intranet, a wireless network, or with other devices via a wireless network. The wireless network may comprise a cellular telephone network, a wireless local area network, or a metropolitan area network. The Wireless network may use various Communication standards, protocols, and technologies, including, but not limited to, Global System for Mobile Communication (GSM), Enhanced Data GSM Environment (EDGE), wideband Code division multiple Access (W-CDMA), Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Wireless Fidelity (WiFi) (e.g., Institute of Electrical and Electronics Engineers (IEEE) standard IEEE 802.10A, IEEE802.11 b, IEEE 802.2.1 g, and/or IEEE802.11 n), Voice over internet protocol (VoIP), world wide mail Access (Microwave for Wireless Communication), and any other suitable protocol for short message Communication (wimax), as well as any other suitable protocol for instant messaging, and may even include those protocols that have not yet been developed.
The audio circuit 110, the earpiece 101, the sound jack 103, and the microphone 105 collectively provide an audio interface between a user and the electronics body portion 10. Specifically, the audio circuit 110 receives sound data from the processor 102, converts the sound data into an electrical signal, and transmits the electrical signal to the earpiece 101. The earpiece 101 converts the electrical signal into sound waves that can be heard by the human ear. The audio circuitry 110 also receives electrical signals from the microphone 105, converts the electrical signals to sound data, and transmits the sound data to the processor 102 for further processing. Audio data may be retrieved from the memory 104 or through the RF module 106. In addition, audio data may also be stored in the memory 104 or transmitted through the RF module 106.
The sensor 114 is disposed in the electronic body portion 10 or the first screen 120 or the second screen 130, and examples of the sensor 114 include, but are not limited to: acceleration sensor 114F, gyroscope 114G, magnetometer, and other sensors.
Specifically, the gyroscope 114G may detect the attitude of the mobile terminal, thereby determining the orientation of various components, determining the orientation of the display screen, determining the orientation of the screen, and the like. The acceleration sensor 114F can detect the magnitude of acceleration in various directions (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used for applications (such as horizontal and vertical screen switching, related games, magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tapping) and the like for recognizing the attitude of the mobile terminal. In addition, the electronic body 10 may also correspond to other sensors such as a gyroscope, a magnetometer, a barometer, a hygrometer, and a thermometer, which are not described herein again.
In this embodiment, the input module 118 may include the touch screen 109 disposed on the main screen 120, and the touch screen 109 may collect a touch operation of the user (for example, an operation of the user on or near the touch screen 109 using any suitable object or accessory such as a finger, a stylus, etc.) on or near the touch screen 109, so that the touch gesture of the user may be obtained and the corresponding connection device may be driven according to a preset program. Optionally, the touch screen 109 may include a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 102, and can receive and execute commands sent by the processor 102. In addition, the touch detection function of the touch screen 109 may be implemented by various types, such as resistive, capacitive, infrared, and surface acoustic wave. In addition to the touch screen 109, in other variations, the input module 118 may include other input devices, such as keys 107. The keys 107 may include, for example, character keys for inputting characters, and control keys for activating control functions. Examples of such control keys include a "back to home" key, a power on/off key, and the like.
The first screen 120 and the second screen 130 are used for displaying information input by the user, information provided to the user, and various graphical user interfaces of the electronic body section 10, which may be composed of graphics, text, icons, numbers, videos, and any combination thereof, and in one example, the touch screen 109 may be disposed on the display screen so as to be integrated with the display screen, wherein one or both of the first display screen and the second display screen may be disposed with the touch screen.
The power module 122 is used to provide power supply to the processor 102 and other components. Specifically, the power module 122 may include a power management system, one or more power sources (e.g., batteries or ac power), a charging circuit, a power failure detection circuit, an inverter, a power status indicator light, and any other components associated with the generation, management, and distribution of power within the electronics body portion 10 or display module.
The mobile terminal further comprises a locator 119, the locator 119 being configured to determine an actual location of the mobile terminal. In this embodiment, the locator 119 implements the positioning of the mobile terminal by using a positioning service, which is understood to be a technology or a service for obtaining the position information (e.g., longitude and latitude coordinates) of the mobile terminal by using a specific positioning technology and marking the position of the positioned object on an electronic map.
It should be understood that the above-described mobile terminal is not limited to a smartphone terminal, but it should refer to a computer device that can be used in mobility. Specifically, the mobile terminal refers to a mobile computer device equipped with an intelligent operating system, and the mobile terminal includes, but is not limited to, a smart phone, a smart watch, a tablet computer, and the like.
Referring to fig. 3 again, based on the above method and apparatus, another embodiment of the present application further provides a mobile terminal 200, which includes an electronic body portion, where the electronic body portion includes a first housing 22 and a second housing 23, a first screen 220 disposed on the first housing 22, and a second screen 130 disposed on the second housing 23. The first housing 22 and the second housing 23 can be made of metal, such as steel or aluminum alloy. In this embodiment, the first screen 220 generally includes a first display 221, and may also include a circuit for responding to a touch operation performed on the first display 221, and the second screen 230 generally includes a second display 231, and may also include a circuit for responding to a touch operation performed on the second display 231, and the like. The Display may be a Liquid Crystal Display (LCD), which in some embodiments is also a touch screen.
In addition, in an actual application scenario, the mobile terminal 200 may be used as a smartphone terminal, where the descriptions of the first screen 220 and the second screen 230 may refer to the first screen 120 and the second screen 130 in fig. 10, and the remaining devices may also refer to fig. 10, which is not described herein again.
To sum up, the embodiment of the present application provides an image acquisition method, an image acquisition device, a mobile terminal, and a computer readable medium, and when it is monitored that a camera application is running on a first screen and a second screen is in a touchpad mode, a touch gesture acting on the second screen is acquired. And the second screen is used as a touch pad to receive touch gestures in the touch pad mode. Then, determining the camera parameters to be adjusted; and adjusting the camera parameters to be adjusted according to the touch gesture. Therefore, when the user uses the camera application on the first screen, the parameter of the camera is adjusted through the second screen without using the first screen to adjust the parameter of the camera, so that the influence on the use of the camera application by the user is avoided, and the influence on the framing picture caused by the operation on the first screen is also avoided.
In the description herein, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present application, "plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (mobile terminal) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments. In addition, functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations may be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not necessarily depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (14)

1. An image acquisition method applied to a mobile terminal, wherein the mobile terminal comprises a first screen, a second screen and a camera device, the method comprising:
when the first screen is monitored to display an image acquisition interface of a camera application, judging whether the second screen is in a touch response mode or a touch pad mode, wherein in the touch response mode, the mobile terminal receives touch operation information through the second screen and controls an application running on the second screen to execute corresponding operation according to the touch operation information, and in the touch pad mode, the mobile terminal receives the touch operation information through the second screen and does not respond to the operation of the application running on the second screen corresponding to the touch operation information;
if the second screen is in a touch response mode, switching the second screen into a touch pad mode, and displaying a parameter adjusting interface on the second screen, wherein the parameter adjusting interface is a screen locking interface;
acquiring touch operation information input based on the parameter adjustment interface;
and adjusting camera parameters of the camera device according to the touch operation information.
2. The method of claim 1, wherein the adjusting the camera parameters of the camera device according to the touch operation information comprises:
determining camera parameters to be adjusted of the camera device according to the touch operation information;
and adjusting the camera parameters to be adjusted according to the touch operation.
3. The method according to claim 2, wherein the determining the camera parameters to be adjusted according to the touch operation information comprises:
acquiring an adjusting instruction input based on the parameter adjusting interface, wherein the adjusting instruction comprises camera parameters to be adjusted;
determining a camera parameter to be adjusted based on the adjustment instruction.
4. The method according to claim 2, wherein the determining the camera parameters to be adjusted according to the touch operation information comprises:
and determining the camera parameter corresponding to the touch operation information acquired based on the second screen according to the preset corresponding relation between the camera parameter and the touch operation information, and taking the camera parameter as the camera parameter to be adjusted.
5. The method according to claim 2, wherein the determining the camera parameters of the camera device to be adjusted according to the touch operation information comprises:
judging whether the image acquisition interface on the first screen is a preview interface;
if the touch operation information is a preset sliding gesture, judging whether the touch operation information is a preset sliding gesture;
and if the gesture is a preset sliding gesture, judging that the camera parameter to be adjusted is a focal length.
6. The method of claim 5, further comprising:
if not, judging whether a current interface running on the first screen corresponds to a camera parameter adjustment interface;
and if the camera parameter adjusting interface corresponds to the camera parameter adjusting interface, taking the corresponding camera parameter adjusting interface as the camera parameter to be adjusted.
7. The method according to any one of claims 1 to 6, wherein the touch operation information is a touch gesture, and the adjusting the camera parameter of the camera device according to the touch operation information includes:
adjusting a camera parameter of the camera device according to the sliding direction and the sliding length of the touch gesture, wherein the camera parameter includes at least one of exposure time, sensitivity or focal length.
8. The method of claim 7, wherein adjusting the camera parameters of the camera device according to the sliding direction and the sliding length of the touch gesture comprises:
if the sliding direction of the touch gesture slides from a first side edge to a second side edge of the second screen, increasing the camera parameters according to a sliding length, wherein the first side edge and the second side edge are opposite side edges;
and if the sliding direction of the touch gesture slides from the second side edge to the first side edge of the second screen, reducing the camera parameter according to the sliding length.
9. The method of claim 1, wherein the mobile terminal comprises a first side and a second side, wherein the first screen is disposed on the first side, wherein the second screen is disposed on the second side, and wherein the first side and the second side are two opposite sides facing away from each other.
10. An image acquisition device, applied to a mobile terminal including a first screen, a second screen and a camera device, the device comprising:
the mobile terminal comprises a display unit, a first screen and a second screen, wherein the display unit is used for judging whether the second screen is in a touch response mode or a touch pad mode when monitoring that the first screen displays an image acquisition interface of a camera application, the mobile terminal receives touch operation information through the second screen and controls an application running on the second screen to execute corresponding operation according to the touch operation information in the touch response mode, the mobile terminal receives the touch operation information through the second screen and does not respond to the operation of the application running on the second screen corresponding to the touch operation information in the touch pad mode, if the second screen is in the touch response mode, the second screen is switched to the touch pad mode, and a parameter adjustment interface is displayed on the second screen and is a screen locking interface;
the acquisition unit is used for acquiring touch operation information input based on the parameter adjustment interface;
and the adjusting unit is used for adjusting the camera parameters of the camera device according to the touch operation information.
11. The apparatus of claim 10, wherein the touch operation information is a touch gesture, and the adjustment unit is specifically configured to: adjusting a camera parameter of the camera device according to the sliding direction and the sliding length of the touch gesture, wherein the camera parameter includes at least one of exposure time, sensitivity or focal length.
12. A mobile terminal comprising a first screen, a second screen, a camera device, a memory, and a processor, the first screen, the second screen, and the memory all coupled with the processor; the memory stores instructions that, when executed by the processor, cause the processor to perform the method of any of claims 1-9.
13. The mobile terminal of claim 12, comprising a first side and a second side, wherein the first screen is disposed on the first side, and the second screen is disposed on the second side, and wherein the first side and the second side are two opposite sides facing away from each other.
14. A computer-readable medium having program code executable by a processor, wherein the program code causes the processor to perform the method of any of claims 1-9.
CN201810340133.1A 2018-04-16 2018-04-16 Image acquisition method and device, mobile terminal and computer readable medium Active CN108769506B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201810340133.1A CN108769506B (en) 2018-04-16 2018-04-16 Image acquisition method and device, mobile terminal and computer readable medium
PCT/CN2019/080957 WO2019201088A1 (en) 2018-04-16 2019-04-02 Image collection method and apparatus, mobile terminal, and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810340133.1A CN108769506B (en) 2018-04-16 2018-04-16 Image acquisition method and device, mobile terminal and computer readable medium

Publications (2)

Publication Number Publication Date
CN108769506A CN108769506A (en) 2018-11-06
CN108769506B true CN108769506B (en) 2020-04-21

Family

ID=64010664

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810340133.1A Active CN108769506B (en) 2018-04-16 2018-04-16 Image acquisition method and device, mobile terminal and computer readable medium

Country Status (2)

Country Link
CN (1) CN108769506B (en)
WO (1) WO2019201088A1 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108769506B (en) * 2018-04-16 2020-04-21 Oppo广东移动通信有限公司 Image acquisition method and device, mobile terminal and computer readable medium
CN109639971B (en) * 2018-12-17 2021-01-08 维沃移动通信有限公司 Shooting focal length adjusting method and terminal equipment
CN112154643B (en) * 2019-08-20 2022-08-02 深圳市大疆创新科技有限公司 Motion camera, self-timer control method and device, movable platform and storage medium
CN110488921A (en) * 2019-08-20 2019-11-22 Oppo广东移动通信有限公司 Electronic equipment and its control method
CN110968229A (en) * 2019-11-29 2020-04-07 维沃移动通信有限公司 Wallpaper setting method and electronic equipment
CN113676709B (en) * 2020-05-14 2023-10-27 聚好看科技股份有限公司 Intelligent projection equipment and multi-screen display method
CN116069156A (en) * 2021-11-01 2023-05-05 华为技术有限公司 Shooting parameter adjusting method, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104881265A (en) * 2015-06-03 2015-09-02 上海华豚科技有限公司 Double-screen equipment capable of realizing image recording
CN106354306A (en) * 2016-08-26 2017-01-25 青岛海信电器股份有限公司 Response method and device for touching operation
CN106445340A (en) * 2016-09-21 2017-02-22 青岛海信电器股份有限公司 Method and device for displaying stereoscopic image by double-screen terminal
CN107317963A (en) * 2017-05-24 2017-11-03 努比亚技术有限公司 A kind of double-camera mobile terminal control method, mobile terminal and storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100071754A (en) * 2008-12-19 2010-06-29 삼성전자주식회사 Photographing method according to multi input scheme through touch and key manipulation and photographing apparatus using the same
KR102477522B1 (en) * 2015-09-09 2022-12-15 삼성전자 주식회사 Electronic device and method for adjusting exposure of camera of the same
CN105141852B (en) * 2015-10-10 2018-08-28 珠海市横琴新区龙族科技有限公司 Control method under Dual-band Handy Phone screening-mode and control device
CN106453941B (en) * 2016-10-31 2019-10-01 努比亚技术有限公司 Double screen operating method and mobile terminal
CN106843730A (en) * 2017-01-19 2017-06-13 宇龙计算机通信科技(深圳)有限公司 A kind of mobile terminal double screen changing method and mobile terminal
CN108769506B (en) * 2018-04-16 2020-04-21 Oppo广东移动通信有限公司 Image acquisition method and device, mobile terminal and computer readable medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104881265A (en) * 2015-06-03 2015-09-02 上海华豚科技有限公司 Double-screen equipment capable of realizing image recording
CN106354306A (en) * 2016-08-26 2017-01-25 青岛海信电器股份有限公司 Response method and device for touching operation
CN106445340A (en) * 2016-09-21 2017-02-22 青岛海信电器股份有限公司 Method and device for displaying stereoscopic image by double-screen terminal
CN107317963A (en) * 2017-05-24 2017-11-03 努比亚技术有限公司 A kind of double-camera mobile terminal control method, mobile terminal and storage medium

Also Published As

Publication number Publication date
WO2019201088A1 (en) 2019-10-24
CN108769506A (en) 2018-11-06

Similar Documents

Publication Publication Date Title
CN108769506B (en) Image acquisition method and device, mobile terminal and computer readable medium
US11861161B2 (en) Display method and apparatus
US11928312B2 (en) Method for displaying different application shortcuts on different screens
US11237703B2 (en) Method for user-operation mode selection and terminals
CN108710456B (en) Application icon processing method and device and mobile terminal
CN108762859B (en) Wallpaper display method and device, mobile terminal and storage medium
CN108234875B (en) Shooting display method and device, mobile terminal and storage medium
EP3640732B1 (en) Method and terminal for acquire panoramic image
CN108777731B (en) Key configuration method and device, mobile terminal and storage medium
CN109067964B (en) Camera control method and device, mobile terminal and storage medium
CN108415641B (en) Icon processing method and mobile terminal
CN108366163B (en) Control method and device for camera application, mobile terminal and computer readable medium
CN109040351B (en) Camera control method and device, mobile terminal and storage medium
CN106445340B (en) Method and device for displaying stereoscopic image by double-screen terminal
CN110417960B (en) Folding method of foldable touch screen and electronic equipment
WO2018219275A1 (en) Focusing method and device, computer-readable storage medium, and mobile terminal
CN112424725A (en) Application program control method and electronic equipment
CN109101119B (en) Terminal control method and device and mobile terminal
CN110825474B (en) Interface display method and device and electronic equipment
CN109104521B (en) Method and device for correcting approaching state, mobile terminal and storage medium
CN111385525B (en) Video monitoring method, device, terminal and system
CN110221882B (en) Display method, display device, mobile terminal and storage medium
CN108762641B (en) Text editing method and terminal equipment
CN108650413B (en) Projection method, projection device, mobile terminal and storage medium
CN110362699B (en) Picture searching method and device, mobile terminal and computer readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant