CN109167894B - Camera control method and device, mobile terminal and storage medium - Google Patents

Camera control method and device, mobile terminal and storage medium Download PDF

Info

Publication number
CN109167894B
CN109167894B CN201810623017.0A CN201810623017A CN109167894B CN 109167894 B CN109167894 B CN 109167894B CN 201810623017 A CN201810623017 A CN 201810623017A CN 109167894 B CN109167894 B CN 109167894B
Authority
CN
China
Prior art keywords
camera
application
interface
terminal
call
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810623017.0A
Other languages
Chinese (zh)
Other versions
CN109167894A (en
Inventor
姚娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201810623017.0A priority Critical patent/CN109167894B/en
Publication of CN109167894A publication Critical patent/CN109167894A/en
Application granted granted Critical
Publication of CN109167894B publication Critical patent/CN109167894B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces

Abstract

The embodiment of the invention discloses a camera control method, a camera control device, a mobile terminal and a storage medium, wherein the camera control method is applied to the mobile terminal, the mobile terminal comprises a terminal main body and a camera arranged on the terminal main body, the camera can be moved out and retracted relative to the terminal main body, and the method comprises the following steps: monitoring the application needing to call the camera in real time in operation; when the situation that the state of the application changes in a first type is monitored, the camera is controlled to move out of the terminal main body; and when the situation that the state of the application is monitored to be changed in a second type is monitored, controlling the camera to retract to the terminal main body. The method can automatically control the movement of the camera according to the state of the application program, and improves user experience.

Description

Camera control method and device, mobile terminal and storage medium
Technical Field
The present invention relates to the field of mobile terminal technologies, and in particular, to a camera control method and apparatus, a mobile terminal, and a storage medium.
Background
With the development of science and technology, mobile terminals (such as mobile phones) are widely used and have more functions, and the mobile terminals become one of the necessary electronic products in daily life. However, at present, components such as a camera of the mobile terminal need to occupy a screen, the screen occupation ratio is not high, and the user experience is not good.
Disclosure of Invention
In view of the above problems, the present invention provides a camera control method, an apparatus, a mobile terminal and a storage medium, so as to control the mobile terminal to move out or retract the retractable camera according to the state of the application, thereby improving user experience, and meanwhile, the mobile terminal may have a higher screen occupation ratio.
In a first aspect, an embodiment of the present invention provides a camera control method, which is applied to a mobile terminal, where the mobile terminal includes a terminal body and a camera disposed in the terminal body, and the camera can be moved out and retracted with respect to the terminal body, and the method includes: monitoring the application needing to call the camera in real time in operation; when the situation that the state of the application changes in a first type is monitored, the camera is controlled to move out of the terminal main body; and when the situation that the state of the application is monitored to be changed in a second type is monitored, controlling the camera to retract to the terminal main body.
In a second aspect, an embodiment of the present invention provides a camera control method, which is applied to a mobile terminal, where the mobile terminal includes a terminal body and a camera disposed in the terminal body, and the camera can be moved out and retracted with respect to the terminal body, and the method includes: the method comprises the steps that a foreground of the mobile terminal runs a first application and obtains a first request of the first application, wherein the first request is used for calling a camera; controlling the camera to move out of the terminal body according to the first request, and starting the camera; and when the first application is switched to a background, closing the camera and controlling the camera to be retracted to the terminal main body.
In a third aspect, an embodiment of the present invention provides a camera control device, which is applied to a mobile terminal, where the mobile terminal includes a terminal body and a camera disposed in the terminal body, and the camera can be moved out and retracted with respect to the terminal body, and the device includes: the system comprises an application monitoring module, a first control module and a second control module, wherein the application monitoring module is used for monitoring the application of the camera which needs to be called in operation in real time; the first control module is used for controlling the camera to move out of the terminal main body when the situation that the application state changes in a first type is monitored; and the second control module is used for controlling the camera to retract to the terminal main body when the situation that the application state is monitored to change in a second type is monitored.
In a fourth aspect, an embodiment of the present invention provides a camera control apparatus applied to a mobile terminal, where the mobile terminal includes a terminal body and a camera disposed in the terminal body, and the camera can be moved out and retracted with respect to the terminal body, and the apparatus includes: the mobile terminal comprises a request acquisition module, a removal control module and a recovery control module, wherein the request acquisition module is used for the foreground of the mobile terminal to run a first application and acquire a first request of the first application, and the first request is used for calling the camera; the moving-out control module is used for controlling the camera to move out of the terminal main body according to the first request and starting the camera; the retraction control module is used for closing the camera when the first application is switched to the background and controlling the camera to retract to the terminal main body.
In a fifth aspect, an embodiment of the present invention provides a mobile terminal, where the mobile terminal includes a terminal body and a camera disposed in the terminal body, the camera is movable out of and retractable with respect to the terminal body, the terminal body includes a memory and a processor, the camera and the memory are coupled to the processor, the memory stores instructions, and when the instructions are executed by the processor, the processor executes the camera control method provided in the first aspect.
In a sixth aspect, an embodiment of the present invention provides a mobile terminal, where the mobile terminal includes a terminal body and a camera disposed in the terminal body, the camera is movable out of and retractable with respect to the terminal body, the terminal body includes a memory and a processor, the camera and the memory are coupled to the processor, the memory stores instructions, and when the instructions are executed by the processor, the processor executes the camera control method provided in the second aspect
In a seventh aspect, an embodiment of the present invention further provides a computer-readable storage medium having a program code executable by a processor, where the program code causes the processor to execute the camera control method provided in the first aspect.
In an eighth aspect, an embodiment of the present invention further provides a computer-readable storage medium having a program code executable by a processor, where the program code causes the processor to execute the camera control method provided in the second aspect.
Compared with the prior art, the camera control method, the camera control device, the mobile terminal and the storage medium provided by the invention have the advantages that the camera is controlled to move out of the terminal main body when the state of the application is monitored to be changed in a first type by monitoring the running application needing to call the camera in real time, and the camera is controlled to be retracted into the terminal main body when the state of the application is monitored to be changed in a second type, so that the camera is automatically controlled to move according to the change of the state of the application, the operation of a user is reduced, the user experience is improved, and meanwhile, the mobile terminal can have a higher screen occupation ratio.
These and other aspects of the invention are apparent from and will be elucidated with reference to the embodiments described hereinafter.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram illustrating a functional device of a mobile terminal according to an embodiment of the present application retracting a terminal body;
fig. 2 is a schematic view showing the functional device of fig. 1 removed from the terminal body;
fig. 3 is a schematic diagram illustrating a first view angle of a functional device of another mobile terminal according to an embodiment of the present application being moved out of a terminal body;
fig. 4 is a schematic diagram illustrating a first view angle of a functional device of a mobile terminal moving out of a terminal body according to an embodiment of the present application
Fig. 5 is a schematic view showing a second viewing angle in which the functional device of fig. 3 is moved out of the terminal body;
fig. 6 shows a schematic view of a first perspective of the functional device of fig. 3 retracted into the terminal body;
fig. 7 is a schematic view showing a second perspective view of the functional device of fig. 3 retracted into the terminal body;
fig. 8 is a flowchart illustrating a camera control method according to an embodiment of the present application;
fig. 9 is a flowchart illustrating a camera control method according to another embodiment of the present application;
fig. 10 is a flowchart illustrating a camera control method according to another embodiment of the present application;
fig. 11 shows a first interface schematic diagram of a mobile terminal provided in the embodiment of the present application;
fig. 12 is a schematic diagram illustrating a second interface of a mobile terminal according to an embodiment of the present application;
fig. 13 is a schematic diagram illustrating a third interface of a mobile terminal according to an embodiment of the present application;
fig. 14 is a schematic diagram illustrating a fourth interface of a mobile terminal according to an embodiment of the present application;
fig. 15 is a schematic diagram illustrating a fifth interface of a mobile terminal according to an embodiment of the present application;
fig. 16 is a schematic diagram illustrating a sixth interface of a mobile terminal according to an embodiment of the present application;
fig. 17 is a schematic diagram illustrating a seventh interface of the mobile terminal provided in the embodiment of the present application;
fig. 18 is a schematic diagram illustrating an eighth interface of a mobile terminal according to an embodiment of the present application;
fig. 19 is a flowchart illustrating a camera control method according to still another embodiment of the present application;
fig. 20 is a block diagram illustrating a configuration of a camera control device according to an embodiment of the present application;
fig. 21 is a block diagram showing a configuration of a camera control device according to still another embodiment of the present application;
fig. 22 is a block diagram illustrating a configuration of a mobile terminal according to an embodiment of the present application for executing a camera control method according to the embodiment of the present application;
fig. 23 is a block diagram illustrating a mobile terminal according to another embodiment of the present application, configured to execute a camera control method according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The display screen generally plays a role in a mobile terminal such as a mobile phone or a tablet computer to display contents such as text, pictures, icons or videos. With the development of touch technologies, more and more display screens arranged on mobile terminals are touch display screens, and when a user is detected to perform touch operations such as dragging, clicking, double clicking, sliding and the like on the touch display screen under the condition of arranging the touch display screens, the touch operation of the user can be responded.
With the increasing requirements of users on the definition and the fineness of displayed contents, more mobile terminals adopt touch display screens with larger sizes to achieve the display effect of a full screen. However, in the process of setting a touch display screen with a large size, it is found that functional devices such as a front camera, a proximity optical sensor, and a receiver, which are arranged at the front end of the mobile terminal, affect an area that the touch display screen can extend to.
Generally, a mobile terminal includes a front panel, a rear cover, and a bezel. The front panel includes a forehead area, a middle screen area and a lower key area. Generally, the forehead area is provided with a sound outlet of a receiver and functional devices such as a front camera, the middle screen area is provided with a touch display screen, and the lower key area is provided with one to three physical keys. With the development of the technology, the lower key area is gradually cancelled, and the physical keys originally arranged in the lower key area are replaced by the virtual keys in the touch display screen.
And functional devices such as a receiver sound outlet hole and a front camera arranged in the forehead area are important for the functional support of the mobile phone and are not easy to cancel, so that the display area of the touch display screen is expanded to cover the forehead area with great difficulty. After some research, the inventor found that functional devices such as a front camera, a proximity optical sensor, and an earpiece may be disposed inside a terminal body of a mobile terminal, and then the functional devices may be configured to rotate or slide so that the camera may be moved out and retracted with respect to the terminal body, so that the camera may be exposed inside the terminal body.
However, after finding the setting position of the mobile functional device, when the mobile terminal is used, the functional device is not usually matched with the running state of the application in the mobile terminal, and in some cases, the user needs to manually control the functional device, for example, a camera cannot be automatically controlled according to the state change of the application, which results in a lot of operations required by the user and affects the user experience. Accordingly, the inventors propose a camera control method, apparatus, mobile terminal, and storage medium for controlling movement of a camera according to an application state in the present application.
The hardware environment of the mobile terminal to which the present application relates will be described first.
As shown in fig. 1, as one mode, the mobile terminal 100 shown in fig. 1 includes a terminal body 110 and a functional device 120, where the functional device extends into the terminal body 110, is hidden in the terminal body, slidably extends out of the terminal body 110, and is exposed from a top of the terminal body 110, and the functional device 120 includes but is not limited to a camera, and one camera may be used, or two or more cameras may be used. Each camera may be directed toward the front of the mobile terminal, or may be directed toward the back of the mobile terminal or in other directions. The functional device 120 may further include a light supplement device. As an example, as shown in fig. 2, a camera 121 is disposed toward the front of the mobile terminal, and a camera 121 and a light supplement device 122 are disposed toward the back of the mobile terminal, where the camera 121 and the light supplement device 122 are sequentially arranged along the sliding-out direction of the functional device. Herein, the camera 121 facing the front of the mobile terminal may be understood as a portion of the camera that captures an image facing the front of the mobile terminal. The light supplement device facing the front of the mobile terminal may be understood as a light supplement device with a transmitting portion facing the front of the mobile terminal. The same is true for the understanding that the device is disposed towards the back of the mobile terminal. It is understood that the sliding of the functional device 120 relative to the terminal body 110 can enable the functional device 120 to move out of and extend into the terminal body 110, for example, the camera 121 slides out of the terminal body 110, i.e., the camera moves out of the terminal body 110, and the camera 121 slides into the terminal body 110, i.e., the camera extends into the terminal body 110.
Alternatively, referring to fig. 3 and 4, the mobile terminal 100 includes a terminal body 110 and a slider 130, and the functional device 120 is disposed on the slider 130. In which the slider 120 can slide with respect to the terminal body 110, so that the functional device 120 is moved out of the terminal body 110 and is exposed with respect to the terminal main body 110 (as shown in fig. 3, 4 and 5), or is inserted into the terminal body 110 and is hidden inside the terminal body 110 (as shown in fig. 6 and 7). Note that, when the functional device mentioned in the following description is removed from the terminal body 110, it may be referred to that the functional device 120 is in a state shown in fig. 2, fig. 3, fig. 4, or fig. 5. And the functional device protrudes into the terminal body 110, it is possible to refer to the state shown in fig. 1, fig. 6 or fig. 7.
Of course, the form in which the functional device 120 is moved out of the terminal body 110 and is inserted into the terminal body 110 is not limited to the form shown in fig. 1, fig. 2, fig. 3, fig. 4, fig. 5, fig. 6, and fig. 7.
The camera control method and device, the mobile terminal and the storage medium provided by the embodiment of the application monitor the running application needing to call the camera in real time, and control the camera to move out of the terminal body according to the monitored change of the state of the application, so that the camera is automatically controlled, and the user experience is improved. Embodiments in the present application will be described in detail below with reference to the accompanying drawings.
In an embodiment, please refer to fig. 8, and fig. 8 is a flowchart illustrating a camera control method according to an embodiment of the present application. According to the camera control method, the running application needing to call the camera is monitored in real time, the camera is controlled to move out and retract according to the change of the monitored state of the application, the automatic control of the camera is achieved, and the user experience is improved.
In a specific embodiment, the camera control method is applied to the camera control device 200 shown in fig. 11 and a mobile terminal equipped with the camera control device 200. The following will describe a specific process of this embodiment by taking a mobile terminal as an example, and it is understood that the mobile terminal applied in this embodiment may be a smart phone, a tablet computer, a wearable electronic device, and the like, and is not limited specifically herein.
As will be described in detail with respect to the flow shown in fig. 8, the camera control method may specifically include the following steps:
step S110: and monitoring the application needing to call the camera in real time in operation.
After the camera of the mobile terminal is arranged in the terminal body, when the mobile terminal is used, the camera is moved out from the interior of the terminal body when being used every time, and after the camera is used, the camera is retracted into the terminal body.
The camera is usually used in connection with the application calling camera of the mobile terminal, so that the running applications needing to call the camera can be monitored in real time, and the movement of the camera is controlled according to the change of the running state of the applications. The application needing to call the camera can be determined according to whether the application has the call authority of the camera, namely the application with the call authority of the camera is the application needing to call the camera, and the application without the call authority of the camera is the application not needing to call the camera.
In this embodiment of the application, the application that needs to call the camera may be a system application, such as a system camera, a setting list, a system browser, system unlocking, and the like, or may also be a third-party application, such as an application camera, a live application, a social application, a third-party browser, an AR application, a payment application, other applications related to code scanning, and the like. Of course, the system application and the third-party application are only examples, and do not limit the system application and the third-party application in the embodiment of the present application.
In the embodiment of the application, the state of the application can be monitored in real time by using the process management of the system of the mobile terminal. Therefore, the monitoring result of the application state, such as whether the application is running, whether the application has the camera authority, the interface displayed by the application, the running state of the application and the like, can be acquired in real time. Of course, the monitoring results of the states applied above are merely examples, and do not constitute a limitation on the states applied in the embodiments of the present application.
Step S120: and when the situation that the state of the application is monitored to change in a first type is monitored, controlling the camera to move out of the terminal main body.
It can be understood that, when the application enters a state that the camera needs to be called, the camera needs to enter a usable state, so that the camera can be controlled to move out of the terminal main body, so that the application can complete a task to be performed by using the camera moved out of the terminal main body.
In the embodiment of the application, when the first type of change of the application state can be monitored, the situation that the application state enters the state where the camera needs to be called is determined, and therefore the camera is controlled to move out of the terminal main body. The first type of change may be that the state of the application enters a state in which a camera needs to be called.
In the embodiment of the application, the moving-out and retracting control of the camera can be realized by controlling the motor which drives the camera to move out and retract. The motor can be controlled to rotate in a first direction to drive the camera to move out of the terminal main body; the motor can be controlled to rotate in a second direction opposite to the first direction so as to drive the camera to extend into the terminal main body.
Furthermore, the motor can drive the camera to move out or extend into a designated position by controlling the rotating speed and the rotating time of the motor. It will be appreciated that the rate of rotation of the control motor determines the rate of movement of the camera in and out, and the time of rotation of the control motor determines the time of movement of the camera in and out.
In addition, the distance between the camera and the surface of the terminal main body can be detected by controlling the camera to move out of or extend into the terminal main body, for example, the distance between the camera and the top surface is detected, so that the position of the camera relative to the terminal main body is detected, whether the camera extends into or out of the terminal main body is judged, and whether the camera extends into or out of the specified position relative to the terminal main body is judged. Specifically, the distance between the camera and the surface of the terminal body may be determined according to the change of the resistance value by sliding the rheostat, or the distance between the camera and the surface of the terminal body may be detected by a distance sensor, and specifically, the manner of detecting the distance between the camera and the surface of the terminal body is not limited in the embodiment of the present application.
Step S130: and when the situation that the state of the application is monitored to be changed in a second type is monitored, controlling the camera to retract to the terminal main body.
It can be understood that, when the application enters a state that the camera needs to be called, the camera needs to enter a usable state, so that the camera can be controlled to move out of the terminal main body, so that the application can complete a task to be performed by using the camera moved out of the terminal main body.
In the embodiment of the application, when the second type of change of the application state can be monitored, the situation that the application state enters the state without calling the camera is determined, and therefore the camera is controlled to be retracted to the terminal main body. The second type of change may be that the state of the application enters a state in which the camera is not required to be called.
The camera control method provided by the embodiment of the application carries out real-time monitoring through the application which needs to call the camera in operation, when the state of the application is monitored to change in a first type, the camera is controlled to move out of the terminal main body, when the state of the application is monitored to change in a second type, the camera is controlled to retract to the terminal main body, so that the state change of the application of the camera is called as required, the camera is controlled to move out and retract, the use requirements of users are met, and the user experience is improved.
In an embodiment, please refer to fig. 9, and fig. 9 is a flowchart illustrating a camera control method according to an embodiment of the present application. As will be described in detail with respect to the flow shown in fig. 9, the camera control method may specifically include the following steps:
step S210: and monitoring the application needing to call the camera in real time in operation.
Step S220: and when the situation that the state of the application is changed in a first type is monitored, whether the camera is moved out of the terminal main body or not is detected.
In the embodiment of the application, when the first type of change of the application state is monitored, the position of the camera can be detected, and whether the camera moves out of the terminal body or not is specifically detected, so that whether the camera needs to be controlled to move out of the terminal body or not is determined.
It can be understood that when the application needs to call the camera, the camera needs to be moved out of the terminal main body, and if the camera is already outside the terminal main body before, the camera does not need to be controlled to be moved out of the terminal main body, and only the state that the camera is moved out of the terminal main body needs to be kept, so that the application can continue to use the camera to perform tasks. And if the camera is not outside the terminal main body before, the camera needs to be controlled to move out of the terminal main body so as to enable the camera to enter a usable state.
In the embodiment of the application, whether the camera moves out of the terminal body or not can be detected by detecting the position of the camera relative to the terminal body. Similarly, the position of the camera relative to the terminal body may also be determined by the sliding rheostat according to the change of the resistance value, and the specific way of detecting the position of the camera relative to the terminal body is not limited in the embodiment of the present application.
Step S230: and if the terminal main body is not moved out, controlling the camera to move out of the terminal main body.
In step S220, if it is detected that the camera is not moved out of the terminal body, the camera is not in a usable state, and the application cannot use the camera yet, so that it is necessary to control the camera to move out of the terminal body, so that the application performs a required task using the camera.
Step S240: and when the state of the application is monitored to be changed in a second type, detecting whether the camera is moved out of the terminal main body.
In the embodiment of the application, when the two types of changes of the application state are monitored, the position of the camera can be detected, and whether the camera withdraws the terminal main body or not is specifically detected, so that whether the camera is required to be controlled to withdraw the terminal main body or not is determined.
It can be understood that when the application does not need to call the camera, the camera needs to be retracted to the outside of the terminal main body so as to protect the camera and not influence the beauty of the mobile terminal, and if the camera is already positioned outside the terminal main body before, the camera needs to be controlled to be retracted to the terminal main body. And if the camera is not positioned outside the terminal main body before, the camera does not need to be controlled to be withdrawn to the terminal main body, and only the current state of the camera needs to be kept.
Step S250: and if the terminal main body is moved out, controlling the camera to be retracted to the terminal main body.
In step S220, if it is detected that the camera moves out of the terminal body, the camera is not retracted into the terminal body, and therefore the camera needs to be controlled to be retracted into the terminal body, so that the camera is located at a position in the terminal body, and the camera is protected, which does not affect the visual experience of the user on the mobile terminal.
In this embodiment of the application, as a manner, if the terminal body is moved out, controlling the camera to be retracted to the terminal body may include:
if the terminal body is moved out, whether the first type of change occurs in the applied state is monitored within the preset time length, if yes, the camera is controlled to keep the extending state, and if not, the camera is controlled to be retracted to the terminal body.
It can be understood that the second type of change occurs in the above application, and when the camera moves out of the terminal main body, the camera can be controlled to be retracted to the terminal main body after a certain time, so that when the camera needs to be called in a short time, the camera moving out of the terminal main body can be directly utilized for use. Specifically, whether the camera needs to be called in a short time is judged by monitoring whether the state of the application changes in the first type within a preset time length. If the first type change occurs, the camera is controlled to be in an extending state so that the application can use the camera to perform required tasks, and if the first type change does not occur, the camera is controlled not to be called within a short time so that the camera is controlled to be retracted to the terminal body.
Therefore, when the camera needs to be called in a short time, the camera moved out of the terminal body can be directly used, the moving times of the camera moving out and retracting can be reduced, user experience is improved, and the influence on the service life of the camera is reduced.
According to the camera control method provided by the embodiment of the application, when the situation that the application needing to call the camera in operation changes in a first type is monitored, whether the camera moves out of the terminal main body or not is detected, when the camera does not move out of the terminal main body, the camera is controlled to move out of the terminal main body so as to use the camera, when the application changes in a second type, whether the camera moves out of the terminal main body or not is also detected, and when the camera moves out of the terminal main body, the camera is controlled to retract the terminal main body so as to protect the camera. Therefore, the state of the camera relative to the terminal main body is automatically controlled according to the change of the state of the camera required to be called, the user is not required to operate the camera, the user experience is improved, and the camera is protected.
In an embodiment, referring to fig. 10, fig. 10 is a schematic flowchart illustrating a camera control method according to an embodiment of the present disclosure. As will be described in detail with respect to the flow shown in fig. 10, the camera control method may specifically include the following steps:
step S310: and calling the application of the camera to perform real-time monitoring during running.
Step S320: and when the situation that the state of the application is monitored to change in a first type is monitored, controlling the camera to move out of the terminal main body.
In an embodiment of the present application, the first preset type of variation may include: the application enters an interface needing to call the camera; or the application is changed from background operation to foreground operation, and the current interface is the interface needing to call the camera.
It can be understood that when the application enters the interface where the camera needs to be called, it indicates that the application needs to call the camera, that is, the camera needs to be moved out of the terminal body, so that the camera is controlled to move out of the terminal body. When the application is changed from background operation to foreground operation and the current interface is the interface needing to call the camera, the application needing to call the camera also needs to call the camera, and therefore the camera is controlled to move out of the terminal main body, and the application can use the camera to perform needed tasks conveniently.
In the embodiment of the application, the interface to be called the camera may be a face authentication interface, a face unlocking interface, a scanning interface, a photographing interface, a video recording interface, a video call interface, a live broadcast interface, an AR interface, a face identity input interface or a picture-text conversion interface.
Further, when the application enters a face identity authentication interface during payment, the camera is controlled to move out of the terminal body so as to collect a face image for identity authentication, and therefore payment is completed. Fig. 11 is an interface diagram of an application entering a face authentication interface where a face is to be obtained, when the application enters the interface, the camera is controlled to move out of the terminal body, and the face authentication interface diagram after the camera moves out of the terminal body is shown in fig. 12. Of course, the above interface diagram is only an example, and does not represent a limitation on the face authentication interface in the present application.
When a face unlocking interface of the system or a third-party application is entered, the camera is also controlled to move out of the terminal body so as to collect a face image for identity authentication, so that a screen or an application program and the like can be unlocked.
When an application enters a scanning interface, for example, a bicycle application, a browser, a payment application, a chat application and the like, which can be scanned, enter an interface for scanning a bar code, the camera needs to be controlled to move out of the terminal body so as to collect the bar code and identify information of the bar code. Of course, the specific interface of the scanning interface is not limited in the embodiment of the present application, and may be other interfaces, such as a scanned document. Fig. 13 is an interface diagram of an application entering a barcode scanning interface, when the application enters the interface, the camera is controlled to move out of the terminal body, and the interface diagram of the barcode scanning after the camera moves out of the terminal body is shown in fig. 14. Of course, the above interface diagrams are merely examples and do not represent a limitation of the scanning interface in the present application.
When the application enters a photographing interface, a video recording interface and the like, such as a system camera or a third-party application camera and the like, the camera also needs to be controlled to move out of the terminal body, so that the application can utilize the camera to shoot images of a target scene for storage, recording and the like. Fig. 15 is an interface diagram of a photographing interface of an application, when the application enters the interface, the camera is controlled to move out of the terminal body, and the interface diagram of the photographing interface after the camera moves out of the terminal body is shown in fig. 16. Of course, the above interface diagrams are merely examples and do not represent limitations of the photographing interface in the present application.
When the application enters a video call interface, for example, a social application enters a video chat interface, the camera needs to be controlled to move out of the terminal body, so that the application can acquire images of the user and transmit the images to other terminals, and the purpose of video call is achieved. Fig. 17 is an interface diagram of a video chat interface, and when an application enters the interface, the camera is controlled to move out of the terminal body, and the interface diagram of the video chat interface after the camera moves out of the terminal body is shown in fig. 18. Of course, the above interface diagram is merely an example, and does not represent a limitation of the video chat interface in the present application.
Similarly, when the application enters a live broadcast interface, for example, a third-party live broadcast application enters the live broadcast interface, the camera is also controlled to move out of the terminal body so as to collect images of scenes needing live broadcast for transmission, and the purpose of live broadcast is realized.
When the application enters an AR (Augmented Reality) interface, for example, an AR game application performs an AR interface, the camera is also controlled to move out of the terminal body so as to collect images of a real scene and fuse the images with a virtual scene, and the purpose of Augmented Reality application is achieved.
When entering a face input interface in a third-party application or a system application, the camera is also controlled to move out of the terminal main body, and a face image is collected and stored; in addition, when the application enters the image-text scanning interface, the camera is controlled to move out of the terminal body, so that the character image of the target area is collected for character conversion, and the characters in the target area are obtained.
In this embodiment of the application, the entering of the application into the interface where the camera needs to be called may include: the interface of the camera is not required to be called from the same application, and the interface of the camera is required to be called; or switching to the interface of the second application in the application, which needs to call the camera, from the interface of the first application in the application, which does not need to call the camera.
It can be understood that the camera does not need to be invoked at the same interface of the application that needs to invoke the camera. The interface needing to call the camera is entered, which means that the camera is not moved out of the terminal main body when the application does not need to call the interface of the camera, so that the camera needs to be controlled to move out when the application enters the interface needing to call the camera, so that the application can use the camera. When the interface of the first application, which does not need to call the camera, is switched to the interface of the second application, which needs to call the camera, the switching process between different applications is represented, the previous interface is the interface of the first application, and the first application does not call the camera, so that the camera is not moved out of the terminal main body, and therefore when the interface is switched to the second application and the switched interface is the interface which needs to call the camera, the camera needs to be controlled to be moved out of the terminal main body, so that the second application can use the camera to perform required tasks.
In this embodiment of the application, the application changes from background running to foreground running, and the current interface is an interface that needs to call the camera, which may include: entering an interface of the application needing to call the camera from the multi-task interface; or the interface of the application needing to call the camera is entered from the main interface.
It can be understood that the application is changed from background running to foreground running, and may enter the application through a multitasking interface, or enter the application through a main interface, and when the multitasking interface and the main interface are used, the camera is not called, that is, the camera is not moved out of the terminal main body, so that the camera needs to be controlled to be moved out of the terminal main body, so that the application uses the camera to perform a required task.
In this application embodiment, when the control camera shifts out the terminal main body, can be according to opening of leading camera or the back camera control leading camera or the back camera that the application was called, specifically can be, when the leading camera was called to the application, control leading camera was opened, when the application called the back camera, control back camera was opened.
In this embodiment of the application, after controlling the camera to move out of the terminal body, the camera control method may further include:
step S330: and when the screen extinguishing signal is detected, controlling the camera to withdraw to the terminal main body.
It can be understood that after the camera moves out of the terminal body, namely the camera is in a state of moving to the outside of the terminal body, when a screen-off signal is detected, namely a key signal of a Power key (an on-off key or a Power key) of the mobile terminal is detected, it indicates that the camera is not required to be used at present, and therefore the camera is controlled to be retracted to the terminal body so as to protect the camera.
Of course, after the screen-off signal is detected, the camera is controlled to be retracted to the terminal body for use under any condition, that is, as long as the camera moves out of the terminal body, after the screen-off signal is detected, the camera is controlled to be retracted to the terminal body.
In addition, when the interface of the camera needs to be called before the screen is lightened again, the camera can be controlled to move out of the terminal body again.
In this embodiment of the application, after controlling the camera to move out of the terminal body, the camera control method may further include:
step S340: and when the signal of returning to the main screen is detected, controlling the camera to return to the terminal main body.
It can be understood that after the camera moves out of the terminal body, that is, the camera is in a state of moving out of the terminal body, after the camera detects the signal of returning to the main screen, that is, the key signal of the HOME key (key returning to the main interface) of the mobile terminal is detected, it indicates that the camera is not needed to be used in the current interface, and therefore the camera is controlled to be retracted to the terminal body, so as to protect the camera.
Of course, after detecting the signal of returning to the main screen, the camera is controlled to return to the terminal body for use in any case, that is, as long as the camera moves out of the terminal body, after detecting the signal of returning to the main screen, the camera is controlled to return to the terminal body.
In addition, when the interface of the camera needs to be called before entering through the multitask interface, the camera can be controlled to move out of the terminal body again.
Step S350: and when the situation that the state of the application is monitored to be changed in a second type is monitored, controlling the camera to retract to the terminal main body.
In an embodiment of the present application, the state of the application undergoes a second type of change, including: the application enters an interface without calling the camera from the interface needing calling the camera; ending the task of calling the camera by the application; or the application changes from foreground to background.
It can be understood that, when the application enters the interface which does not need to call the camera from the interface which needs to call the camera, it indicates that no application needs to call the camera at present, that is, the camera needs to be retracted out of the terminal body at this time, so that the camera is controlled to be retracted into the terminal body, the camera is protected, and the influence on the overall appearance of the mobile terminal is reduced.
When the task of calling the camera by the application is finished, the fact that no application needs to call the camera currently is also indicated, and therefore the camera is controlled to be retracted to the terminal body, the camera is protected, and the influence on the overall appearance of the mobile terminal is reduced.
When the application is changed from background operation to foreground operation and the current interface is the interface needing to call the camera, the camera is also called when no application exists at present, so that the camera is controlled to be retracted to the terminal body, the camera is protected, and the influence on the overall appearance of the mobile terminal is reduced.
Further, the entering of the application from the interface where the camera needs to be called into the interface where the camera does not need to be called may include: the interface of the camera is required to be called from the same application, and the interface without the camera is entered; or switching to an interface which does not need to call the camera of the second application in the application from the interface which needs to call the camera of the first application in the application.
The application can enter the interface without calling the camera from the interface needing to call the camera, and can enter the interface without calling the camera from the interface needing to call the camera when the same application runs in the foreground, or enter the interface without calling the camera from the interface needing to call the camera when different applications are switched. When the interface of the camera needs to be called, the camera is in a state of being moved out of the terminal main body, and when the interface of the camera does not need to be called is entered, the camera needs to be controlled to withdraw the terminal main body, and the interface of the camera does not need to be called is entered from the interface of the camera needing to be called.
In the embodiment of the application, when the user quits the application calling the camera by using the return key, the camera also needs to be controlled to be retracted into the terminal body so as to protect the camera.
In addition, when the camera moves out of the terminal body, when the fact that a user clicks a multi-task key is detected, the camera can be controlled not to be retracted into the terminal body, and when a subsequent user selects an interface which does not need to call the camera, the camera is controlled to be retracted into the terminal body, so that the camera is prevented from moving for many times, and the influence of the moving times of the camera on the service life of the camera is reduced.
In this application embodiment, when the camera shifts out the terminal main body, when detecting the control piece of user pull-down control center, can not control the camera and withdraw to the terminal main body in, when follow-up user selection entering need not call the interface of camera, just control the camera and withdraw to the terminal main body in to avoid the multiple movement of camera, reduce the camera and remove the too much influence to camera life-span of number of times.
In the embodiment of the application, when the camera is moved out of the terminal body, the interface of the camera needs to be called by the application and is covered by other applications (such as incoming call prompt) so that the application runs in a background, and the camera can be controlled to be retracted into the terminal body. Of course, the interface requiring the camera to be called is not covered by the split screen processing, the pop-up frame, the transparent covering layer and the like, and the state that the camera is moved out of the terminal body is maintained.
According to the camera control method provided by the embodiment of the application, the camera is determined to be moved out and retracted through monitoring the change of the interface of the camera, so that the camera is automatically controlled to be moved out and retracted according to the state change of the application, the operation of a user is reduced, the camera is protected while the user experience is improved, and the influence of the camera moving out on the whole appearance feeling of the mobile terminal is reduced.
In an embodiment, please refer to fig. 19, and fig. 19 shows a flowchart of a camera control method provided in the embodiment of the present application. As will be described in detail with respect to the flow shown in fig. 19, the camera control method may specifically include the following steps:
step S410: the method comprises the steps that a foreground of the mobile terminal runs a first application and obtains a first request of the first application, wherein the first request is used for calling the camera.
It can be understood that after the camera is arranged in the terminal body, when the camera is called, the camera needs to be in a state of being moved out of the terminal body, so that the camera can be normally used; when the camera is not called, the camera does not need to be moved out of the terminal body, and the camera needs to be retracted to the terminal body so as to protect the terminal body. When the first application runs in the foreground of the mobile terminal, the first application can send a first request for calling the camera when the camera is required to be used for carrying out related tasks, so that the camera can be moved out and opened later, and the first application can use the camera to carry out the required tasks.
Step S420: and controlling the camera to move out of the terminal body according to the first request, and starting the camera.
When a first request for calling the camera is acquired, the first request can be responded, the camera is controlled to move out of the terminal body, the camera is started, the camera enters a usable state, and therefore the first application can use the camera to perform required tasks.
Step S430: and when the first application is switched to a background, closing the camera and controlling the camera to be retracted to the terminal main body.
When the first application is switched from foreground operation to background operation, the camera is not called by the first application, so that the camera is controlled to be retracted to the terminal main body, the camera is protected, and the influence of the camera stretching out on the appearance attractiveness of the mobile terminal is reduced.
In this embodiment of the present application, the step S430 may include: and when the first application is switched to the third application, closing the camera and controlling the camera to be retracted to the terminal main body. It can be understood that, the first application is switched from the foreground to the background, and the first application is switched to the second application to cause the first application to be in the background, so that the camera is closed and controlled to be retracted to the terminal main body.
In addition, the first application is switched from the foreground to the background, or the first application is switched from the foreground to the background by responding to a request for returning to the main interface when the request is received. Of course, the scenario that the first application is switched from foreground to background is not limited in the embodiment of the present application.
In this embodiment of the application, after the first application is switched to the background operation in step S430, the method may further include: and when the first application is switched from the background to the foreground, controlling the camera to move out of the terminal main body and starting the camera.
It can be understood that, when the first application is not switched to the background, the first application is in a state of calling the camera, and when the background is switched to the foreground, the first application needs to keep the previous state, so that the camera needs to be controlled to move out of the terminal main body, and the camera is started, so that the first application uses the camera to perform required tasks.
In this embodiment of the application, the first application runs in the foreground, and calls the camera, and after the camera is controlled to move out of the terminal main body, the method may further include:
and when the first application is closed, controlling the camera to be retracted to the terminal body.
It can be understood that after the first application which is running in the foreground and calls the camera is closed, the first application does not call the camera, so that the camera is controlled to be retracted to the terminal main body, the influence of the camera stretching out on the appearance attractiveness of the mobile terminal is reduced while the camera is protected.
At above-mentioned first application operation in the proscenium to call the camera, after control camera shifts out terminal subject, can also include:
when the first application is switched to a second application, and a second request of the second application is obtained, wherein the second request is used for calling the camera; and according to the second request, keeping the state that the camera moves out of the terminal body.
It can be understood that when the first application which calls the camera is switched to the second application and the second application also calls the camera, the state that the camera is moved out of the terminal main body can be kept, so that the camera does not need to be moved out of the terminal main body to meet the requirement of the second application after the camera is controlled to be retracted when the first application is switched to the background, the moving times of the camera are reduced, and the service life of the camera is prolonged.
According to the camera control method provided by the embodiment of the application, when the first request for calling the camera of the first application running in the foreground is obtained, the camera is controlled to move out of the terminal main body, the camera is started, and when the first application is switched to the background, the camera is closed and the camera is controlled to be retracted to the terminal main body. Therefore, the camera can be moved out and retracted to be automatically controlled according to the calling condition of the application program to the camera, the use requirements of users are met, the user does not need to manually move the camera, and the user experience is improved.
In one embodiment, referring to fig. 20, fig. 20 is a block diagram illustrating a camera control device 400 according to an embodiment of the present disclosure. This camera control device 400 is applied to a mobile terminal including a terminal body and a camera provided in the terminal body, the camera being movable out and retracted with respect to the terminal body. As will be explained below with respect to the block diagram shown in fig. 20, the camera control device 400 includes: an application monitoring module 410, a first control module 420, and a second control module 430. The application monitoring module 410 is configured to monitor an application that needs to call the camera in real time during running; the first control module 420 is configured to control the camera to move out of the terminal body when it is monitored that the state of the application changes by a first type; the second control module 430 is configured to control the camera to retract to the terminal body when it is monitored that the state of the application changes by the second type.
In this embodiment, the first control module 420 may be specifically configured to: when the situation that the state of the application changes in a first type is monitored, whether the camera moves out of the terminal main body or not is detected; and if the terminal main body is not moved out, controlling the camera to move out of the terminal main body.
In this embodiment, the first type of change in the state of the application may include: the application enters an interface needing to call the camera; or the application is changed from background operation to foreground operation, and the current interface is the interface needing to call the camera.
In this embodiment of the application, the entering of the application into the interface where the camera needs to be called may include: the interface of the camera is not required to be called from the same application, and the interface of the camera is required to be called; or switching to an interface of a second application in the applications, which needs to call the camera, from an interface of the first application in the applications, which does not need to call the camera.
In this embodiment of the present application, the changing of the application from background running to foreground running and the current interface being the interface that needs to call the camera includes: entering an interface of the application needing to call the camera through a multi-task interface; or the interface of the application needing to call the camera is entered from the main interface.
In this embodiment, the camera control device 400 may further include: and a third control module. The third control module can be used for controlling the camera to be retracted to the terminal main body after detecting the screen-off signal.
In this embodiment, the camera control device 400 may further include: and a fourth control module. The fourth control module may be configured to control the camera to retract to the terminal body after detecting a signal to retract to a main screen.
In this embodiment, the second control module 430 may specifically be configured to: when the state of the application is monitored to be changed in a second type, whether the camera is moved out of the terminal main body or not is detected; and if the terminal main body is moved out, controlling the camera to be retracted to the terminal main body.
In this embodiment of the application, the controlling 430, by the second control module, the camera to be retracted to the terminal body may include: and if the terminal body is moved out, monitoring whether the state of the application has the first type of change within a preset time length, if so, controlling the camera to keep the extending state, and if not, controlling the camera to retract to the terminal body.
In this embodiment of the present application, the second type of change in the state of the application may include: the application enters an interface without calling the camera from the interface needing calling the camera; the application calls the task of the camera to finish; or the application changes from foreground running to background running.
In this embodiment of the present application, entering an interface that does not need to call a camera from an interface that needs to call a camera includes: calling the interface of the camera from the same application, and entering the interface without calling the camera; or switching to an interface of a second application in the applications, which does not need to call the camera, from an interface of the first application in the applications, which needs to call the camera.
In an embodiment, referring to fig. 21, fig. 21 is a block diagram illustrating a camera control device 500 according to an embodiment of the present disclosure. This camera controlling means 500 is applied to a mobile terminal, the mobile terminal includes a terminal body and a camera provided in the terminal body, the camera can be moved out and retracted with respect to the terminal body. As will be explained below with respect to the block diagram shown in fig. 21, the camera control device 500 includes: a request acquisition module 510, a removal control module 520, and a retraction control module 530. The request obtaining module 510 is configured to run a first application on the foreground of the mobile terminal, and obtain a first request of the first application, where the first request is used to invoke the camera; the moving-out control module 520 is configured to control the camera to move out of the terminal body according to the first request, and start the camera; the retraction control module 530 is configured to close the camera and control the camera to retract to the terminal body when the first application is switched to the background.
In this embodiment, the camera control device 500 may further include a first removal module. And the first shifting-out module is used for controlling the camera to shift out of the terminal main body and starting the camera when the first application is switched from the background to the foreground after the first application is switched to the background.
In this embodiment, the camera control device 500 may further include a retraction module. And the withdrawing module is used for controlling the camera to be withdrawn to the terminal main body when the first application is closed after controlling the camera to be moved out of the terminal main body.
In this embodiment, the camera control device 500 may further include a calling acquisition module and a second removal module. The calling acquisition module is used for acquiring a second request of the second application when the first application is switched to the second application, and the second request is used for calling the camera; and the second moving-out module is used for keeping the state that the camera is moved out of the terminal body according to the second request.
In this embodiment, the retraction control module 530 may be specifically configured to: and when the first application is switched to a third application, closing the camera and controlling the camera to be retracted to the terminal main body.
In an embodiment, referring to fig. 22, based on the camera control method and apparatus, the present application further provides a mobile terminal 100 capable of executing the camera control method. The mobile terminal 100 includes one or more processors 102 (only one shown), a memory 104, and a camera 220 coupled to each other. The number of the cameras 220 can be set according to the needs, for example, two cameras 220 can be configured, one as a front camera and the other as a rear camera. The memory 104 stores a program that can execute the camera control method provided by the above-mentioned embodiment, and the processor 102 can execute the program stored in the memory 104.
In summary, compared with the prior art, the camera control method, the camera control device, the mobile terminal and the storage medium provided by the invention have the advantages that the camera control method, the camera control device, the mobile terminal and the storage medium provided by the invention monitor the running application requiring the camera in real time, when the state of the application is monitored to be changed in a first type, the camera is controlled to move out of the terminal main body, and when the state of the application is monitored to be changed in a second type, the camera is controlled to be retracted into the terminal main body, so that the camera is automatically controlled to move according to the change of the state of the application, the operation of a user is reduced, and the user experience is improved.
It should be noted that, in the present specification, the embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments may be referred to each other. For the device-like embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment. For any processing manner described in the method embodiment, all the processing manners may be implemented by corresponding processing modules in the apparatus embodiment, and details in the apparatus embodiment are not described again.
A mobile terminal provided by the present application will be described with reference to fig. 23.
Referring to fig. 23, based on the camera control method and apparatus, the embodiment of the present application further provides a mobile terminal 100 capable of executing the camera control method. The mobile terminal 100 includes one or more (only one shown) processors 102, memory 104, wireless module 106, audio circuitry 110, sensors 114, input module 118, and power module 132. It will be understood by those of ordinary skill in the art that the present application is not limited to the structure of the mobile terminal 100. For example, the mobile terminal 100 may also include more or fewer components than shown, or have a different configuration than shown.
Those skilled in the art will appreciate that all other components are peripheral devices with respect to the processor 102, and the processor 102 is coupled to the peripheral devices through a plurality of peripheral interfaces 124. The peripheral interface 124 may be implemented based on the following criteria: universal Asynchronous Receiver/Transmitter (UART), General Purpose Input/Output (GPIO), Serial Peripheral Interface (SPI), and Inter-Integrated Circuit (I2C), but the present invention is not limited to these standards. In some examples, the peripheral interface 124 may comprise only a bus; in other examples, the peripheral interface 124 may also include other elements, such as one or more controllers, for example, a display controller for interfacing with the display panel 111 or a memory controller for interfacing with a memory. These controllers may also be separate from the peripheral interface 124 and integrated within the processor 102 or a corresponding peripheral.
The memory 104 may be used to store software programs and modules, and the processor 102 executes various functional applications and data processing by executing the software programs and modules stored in the memory 104. The memory 104 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid state memory. In some examples, the memory 104 may further include memory remotely located from the processor 102, which may be connected to the mobile terminal 100 or the first screen 130 over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The wireless module 106 is configured to receive and transmit electromagnetic waves, and achieve interconversion between the electromagnetic waves and electrical signals, so as to communicate with a communication network or other devices. The wireless module 106 may include various existing circuit elements for performing these functions, such as an antenna, a radio frequency transceiver, a digital signal processor, an encryption/decryption chip, a Subscriber Identity Module (SIM) card, memory, and so forth. The wireless module 106 may communicate with various networks, such as the internet, an intranet, a wireless network, or with other devices via a wireless network. The wireless network may comprise a cellular telephone network, a wireless local area network, or a metropolitan area network. The Wireless network may use various Communication standards, protocols and technologies, including but not limited to Global System for Mobile Communication (GSM), Enhanced Data GSM Environment (EDGE), wideband code division multiple Access (W-CDMA), Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Wireless Fidelity (WiFi) (e.g., Institute of Electrical and Electronics Engineers (IEEE) standard IEEE 802.10A, IEEE802.11 b, IEEE 802.2.1 g and/or IEEE802.11 n), Voice over internet protocol (VoIP), world wide mail Access (Microwave for Wireless Communication), and other short message Communication protocols, as well as any other suitable communication protocols, and may even include those that have not yet been developed.
The camera 121 is used to capture images and pass them to the process 102 for processing. The camera 121 is driven by a motor (not shown in the figure) to extend or retract to be hidden in the terminal body. The motor executes an action of driving the camera 121 to extend or retract in response to a control instruction sent by the processor.
The audio circuit 110, the speaker 101, the sound jack 103, and the microphone 105 collectively provide an audio interface between a user and the mobile terminal 100 or the first screen 130. Specifically, the audio circuit 110 receives sound data from the processor 102, converts the sound data into an electrical signal, and transmits the electrical signal to the speaker 101. The speaker 101 converts an electric signal into a sound wave audible to the human ear. The audio circuitry 110 also receives electrical signals from the microphone 105, converts the electrical signals to sound data, and transmits the sound data to the processor 102 for further processing. Audio data may be retrieved from the memory 104 or through the wireless module 106. In addition, audio data may also be stored in the memory 104 or transmitted via the wireless module 106.
The sensor 114 is disposed within the mobile terminal 100 or within the first screen 130, examples of the sensor 114 include, but are not limited to: light sensors, operational sensors, pressure sensors, infrared heat sensors, distance sensors, gravitational acceleration sensors, and other sensors.
Specifically, the light sensors may include a light sensor 114F, a pressure sensor 114G. Among them, the pressure sensor 114G may detect a pressure generated by pressing on the mobile terminal 100. That is, the pressure sensor 114G detects pressure generated by contact or pressing between the user and the mobile terminal, for example, contact or pressing between the user's ear and the mobile terminal. Accordingly, the pressure sensor 114G may be used to determine whether contact or pressing has occurred between the user and the mobile terminal 100, as well as the magnitude of the pressure.
Referring to fig. 23 again, in particular, in the embodiment shown in fig. 23, the light sensor 114F and the pressure sensor 114G are disposed adjacent to the display panel 111. The light sensor 114F may turn off the display output by the processor 102 when an object is near the first screen 130, for example, when the mobile terminal 100 moves to the ear.
As one of the motion sensors, the gravity acceleration sensor can detect the magnitude of acceleration in various directions (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used for applications (such as horizontal and vertical screen switching, related games, magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tapping) and the like for recognizing the attitude of the mobile terminal 100. In addition, the mobile terminal 100 may also be configured with other sensors such as a gyroscope, a barometer, a hygrometer and a thermometer, which are not described herein,
in this embodiment, the input module 118 may include the touch screen 109 disposed on the first screen 130, and the touch screen 109 may collect a touch operation of a user on or near the touch screen 109 (for example, an operation of the user on or near the touch screen 109 using any suitable object or accessory such as a finger, a stylus, etc.) and drive a corresponding connection device according to a preset program. Optionally, the touch screen 109 may include a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 102, and can receive and execute commands sent by the processor 102. In addition, the touch detection function of the touch screen 109 may be implemented by using resistive, capacitive, infrared, and surface acoustic wave types. In addition to the touch screen 109, in other variations, the input module 118 may include other input devices, such as keys. The keys may include, for example, character keys for inputting characters, and control keys for triggering control functions. Examples of such control keys include a "back to home" key, a power on/off key, and the like.
The first screen 130 is used to display information input by a user, information provided to the user, and various graphic user interfaces of the mobile terminal 100, which may be composed of graphics, text, icons, numbers, videos, and any combination thereof, and in one example, the touch screen 109 may be disposed on the display panel 111 so as to be integrated with the display panel 111.
The power module 132 is used to provide power supply to the processor 102 and other components. Specifically, the power module 132 may include a power management system, one or more power sources (e.g., batteries or ac power), a charging circuit, a power failure detection circuit, an inverter, a power status indicator light, and any other components related to the generation, management, and distribution of power in the mobile terminal 100 or the first screen 130.
The mobile terminal 100 further comprises a locator 119, the locator 119 being configured to determine an actual location of the mobile terminal 100. In this embodiment, the locator 119 implements the positioning of the mobile terminal 100 by using a positioning service, which is understood to be a technology or a service for obtaining the position information (e.g., longitude and latitude coordinates) of the mobile terminal 100 by using a specific positioning technology and marking the position of the positioned object on an electronic map.
It should be understood that the mobile terminal 100 described above is not limited to a smartphone terminal, but it should refer to a computer device that can be used in mobility. Specifically, the mobile terminal 100 refers to a mobile computer device equipped with an intelligent operating system, and the mobile terminal 100 includes, but is not limited to, a smart phone, a smart watch, a tablet computer, and the like.
In the description herein, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present application, "plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (mobile terminal) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments. In addition, functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations may be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not necessarily depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (10)

1. A camera control method is applied to a mobile terminal, the mobile terminal comprises a terminal body and a camera arranged on the terminal body, and the camera can move out and retract relative to the terminal body, and the method comprises the following steps:
monitoring the application needing to call the camera in real time in operation;
when the state of the application is monitored to be changed in a first type, the camera is controlled to move out of the terminal main body, the change in the state of the application in the first type comprises that the application enters an interface needing to call the camera, or the application is changed from background operation to foreground operation, and the current interface is the interface needing to call the camera, wherein the interface needing to call the camera at least comprises a photographing interface, a video call interface or a live broadcast interface;
when the state of the application is monitored to generate a second type of change, determining the position of the camera relative to the terminal body according to the change of the resistance value of the slide rheostat of the mobile terminal, and determining whether the camera moves out of the terminal body according to the position of the camera relative to the terminal body, wherein the second type of change comprises that the application enters an interface which does not need to call the camera from an interface which needs to call the camera, the task of calling the camera by the application is finished, or the application is changed from foreground operation to background operation;
and if the camera is moved out of the terminal main body, monitoring whether the state of the application has the first type of change within a preset time length, if so, controlling the camera to keep the extending state, and if not, controlling the camera to retract to the terminal main body.
2. The method according to claim 1, wherein the controlling the camera to move out of the terminal body when the first type of change in the state of the application is monitored comprises:
when the situation that the state of the application changes in a first type is monitored, whether the camera moves out of the terminal main body or not is detected;
and if the terminal main body is not moved out, controlling the camera to move out of the terminal main body.
3. The method of claim 1, wherein the application entering an interface requiring invocation of the camera comprises:
the interface of the camera is not required to be called from the same application, and the interface of the camera is required to be called; or
And switching to an interface of a second application in the applications, which needs to call the camera, from the interface of the first application in the applications, which does not need to call the camera.
4. The method of claim 1, wherein the application changes from background running to foreground running and the current interface is the interface that needs to call the camera, comprising:
entering an interface of the application needing to call the camera through a multi-task interface; or
And entering an interface of the application needing to call the camera from a main interface.
5. The method according to any one of claims 1 to 4, wherein after controlling the camera to move out of the terminal body, the method further comprises:
and when the screen extinguishing signal is detected, controlling the camera to withdraw to the terminal main body.
6. The method according to any one of claims 1 to 4, wherein after controlling the camera to move out of the terminal body, the method further comprises:
and when the signal of returning to the main screen is detected, controlling the camera to return to the terminal main body.
7. The method of claim 1, wherein the application entering the interface without the camera call from the interface with the camera call comprises:
calling the interface of the camera from the same application, and entering the interface without calling the camera; or
And switching to an interface of a second application in the applications, which does not need to call the camera, from the interface of the first application in the applications, which needs to call the camera.
8. The utility model provides a camera controlling means which characterized in that is applied to mobile terminal, mobile terminal includes the terminal main part and set up in the camera of terminal main part, the camera can for terminal main part moves out and withdraws, the device includes: an application monitoring module, a first control module, and a second control module, wherein,
the application monitoring module is used for monitoring the application of the camera which needs to be called in operation in real time;
the first control module is used for controlling the camera to move out of the terminal main body when monitoring that the state of the application changes in a first type, wherein the change in the state of the application comprises the fact that the application enters an interface needing to call the camera, or the fact that the application changes from background operation to foreground operation and the current interface is the interface needing to call the camera, and the interface needing to call the camera at least comprises a photographing interface, a video call interface or a live broadcast interface;
the second control module is used for determining the position of the camera relative to the terminal body according to the change of the resistance value of the slide rheostat of the mobile terminal when the state of the application is monitored to generate a second type of change, and determining whether the camera is moved out of the terminal body according to the position of the camera relative to the terminal body, wherein the second type of change comprises that the application enters an interface which does not need to call the camera from an interface which needs to call the camera, the task of calling the camera by the application is finished, or the application is changed from foreground operation to background operation;
the second control module is further used for monitoring whether the state of the application changes in a first type within a preset time period if the camera is moved out of the terminal main body, controlling the camera to keep in an extending state if the state of the application changes in the first type, and controlling the camera to be retracted to the terminal main body if the state of the application does not change in the first type.
9. A mobile terminal, characterized in that the mobile terminal comprises a terminal body and a camera provided to the terminal body, the camera being movable out of and retractable with respect to the terminal body, the terminal body comprising a memory and a processor, the camera and the memory being coupled to the processor, the memory storing instructions that, when executed by the processor, the processor performs the method according to any one of claims 1-7.
10. A computer-readable storage medium having program code executable by a processor, the program code causing the processor to perform the method of any one of claims 1-7.
CN201810623017.0A 2018-06-15 2018-06-15 Camera control method and device, mobile terminal and storage medium Active CN109167894B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810623017.0A CN109167894B (en) 2018-06-15 2018-06-15 Camera control method and device, mobile terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810623017.0A CN109167894B (en) 2018-06-15 2018-06-15 Camera control method and device, mobile terminal and storage medium

Publications (2)

Publication Number Publication Date
CN109167894A CN109167894A (en) 2019-01-08
CN109167894B true CN109167894B (en) 2019-12-31

Family

ID=64897177

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810623017.0A Active CN109167894B (en) 2018-06-15 2018-06-15 Camera control method and device, mobile terminal and storage medium

Country Status (1)

Country Link
CN (1) CN109167894B (en)

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110087012B (en) * 2019-04-17 2020-10-23 华为技术有限公司 Camera control method and electronic equipment
US10911593B2 (en) 2019-05-15 2021-02-02 Asustek Computer Inc. Electronic device having a rotatable camera module
US11240357B2 (en) 2019-05-15 2022-02-01 Asustek Computer Inc. Electronic device
CN111953866B (en) * 2019-05-15 2021-11-26 华硕电脑股份有限公司 Electronic device
US11375125B2 (en) 2019-05-15 2022-06-28 Asustek Computer Inc. Electronic device
US11546451B2 (en) 2019-05-15 2023-01-03 Asustek Computer Inc. Electronic device
US11477385B2 (en) 2019-05-15 2022-10-18 Asustek Computer Inc. Electronic device with rotatable camera for protecting privacy
CN110149427B (en) * 2019-05-16 2021-01-08 Oppo广东移动通信有限公司 Camera control method and related product
CN110445979B (en) * 2019-06-26 2021-03-23 维沃移动通信有限公司 Camera switching method and terminal equipment
CN112929493B (en) * 2019-07-05 2022-07-12 Oppo广东移动通信有限公司 Slider control method and related product
CN110519506A (en) * 2019-07-22 2019-11-29 华为技术有限公司 A kind of camera lift control method and electronic equipment
CN110418000B (en) * 2019-07-24 2021-05-11 RealMe重庆移动通信有限公司 Terminal control method, device, mobile terminal and storage medium
CN112449088A (en) * 2019-08-30 2021-03-05 华为技术有限公司 Camera control method and device and terminal equipment
CN110995988A (en) * 2019-11-25 2020-04-10 深圳传音控股股份有限公司 Intelligent terminal, camera device control method and computer-readable storage medium
CN113190103A (en) * 2020-01-14 2021-07-30 北京小米移动软件有限公司 Terminal device, motion detection method and device of image acquisition module and storage medium
CN113301404A (en) * 2020-02-24 2021-08-24 聚好看科技股份有限公司 Display apparatus and control method
CN111385413A (en) * 2020-02-28 2020-07-07 Oppo(重庆)智能科技有限公司 Telescopic camera control method and device, electronic device and storage medium
CN212486599U (en) * 2020-07-06 2021-02-05 瑞声科技(新加坡)有限公司 Image pickup apparatus
WO2022198392A1 (en) * 2021-03-22 2022-09-29 深圳市大疆创新科技有限公司 Photographing device control method, photographing device, electronic device, and medium
CN114143452A (en) * 2021-11-24 2022-03-04 北京达佳互联信息技术有限公司 Camera control method and device for video session

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0877468A (en) * 1994-09-08 1996-03-22 Ono Denki Kk Monitor device
CN106101306A (en) * 2016-06-01 2016-11-09 努比亚技术有限公司 A kind of terminal unit and photographic head external member thereof
CN108924290A (en) * 2018-06-12 2018-11-30 Oppo广东移动通信有限公司 Camera control method, device, mobile terminal and computer-readable medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104461701B (en) * 2013-09-15 2019-07-26 联想(北京)有限公司 A kind of control method and electronic equipment
CN207304636U (en) * 2017-09-19 2018-05-01 广东欧珀移动通信有限公司 Mobile terminal

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0877468A (en) * 1994-09-08 1996-03-22 Ono Denki Kk Monitor device
CN106101306A (en) * 2016-06-01 2016-11-09 努比亚技术有限公司 A kind of terminal unit and photographic head external member thereof
CN108924290A (en) * 2018-06-12 2018-11-30 Oppo广东移动通信有限公司 Camera control method, device, mobile terminal and computer-readable medium

Also Published As

Publication number Publication date
CN109167894A (en) 2019-01-08

Similar Documents

Publication Publication Date Title
CN109167894B (en) Camera control method and device, mobile terminal and storage medium
CN109639970B (en) Shooting method and terminal equipment
CN108495029B (en) Photographing method and mobile terminal
CN108495045B (en) Image capturing method, image capturing apparatus, electronic apparatus, and storage medium
CN108710456B (en) Application icon processing method and device and mobile terminal
CN108965691B (en) Camera control method and device, mobile terminal and storage medium
CN108664190B (en) Page display method, device, mobile terminal and storage medium
CN109040351B (en) Camera control method and device, mobile terminal and storage medium
CN109120841B (en) Camera control method and device, mobile terminal and storage medium
EP3299946B1 (en) Method and device for switching environment picture
CN108391058B (en) Image capturing method, image capturing apparatus, electronic apparatus, and storage medium
CN110868626A (en) Method and device for preloading content data
CN108848313B (en) Multi-person photographing method, terminal and storage medium
CN107807772A (en) Image processing method, device and mobile terminal
CN107566746B (en) Photographing method and user terminal
CN109618218B (en) Video processing method and mobile terminal
CN108833709A (en) A kind of the starting method and mobile terminal of camera
CN111464746B (en) Photographing method and electronic equipment
CN108132749B (en) Image editing method and mobile terminal
CN111083374B (en) Filter adding method and electronic equipment
CN109542307B (en) Image processing method, device and computer readable storage medium
CN111368114B (en) Information display method, device, equipment and storage medium
CN111064888A (en) Prompting method and electronic equipment
CN112131438A (en) Information generation method, information display method and device
CN112749590B (en) Object detection method, device, computer equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant