CN110213489B - Control method, control device and terminal equipment - Google Patents

Control method, control device and terminal equipment Download PDF

Info

Publication number
CN110213489B
CN110213489B CN201910536510.3A CN201910536510A CN110213489B CN 110213489 B CN110213489 B CN 110213489B CN 201910536510 A CN201910536510 A CN 201910536510A CN 110213489 B CN110213489 B CN 110213489B
Authority
CN
China
Prior art keywords
type
scene
telescopic
determining
target scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910536510.3A
Other languages
Chinese (zh)
Other versions
CN110213489A (en
Inventor
章捷彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201910536510.3A priority Critical patent/CN110213489B/en
Publication of CN110213489A publication Critical patent/CN110213489A/en
Priority to PCT/CN2020/080689 priority patent/WO2020253295A1/en
Application granted granted Critical
Publication of CN110213489B publication Critical patent/CN110213489B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/20Details of telephonic subscriber devices including a rotatable camera

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Studio Devices (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a control method, a control device and terminal equipment, wherein the method comprises the following steps: receiving a rotation operation of a user on the rotatable camera under the condition that a current scene is a target scene, wherein the rotation operation comprises a rotation angle and/or a rotation direction; determining an operation event corresponding to the target scene and the rotation operation based on the target scene; and executing the operation event. By the method, the control efficiency of the terminal can be improved, the accuracy of executing the operation event can be guaranteed, the actual use requirements of users can be met, and the user experience can be improved.

Description

Control method, control device and terminal equipment
Technical Field
The present application relates to the field of computer technologies, and in particular, to a control method, an apparatus, and a terminal device.
Background
With the continuous development of computer technology, mobile phone-based terminal devices gradually become necessities for people to live and work, and in order to meet the use requirements of people on the screen of the terminal device, manufacturers improve the screen occupation ratio by simplifying physical keys, hiding a front camera and the like, so as to obtain better screen use effects, such as a full-screen mobile phone provided with a telescopic camera.
When a user uses a full-screen mobile phone, in a specific scene, a specific use effect can be realized in a specific area of a screen through a specific operation. For example, when a user watches a video by using a full-screen mobile phone, the user can control the volume to be increased or decreased by a sliding operation in a specific area of the screen (such as the left area of the screen) upwards or downwards. Or when the user uses the full-screen mobile phone to shoot images, in the image preview frame, the user can control the focal length of the camera to be increased or decreased through the telescopic sliding operation of the two fingers on the screen.
However, when a user operates on the screen, since the screen is sensitive to the user operation, if the operation range of the user is too small, the terminal device may not react to the user operation, and meanwhile, if the operation range of the user is large, the terminal device may excessively react to the user operation. For example, when a user watches a video, the sound volume of the video playing may be rapidly increased due to a large sliding operation of a finger on a screen, which may cause very bad hearing to the user, or when the user performs image shooting, the preview image may be suddenly enlarged/reduced due to an excessively large/small sliding operation of a telescopic sliding operation, and the user needs to perform a plurality of operations to achieve a good use effect, which may result in a low efficiency of terminal control.
Disclosure of Invention
An object of the embodiments of the present application is to provide a control method, apparatus and device, so as to solve the problem in the prior art that a user needs to perform multiple operations on a screen to achieve a better use effect, and the resulting control efficiency of a terminal is low.
In order to solve the above technical problem, the embodiment of the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides a control method, where the method is applied to a terminal device configured with a rotatable camera, and the method includes:
receiving a rotation operation of a user on the rotatable camera under the condition that a current scene is a target scene, wherein the rotation operation comprises a rotation angle and/or a rotation direction;
determining an operation event corresponding to the target scene and the rotation operation based on the target scene;
and executing the operation event.
Optionally, in the case that the current scene is a target scene, receiving a rotation operation of the rotatable camera by a user includes:
starting a preset sensor and the rotatable camera under the condition that the current scene is a target scene;
and monitoring the rotating operation of the rotatable camera by the user through the preset sensor.
Optionally, the determining, based on the target scene, an operation event corresponding to the target scene and the rotation operation includes:
acquiring a scene type to which the target scene belongs;
determining the operation type of the operation event corresponding to the scene type based on the scene type to which the target scene belongs;
determining an operation parameter for executing the operation event based on the rotation operation and the operation type;
the executing the operational event comprises:
executing the operation event based on the operation parameter and the operation type.
Optionally, the determining, based on the rotation operation and the operation type, an operation parameter for executing the operation event includes:
determining a preset conversion relation corresponding to the operation type based on the operation type, wherein the preset conversion relation is determined based on data obtained by monitoring the rotation of the rotatable camera by the preset sensor;
determining a first operation parameter corresponding to the rotation angle based on the preset conversion relation;
determining an operating parameter for performing the operational event based on the first operating parameter and/or the rotational direction.
Optionally, the determining, based on the scene type to which the target scene belongs, an operation type of the operation event corresponding to the scene type includes:
receiving input operation of a user on an operation type of the operation event corresponding to a scene type to which the target scene belongs;
determining an operation type of the operation event based on the input operation.
In a second aspect, an embodiment of the present application provides a control method, where the method is applied to a terminal device configured with a scalable camera, and the method includes:
receiving a telescopic operation of a user on the telescopic camera under the condition that a current scene is a target scene, wherein the telescopic operation comprises a distance for the telescopic camera to extend or shorten;
determining an operation event corresponding to the target scene and the telescopic operation based on the target scene;
and executing the operation event.
In a third aspect, an embodiment of the present application provides a control apparatus, where the apparatus is configured with a rotatable camera, and the apparatus includes:
the operation receiving module is used for receiving the rotation operation of a user on the rotatable camera under the condition that the current scene is a target scene, wherein the rotation operation comprises a rotation angle and/or a rotation direction;
a first determination module, configured to determine, based on the target scene, an operation event corresponding to the target scene and the rotation operation;
and the first execution module is used for executing the operation event.
Optionally, the operation receiving module includes:
the starting unit is used for starting a preset sensor and the rotatable camera under the condition that the current scene is a target scene;
and the operation monitoring unit is used for monitoring the rotating operation of the rotatable camera by the user through the preset sensor.
Optionally, the first determining module includes:
the type acquisition unit is used for acquiring the scene type of the target scene;
a type determining unit, configured to determine, based on a scene type to which the target scene belongs, an operation type of the operation event corresponding to the scene type;
a parameter determination unit configured to determine an operation parameter for executing the operation event based on the rotation operation and the operation type;
the event execution module is configured to:
executing the operation event based on the operation parameter and the operation type.
The executing the operation event based on the operation parameter and the operation type includes:
and based on the operation parameters, performing focus adjustment on the preview image acquired based on the rear camera.
Optionally, the parameter determining unit is configured to:
determining a preset conversion relation corresponding to the operation type based on the operation type, wherein the preset conversion relation is determined based on data obtained by monitoring the rotation of the rotatable camera by the preset sensor;
determining a first operation parameter corresponding to the rotation angle based on the preset conversion relation;
determining an operating parameter for performing the operational event based on the first operating parameter and/or the rotational direction.
Optionally, the type determining unit is configured to:
receiving input operation of a user on an operation type of the operation event corresponding to a scene type to which the target scene belongs;
determining an operation type of the operation event based on the input operation.
In a fourth aspect, an embodiment of the present application provides a control apparatus, where the apparatus is configured with a retractable camera, and the apparatus includes:
the telescopic monitoring module is used for receiving telescopic operation of a user on the telescopic camera under the condition that the current scene is a target scene, wherein the telescopic operation comprises the distance for extending or shortening the telescopic camera;
a second determination module, configured to determine, based on the target scene, an operation event corresponding to the target scene and the scaling operation;
and the second execution module is used for executing the operation event.
In a fifth aspect, an embodiment of the present application provides a terminal device, which includes a processor, a memory, and a computer program stored on the memory and executable on the processor, where the computer program, when executed by the processor, implements the steps of the control method provided in the first aspect.
In a sixth aspect, an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and the computer program, when executed by a processor, implements the steps of the control method provided in the first aspect.
In a seventh aspect, an embodiment of the present application provides a terminal device, including a processor, a memory, and a computer program stored on the memory and executable on the processor, where the computer program, when executed by the processor, implements the steps of the control method provided in the second aspect.
In an eighth aspect, an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and the computer program, when executed by a processor, implements the steps of the control method provided in the second aspect.
According to the technical scheme provided by the embodiment of the application, when the current scene is the target scene, the rotation operation of the user on the rotatable camera is received, wherein the rotation operation comprises the rotation angle and/or the rotation direction, and then the operation event corresponding to the target scene and the rotation operation is determined based on the target scene, and the operation event is executed. Therefore, in a target scene, a user only needs to rotate the rotatable camera to control the terminal device to execute the corresponding operation event without repeatedly performing adjustment operation, and the control efficiency of the terminal is improved. Meanwhile, the terminal equipment can accurately execute the operation event according to the rotation operation of the user, the actual use requirement of the user is met, and the user experience is improved.
Drawings
In order to more clearly illustrate the embodiments of the present specification or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only some embodiments described in the present specification, and for those skilled in the art, other drawings can be obtained according to the drawings without any creative effort.
FIG. 1 illustrates an embodiment of a control method of the present application;
FIG. 2 is a schematic diagram of a method of operation based on a rotation angle determination according to the present application;
FIG. 3 illustrates another control method embodiment of the present application;
FIG. 4 is a schematic diagram of another method of operation based on rotation angle determination according to the present application;
FIG. 5 is a diagram of yet another control method embodiment of the present application;
FIG. 6 is a flowchart of yet another control method embodiment of the present application;
FIG. 7 illustrates an embodiment of a control device according to the present application;
FIG. 8 illustrates an exemplary control device of the present application;
fig. 9 is a terminal device according to an embodiment of the present application.
Detailed Description
The embodiment of the application provides a control method, a control device and terminal equipment.
In order to make those skilled in the art better understand the technical solutions in the present application, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present specification, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Example one
As shown in fig. 1, an execution subject of the method may be a terminal device, where the terminal device is configured with a rotatable camera, the terminal device may be a device such as a personal computer, or a mobile terminal device such as a mobile phone or a tablet computer, and the terminal device may be a terminal device used by a user. The method may specifically comprise the steps of:
in step S102, in the case that the current scene is the target scene, a rotation operation of the rotatable camera by the user is received,
the target scene may be a shooting scene (including an image shooting scene or a video shooting scene, etc.), an image viewing scene, a video playing scene, a reading scene, a display screen display parameter setting scene, etc., and the rotation operation may include a rotation angle and/or a rotation direction.
In implementation, with the continuous development of computer technology, a mobile phone-based terminal device gradually becomes a necessity for life and work of people, and in order to meet the use requirements of people on a screen of the terminal device, manufacturers improve the screen occupation ratio by simplifying physical keys, hiding a front camera and the like, so as to obtain a better screen use effect, such as a full-screen mobile phone configured with a retractable camera. When a user uses a full-screen mobile phone, in a specific scene, a specific use effect can be realized in a specific area of a screen through a specific operation. For example, when a user watches a video by using a full-screen mobile phone, the user can control the volume to be increased or decreased by a sliding operation in a specific area of the screen (such as the left area of the screen) upwards or downwards. Or when the user uses the full-screen mobile phone to shoot images, in the image preview frame, the user can control the focal length of the camera to be increased or decreased through the telescopic sliding operation of the two fingers on the screen.
However, when a user operates on the screen, since the screen is sensitive to the user operation, if the operation range of the user is too small, the terminal device may not react to the user operation, and meanwhile, if the operation range of the user is large, the terminal device may excessively react to the user operation. For example, when a user watches a video, the sound volume of the video playing may be rapidly increased due to a large sliding operation of a finger on a screen, which may cause very bad hearing to the user, or when the user performs image shooting, the preview image may be suddenly enlarged/reduced due to an excessively large/small sliding operation of a telescopic sliding operation, and the user needs to perform a plurality of operations to achieve a good use effect, which may result in a low efficiency of terminal control. Therefore, an embodiment of the present invention provides a technical solution capable of solving the above problems, which may specifically include the following:
after the user starts the terminal equipment, the terminal equipment can monitor the service condition of the user to the terminal equipment, and if the current scene is the target scene, the rotatable camera can be started and the rotation operation of the user to the rotatable camera can be received. For example, after the user starts the terminal device, the user may click the image capturing application, and at this time, the terminal device may monitor that the current scene is an image/video capturing scene (that is, the current scene is a target scene), and the terminal device may start the rotatable camera and receive the rotation operation of the user on the rotatable camera.
The rotatable camera can be hidden in the terminal equipment, and when a starting instruction is received, the terminal equipment can control the rotatable camera to stretch out of the terminal equipment so that a user can rotate the rotatable camera. Or, the rotatable camera can be arranged on one side of the terminal device and can be configured with physical keys, if the terminal device monitors that the current scene is the target scene, the physical keys can be started, at the moment, the user can click or double-click on the physical keys to start the rotatable camera, and then the user can rotate the extended rotatable camera.
In addition, in the case that the current scene is the target scene, if the terminal device does not receive the rotation operation of the user on the rotatable camera within the preset time range, the terminal device may close the rotatable camera (i.e., withdraw the rotatable camera into the terminal device). When receiving the preset use instruction of the user, the rotatable camera can be started again, and the rotation operation of the user on the rotatable camera is received. For example, if the video playing scene is the target scene, the preset time range is 2 minutes, when the terminal device monitors that the user opens the video playing application program and clicks the video 1 to watch, the terminal device may open the rotatable camera (i.e., control the rotatable camera to extend out of the terminal device), and if the terminal device does not receive the rotation operation of the user on the rotatable camera within 2 minutes, the terminal device may close the rotatable camera (i.e., withdraw the rotatable camera into the terminal device). If the user needs to rotate the rotatable camera, the user can control the terminal device to start the rotatable camera through a preset use instruction, wherein the preset use instruction can be to perform a preset operation in a preset area of the screen (for example, to perform a double-click operation on the left side of the screen), or the preset use instruction can also be to perform a preset operation on a physical key configured on the terminal device (for example, to double-click a volume amplification key), and the like.
In step S104, based on the target scene, an operation event corresponding to the target scene and the rotation operation is determined.
In implementation, the operation event corresponding to the target scene and the rotation operation may be determined according to a preset correspondence. For example, the target scene may include an image capturing scene, a video playing scene, and a reading scene, the rotation operation may include a first rotation operation performed in a clockwise direction and a second rotation operation performed in a counterclockwise direction, and based on a preset correspondence (e.g., the preset correspondence shown in table 1), operation events corresponding to different rotation operations in different target scenes may be determined.
TABLE 1
Object scene Rotating operation Operational events
Image capture scene First rotation operation Focal length magnification event
Image capture scene Second rotation operation Focus zoom out event
Video playing scene First rotation operation Volume up event
Video playing scene Second rotation operation Volume down event
Reading scene First rotation operation Font magnification event
Reading scene Second rotation operation Font reduction event
Based on the preset corresponding relationship in table 1, the operation events corresponding to different rotation operations in different target scenarios can be determined. For example, the target scene is a video playing scene, and the terminal device may determine whether to be a volume-up event or a volume-down event according to a rotation direction of the rotatable camera by the user. If the user can rotate the rotatable camera clockwise (i.e. the rotation operation is the first rotation), the corresponding operation event can be a volume-up event. Then, the corresponding volume adjustment amplitude may be determined according to the rotation angle of the rotatable camera by the user, and if the user rotates the rotatable camera from 0 degree (i.e., the initial position of the rotatable camera when extending out from the terminal device) to 5 degrees, the corresponding volume adjustment amplitude may be 5. The volume corresponding to the initial position of the rotatable camera when the rotatable camera extends out may be a real-time volume in the current scene.
In addition, the operation event corresponding to the target scene can be determined according to the rotation angle of the rotatable camera by the user. For example, the scales corresponding to different rotation angles may be set based on the initial position of the rotatable camera when it is extended from the terminal device, as shown in fig. 2, 0 to 180 degrees in the clockwise direction in the initial position may correspond to 0 to 180 degrees, and 0 to 180 degrees in the counterclockwise direction in the initial position may correspond to 0 to 180 degrees. If the target scene is a reading scene, and the user rotates the rotatable camera to a range from 0 degree to 180 degrees, the corresponding operation event may be a font amplification event, and the specific font amplification amplitude may be determined according to the rotation angle. Conversely, if the user rotates the rotatable camera to a range of 0 degrees to-180 degrees, the corresponding operation event may be a font reduction event, and the specific font reduction amplitude may also be determined according to the rotation angle.
If the target scene is a picture preview switching scene, the preview picture can be switched according to the rotating direction of the rotatable camera by the user. For example, when the user rotates the rotatable camera clockwise by one, the terminal device may switch the currently displayed preview image to the next preview image for display, and if the user rotates the rotatable camera counterclockwise by one, the terminal device may switch the currently displayed preview image to the previous preview image for display.
In addition, the rotatable camera configured in the terminal device can also have a telescopic function, and in the same target scene, the corresponding operation event can be determined according to the telescopic distance of the rotatable camera and the rotation operation of the user. For example, after the terminal device starts the rotatable camera, the rotatable camera automatically extends out of the terminal device and is fixed at a position 1cm away from the terminal device, at this time, a user can adjust the rotatable camera from the position 1cm away from the terminal device to a position 1.5cm or 0.5cm away from the terminal device, and the terminal device can determine a corresponding operation event based on a preset telescopic operation relationship according to the finally monitored distance between the rotatable camera and the terminal device. For example, in the case that the target scene is a reading scene, the user may adjust the distance between the rotatable camera and the terminal device from 1cm to 0.5cm, may determine that the operation corresponding to the target scene may be a font increasing event, and determine that the corresponding font increasing amplitude is determined according to the rotation operation of the rotatable camera by the user.
In step S106, an operation event is executed.
In implementation, after the operation event is executed, the rotatable camera may be turned off (that is, the rotatable camera may be retracted into the terminal device), when it is monitored that the target scene changes or a user switching instruction is received, the rotatable camera may be turned on again, the operation event corresponding to the target scene at that time is determined according to the rotation operation of the user, and the operation event is executed.
The embodiment of the specification provides a control method, which is used for receiving a rotation operation of a user on a rotatable camera in the case that a current scene is a target scene, wherein the rotation operation comprises a rotation angle and/or a rotation direction, then determining an operation event corresponding to the target scene and the rotation operation based on the target scene, and executing the operation event. Therefore, in a target scene, a user only needs to rotate the rotatable camera to control the terminal device to execute the corresponding operation event without repeatedly performing adjustment operation, and the control efficiency of the terminal is improved. Meanwhile, the terminal equipment can accurately execute the operation event according to the rotation operation of the user, the actual use requirement of the user is met, and the user experience is improved.
Example two
As shown in fig. 3, the present specification provides a control method, where an execution subject of the method may be a terminal device, where the terminal device is configured with a rotatable camera, the terminal device may be a device such as a personal computer, or a mobile terminal device such as a mobile phone or a tablet computer, and the terminal device may be a terminal device used by a user. The method may specifically comprise the steps of:
in step S302, in the case where the current scene is the target scene, the predetermined sensor and the rotatable camera are activated.
The predetermined sensor may be an angle sensor, and may be configured to monitor a rotation operation of the rotatable camera by a user.
In step S304, the rotation operation of the rotatable camera by the user is monitored by a predetermined sensor.
In an implementation, the rotation angle and/or the rotation direction of the rotatable camera by the user may be monitored by a predetermined sensor. If the preset sensor does not monitor the rotating operation of the user on the rotatable camera within the preset time range, the preset sensor and the rotatable camera can be closed, and when a preset using instruction of the user is received, the preset sensor and the rotatable camera can be opened again.
In step S306, a scene type to which the target scene belongs is acquired.
The scene type may include a volume adjustment type, a focus adjustment type, a font adjustment type, a brightness adjustment type, a contrast adjustment type, a color temperature adjustment type, a picture/video preview switching type, and the like, and the operation event may be an event relative to the target scene and may include an event type, an operation parameter, and the like, for example, the operation event may be a volume increase event (including a corresponding volume adjustment amplitude), a font reduction event (including a corresponding font adjustment amplitude), a picture preview switching event, and the like.
In implementation, a user may set different target scenes corresponding to different scene types, where the scene types may include a volume adjustment type, a focus adjustment type, a font adjustment type, a brightness adjustment type, a contrast adjustment type, a color temperature adjustment type, a picture/video preview switching type, and the like. For example, as shown in table 2, for different target scenes, a user may set different or the same scene types, and when it is monitored that the current scene is the target scene, the scene type to which the target scene belongs may be obtained. If the terminal device monitors that the user is using the reading application program to read the characters, the current scene can be determined as the reading scene, and the corresponding scene type can be the font adjusting scene, that is, the scene type to which the target scene belongs is the font adjusting type.
TABLE 2
Object scene Type of scene
Video playing scene Type of volume adjustment
Video preview scene Type of volume adjustment
Image capture scene Type of focus adjustment
Video shooting scene Type of brightness adjustment
Reading scene Font adjustment type
In addition, an input selection box of the scene type can be provided for the user according to a received preset switching instruction of the user, so that the user can switch the scene type of the target scene. For example, the predetermined switching instruction may be a double-click on key, and when the user uses the terminal device to read text, the scene type to which the current scene belongs may be determined to be the font adjustment type according to the scene type corresponding to the target scene in table 2. If the user does not need to adjust the font at this time, the user can double click the power-on key, the terminal device can provide an input selection frame for the user on the screen after receiving the preset switching instruction, the user can select the corresponding scene type according to the actual requirement (for example, the user can change the scene type to which the target scene belongs to the brightness adjustment type), and after the user selects the scene type to be switched, the terminal device can prompt the user whether to store the scene type and update the scene type to which the target scene belongs according to the scene type.
In step S308, the operation type of the operation event corresponding to the scene type is determined based on the scene type to which the target scene belongs.
The operation type of the operation event may be a type corresponding to a scene type, for example, the operation type of the operation event may be a volume adjustment type, a focus adjustment type, a font adjustment type, a brightness adjustment type, a contrast adjustment type, a color temperature adjustment type, a picture/video preview switching type, and the like.
The specific processing procedure of step S308 may refer to the related contents in step S104, and is not described again here.
In practical applications, the processing manner of step S308 may be various, and an alternative implementation manner is provided below, which may specifically refer to the following processing of step one and step two.
Step one, receiving input operation of a user on an operation type of an operation event corresponding to a scene type to which a target scene belongs.
In implementation, an input selection box can be provided for a user, and the user can select the operation type of the operation event according to actual requirements. For example, the terminal device monitors that the user opens a video playing application and selects a video to play, before or during playing the video, the terminal device may display an input selection box on the screen, for example, the input selection box may include tags of "volume adjustment", "brightness adjustment", and "color temperature adjustment", and the user may select an operation type of a currently required operation event in the input selection box, for example, the user may select "color temperature adjustment". Meanwhile, in the input selection box, a selection box of "whether to use the operation type as a default operation type of the operation event corresponding to the video playing scene" may be provided for the user, and if the user selects to use the operation type selected this time (e.g., "color temperature adjustment") as the default operation type of the operation event corresponding to the video playing scene, when the user enters the video playing application program again to watch the video, the terminal device may adjust the color temperature as the operation type of the operation event corresponding to the video playing scene without providing the input selection box of the operation type for the user again.
In addition, after the default operation type is saved, if the user needs to change the operation type of the operation event, a preset operation may be performed on the preset key to restart the input function of the operation type of the operation event, that is, the terminal device may receive an input operation of the user on the actual operation type of the operation corresponding to the scene type to which the target scene belongs again.
And step two, determining the operation type of the operation event based on the input operation.
In step S310, operation parameters for executing the operation event are determined based on the rotation operation and the operation type.
The specific processing procedure of step S310 may refer to the related contents in step S104, and is not described again.
In practical applications, the processing manner of step S310 may be various, and an alternative implementation manner is provided below, which may specifically refer to the following processing from step one to step three.
Step one, determining a preset conversion relation corresponding to an operation type based on the operation type.
Wherein the predetermined conversion relationship may be determined based on data obtained by monitoring the rotation of the rotatable camera by a predetermined sensor.
In implementation, different operation types may correspond to different preset conversion relationships, and meanwhile, the preset conversion relationships corresponding to data obtained by different rotation operations may also be different.
For example, the preset conversion relationship corresponding to the volume adjustment type may be a conversion relationship between data corresponding to the rotation angle and the volume adjustment amplitude, and the preset conversion relationship corresponding to the font adjustment type may be a relationship between data corresponding to the rotation angle and the font adjustment amplitude, and the like.
Taking the preset conversion relationship corresponding to the volume adjustment type as an example, as shown in fig. 4, in fig. 4(a), the initial position of the rotatable camera protruding out of the terminal device is taken as the origin, and the arrangement is performed by 0 degree to 360 degrees in the clockwise direction, then the corresponding preset conversion relationship may be the preset conversion relationship 1, and in fig. 4(b), the initial position of the rotatable camera protruding out of the terminal device is taken as the origin, the arrangement is performed by 0 degree to 180 degrees in the range of 180 degrees in the clockwise direction, and the arrangement is performed by 0 degree to-180 degrees in the range of 180 degrees in the counterclockwise direction, then the corresponding preset conversion relationship may be the preset conversion relationship 2.
And step two, determining a first operation parameter corresponding to the rotation angle based on a preset conversion relation.
In the implementation, taking the preset conversion relationship corresponding to the volume adjustment type as an example, as shown in fig. 4, in fig. 4(a), different first operation parameters may be converted for different degrees based on the preset conversion relationship 1, for example, if the operation is rotated by 1 degree in the clockwise direction, the obtained corresponding first operation parameters may be 4 grids based on the preset conversion relationship 1. In fig. 4(b), different first operating parameters may be converted for different degrees based on the preset conversion relation 2, for example, if the counter-clockwise direction is rotated by 1 degree, the corresponding first operating parameter obtained based on the preset conversion relation 1 may be-1 grid.
And step three, determining the operation parameters for executing the operation event based on the first operation parameters and/or the rotation direction.
In an implementation, the operation parameter for executing the operation event may be determined based on the first operation parameter, for example, in fig. 4(b), if the obtained first parameter is-1 lattice, the operation parameter for executing the operation event may be determined as-1 lattice, that is, the volume may be decreased by one lattice.
Further, an operating parameter for performing the operating event may also be determined based on the first operating parameter and the direction of rotation. Taking the preset conversion relationship corresponding to the above volume adjustment type as an example, in fig. 4(a), if the rotation direction of the user is clockwise and the first operation parameter is 4, the operation parameter for executing the operation event may be 4, that is, the volume is increased by 4.
Alternatively, the operational parameters for performing the operational event may also be determined based on the rotational direction. For example, the operation event may be a picture switching preview event, and if the rotation direction is clockwise, the corresponding operation parameter may be to acquire a previous preview image.
In addition, the rotatable camera configured in the terminal equipment can be a front camera, the rear camera can be configured in the terminal equipment, if the scene type of the target scene is a shooting scene (including a picture shooting scene or a video shooting scene, the operation type of the operation event corresponding to the shooting scene can be determined to be focal length adjustment, and based on the steps from the first step to the third step, the operation parameter for adjusting the focal length of the rear camera can be determined.
And based on the operation parameters, performing focus adjustment on the preview image acquired based on the rear camera.
In step S312, based on the operation parameter and the operation type, the operation event is executed.
The embodiment of the specification provides a control method, which is used for receiving a rotation operation of a user on a rotatable camera in the case that a current scene is a target scene, wherein the rotation operation comprises a rotation angle and/or a rotation direction, then determining an operation event corresponding to the target scene and the rotation operation based on the target scene, and executing the operation event. Therefore, in a target scene, a user only needs to rotate the rotatable camera to control the terminal device to execute the corresponding operation event without repeatedly performing adjustment operation, and the control efficiency of the terminal is improved. Meanwhile, the terminal equipment can accurately execute the operation event according to the rotation operation of the user, the actual use requirement of the user is met, and the user experience is improved.
EXAMPLE III
As shown in fig. 5, an execution subject of the method may be a terminal device, where the terminal device is configured with a retractable camera, the terminal device may be a device such as a personal computer, or a mobile terminal device such as a mobile phone or a tablet computer, and the terminal device may be a terminal device used by a user. The method may specifically comprise the steps of:
in step S502, when the current scene is the target scene, a zoom operation of the user on the retractable camera is received.
Wherein the telescopic operation may include a distance by which the telescopic camera is extended or shortened.
In implementation, when the current scene is the target scene, the retractable camera can be opened, extended out of the terminal device, and the retracting operation of the user on the retractable camera can be received. The distance that the retractable camera is extended or shortened may be a distance difference between a final position where the user moves the retractable camera and an initial position where the retractable camera is extended from the terminal device.
In step S504, based on the target scene, an operation event corresponding to the target scene and the scaling operation is determined.
In implementation, a user may set different target scenes corresponding to different scene types, and for different scene types, a corresponding operation event may be determined according to a scaling operation. For example, the target scene may be an image capturing scene, the corresponding scene type may be a focus adjustment type or a brightness adjustment type, and the corresponding operation event may be determined according to the received scaling operation. For example, the received telescopic operation may be to extend the telescopic camera by one gear (the first gear may be 0.5cm), and the corresponding operation event may be to zoom in by one grid or to increase the brightness by one gear. Or, if the received telescopic operation is to shorten the telescopic camera by one gear, the corresponding operation event may be to shorten the focal length by one gear, or to turn down the brightness by one gear.
In step S506, an operation event is executed.
In implementation, after the operation event is executed, the scalable camera may be turned off (that is, the scalable camera may be retracted into the terminal device), when it is monitored that the target scene changes or a user switching instruction is received, the scalable camera may be turned on again, and according to the user's scaling operation, the operation event corresponding to the target scene at that time is determined, and the operation event is executed.
The embodiment of the specification provides a control method, which includes receiving a telescopic operation of a user on a telescopic camera in a case that a current scene is a target scene, wherein the telescopic operation includes a distance for the telescopic camera to extend or shorten, determining an operation event corresponding to the target scene and the telescopic operation based on the target scene, and executing the operation event. Therefore, in a target scene, a user only needs to perform telescopic operation on the telescopic camera to control the terminal equipment to execute a corresponding operation event, multiple times of repeated adjustment operation is not needed, and the control efficiency of the terminal is improved. Meanwhile, the terminal equipment can accurately execute the operation event according to the rotation operation of the user, the actual use requirement of the user is met, and the user experience is improved.
Example four
As shown in fig. 6, an execution subject of the method may be a terminal device, where the terminal device is configured with a retractable camera, the terminal device may be a device such as a personal computer, or a mobile terminal device such as a mobile phone or a tablet computer, and the terminal device may be a terminal device used by a user. The method may specifically comprise the steps of:
in step S602, in the case where the current scene is the target scene, the predetermined sensor and the retractable camera are activated.
The preset sensor can be an angle sensor and can be used for monitoring the telescopic operation of the user on the telescopic camera.
In step S604, the user' S operation of extending and retracting the retractable camera is monitored by a predetermined sensor.
Wherein the telescopic operation includes a distance by which the telescopic camera is extended or shortened.
In step S606, a scene type to which the target scene belongs is acquired.
In step S608, the operation type of the operation event corresponding to the scene type is determined based on the scene type to which the target scene belongs.
The specific processing procedures of steps S606-S608 may refer to the related contents of steps S306-S308 in the above embodiments, and are not described again.
In step S610, operation parameters for executing the operation event are determined based on the scaling operation and the operation type.
The specific processing procedure of step S610 may refer to the related contents in step S104 of the above embodiment, and is not described again here.
In step S612, based on the operation parameter and the operation type, an operation event is executed.
The embodiment of the present specification provides a control method, which receives a telescopic operation of a user on a telescopic camera when a current scene is monitored to be a target scene, where the telescopic operation includes a distance that the telescopic camera extends or shortens, and then determines an operation event corresponding to the target scene and the telescopic operation based on the target scene to execute the operation event. Therefore, in a target scene, a user only needs to perform telescopic operation on the telescopic camera to control the terminal equipment to execute a corresponding operation event, multiple times of repeated adjustment operation is not needed, and the control efficiency of the terminal is improved. Meanwhile, the terminal equipment can accurately execute the operation event according to the rotation operation of the user, the actual use requirement of the user is met, and the user experience is improved.
EXAMPLE five
Based on the same idea, the control method provided in the embodiment of the present specification further provides a control device, as shown in fig. 7.
The control device includes: operating the receiving module 701, the first determining module 702 and the first executing module 703, wherein:
an operation receiving module 701, configured to receive a rotation operation of a user on the rotatable camera when a current scene is a target scene, where the rotation operation includes a rotation angle and/or a rotation direction;
a first determining module 702, configured to determine, based on the target scene, an operation event corresponding to the target scene and the rotation operation;
a first executing module 703 is configured to execute the operation event.
In this embodiment of the present invention, the operation receiving module 701 includes:
the starting unit is used for starting a preset sensor and the rotatable camera under the condition that the current scene is a target scene;
and the operation monitoring unit is used for monitoring the rotating operation of the rotatable camera by the user through the preset sensor.
In this embodiment of the present invention, the first determining module 702 includes:
the type acquisition unit is used for acquiring the scene type of the target scene;
a type determining unit, configured to determine, based on a scene type to which the current scene belongs, an operation type of the operation event corresponding to the scene type;
a parameter determination unit configured to determine an operation parameter for executing the operation event based on the rotation operation and the operation type;
the event execution module is configured to:
executing the operation event based on the operation parameter and the operation type.
In the embodiment of the present invention, the rotatable camera is a front camera, the terminal device is further configured with a rear camera, the type of the current scene is a shooting scene,
the type determining unit is configured to:
determining that the operation type of the operation event corresponding to the shooting scene is focus adjustment;
the executing the operation event based on the operation parameter and the operation type includes:
and based on the operation parameters, performing focus adjustment on the preview image acquired based on the rear camera.
In an embodiment of the present invention, the parameter determining unit is configured to:
determining a preset conversion relation corresponding to the operation type based on the operation type, wherein the preset conversion relation is determined based on data obtained by monitoring the rotation of the rotatable camera by the preset sensor;
determining a first operation parameter corresponding to the rotation angle based on the preset conversion relation;
determining an operating parameter for performing the operational event based on the first operating parameter and/or the rotational direction.
In an embodiment of the present invention, the type determining unit is configured to:
receiving input operation of a user on the operation type of the operation event corresponding to the scene type of the current scene;
determining an operation type of the operation event based on the input operation.
The control apparatus according to the embodiment of the present invention may further execute the method executed by the terminal device in fig. 1 to 4, and implement the functions of the terminal device in the embodiments shown in fig. 1 to 4, which are not described herein again.
The embodiment of the specification provides a control device, which receives a rotation operation of a user on a rotatable camera in the case that a current scene is a target scene, wherein the rotation operation comprises a rotation angle and/or a rotation direction, and then determines an operation event corresponding to the target scene and the rotation operation based on the target scene, and executes the operation event. Therefore, in a target scene, a user only needs to rotate the rotatable camera to control the terminal device to execute the corresponding operation event without repeatedly performing adjustment operation, and the control efficiency of the terminal is improved. Meanwhile, the terminal equipment can accurately execute the operation event according to the rotation operation of the user, the actual use requirement of the user is met, and the user experience is improved.
EXAMPLE six
Based on the same idea, the control method provided in the embodiments of the present specification further provides a control device configured with a retractable camera, as shown in fig. 8.
The control device includes: a stretch monitoring module 801, a second determining module 802, and a second executing module 803, wherein:
a telescopic monitoring module 801, configured to receive a telescopic operation of a user on the telescopic camera when a current scene is a target scene, where the telescopic operation includes a distance that the telescopic camera extends or shortens;
a second determining module 802, configured to determine, based on the target scene, an operation event corresponding to the target scene and the scaling operation;
a second executing module 803, configured to execute the operation event.
The control apparatus in the embodiment of the present invention may further execute the method executed by the terminal device in fig. 5 to 6, and implement the functions of the terminal device in the embodiments shown in fig. 5 to 6, which are not described herein again.
The embodiment of the specification provides a control device, which receives a telescopic operation of a user on a telescopic camera in the case that a current scene is a target scene, wherein the telescopic operation comprises a distance for the telescopic camera to extend or shorten, and then determines an operation event corresponding to the target scene and the telescopic operation based on the target scene to execute the operation event. Therefore, in a target scene, a user only needs to perform telescopic operation on the telescopic camera to control the terminal equipment to execute a corresponding operation event, multiple times of repeated adjustment operation is not needed, and the control efficiency of the terminal is improved. Meanwhile, the terminal equipment can accurately execute the operation event according to the rotation operation of the user, the actual use requirement of the user is met, and the user experience is improved.
EXAMPLE seven
Figure 9 is a schematic diagram of a hardware structure of a terminal device implementing various embodiments of the present invention,
the terminal device 900 includes but is not limited to: a radio frequency unit 901, a network module 902, an audio output unit 903, an input unit 904, a sensor 905, a display unit 906, a user input unit 907, an interface unit 908, a memory 909, a processor 910, and a power supply 911. Those skilled in the art will appreciate that the terminal device configuration shown in fig. 9 does not constitute a limitation of the terminal device, and that the terminal device may include more or fewer components than shown, or combine certain components, or a different arrangement of components. In the embodiment of the present invention, the terminal device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The processor 910 is configured to receive, when a current scene is a target scene, a rotation operation of a user on the rotatable camera, where the rotation operation includes a rotation angle and/or a rotation direction;
a processor 910, further configured to determine, based on the target scene, an operation event corresponding to the target scene and the rotation operation;
and the processor 910 is further configured to execute the operation event.
In addition, the processor 910 is further configured to activate a predetermined sensor and the rotatable camera if the current scene is a target scene;
in addition, the processor 910 is further configured to monitor a rotation operation of the rotatable camera by the user through the predetermined sensor.
In addition, the processor 910 is further configured to obtain a scene type to which the target scene belongs;
in addition, the processor 910 is further configured to determine, based on a scene type to which the target scene belongs, an operation type of the operation event corresponding to the scene type;
additionally, the processor 910 is further configured to determine an operation parameter for executing the operation event based on the rotation operation and the operation type;
in addition, the processor 910 is further configured to execute the operation event based on the operation parameter and the operation type.
In addition, the processor 910 is further configured to obtain the light intensity of the current environment through the photosensitive sensor when a preset light monitoring period is reached.
In addition, the rotatable camera is a front camera, the terminal equipment is also provided with a rear camera, the scene type of the current scene is a shooting scene,
the processor 910 is further configured to determine that an operation type of the operation event corresponding to the shooting scene is focus adjustment;
the processor 910 is further configured to perform focus adjustment on a preview image acquired based on the rear camera based on the operation parameter.
In addition, the processor 910 is further configured to determine, based on the operation type, a preset conversion relationship corresponding to the operation type, where the preset conversion relationship is determined based on data obtained by monitoring rotation of the rotatable camera by the predetermined sensor;
in addition, the processor 910 is further configured to determine a first operating parameter corresponding to the rotation angle based on the preset conversion relationship;
additionally, the processor 910 is further configured to determine an operating parameter for executing the operating event based on the first operating parameter and/or the rotation direction.
In addition, the processor 910 is further configured to receive an input operation of a user on an operation type of the operation event corresponding to a scene type to which the target scene belongs;
in addition, the processor 910 is further configured to determine an operation type of the operation event based on the input operation.
Further, the terminal apparatus 900 may further perform the following processing procedures:
the terminal device 910 is configured to receive a telescopic operation of a user on the telescopic camera when a current scene is a target scene, where the telescopic operation includes a distance that the telescopic camera extends or shortens;
in addition, the terminal device 910 is further configured to determine, based on the target scene, an operation event corresponding to the target scene and the scaling operation;
in addition, the terminal device 910 is further configured to execute the operation event.
The embodiment of the application provides a terminal device, which receives a rotation operation of a user on a rotatable camera under the condition that a current scene is a target scene, wherein the rotation operation comprises a rotation angle and/or a rotation direction, and then determines an operation event corresponding to the target scene and the rotation operation based on the target scene to execute the operation event. Therefore, in a target scene, a user only needs to rotate the rotatable camera to control the terminal device to execute the corresponding operation event without repeatedly performing adjustment operation, and the control efficiency of the terminal is improved. Meanwhile, the terminal equipment can accurately execute the operation event according to the rotation operation of the user, the actual use requirement of the user is met, and the user experience is improved.
It should be understood that, in the embodiment of the present application, the radio frequency unit 901 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, after receiving downlink data from a base station, the downlink data is processed by the processor 910; in addition, the uplink data is transmitted to the base station. Generally, the radio frequency unit 901 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 901 can also communicate with a network and other devices through a wireless communication system.
The terminal device provides wireless broadband internet access to the user through the network module 902, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 903 may convert audio data received by the radio frequency unit 901 or the network module 902 or stored in the memory 909 into an audio signal and output as sound. Also, the audio output unit 903 may also provide audio output related to a specific function performed by the terminal apparatus 900 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 903 includes a speaker, a buzzer, a receiver, and the like.
The input unit 904 is used to receive audio or video signals. The input Unit 904 may include a Graphics Processing Unit (GPU) 9041 and a microphone 9042, and the Graphics processor 9041 processes image data of a still picture or video obtained by an image capturing device (such as a camera) in a video capture mode or an image capture mode. The processed image frames may be displayed on the display unit 906. The image frames processed by the graphic processor 9041 may be stored in the memory 909 (or other storage medium) or transmitted via the radio frequency unit 901 or the network module 902. The microphone 9042 can receive sounds and can process such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 901 in case of the phone call mode.
The terminal device 900 also includes at least one sensor 905, such as light sensors, motion sensors, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 9061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 9061 and/or backlight when the terminal device 900 is moved to the ear. As one of the motion sensors, the accelerometer sensor can monitor the magnitude of acceleration in each direction (generally, three axes), can monitor the magnitude and direction of gravity when stationary, and can be used for identifying the attitude of the terminal device (such as horizontal and vertical screen switching, related games, magnetometer attitude calibration), and vibration identification related functions (such as pedometer and tapping); the sensors 905 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which are not described in detail herein.
The display unit 906 is used to display information input by the user or information provided to the user. The Display unit 906 may include a Display panel 9061, and the Display panel 9061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 907 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal device. Specifically, the user input unit 907 includes a touch panel 9071 and other input devices 9072. The touch panel 9071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 9071 (e.g., operations by a user on or near the touch panel 9071 using a finger, a stylus, or any other suitable object or accessory). The touch panel 9071 may include two parts, a touch monitoring device and a touch controller. The touch monitoring device monitors the touch direction of a user, monitors signals brought by touch operation and transmits the signals to the touch controller; the touch controller receives touch information from the touch monitoring device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 910, receives a command sent by the processor 910, and executes the command. In addition, the touch panel 9071 may be implemented by using various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The user input unit 907 may include other input devices 9072 in addition to the touch panel 9071. Specifically, the other input devices 9072 may include, but are not limited to, a physical keyboard, function keys (such as a volume control key, a switch key, and the like), a track ball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 9071 may be overlaid on the display panel 9061, and when the touch panel 9071 detects a touch operation on or near the touch panel 9071, the touch panel is transmitted to the processor 910 to determine the type of the touch event, and then the processor 910 provides a corresponding visual output on the display panel 9061 according to the type of the touch event. Although in fig. 9, the touch panel 9071 and the display panel 9061 are implemented as two independent components to implement the input and output functions of the terminal device, in some embodiments, the touch panel 9071 and the display panel 9061 may be integrated to implement the input and output functions of the terminal device, which is not limited herein.
The interface unit 908 is an interface for connecting an external device to the terminal apparatus 900. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 908 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the terminal apparatus 900 or may be used to transmit data between the terminal apparatus 900 and external devices.
The memory 909 may be used to store software programs as well as various data. The memory 909 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 909 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 910 is a control center of the terminal device, connects various parts of the entire terminal device with various interfaces and lines, and performs various functions of the terminal device and processes data by running or executing software programs and/or modules stored in the memory 909 and calling data stored in the memory 909, thereby performing overall monitoring of the terminal device. Processor 910 may include one or more processing units; preferably, the processor 910 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It is to be appreciated that the modem processor described above may not be integrated into processor 910.
The terminal device 900 may further include a power supply 911 (e.g., a battery) for supplying power to various components, and preferably, the power supply 911 may be logically connected to the processor 910 through a power management system, so as to manage charging, discharging, and power consumption management functions through the power management system.
Preferably, an embodiment of the present invention further provides a terminal device, which includes a processor 910, a memory 909, and a computer program that is stored in the memory 909 and can be run on the processor 910, and when the computer program is executed by the processor 910, the computer program implements each process of the above control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
Example eight
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
The embodiment of the application provides a computer-readable storage medium, which receives a rotation operation of a user on a rotatable camera in the case that a current scene is a target scene, wherein the rotation operation comprises a rotation angle and/or a rotation direction, and then determines an operation event corresponding to the target scene and the rotation operation based on the target scene, and executes the operation event. Therefore, in a target scene, a user only needs to rotate the rotatable camera to control the terminal device to execute the corresponding operation event without repeatedly performing adjustment operation, and the control efficiency of the terminal is improved. Meanwhile, the terminal equipment can accurately execute the operation event according to the rotation operation of the user, the actual use requirement of the user is met, and the user experience is improved.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (7)

1. A control method applied to a terminal device provided with a rotatable camera, the method comprising:
receiving a rotation operation of a user on the rotatable camera under the condition that a current scene is a target scene, wherein the rotation operation comprises a rotation angle and/or a rotation direction;
determining an operation event corresponding to the target scene and the rotation operation based on the target scene;
executing the operation event;
wherein the determining, based on the target scene, an operational event corresponding to the target scene and the rotation operation comprises:
acquiring a scene type to which the target scene belongs;
determining the operation type of the operation event corresponding to the scene type based on the scene type to which the target scene belongs;
determining an operation parameter for executing the operation event based on the rotation operation and the operation type;
the executing the operational event comprises:
executing the operation event based on the operation parameter and the operation type;
wherein the determining of the operation parameters for executing the operation event based on the rotation operation and the operation type comprises:
determining a preset conversion relation corresponding to the operation type based on the operation type, wherein the preset conversion relation is determined based on data obtained by monitoring the rotation of the rotatable camera by a preset sensor; wherein, different operation types correspond to different preset conversion relations;
determining a first operation parameter corresponding to the rotation angle based on the preset conversion relation under the condition that the rotation operation comprises the rotation angle; determining an operating parameter for executing the operating event based on the first operating parameter;
in a case where the rotation operation includes a rotation direction, determining an operation parameter for performing the operation event based on the rotation direction;
under the condition that the rotation operation comprises a rotation angle and a rotation direction, determining a first operation parameter corresponding to the rotation angle based on the preset conversion relation; determining an operating parameter for performing the operating event based on the first operating parameter and/or the rotational direction;
the receiving, when the current scene is a target scene, a rotation operation of the rotatable camera by a user includes:
starting the preset sensor and the rotatable camera under the condition that the current scene is a target scene;
and monitoring the rotating operation of the rotatable camera by the user through the preset sensor.
2. The method according to claim 1, wherein the determining an operation type of the operation event corresponding to the scene type based on the scene type to which the target scene belongs comprises:
receiving input operation of a user on an operation type of the operation event corresponding to a scene type to which the target scene belongs;
determining an operation type of the operation event based on the input operation.
3. A control method is applied to a terminal device configured with a telescopic camera, and comprises the following steps:
receiving a telescopic operation of a user on the telescopic camera under the condition that a current scene is a target scene, wherein the telescopic operation comprises a distance for the telescopic camera to extend or shorten;
determining an operation event corresponding to the target scene and the telescopic operation based on the target scene;
executing the operation event;
wherein the determining, based on the target scene, an operation event corresponding to the target scene and the scaling operation comprises:
acquiring a scene type to which the target scene belongs;
determining the operation type of the operation event corresponding to the scene type based on the scene type to which the target scene belongs;
determining an operation parameter for executing the operation event based on the scaling operation and the operation type;
the executing the operational event comprises:
executing the operation event based on the operation parameter and the operation type;
wherein the determining of the operation parameters for executing the operation event based on the scaling operation and the operation type comprises:
determining a preset conversion relation corresponding to the operation type based on the operation type, wherein the preset conversion relation is determined by data obtained by monitoring the expansion and contraction of the telescopic camera based on a preset sensor; wherein, different operation types correspond to different preset conversion relations;
under the condition that the telescopic operation comprises a telescopic distance, determining a second operation parameter corresponding to the telescopic distance based on the preset conversion relation; determining an operating parameter for executing the operating event based on the second operating parameter;
determining an operation parameter for executing the operation event based on the telescopic direction in the case that the telescopic operation includes the telescopic direction;
under the condition that the telescopic operation comprises a telescopic distance and a telescopic direction, determining a second operation parameter corresponding to the telescopic distance based on the preset conversion relation; determining an operating parameter for executing the operating event based on the second operating parameter and/or the telescoping direction;
under the condition that the current scene is the target scene, receiving the telescopic operation of the user on the telescopic camera, wherein the telescopic operation comprises the following steps:
starting the preset sensor and the telescopic camera under the condition that the current scene is a target scene;
and monitoring the telescopic operation of the user on the telescopic camera through the preset sensor.
4. A control device, characterized in that the device is provided with a rotatable camera, the device comprising:
the operation receiving module is used for receiving the rotation operation of a user on the rotatable camera under the condition that the current scene is a target scene, wherein the rotation operation comprises a rotation angle and/or a rotation direction;
a first determination module, configured to determine, based on the target scene, an operation event corresponding to the target scene and the rotation operation;
the first execution module is used for executing the operation event;
wherein the first determining module comprises:
the type acquisition unit is used for acquiring the scene type of the target scene;
a type determining unit, configured to determine, based on a scene type to which the target scene belongs, an operation type of the operation event corresponding to the scene type;
a parameter determination unit configured to determine an operation parameter for executing the operation event based on the rotation operation and the operation type;
wherein the first execution module is configured to:
executing the operation event based on the operation parameter and the operation type;
wherein the parameter determination unit is configured to:
determining a preset conversion relation corresponding to the operation type based on the operation type, wherein the preset conversion relation is determined based on data obtained by monitoring the rotation of the rotatable camera by a preset sensor; wherein, different operation types correspond to different preset conversion relations;
determining a first operation parameter corresponding to the rotation angle based on the preset conversion relation under the condition that the rotation operation comprises the rotation angle; determining an operating parameter for executing the operating event based on the first operating parameter;
in a case where the rotation operation includes a rotation direction, determining an operation parameter for performing the operation event based on the rotation direction;
under the condition that the rotation operation comprises a rotation angle and a rotation direction, determining a first operation parameter corresponding to the rotation angle based on the preset conversion relation; determining an operating parameter for performing the operating event based on the first operating parameter and/or the rotational direction;
wherein the operation receiving module comprises:
the starting unit is used for starting the preset sensor and the rotatable camera under the condition that the current scene is a target scene;
and the operation monitoring unit is used for monitoring the rotating operation of the rotatable camera by the user through the preset sensor.
5. A control device, wherein the device is provided with a retractable camera, the device comprising:
the telescopic monitoring module is used for receiving telescopic operation of a user on the telescopic camera under the condition that the current scene is a target scene, wherein the telescopic operation comprises the distance for extending or shortening the telescopic camera;
a second determination module, configured to determine, based on the target scene, an operation event corresponding to the target scene and the scaling operation;
the second execution module is used for executing the operation event;
wherein the second determining module is configured to:
the type acquisition unit is used for acquiring the scene type of the target scene;
a type determining unit, configured to determine, based on a scene type to which the target scene belongs, an operation type of the operation event corresponding to the scene type;
a parameter determination unit configured to determine an operation parameter for executing the operation event based on the scaling operation and the operation type;
wherein the second execution module is configured to:
executing the operation event based on the operation parameter and the operation type;
wherein the parameter determination unit is configured to:
determining a preset conversion relation corresponding to the operation type based on the operation type, wherein the preset conversion relation is determined by data obtained by monitoring the expansion and contraction of the telescopic camera based on a preset sensor; wherein different operation types correspond to different preset conversion relations;
under the condition that the telescopic operation comprises a telescopic distance, determining a second operation parameter corresponding to the telescopic distance based on the preset conversion relation; determining an operating parameter for executing the operating event based on the second operating parameter;
determining an operation parameter for executing the operation event based on the telescopic direction in the case that the telescopic operation includes the telescopic direction;
under the condition that the telescopic operation comprises a telescopic distance and a telescopic direction, determining a second operation parameter corresponding to the telescopic distance based on the preset conversion relation; determining an operating parameter for executing the operating event based on the second operating parameter and/or the telescoping direction;
wherein, flexible monitoring module is used for:
starting the preset sensor and the telescopic camera under the condition that the current scene is a target scene;
and monitoring the telescopic operation of the user on the telescopic camera through the preset sensor.
6. A terminal device, characterized in that it comprises a processor, a memory and a computer program stored on the memory and executable on the processor, which computer program, when executed by the processor, implements the steps of the control method according to any one of claims 1 to 2.
7. A terminal device, characterized in that it comprises a processor, a memory and a computer program stored on said memory and executable on said processor, said computer program, when executed by said processor, implementing the steps of the control method as claimed in claim 3.
CN201910536510.3A 2019-06-20 2019-06-20 Control method, control device and terminal equipment Active CN110213489B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910536510.3A CN110213489B (en) 2019-06-20 2019-06-20 Control method, control device and terminal equipment
PCT/CN2020/080689 WO2020253295A1 (en) 2019-06-20 2020-03-23 Control method and apparatus, and terminal device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910536510.3A CN110213489B (en) 2019-06-20 2019-06-20 Control method, control device and terminal equipment

Publications (2)

Publication Number Publication Date
CN110213489A CN110213489A (en) 2019-09-06
CN110213489B true CN110213489B (en) 2021-06-15

Family

ID=67793678

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910536510.3A Active CN110213489B (en) 2019-06-20 2019-06-20 Control method, control device and terminal equipment

Country Status (2)

Country Link
CN (1) CN110213489B (en)
WO (1) WO2020253295A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110213489B (en) * 2019-06-20 2021-06-15 维沃移动通信有限公司 Control method, control device and terminal equipment
CN110930878B (en) * 2019-11-11 2021-06-18 维沃移动通信有限公司 Control method of flexible screen and electronic equipment
CN111372047B (en) * 2020-03-02 2021-02-26 杭州普维云技术有限公司 Intelligent head shaking machine camera
WO2022028060A1 (en) * 2020-08-07 2022-02-10 海信视像科技股份有限公司 Display device and display method
CN115291781B (en) * 2022-07-01 2023-05-26 厦门立林科技有限公司 Parameter adjusting method, device and system of equipment

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1319356C (en) * 2003-04-24 2007-05-30 三星电子株式会社 Device and method for using a rotating key and controlling a display in a mobile terminal
KR20060065344A (en) * 2004-12-10 2006-06-14 엘지전자 주식회사 Means for jog-dial of mobile phone using camera and method thereof
TWI349845B (en) * 2008-07-23 2011-10-01 Asustek Comp Inc Portable electronic device
CN202601139U (en) * 2012-06-05 2012-12-12 西安诺瓦电子科技有限公司 Control input component of light-emitting diode (LED) display screen portable control device
CN104461313A (en) * 2013-09-23 2015-03-25 深圳桑菲消费通信有限公司 Control method and device for mobile terminal and mobile terminal
CN104679412B (en) * 2015-02-10 2018-01-23 广东欧珀移动通信有限公司 A kind of incoming call processing method and mobile terminal
JP2018508076A (en) * 2015-03-08 2018-03-22 アップル インコーポレイテッド User interface with rotatable input mechanism
CN105808077A (en) * 2016-03-09 2016-07-27 广东欧珀移动通信有限公司 Method for controlling page turning, and mobile terminal
JP6484761B2 (en) * 2016-06-27 2019-03-13 富士フイルム株式会社 Camera and setting method thereof
CN106060369B (en) * 2016-08-04 2019-03-26 Oppo广东移动通信有限公司 Camera module and terminal
CN206004725U (en) * 2016-08-26 2017-03-08 深迪半导体(上海)有限公司 A kind of adjustable mobile phone of photographic head
CN108388314A (en) * 2018-04-04 2018-08-10 深圳天珑无线科技有限公司 A kind of electronic equipment and its CCD camera assembly
CN110213489B (en) * 2019-06-20 2021-06-15 维沃移动通信有限公司 Control method, control device and terminal equipment

Also Published As

Publication number Publication date
CN110213489A (en) 2019-09-06
WO2020253295A1 (en) 2020-12-24

Similar Documents

Publication Publication Date Title
CN110213489B (en) Control method, control device and terminal equipment
CN107977144B (en) Screen capture processing method and mobile terminal
CN107817939B (en) Image processing method and mobile terminal
CN108495029B (en) Photographing method and mobile terminal
CN108307109B (en) High dynamic range image preview method and terminal equipment
CN109525874B (en) Screen capturing method and terminal equipment
CN111050070B (en) Video shooting method and device, electronic equipment and medium
CN108174103B (en) Shooting prompting method and mobile terminal
CN107087137B (en) Method and device for presenting video and terminal equipment
CN109348019B (en) Display method and device
US11165950B2 (en) Method and apparatus for shooting video, and storage medium
CN109102555B (en) Image editing method and terminal
CN108616772B (en) Bullet screen display method, terminal and server
CN110881212B (en) Method and device for saving power of equipment, electronic equipment and medium
CN111031253B (en) Shooting method and electronic equipment
CN109634688B (en) Session interface display method, device, terminal and storage medium
CN110933494A (en) Picture sharing method and electronic equipment
CN110941378B (en) Video content display method and electronic equipment
CN110505660B (en) Network rate adjusting method and terminal equipment
CN110413363B (en) Screenshot method and terminal equipment
CN109639981B (en) Image shooting method and mobile terminal
CN111399736A (en) Progress bar control method, device and equipment and readable storage medium
CN110795021A (en) Information display method and device and electronic equipment
CN107729100B (en) Interface display control method and mobile terminal
CN107734269B (en) Image processing method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant