CN116074615B - Camera control method and device and terminal equipment - Google Patents

Camera control method and device and terminal equipment Download PDF

Info

Publication number
CN116074615B
CN116074615B CN202310208838.9A CN202310208838A CN116074615B CN 116074615 B CN116074615 B CN 116074615B CN 202310208838 A CN202310208838 A CN 202310208838A CN 116074615 B CN116074615 B CN 116074615B
Authority
CN
China
Prior art keywords
terminal equipment
message
driving chip
application processor
application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310208838.9A
Other languages
Chinese (zh)
Other versions
CN116074615A (en
Inventor
吴斌
张辉迪
刘炎南
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202310208838.9A priority Critical patent/CN116074615B/en
Publication of CN116074615A publication Critical patent/CN116074615A/en
Application granted granted Critical
Publication of CN116074615B publication Critical patent/CN116074615B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Telephone Function (AREA)

Abstract

The embodiment of the application discloses a camera control method, a camera control device and terminal equipment, which are applicable to the technical field of electronics, and the method comprises the following steps: the terminal device comprises: the lens module comprises a motor and a driving chip for driving the motor, and the method comprises the following steps: and monitoring the motion data of the terminal equipment. And identifying the motion state of the terminal equipment according to the motion data. And when the identified motion state is shaking, controlling the driving chip to be electrified. And when the identified motion state is the shaking, controlling the driving chip to be powered down. The embodiment of the application can reduce the power consumption brought to the terminal equipment while avoiding the shaking of the motor.

Description

Camera control method and device and terminal equipment
Technical Field
The present application relates to the field of electronic technologies, and in particular, to a method and an apparatus for controlling a camera, and a terminal device.
Background
The shooting function has become a standard function of many terminal devices, and with popularization and development of the terminal devices, the types of cameras supported by the terminal devices are more and more abundant. For example, some smartphones and tablet computers may support cameras of the wide angle, ultra wide angle, tele, depth, etc. type.
In order to adapt to various shooting scene requirements possibly occurring in practical application, functions and structures of cameras in terminal equipment are also more and more complex. Among them, cameras supporting an Auto Focus (AF) lens and an Optical anti-shake (Optical ImageStabilization, OIS) lens are also a conventional configuration of various manufacturer terminal devices.
The hardware structure of the AF lens module and OIS lens module (hereinafter referred to as the modules) is large in size and mass, and includes a motor and a Driver IC. Under the condition that a driving chip of the module is not electrified, the motor controlled by the driving chip is easy to move freely along with the shaking of the terminal equipment, so that the motor is collided, and abnormal impact sound is caused. And power consumption of the terminal equipment is increased when the driving chip is electrified.
Therefore, a scheme is needed for avoiding the abnormal vibration of the motor and controlling the power consumption of the terminal equipment.
Disclosure of Invention
In view of the above, the embodiments of the present application provide a method, an apparatus, and a terminal device for controlling a camera, which can solve the problem of controlling power consumption of the terminal device while the motor is in abnormal vibration.
A first aspect of an embodiment of the present application provides a camera control method, which is applied to a terminal device, where the terminal device includes: the lens module comprises a motor and a driving chip for driving the motor, and the method comprises the following steps:
and monitoring the motion data of the terminal equipment and identifying the motion state of the terminal equipment according to the motion data.
And when the identified motion state is shaking, controlling the driving chip to be electrified. And when the identified motion state is the end of shaking, controlling the driving chip to be powered down.
The embodiment of the application can timely electrify the terminal equipment when the terminal equipment starts to shake, can realize real-time response to the shake condition, and can furthest avoid abnormal sound caused by wall collision due to shake of the mobile phone. And when the terminal equipment just finishes shaking, the power is timely turned off, so that the condition that the driving chip is powered on for a long time in a non-use way can be effectively avoided, the excessive power consumption of the driving chip is avoided, and the power consumption of the terminal equipment is reduced. On the basis, the power-off when shaking is finished and the power-on when shaking is started are combined, so that the embodiment of the application can realize accurate power-on and power-off control of the driving chip during shaking of the terminal equipment. On the basis of avoiding abnormal motor vibration, the power consumption situation brought by the embodiment of the application is controlled as much as possible, so that balance between abnormal motor vibration and power consumption is realized.
In a first possible implementation manner of the first aspect, identifying a motion state of the terminal device according to the motion data includes:
and when the motion data is higher than a threshold value, carrying out change trend analysis on the data collected in the time length of the latest window in the motion data. When the change trend is from low to high or from weak to strong, the motion state is judged to be the starting of shaking.
And when the motion data is lower than a threshold value, carrying out change trend analysis on the data collected in the time length of the latest window in the motion data. When the change trend is from high to low or from strong to weak, the movement state is judged to be the end of shaking.
The embodiment of the application initially distinguishes whether the shaking starts or the shaking ends possible conditions through the threshold value. On the basis, the movement data is utilized to carry out targeted analysis and identification according to different change trend characteristics when the movement data starts to shake and when the movement data ends to shake, so that the accurate identification of starting to shake and ending to shake can be realized.
In a second possible implementation manner of the first aspect, the lens module further includes: a target lens.
Before identifying the motion state of the terminal device according to the motion data, the method further comprises:
detecting whether the target lens is opened or not, and executing an operation of identifying the motion state of the terminal device according to the motion data when the target lens is not opened.
According to the embodiment of the application, through the follow-up operation of the embodiment of the application when the target lens is detected not to be opened, the targeted monitoring of the target lens module can be realized so as to prevent the abnormal shaking sound.
As one embodiment of the present application, the target lens is an auto-focus lens or an optical anti-shake lens.
As one embodiment of the present application, when the target lens is an auto-focus lens, the motion data is acceleration data.
In the embodiment of the application, the detection of the transverse movement (Shift) condition in the Z-axis direction of the terminal equipment can be realized through the acceleration sensor, so that the analysis of the motion condition related to the target lens is realized.
In a third possible implementation manner of the first aspect, the terminal device further includes: an intelligent sensor hub and an application processor.
And monitoring the motion data of the terminal equipment. Identifying a motion state of the terminal device according to the motion data, including:
the intelligent sensing hub monitors the motion data of the terminal equipment and identifies the motion state of the terminal equipment according to the motion data.
Controlling the powering up of the driving chip, comprising:
the intelligent sensor hub generates trigger information, encapsulates the trigger information into virtual sensor information and sends the virtual sensor information to the application processor.
And the application processor powers on the driving chip when the received virtual sensor message is a trigger message.
In a fourth possible implementation manner of the first aspect, controlling the driving chip to power down includes:
and the intelligent sensor hub in the terminal equipment generates release information, encapsulates the release information into a virtual sensor message and sends the virtual sensor message to the application processor.
And the application processor powers down the driving chip when the received virtual sensor message is a trigger message.
In the embodiment of the application, the real-time monitoring of the motion state is realized by using the characteristic of low power consumption of the sensor hub, so that the long-time work of an application processor is avoided, and the power consumption monitored by the embodiment of the application can be greatly reduced. Meanwhile, the embodiment of the application can realize the data transmission of the Sensorhub and the application processor on the original data link by carrying out virtual sensor message encapsulation without additionally adding or modifying the data link. Therefore, the embodiment of the application can realize motor control of more types of terminal equipment, thereby realizing compatibility of more different types of terminal equipment.
In a fifth possible implementation manner of the first aspect, the terminal device further includes: and driving a camera.
The application processor powers on the driver chip when the received virtual sensor message is a trigger message, including:
and when the received virtual sensor message is a trigger message, the application processor sends a power-on instruction to the camera driver.
When the camera driver receives a power-on instruction, the power-on of the driving chip is controlled.
When the received virtual sensor message is a trigger message, the application processor powers down the driving chip, and the method comprises the following steps:
and when the received virtual sensor message is a release message, the application processor sends a power-down instruction to the camera driver.
When the camera driver receives a power-down instruction, the camera driver controls the driving chip to power down.
The embodiment of the application can realize the power on and off of the driving chip by utilizing the camera drive of the terminal equipment.
In a sixth possible implementation manner of the first aspect, before sending the trigger message encapsulated as the virtual sensor message to the application processor, the smart sensor hub further includes:
and detecting whether the module to which the application processor belongs is in a dormant state. And when the module of the application processor is in the dormant state, the intelligent sensor hub wakes up the module of the application processor, and performs the operation of packaging the trigger message into a virtual sensor message and then sending the virtual sensor message to the application processor.
Before sending the virtual sensor message to the application processor, the sensor hub performs dormancy detection on the application processor module and performs dormancy wakeup, so that the embodiment of the application can ensure normal communication between the sensor hub and the application processor. Therefore, the embodiment of the application can timely and accurately control the power on and off of the driving chip, and effectively avoid the shaking abnormal sound of the motor.
In a seventh possible implementation manner of the first aspect, when the identified motion state is shaking, the driving chip is controlled to power up.
On the basis of power-on when starting to shake and power-off when ending to shake, the method introduces detection in the shake of the terminal equipment, and powers on the driving chip when detecting in the shake of the terminal equipment. The embodiment of the application can effectively ensure the timeliness of the power-on of the driving chip, thereby improving the timeliness of motor motion control and improving the control effect of motor shaking abnormal sound.
In an eighth possible implementation manner of the first aspect, after the controlling the driving chip to power up, the method further includes:
and (3) timing the power-on time, and controlling the driving chip to power down when the timing time reaches the preset time upper limit value.
The embodiment of the application introduces a timeout forced power-down mode, which can avoid the influence of excessively high power consumption on the terminal equipment caused by long-time power-up of the driving chip due to various possible factors, thereby realizing effective control of the power consumption of the terminal equipment.
In a ninth possible implementation manner of the first aspect, before identifying a motion state of the terminal device according to the motion data, the method further includes:
and acquiring the scene of the terminal equipment.
And when the scene in which the terminal equipment is positioned is a first target scene, executing the operation of identifying the motion state of the terminal equipment according to the motion data. The first target scene is a scene in which a user easily perceives motor shaking abnormal sound and the motor shaking abnormal sound is short in duration.
As an alternative embodiment of the present application, when the terminal device is the second target scene, the operation of identifying the motion state of the terminal device from the motion data is not performed, and the flow is ended. The second target scene is a scene in which the user is hard to perceive motor shaking abnormal sound and the motor shaking abnormal sound lasts longer.
The embodiment of the application can balance the influence of the motor shaking abnormal sound on the user perception (namely user experience) and solve the power consumption brought to the terminal equipment during shaking abnormal sound, thereby effectively improving the comprehensive use experience of the user.
In a tenth possible implementation manner of the first aspect, acquiring a scene in which the terminal device is located includes:
and acquiring the sound intensity of the environment in which the terminal equipment is positioned, the vibration intensity of the terminal equipment and the moving speed of the terminal equipment.
And when the sound intensity is lower than the volume threshold value, the vibration intensity is lower than the vibration threshold value and the moving speed is lower than the speed threshold value, judging that the scene where the terminal equipment is positioned is a first target scene.
As an embodiment of the present application, when the sound intensity is higher than the volume threshold and/or the vibration intensity is higher than the vibration threshold and the moving speed is higher than the speed threshold, it is determined that the terminal device is in the second target scene.
The embodiment of the application can provide a simple and effective scene recognition method, and can realize efficient and accurate recognition of scenes.
In an eleventh possible implementation manner of the first aspect, the hardware abstraction layer of the terminal device includes a camera abstraction layer and an activity recognition abstraction layer corresponding to the application processor.
The intelligent sensor hub generates trigger information, encapsulates the trigger information into virtual sensor information and sends the virtual sensor information to the application processor. The application processor powers on the driver chip when the received virtual sensor message is a trigger message, including:
the intelligent sensor hub generates trigger information, packages the trigger information into virtual sensor information and sends the virtual sensor information to the activity recognition abstraction layer.
The dynamic recognition abstraction layer sends the virtual sensor message to the camera abstraction layer.
And the camera abstract layer powers on the driving chip when the received virtual sensor message is a trigger message.
The embodiment of the application can realize multiplexing of the exception handling mechanism between the ARHal and the sensor hub, and reduce the workload of a developer on the exception handling development of the CameraHal and the sensor hub while avoiding or improving the problem of transmission exception in the message transmission process of the virtual sensor.
A second aspect of an embodiment of the present application provides a camera control apparatus for controlling a terminal device, the terminal device including: the lens module comprises a motor and a driving chip for driving the motor, and the camera control device is as follows:
and the monitoring module is used for monitoring the motion data of the terminal equipment.
And the monitoring module is also used for identifying the motion state of the terminal equipment according to the motion data.
And the control module is used for controlling the driving chip to be electrified when the identified motion state starts to shake.
And the control module is also used for controlling the driving chip to be powered down when the identified motion state is shaking.
As an embodiment of the present application, the camera control apparatus may implement the method according to any one of the first aspect described above.
In a third aspect, an embodiment of the present application provides a terminal device, including a camera, a memory, a processor, and a computer program stored in the memory and executable on the processor, the processor implementing a method according to any one of the first aspects when the computer program is executed by the processor.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium storing a computer program which when executed by a processor performs a method as in any of the first aspects above.
In a fifth aspect, an embodiment of the present application provides a chip system, the chip system including a processor, the processor being coupled to a memory, the processor executing a computer program stored in the memory to implement a method as described in any one of the first aspects. Wherein the processor may comprise an application processor and an intelligent sensor hub, and the application processor and the co-processor are each responsible for the corresponding steps of the method according to any one of the first aspects above.
In a sixth aspect, an embodiment of the application provides a computer program product for, when run on a terminal device, causing the terminal device to perform the method of any of the first aspects above.
It will be appreciated that the advantages of the second to sixth aspects may be found in the relevant description of the first aspect, and are not described here again.
Drawings
Fig. 1 is a schematic flow chart of a camera control method according to an embodiment of the present application;
Fig. 2A is a schematic diagram of a control flow of a motor in a lens module when a terminal device provided by an embodiment of the present application normally starts a camera;
fig. 2B is a schematic flow chart of motion state recognition according to an embodiment of the present application;
fig. 3 is a schematic flow chart of a method for controlling a camera based on a scene where a terminal device is located according to an embodiment of the present application;
fig. 4 is a schematic flow chart of a camera control method according to an embodiment of the present application;
fig. 5 is a flowchart of another camera control method according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a camera control device according to an embodiment of the present application;
fig. 7A is a schematic structural diagram of a mobile phone according to an embodiment of the present application;
fig. 7B is a software architecture block diagram of a terminal device according to an embodiment of the present application;
fig. 8 is a schematic diagram of a hardware structure of a terminal device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
Some concepts that may be involved in embodiments of the present application are described below:
module and motor: in the embodiment of the present application, the module is a generic term for the AF lens module and the OIS lens module, and the motor refers to a motor in the module, so the motor may be a motor in the AF lens module or a motor in the OIS lens module.
And (3) a driving chip: unless otherwise stated, the driving chips in the embodiments of the present application are all driving chips for driving a motor, and after the driving chips are powered on, the motion state of the motor can be controlled.
Along with the continuous iterative updating of the AF lens module and the OIS lens module designs, the chip target surface and the lens quality on the module hardware design are gradually increased, so that the expansion on the hardware structure is caused, namely the volume and the mass of the hardware structure are increased. A motor is arranged in the module and a driving chip corresponding to the motor is arranged in the module. When a user opens the camera, the driving chip is electrified so as to control the motion state of the motor. But when the camera is in a closed state, the driving chip is in a power-down state, and the control of the motor is lost. At this time, the motor is easy to freely move along with the shaking of the terminal equipment, so that the motor is caused to collide, and the abnormal sound is caused. For example, when the terminal device is severely rocked, the motor rotor may collide with the mechanical limiting device.
In order to avoid abnormal vibration of the motor in the module, there are several alternative schemes:
scheme 1: and after the terminal equipment is started, the continuous power-on state of the driving chip is kept, so that the motion state of the motor is continuously controlled.
At this time, although the motor can be prevented from shaking abnormal sound, the power consumption of the driving chip is larger, and the power consumption of the terminal equipment is seriously increased due to the fact that the driving chip is continuously kept powered on, so that the cruising of the terminal equipment is greatly influenced.
Scheme 2: the module is improved from the hardware layer, enhancing the stability of the motor within the module.
From the hardware level improvement, the cost of the hardware of the terminal equipment is increased excessively, and the cost pressure of terminal equipment manufacturers is increased excessively.
In order to avoid the abnormal vibration of the motor, the power consumption of the terminal equipment is controlled. In the embodiment of the application, the terminal equipment can monitor whether the terminal equipment starts shaking or not and timely control the driving chip to be electrified when the terminal equipment monitors the starting shaking. Therefore, the embodiment of the application can timely realize the control of the motor motion before the motor has the abnormal vibration sound, thereby avoiding the abnormal vibration sound of the motor. On the basis, when the terminal equipment monitors that the terminal equipment finishes shaking, the terminal equipment timely controls the driving chip to power down. Therefore, the embodiment of the application can effectively avoid the condition that the driving chip is not electrified for a long time, and avoid the overlarge power consumption of the driving chip, thereby reducing the power consumption of the terminal equipment.
The embodiment of the application combines the power-on when the shaking starts and the power-off when the shaking ends, so that the embodiment of the application can realize accurate power-on and power-off control of the driving chip during the shaking of the terminal equipment. On the basis of avoiding abnormal motor vibration, the power consumption problem brought by the embodiment of the application is controlled as much as possible, so that balance between abnormal motor vibration and power consumption is realized.
In addition, the embodiment of the application does not increase the cost of hardware, so that the whole application cost is controllable.
The camera control method provided by the embodiment of the application can be applied to terminal equipment such as mobile phones, tablet computers and wearable equipment, and the terminal equipment is the execution main body of the camera control method provided by the embodiment of the application, and the embodiment of the application does not limit the specific type of the terminal equipment.
The following describes an application scenario of the embodiment of the present application:
the embodiment of the application can be applied to any use scene of the terminal equipment. Meanwhile, the embodiment of the application can be suitable for controlling the motor under the condition that the terminal equipment is randomly swayed, so that the swaying abnormal sound of the motor is avoided. For example, when the user uses a shake-to-shake function to rob a red pack or play a game, the terminal device may power up the driver chip when detecting the shake itself. For another example, when the terminal device is subjected to severe shaking due to impact, falling or the like, the driving chip may be powered on when the terminal device detects the severe shaking.
In order to illustrate the technical scheme of the application, the following description is made by specific examples.
Fig. 1 shows a flowchart of an implementation of a camera control method according to an embodiment of the present application, which is described in detail below:
s100, the application processor detects whether the target lens is started.
In the embodiment of the application, the lens to be processed is collectively called a target lens, and the target lens can be an AF lens and/or an OIS lens, and can be specifically set by a technician according to actual requirements. In practical applications, the terminal device may include multiple lenses, and when the terminal device does not turn on the camera, all the lenses are in an unopened state. When the terminal device turns on the camera, the lens used by the camera in different shooting modes may be different. In the case that the target shot is determined, the terminal device mainly detects whether the target shot is on or not. And when the target lens is detected not to be opened, the follow-up operation of the embodiment of the application is carried out so as to realize targeted monitoring of the target lens module and prevent abnormal shaking sound.
When the target lens is started, the driving chip of the target lens module is electrified, so that the motion state of the motor in the target lens module can be controlled, and the follow-up operation of the embodiment of the application can be omitted. For example, refer to fig. 2A, which is a schematic diagram of a control flow of starting a motor in a lens module when a terminal device normally starts a camera according to an embodiment of the present application. In the embodiment of the application, after the terminal equipment is started, a user turns on the camera, at the moment, the camera turns on the lens A, the driving chip of the lens A is electrified, and the motor in the lens A module is controlled in motion.
S101, when the target lens is not started, the application processor subscribes to the virtual sensor message of the intelligent sensor hub.
In the embodiment of the application, in order to realize timely power-up of the driving chip, the motion state of the terminal equipment needs to be monitored under the condition that the target lens is not started. In the embodiment of the application, the application processor (application process, AP) of the terminal equipment can be adopted to actively monitor the motion state of the terminal equipment. For example, a camera abstraction layer (CameraHal) with an AP in a hardware abstraction layer may be employed to actively monitor the motion state of the terminal device. However, the AP is directly used to actively monitor the power consumption of the terminal device, so in order to reduce the power consumption of the terminal device, the embodiment of the application can use a low-power-consumption intelligent sensor hub (sensor hub) to monitor the motion state of the terminal device. Wherein the intelligent sensor hub may also be referred to as a coprocessor.
Considering that the native data link between Sensor hub and AP in part of the terminal device only supports the native Sensor (Sensor) type, such as sensors of gyroscopes (Gyro), acceleration sensors (ACC), and Gravity sensors (Gravity). Therefore, in order to be compatible with the situation of more different terminal devices, the embodiment of the application adopts a virtual sensor mode, and the detection result of the sensor hub on the motion state is packaged into a virtual sensor message type. Thereby realizing that the sensor hub transmits the corresponding motion state data to the AP. Therefore, in the embodiment of the application, after the terminal equipment is started, the AP can actively subscribe to the virtual sensor message of the Sensorhub. The following description will take the active monitoring of the motion state of the terminal device by using the camelhal of the AP as an example.
S102, the intelligent sensor hub monitors motion data detected by a target sensor, wherein the target sensor comprises a motion sensor.
In the embodiment of the application, the sensor capable of detecting the motion data of the terminal equipment is collectively called a target sensor. The target sensor is a motion sensor, and the number and types of the sensors specifically included in the motion sensor can be set by a technician, and are not limited in any way. For example, when the target lens includes an AF lens, the target sensor may include an acceleration sensor in order to detect a lateral movement (Shift) condition in the terminal device Z-axis direction. When the target lens includes OIS lenses, in order to detect the longitudinal movement in the X-axis and Y-axis directions of the terminal device, the target sensor may include a gyroscope or the like. Correspondingly, the specific data type contained in the motion data needs to be determined according to the type of the target sensor. For example, when the object sensor comprises an acceleration sensor, the motion data comprises acceleration data.
After the subscription to the sensor is completed, the sensor starts to continuously monitor the motion data detected by the target sensor, so that the motion data of the terminal equipment can be effectively acquired in real time.
It should be noted that the step sequences of S100 and S101 may be interchanged or performed synchronously in practical applications. For example, in some embodiments, after the terminal device is powered on, the AP may first actively subscribe to the virtual sensor message of the sensor hub, and perform a monitoring operation of the motion data (i.e. S102 operation). And detecting whether the target lens is started or not in the running process of the terminal equipment, and if the target lens is started, performing motion state identification operation (namely S103 operation).
S103, the intelligent sensor hub identifies the motion state of the terminal equipment according to the motion data output by the target sensor.
In the embodiment of the application, according to the time sequence of the movement of the terminal equipment in the shaking process, the movement state of the terminal equipment in the shaking process can be divided into: three conditions of starting shaking, shaking neutralization and ending shaking are adopted. The ending of the shake may also be referred to as the disconnecting shake, which means that the terminal device is ending the shake or just ending the shake, and the starting of the shake means that the terminal device is starting the shake. Both the start of shaking and the end of shaking are critical states for a short time. Meanwhile, the motion state of the terminal equipment outside the shaking process can be collectively called as non-shaking state.
In order to prevent abnormal vibration of the motor, when the terminal equipment starts to shake and is in shake, the embodiment of the application can be selected to power on the driving chip. However, another problem arises, if only the power-on operation is controlled to be performed on the driving chip, the abnormal sound caused by the shaking of the motor can be avoided, but the situation that the terminal equipment continues to have high power consumption after the power-on operation is also caused. Therefore, in order to control the actual power consumption of the terminal equipment, the embodiment of the application can also timely perform the power-down operation on the driving chip when the terminal equipment is finished shaking. Therefore, the driving chip is kept electrified when the terminal equipment shakes, the driving chip is timely electrified when the terminal equipment does not shake, and the power consumption brought to the terminal equipment is effectively controlled on the basis of avoiding abnormal sound caused by motor shake. Specifically:
after the sensor hub acquires the motion data, the motion data is used for identifying the motion state of the terminal equipment. The embodiment of the application can be used for carrying out targeted identification on starting shaking and ending shaking, and can also select whether to identify shaking or not. The embodiment of the application does not limit the identification method of the specific motion state too much, and can be set by a technician. For example, in some embodiments, a threshold may be set for the motion data to distinguish the extent of the motion state. As an alternative embodiment, taking the case that the target sensor is an acceleration sensor, the acceleration value is taken as an example when the data is motion, a threshold value for acceleration may be set. And when the acceleration value is higher than the threshold value, judging that the vibration starts. And on the basis that the previous motion state recognition result is that the shaking is started or the shaking is in progress, judging that the shaking is ended when the acceleration value is lower than the threshold value. Wherein the specific value of the threshold value may be set by the skilled person.
As an alternative embodiment of the present application for identifying the motion state, a threshold value may be set, as well as a window duration. Reference may be made to fig. 2B, which is a schematic flow chart of motion state recognition according to an embodiment of the present application. Based on the acquired motion data, the embodiment of the application can judge whether the motion data exceeds the threshold value or not to primarily distinguish whether the motion data starts to shake or ends to shake.
When the motion data is higher than the threshold value, the probability that the terminal equipment belongs to starting shaking is larger. And at the moment, utilizing a window mechanism to analyze the variation trend of the sampled motion data in the latest window duration. If the trend of the change gradually increases, for example, from low to high or from weak to strong, it is indicated that the trend of the movement of the terminal device is increasing, and the terminal device is in a state of starting shaking. If the change value is stable or gradually becomes smaller (for example, from high to low or from strong to weak), the movement trend of the terminal device is stable at this time, or the terminal device is weakened but not weakened to a state of ending shaking (the absolute value of the movement data is still larger at this time). I.e. not in a state of starting shaking.
When the motion data is lower than the threshold value, the probability that the terminal equipment is in the end of shaking is larger. And at the moment, utilizing a window mechanism to analyze the variation trend of the sampled motion data in the latest window duration. If the change value is gradually smaller, the terminal equipment movement trend is weakened. Meanwhile, the absolute value of the motion data is smaller at the moment, so that the terminal equipment is in a shaking ending stage, namely the shaking ending stage in the embodiment of the application. If the change trend is gradually increased, the movement trend of the terminal equipment is increased, and the terminal equipment is not in a state of ending shaking.
As another alternative embodiment of identifying the motion state in the embodiment of the present application, it is also possible to choose to train some identification models for the motion state of the terminal device in advance. On the basis, the motion data can be processed by utilizing the identification model, so that the specific motion state of the terminal equipment can be identified.
As an embodiment of the present application, it is considered that in practical application, there may be a certain error in identifying the start shake, which may cause difficulty in timely powering up the driving chip. In order to reduce the influence of this situation as much as possible, in an embodiment of the present application, the sensor hub may also periodically identify whether the terminal device is in shake. That is, in S103, it may include: and when the previous motion state identification result is that the motion state is not the motion state except the shaking, periodically identifying whether the motion state of the terminal equipment is in the shaking state. Alternatively, the previous movement state may be ignored, and in S103, the method may include: and periodically identifying whether the motion state of the terminal equipment is in shaking.
And S104, when the motion state of the terminal equipment is identified to start shaking, the intelligent sensor hub generates trigger information, packages the trigger information into a virtual sensor message and sends the virtual sensor message to the application processor.
When the sensor hub recognizes that the terminal equipment starts shaking, trigger information can be generated and the CameraHal is informed to power on the driving chip. However, considering that the native data link between the sensor hub and the camel in part of the terminal equipment only supports the native sensor type, the sensor hub encapsulates the trigger information into a virtual sensor message after generating the trigger information, and then sends the virtual sensor message to the camel through the native data link.
As an alternative embodiment of the present application, when the motion state of the terminal device is identified as shaking, the intelligent sensor hub generates trigger information, encapsulates the trigger information into a virtual sensor message, and sends the virtual sensor message to the application processor.
In order to improve timeliness and accuracy of motor motion state control, the embodiment of the application can also generate trigger information and package and send virtual sensor messages when the terminal equipment is identified to shake.
And S105, when the motion state of the terminal equipment is recognized to be the end of shaking, the intelligent sensor hub generates release information, encapsulates the release information into a virtual sensor message and sends the virtual sensor message to the application processor.
When the sensor hub recognizes that the terminal equipment is finished shaking, the sensor hub can generate release information and package virtual sensor information, and then send the release information to the CameraHal through a primary data link to inform the CameraHal to power down the driving chip.
As an alternative embodiment of the present application, consider that in practical application, the camelhal side cannot normally receive the virtual sensor message due to some known or unknown factors. For example, during the process of transmitting a message to the camelhal, the sensor hub may lose the message, or the connection between the sensor hub and the camelhal is abnormal. In order to cope with these situations, some escape mechanisms and exception handling mechanisms may be provided in embodiments of the present application. For example, a single virtual sensor message may be repeatedly sent to the camelhal multiple times in case of a problem with information transmission. For another example, upon detecting an abnormal connection with a camelhal, the camel is actively reported to reconnect the camel to the sensor hub.
As another alternative embodiment of the present application, when the sensor hub recognizes that the terminal device is finished shaking, the release information may be generated and encapsulated with a virtual sensor message, and then sent to the active recognition abstraction layer (Activity Recognition Hardware AbstractionLayer, ARHal) through the native data link. The encapsulated virtual sensor message is sent by ARHal to CameraHal. Because of an exception handling mechanism between ARHal and Sensorhub, transmission exception problems in the data transmission process can be avoided or improved to a certain extent. Therefore, the embodiment of the application can realize multiplexing of the exception handling mechanism between the ARHal and the sensor, and reduce the workload of a developer on the exception handling development of the CameraHal and the sensor while avoiding or improving the problem of transmission exception in the message transmission process of the virtual sensor.
Accordingly, in this case, in S104, "encapsulate trigger message into a virtual sensor message and send it to the application processor", it may be modified to "encapsulate trigger message into a virtual sensor message and send it to the ARHal". ARHal sends the received virtual sensor message to CameraHal. In S105, "encapsulate release message as virtual sensor message and send to application processor", it may be modified to "encapsulate release message as virtual sensor message and send to ARHal". ARHal sends the received virtual sensor message to CameraHal.
S106, after receiving the virtual sensor message, the application processor sends a power-on instruction to the camera driver if the virtual sensor message is a trigger message; and if the virtual sensor message is a release message, sending a power-down instruction to the camera driver.
After receiving the virtual sensor message, the CameraHal analyzes the virtual sensor message to determine whether to trigger the message or release the message. And when the trigger message is the trigger message, generating a corresponding power-on instruction and sending the power-on instruction to the camera driver. Otherwise, if the message is released, a corresponding power-down instruction is generated and sent to the camera driver. The virtual sensor message may be directly sent by the sensor hub, or may be forwarded by ARHal, etc., which is not limited herein.
As an alternative embodiment of the application, it is contemplated that in practice the duration of the shaking state is short in most cases of the terminal device. Particularly, when the definition requirement on the shake is higher, for example, when the shake particularly refers to severe shake, the corresponding detection threshold value is also set higher. In real life, the terminal equipment is rarely in a violent shaking state for a long time. Therefore, in order to prevent the recognition omission of the end shake, the problem of high power consumption caused by long-time power-up of the driving chip is caused. The embodiment of the application provides a timeout detection scheme. Namely, after the driving chip is electrified, the sensor hub or the CameraHal starts to count time, and when the count time reaches the preset time upper limit value, the driving chip is electrified. If the sensor hub is responsible for timing, generating release information when the timing duration reaches a preset duration upper limit value, packaging the release information into a virtual sensor message, and sending the virtual sensor message to the CameraHal.
And S107, when the camera driver receives a power-on instruction, the camera driver controls the power-on of the driving chip, and when the camera driver receives a power-off instruction, the camera driver controls the power-off of the driving chip.
In the embodiment of the application, the camera driving is responsible for power-on and power-off control of the driving chip. And powering up the driving chip when the power-up instruction is received, and powering down the driving chip when the power-down instruction is received. The driving chip starts to control the motion state of the motor in the module after being electrified, so that the conditions of wall collision and the like when the motor shakes the terminal equipment are avoided, and shaking abnormal sound occurs.
As an embodiment of the present application, consider that when an application processor module in a terminal device is in a sleep state (for example, a terminal device screen-off state), normal communication between the camel hal and the intelligent sensor hub is not possible. Therefore, in order to avoid that the driver chip cannot be timely controlled to be powered on or powered off in the sleep state of the terminal equipment, on the basis of the above embodiments, the embodiment of the application can also detect whether the application processor module enters the sleep state and wake up when the application processor module is in the sleep state. Thereby realizing the normal communication with the CameraHal and the normal power-on and power-off control of the CameraHal on the driving chip. The application processor module is generally called as related software and hardware including an application processor. Specifically, the embodiment of the application comprises the following steps:
the sensor hub identifies whether the application processor module is in a dormant state.
And if the application processor module is in the dormant state, the sensor hub wakes up the application processor module.
Specifically, the embodiment of the application can be executed at any time before the sensor hub sends the virtual sensor message to the CameraHal. For example, as a step at any position before S104 or S105, for example, any position between steps S100 to S104 may be inserted. For example, the method may further include inserting into S104 or S105, that is, before or after generating the virtual sensor message, the method first identifies and wakes up the sleep state, and then sends the virtual sensor message to the application processor. In the embodiment of the application, the normal communication between the sensor hub and the application processor can be ensured by detecting and waking up the sleep state of the terminal equipment, so that the power-on and power-off of the driving chip can be controlled timely and accurately, and the abnormal shaking noise of the motor can be effectively avoided.
As another alternative embodiment of the present application, on one hand, it is considered that in real life, the requirement of the user for the terminal device to run up is higher, so that the requirement of the terminal device for controlling the power consumption is also higher. And the motor is controlled in motion, so that the power consumption brought to the terminal equipment in a short time is generally controllable. However, if the motor is controlled in motion for a long time, larger power consumption is brought to the terminal equipment, so that the endurance of the terminal equipment is greatly affected. On the other hand, under different scenes, the perception degree of the motor shaking abnormal sound by the user can be greatly different. In a perceptible scene, such as in some quiet environments or in situations where the user's own state is calm, the sloshing abnormal sound of the motor is easily perceived by the user. In a scene difficult to sense, such as some noisy environments or a situation that the state of the user is changed severely, the abnormal shaking sound of the motor is difficult to sense. By combining the two aspects, the influence of the motor shaking abnormal sound on the user perception (namely user experience) is balanced, and the power consumption brought to the terminal equipment during shaking abnormal sound is solved. In the embodiment of the application, whether the various embodiments for controlling the motor motion state need to be started or not can be determined according to the actual scene situation of the user. The details are as follows:
For some scenes (hereinafter referred to as first target scenes) where the user is easy to perceive and the motor is shaking for a short duration. For example, when the user uses a shake-shake function to rob a red envelope or play a game, the motor shake time is short, and thus the duration is short even if a shake abnormal sound occurs. Meanwhile, the user holds the terminal equipment at the moment, so that the user can easily perceive abnormal motor shaking sound. Under the first target scene, the power consumption brought by controlling the motion state of the motor is controllable, and poor experience of shaking abnormal sound can be well avoided for a user, so that the user experience is improved. Thus, in the embodiment of the present application, control of the motor motion state may be selected for the target scene.
For some scenes (hereinafter referred to as second target scenes) where it is difficult for the user to perceive and the motor is swaying for a long duration. For example, when the vehicle on which the user sits shakes more severely, the terminal device on the user may shake for a long time, and meanwhile, the user's perception of abnormal shaking sound of the terminal device may be reduced due to the influence of the vehicle shaking on the user. For another example, when the user runs in the long distance, the terminal device on the user may shake for a long time, and the sensing of the user on the shake abnormal sound of the terminal device is reduced due to the severe long distance running. On the one hand, if the motor motion state is controlled by continuously powering on the driving chip in the second target scene, the power consumption of the terminal device is overlarge, and the problem of serious endurance occurs. On the other hand, in the second target scene, because the motor shakes abnormal sound perception to decline by the user, even if the motor appears and shakes abnormal sound, still can not cause too big influence for user experience. In summary, in the embodiment of the present application, control of the motor motion state may be selected not to be performed for the second target scene.
The determination of the scene according to the embodiments of the present application can be applied in combination with the above-described embodiments. In the embodiment, the determining step of the scene may occur at any position before the step of sending the trigger information encapsulated as the virtual sensor message. For example, when combined with the embodiment shown in fig. 1, the scene determination in which the terminal device is located may be performed at any step before the sensor hub of S104 sends the virtual sensor message to the application processor. And when the terminal equipment is determined to be in the first target scene, normally executing S104 and subsequent steps. And when the terminal equipment is determined to be in the second target scene, the sensor hub does not send the virtual sensor message to the application processor, so that the subsequent application processor does not power on the driving chip.
As an alternative embodiment of the present application, scene recognition may be performed before the terminal device motion state is recognized in S103. And determining whether the motion state identification is needed or not and performing subsequent motor motion state control operation according to the scene identification result. Fig. 3 may be referred to, which is a schematic flow chart of a method for controlling a camera based on a scene where a terminal device is located in an embodiment of the present application. In the embodiment of the present application, S103 further includes:
S201, the intelligent sensor hub acquires the scene where the terminal equipment is located.
Before the motion state of the terminal equipment is identified, the sensor hub firstly acquires the current scene of the terminal equipment so as to distinguish whether the scene is a first target scene or a second target scene. The identification of the scene where the terminal equipment is located can be realized by the sensor hub, or the sensor hub can identify and provide a query interface for the sensor hub by other software and hardware modules, and the sensor hub can determine the scene by means of query and the like. Meanwhile, the embodiment of the application does not limit the specific method of scene recognition too much, and can be set by technicians. For example, in some alternative embodiments, some scene recognition algorithms, or scene recognition models, may be preset to implement scene recognition. Whether the terminal device is in the first target scene or the second target scene is identified, for example, by analyzing data such as sound, light, heat (i.e., temperature), force (i.e., force-related variables such as speed, acceleration, etc.), etc., which are detectable by the terminal device.
As an alternative embodiment of the present application for scene recognition, consider that in practical applications, the user's perception of the change in surroundings is more sensitive in the case of being generally quieter. Therefore, whether the environment where the user is located is quiet or not can be identified by detecting the sound intensity of the environment where the terminal device is located, the vibration intensity of the terminal device and the like, and whether the user easily perceives abnormal shaking sound of the motor or not is identified. Meanwhile, in the scene that the duration time of running, riding and the like of the user is long, the moving speed of the terminal equipment is high, so that whether the duration time of the motor shaking abnormal sound is long can be identified by using the moving speed. Therefore, in the embodiment of the application, the sensor hub can acquire the sound intensity of the environment detected by the terminal device, the vibration intensity of the terminal device and the moving speed of the terminal device, and then identify whether the terminal device is in the first target scene or the second target scene according to the data. For example, when the sound intensity is higher than the volume threshold and/or the vibration intensity is higher than the vibration threshold and the moving speed is higher than the speed threshold, it is determined that the terminal device is in the second target scene. And when the sound intensity is lower than the volume threshold value, the vibration intensity is lower than the vibration threshold value and the moving speed is lower than the speed threshold value, judging that the terminal equipment is in the first target scene. The embodiment of the application can provide a simple and effective scene recognition method, and can realize efficient and accurate recognition of scenes.
And S202, when the scene where the terminal equipment is located is determined to be a first target scene, executing the operation of S103, namely, continuously identifying the motion state of the terminal equipment by the intelligent sensing hub.
When the terminal device is in the first target scene, in order to balance the user experience and the power consumption problem of the terminal device, the following operations of S103 and the like can be continuously executed, so that the motor motion state is effectively controlled. The user experience is improved as much as possible while the power consumption of the terminal equipment is not affected as much as possible.
And S203, when the scene where the terminal equipment is located is determined to be a second target scene, ending the flow of controlling the motor motion state at the moment, namely, no operations such as S103 to S107 are needed.
When the terminal equipment is in the second target scene, if the motor motion state is continuously controlled, serious power consumption problem can be brought to the terminal equipment, so that the problem of continuous voyage of the terminal equipment can occur. Therefore, the embodiment of the application can end the control flow of the motor motion state at this time, thereby avoiding the problem of power consumption of the terminal equipment while not affecting the user experience.
As an alternative embodiment of the present application, the steps of S201 to S203 of the embodiment shown in fig. 3 may be steps before S103 after S100. For example, the steps may be performed sequentially in the order of S101, S102, S100 to the embodiment shown in fig. 3, or may be performed sequentially in the order of S100, S101, S102 to the embodiment shown in fig. 3.
As a specific embodiment of the present application, referring to fig. 4, taking an example that a terminal device is a mobile phone, a target lens is an AF lens and is turned on, and a target sensor is an acceleration sensor, and simultaneously, a camera hal is adopted to actively monitor a motion state of the terminal device and control power on/off, a flow chart of a camera control method for performing motor motion control under a normal wake-up state (such as a screen-on state of the terminal device) of the terminal device is shown. The details are as follows:
after the mobile phone is started, the camera abstraction layer subscribes to the virtual sensor information of the intelligent sensing hub side.
The intelligent sensor hub monitors acceleration data of the acceleration sensor.
The intelligent sensing hub determines whether the acceleration data exceeds a threshold value.
And when the acceleration data exceeds the threshold value, continuously judging whether the terminal equipment just enters a super-threshold state (namely, a movement state of starting shaking) according to the acceleration data.
If the terminal equipment just enters the super threshold state, the intelligent sensor hub generates a trigger message, encapsulates the trigger message into a virtual sensor message and then sends the virtual sensor message to the activity recognition abstraction layer. The virtual sensor message is sent by the activity recognition abstraction layer to the camera abstraction layer. If the terminal equipment does not just enter the super threshold state, ending the control flow.
And when the acceleration data does not exceed the threshold value, continuously judging whether the terminal equipment is just separated from the super-threshold state (namely, the shaking motion state is ended) according to the acceleration data.
If the terminal equipment is just out of the threshold state, the intelligent sensing hub generates a release message, encapsulates the release message into a virtual sensor message and then sends the virtual sensor message to the activity recognition abstraction layer. The virtual sensor message is sent by the activity recognition abstraction layer to the camera abstraction layer. If the terminal equipment is not just separated from the over-threshold state, ending the control flow.
The camera abstraction layer analyzes the virtual sensor message after receiving the virtual sensor message; when the trigger message is analyzed, sending a power-on instruction to a camera driver; and when the resolved message is a release message, sending a power-down instruction to the camera driver.
When the camera driver receives the power-on instruction, the AF driving chip is powered on, and the driving chip controls the motion state of the motor in the AF lens module.
And when the camera driver receives a power-down instruction, powering down the AF driving chip.
Specific implementation details, implementation principles, beneficial effects and the like of the embodiment of the present application may refer to the related descriptions of the embodiments shown in fig. 1 to 3, which are not described herein.
As a specific embodiment of the present application, referring to fig. 5, taking an example that the terminal device is a mobile phone, the target lens is an AF lens and is turned on, and the target sensor is an acceleration sensor, a flow chart of a camera control method for controlling motor motion when the terminal device is in a dormant state (e.g. a terminal device is in a screen-off state) is shown. The details are as follows:
after the mobile phone is started, the application processor subscribes to the virtual sensor message of the intelligent sensing hub side.
The intelligent sensor hub monitors acceleration data of the acceleration sensor.
The intelligent sensing hub determines whether the acceleration data exceeds a threshold value.
And when the acceleration data exceeds the threshold value, continuously judging whether the terminal equipment just enters a super-threshold state (namely, a movement state of starting shaking) according to the acceleration data.
And if the terminal equipment just enters the super-threshold state, the intelligent sensor hub generates a trigger message. If the terminal equipment does not just enter the super threshold state, ending the control flow.
The intelligent sensing hub detects whether the application processor module is in a dormant state, and wakes up the application processor module if the application processor module is in the dormant state. And then the triggering message is packaged into a virtual sensor message and then sent to the activity recognition abstraction layer. The virtual sensor message is sent by the activity recognition abstraction layer to the camera abstraction layer.
And when the acceleration data does not exceed the threshold value, continuously judging whether the terminal equipment is just separated from the super-threshold state (namely, the shaking motion state is ended) according to the acceleration data.
If the terminal device just leaves the threshold state, the intelligent sensing hub generates a release message. If the terminal equipment is not just separated from the over-threshold state, ending the control flow.
The intelligent sensing hub detects whether the application processor module is in a dormant state, and wakes up the application processor module if the application processor module is in the dormant state. And then the release message is packaged into a virtual sensor message and then sent to the activity recognition abstraction layer. The virtual sensor message is sent by the activity recognition abstraction layer to the camera abstraction layer.
The camera abstraction layer analyzes the virtual sensor message after receiving the virtual sensor message; when the trigger message is analyzed, sending a power-on instruction to a camera driver; and when the resolved message is a release message, sending a power-down instruction to the camera driver.
When the camera driver receives the power-on instruction, the AF driving chip is powered on, and the driving chip controls the motion state of the motor in the AF lens module.
And when the camera driver receives a power-down instruction, powering down the AF driving chip.
Specific implementation details, implementation principles, beneficial effects and the like of the embodiment of the present application may refer to the related descriptions of the embodiments shown in fig. 1 to 3, which are not described herein.
Compared with the scheme of directly keeping the power-on state of the driving chip for a long time to avoid abnormal motor shaking, the embodiment of the application has at least the following beneficial effects:
1. in the embodiment of the application, the real-time monitoring of the motion state is realized by using the characteristic of low power consumption of the sensor hub, so that the long-time work of an application processor is avoided, and the power consumption monitored by the embodiment of the application can be greatly reduced.
2. The terminal equipment is electrified in time when starting shaking, so that real-time response to shaking conditions can be realized, and abnormal sound conditions caused by wall collision due to shaking of the mobile phone are avoided to the greatest extent.
3. The terminal device can be powered down in time when shaking just before, so that the situation that the driving chip is powered up for a long time without use can be effectively avoided, the excessive power consumption of the driving chip is avoided, and the power consumption of the terminal device is reduced.
Meanwhile, the power-on and power-off when shaking is finished are combined with the power-on and power-off when shaking is started, so that the embodiment of the application can realize accurate power-on and power-off control on the driving chip during shaking of the terminal equipment. On the basis of avoiding abnormal motor vibration, the power consumption situation brought by the embodiment of the application is controlled as much as possible, so that balance between abnormal motor vibration and power consumption is realized.
4. On the basis of power-on when starting shaking and power-off when finishing shaking, the method introduces periodic shaking detection of the terminal equipment, and powers on the driving chip when detecting the shaking of the terminal equipment. The embodiment of the application can effectively ensure the timeliness of the power-on of the driving chip, thereby improving the timeliness of motor motion control and improving the control effect of motor shaking abnormal sound.
5. According to the embodiment of the application, the data transmission of the sensor hub and the application processor on the original data link can be realized by carrying out virtual sensor message encapsulation, and the data link does not need to be additionally increased or modified. Therefore, the embodiment of the application can realize motor control of more types of terminal equipment, thereby realizing compatibility of more different types of terminal equipment.
6. The embodiment of the application introduces a timeout forced power-down mode, which can avoid the influence of excessively high power consumption on the terminal equipment caused by long-time power-up of the driving chip due to various possible factors, thereby realizing effective control of the power consumption of the terminal equipment.
7. The embodiment of the application can identify the scene where the terminal equipment is located before identifying the motion state of the terminal equipment, and set different schemes for executing logic for different scenes. The motor motion state control is performed in a scene where a user easily perceives motor shake abnormality and the duration is short. In a scene where the user hardly perceives the motor shaking abnormality and the duration is long, the motor motion state control is not performed. Therefore, the embodiment of the application can balance the influence of the motor shaking abnormal sound on the user perception (namely user experience) and solve the power consumption brought to the terminal equipment during shaking abnormal sound, thereby effectively improving the comprehensive use experience of the user.
8. Before sending the virtual sensor message to the application processor, the sensor hub performs dormancy detection on the application processor module and performs dormancy wakeup, so that the embodiment of the application can ensure normal communication between the sensor hub and the application processor. Therefore, the embodiment of the application can timely and accurately control the power on and off of the driving chip, and effectively avoid the shaking abnormal sound of the motor.
9. When the sensor hub recognizes that the terminal device finishes shaking, the sensor hub can generate release information, package the virtual sensor message and send the release information to the ARHal through a primary data link. The encapsulated virtual sensor message is sent by ARHal to CameraHal. Because of an exception handling mechanism between ARHal and Sensorhub, transmission exception problems in the data transmission process can be avoided or improved to a certain extent. Therefore, the embodiment of the application can realize multiplexing of the exception handling mechanism between the ARHal and the sensor, and reduce the workload of a developer on the exception handling development of the CameraHal and the sensor while avoiding or improving the problem of transmission exception in the message transmission process of the virtual sensor.
Corresponding to the camera control method described in the above embodiments, fig. 6 shows a schematic structural diagram of the camera control device provided in the embodiment of the present application, and for convenience of explanation, only the portions related to the embodiment of the present application are shown.
Referring to fig. 6, the camera control apparatus includes:
and the monitoring module 61 is used for monitoring the motion data of the terminal equipment.
The monitoring module 61 is further configured to identify a motion state of the terminal device according to the motion data.
The control module 62 is configured to control the driving chip to power up when the identified motion state is shaking.
The control module 62 is further configured to control the driving chip to power down when the identified motion state is the end of shaking.
The process of implementing respective functions by each module in the camera control device provided in the embodiment of the present application may refer to the foregoing description of the embodiments shown in fig. 1 to 5 and other related method embodiments, which are not repeated herein.
It should be noted that, because the content of information interaction and execution process between the above devices/units is based on the same concept as the method embodiment of the present application, specific functions and technical effects thereof may be referred to in the method embodiment section, and will not be described herein.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic, and should not limit the implementation process of the embodiment of the present application.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in the present specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in the present description and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
Furthermore, the terms "first," "second," "third," and the like in the description of the present specification and in the appended claims, are used for distinguishing between descriptions and not necessarily for indicating or implying a relative importance. It will also be understood that, although the terms "first," "second," etc. may be used herein in some embodiments of the application to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another element. For example, a first table may be named a second table, and similarly, a second table may be named a first table without departing from the scope of the various described embodiments. The first table and the second table are both tables, but they are not the same table.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
The camera control method provided by the embodiment of the application can be applied to terminal equipment with cameras such as mobile phones, tablet computers, wearable equipment, vehicle-mounted equipment, augmented reality (augmented reality, AR)/Virtual Reality (VR) equipment, notebook computers, ultra-mobilepersonal computer (UMPC), netbooks, personal digital assistants (personaldigital assistant, PDA) and the like, and the embodiment of the application does not limit the specific types of the terminal equipment.
For example, the terminal device may be a cellular telephone, a cordless telephone, a Session initiation protocol (Session InitiationProtocol, SIP) telephone, a Wireless Local Loop (WLL) station, a personal digital assistant (PersonalDigital Assistant, PDA) device, a handheld device with Wireless communication capabilities, a computing device or other processing device connected to a Wireless modem, a vehicle mounted device, a car networking terminal, a computer, a laptop computer, a handheld communication device, a handheld computing device, a satellite radio, a television Set Top Box (STB), a customer premise equipment (customer premise equipment, CPE) and/or other devices for communication over a Wireless system, as well as next generation communication systems, such as a terminal device in a 5G network or a terminal device in a future evolved public land mobile network (Public Land Mobile Network, PLMN) network, etc.
By way of example, but not limitation, when the terminal device is a wearable device, the wearable device may also be a generic name for applying wearable technology to intelligently design daily wear, developing wearable devices, such as glasses, gloves, watches, apparel, shoes, and the like. The wearable device is a portable device that is worn directly on the body or integrated into the clothing or accessories of the user. The wearable device is not only a hardware device, but also can realize a powerful function through software support, data interaction and cloud interaction. The generalized wearable intelligent device comprises full functions, large size, and complete or partial functions which can be realized independent of a smart phone, such as a smart watch or a smart glasses, and is only focused on certain application functions, and needs to be matched with other devices such as the smart phone for use, such as various smart bracelets, smart jewelry and the like for physical sign monitoring.
In the following, taking an example that the terminal device is a mobile phone, fig. 7A shows a schematic structural diagram of the mobile phone 100.
The handset 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a SIM card interface 195, etc. The sensor module 180 may include a gyroscope sensor 180A, an acceleration sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an ambient light sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, and a touch sensor 180K (of course, the mobile phone 100 may also include other sensors such as a temperature sensor, a pressure sensor, an air pressure sensor, a bone conduction sensor, etc., which are not shown).
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a Neural network processor (Neural-network Processing Unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors. The controller may be a neural center or a command center of the mobile phone 100. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
The processor 110 may operate the camera control method provided by the embodiment of the present application, so as to avoid abnormal vibration of the motor in the lens module, and improve user experience. The processor 110 may include different devices, for example, when the CPU and the GPU are integrated, the CPU and the GPU may cooperate to execute the camera control method provided in the embodiment of the present application, for example, a part of algorithms in the camera control method are executed by the CPU, and another part of algorithms are executed by the GPU, so as to obtain a faster processing efficiency.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (FLED), a mini, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the cell phone 100 may include 1 or N display screens 194, N being a positive integer greater than 1. The display 194 may be used to display information entered by or provided to a user as well as various graphical user interfaces (graphical user interface, GUI). For example, the display 194 may display photographs, videos, web pages, or files, etc. For another example, the display 194 may display a graphical user interface. Including status bars, hidden navigation bars, time and weather gadgets (widgets), and icons for applications, such as browser icons, etc. The status bar includes the name of the operator (e.g., chinese mobile), the mobile network (e.g., 4G), time, and the remaining power. The navigation bar includes a back (back) key icon, a home screen (home) key icon, and a forward key icon. Further, it is to be appreciated that in some embodiments, bluetooth icons, wi-Fi icons, external device icons, etc. may also be included in the status bar. It will also be appreciated that in other embodiments, a Dock may be included in the graphical user interface, a commonly used application icon may be included in the Dock, and the like. When the processor detects a touch event of a finger (or a stylus or the like) of a user with respect to a certain application icon, a user interface of the application corresponding to the application icon is opened in response to the touch event, and the user interface of the application is displayed on the display screen 194.
In the embodiment of the present application, the display 194 may be an integral flexible display, or a tiled display composed of two rigid screens and a flexible screen located between the two rigid screens may be used. After the processor 110 runs the camera control method provided by the embodiment of the application, the processor 110 can control the external audio output device to switch the output audio signals.
The camera 193 (front camera or rear camera, or one camera may be used as both front camera and rear camera) is used to capture still images or video. In general, the camera 193 may include a photosensitive element such as a lens group including a plurality of lenses (convex lenses or concave lenses) for collecting optical signals reflected by an object to be photographed and transmitting the collected optical signals to an image sensor. The image sensor generates an original image of the object to be photographed according to the optical signal.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the cellular phone 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store, among other things, code for an operating system, an application program (e.g., a camera application, a WeChat application, etc.), and so on. The storage data area may store data created during use of the handset 100 (e.g., images, video, etc. acquired by the camera application), etc.
The internal memory 121 may also store one or more computer programs corresponding to the camera control method provided in the embodiment of the present application. The one or more computer programs stored in the memory 121 and configured to be executed by the one or more processors 110 include instructions that may be used to perform the various steps as in the corresponding embodiments of fig. 1-5, which may include an account verification module, a priority comparison module. The account verification module is used for authenticating system authentication accounts of other terminal devices in the local area network; the priority comparison module can be used for comparing the priority of the audio output request service with the priority of the current output service of the audio output equipment. And the state synchronization module can be used for synchronizing the equipment state of the audio output equipment currently accessed by the terminal equipment to other terminal equipment or synchronizing the equipment state of the audio output equipment currently accessed by other equipment to the local. When the code of the camera control method stored in the internal memory 121 is executed by the processor 110, the processor 110 may control the terminal device to perform the motion data processing.
In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
Of course, the codes of the camera control method provided by the embodiment of the application can also be stored in an external memory. In this case, the processor 110 may run codes of a camera control method stored in the external memory through the external memory interface 120, and the processor 110 may control the terminal device to perform motion data processing.
The function of the sensor module 180 is described below.
The gyro sensor 180A may be used to determine the motion gesture of the handset 100. In some embodiments, the angular velocity of the handset 100 about three axes (i.e., x, y, and z axes) may be determined by the gyro sensor 180A. I.e., the gyro sensor 180A may be used to detect the current motion state of the handset 100, such as shaking or being stationary.
When the display screen in the embodiment of the present application is a foldable screen, the gyro sensor 180A may be used to detect a folding or unfolding operation acting on the display screen 194. The gyro sensor 180A may report the detected folding operation or unfolding operation to the processor 110 as an event to determine the folding state or unfolding state of the display screen 194.
The acceleration sensor 180B can detect the magnitude of acceleration of the mobile phone 100 in various directions (typically three axes). I.e., the gyro sensor 180A may be used to detect the current motion state of the handset 100, such as shaking or being stationary. When the display screen in the embodiment of the present application is a foldable screen, the acceleration sensor 180B may be used to detect a folding or unfolding operation acting on the display screen 194. The acceleration sensor 180B may report the detected folding operation or unfolding operation as an event to the processor 110 to determine the folding state or unfolding state of the display screen 194.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The mobile phone emits infrared light outwards through the light emitting diode. The cell phone uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object in the vicinity of the handset. When insufficient reflected light is detected, the handset may determine that there is no object in the vicinity of the handset. When the display screen in the embodiment of the present application is a foldable screen, the proximity light sensor 180G may be disposed on a first screen of the foldable display screen 194, and the proximity light sensor 180G may detect a folding angle or an unfolding angle of the first screen and the second screen according to an optical path difference of the infrared signal.
The gyro sensor 180A (or the acceleration sensor 180B) may transmit detected motion state information (such as angular velocity) to the processor 110. The processor 110 determines whether it is currently in a handheld state or a foot rest state based on the motion state information (e.g., when the angular velocity is not 0, it is indicated that the mobile phone 100 is in a handheld state).
The fingerprint sensor 180H is used to collect a fingerprint. The mobile phone 100 can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access an application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The touch sensor 180K, also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may be disposed on the surface of the mobile phone 100 at a different location than the display 194.
Illustratively, the display 194 of the handset 100 displays a main interface that includes icons of a plurality of applications (e.g., camera applications, weChat applications, etc.). The user clicks on an icon of the camera application in the main interface by touching the sensor 180K, triggering the processor 110 to launch the camera application, opening the camera 193. The display 194 displays an interface for the camera application, such as a viewfinder interface.
The wireless communication function of the mobile phone 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the handset 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied to the handset 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110. In the embodiment of the present application, the mobile communication module 150 may also be used for information interaction with other terminal devices.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near fieldcommunication, NFC), infrared technology (IR), etc. applied to the handset 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2. In an embodiment of the present application, the wireless communication module 160 may be configured to access an access point device, and send and receive messages to other terminal devices.
In addition, the mobile phone 100 may implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor, etc. Such as music playing, recording, etc. The handset 100 may receive key 190 inputs, generating key signal inputs related to user settings and function control of the handset 100. The cell phone 100 may generate a vibration alert (such as an incoming call vibration alert) using the motor 191. The indicator 192 in the mobile phone 100 may be an indicator light, which may be used to indicate a state of charge, a change in power, an indication message, a missed call, a notification, etc. The SIM card interface 195 in the handset 100 is used to connect to a SIM card. The SIM card may be inserted into the SIM card interface 195 or removed from the SIM card interface 195 to enable contact and separation with the handset 100.
It should be understood that in practical applications, the mobile phone 100 may include more or fewer components than shown in fig. 7A, and embodiments of the present application are not limited. The illustrated handset 100 is only one example, and the handset 100 may have more or fewer components than shown in the figures, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The software system of the terminal device can adopt a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture or a cloud architecture. In the embodiment of the application, an Android system with a layered architecture is taken as an example, and the software structure of terminal equipment is illustrated. Fig. 7B is a software configuration block diagram of a terminal device according to an embodiment of the present application.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into five layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun rows (Android run) and libraries, a hardware abstraction layer (HardwareAbstract Layer, HAL), and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in fig. 7B, the application package may include applications such as phone, camera, gallery, calendar, talk, map, navigation, WLAN, bluetooth, music, video, short message, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 7B, the application framework layer may include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is arranged to provide communication functions for the terminal device. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the terminal equipment vibrates, and an indicator light blinks.
Android run time includes a core library and virtual machines. Android run is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGLES), 2D graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio video encoding formats, such as: MPEG4, h.164, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The hardware abstraction layer is a hardware interface layer abstracted on a concrete hardware platform, and is responsible for realizing the functions and control of the concrete hardware platform and providing a unified API interface for other software modules. The hardware abstraction layer can abstract the commonality of hardware operation and control, and provides a unified control interface for upper-layer software so as to realize isolation of other software modules from the bottom-layer hardware. A number of modules may be contained within the hardware abstraction layer, such as CameraHal, ARHal and AudioHal, etc.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
Meanwhile, in the embodiment of the application, the hardware layer at least comprises an application processor, an intelligent sensor hub, a motion sensor (such as an acceleration sensor) and a lens module (such as an AF lens module and an OIS lens module), wherein the lens module comprises a motor and a driving chip for controlling the motion state of the motor.
The following illustrates the workflow of the software and hardware of the mobile phone 100 in conjunction with the scenario in which the mobile phone 100 performs camera control.
And when the terminal equipment shakes, the acceleration sensor sends the detected acceleration data to the sensor hub. When the sensor hub recognizes that power-on or power-off is needed, the trigger message or the release message is transmitted to the CameraHal of the application processor in the hardware abstraction layer. And then the CameraHal generates a corresponding power-on instruction or power-off instruction and sends the instruction to the camera driver. And the camera is used for driving to power on or power off the driving chip. Finally, when the driving chip is electrified, the motor motion state is controlled, so that the motor is prevented from shaking and abnormal sound. The driving chip releases the motion control of the motor when the driving chip is powered down, so that the power consumption brought to the terminal equipment is controlled.
Fig. 8 is a schematic structural diagram of a terminal device according to an embodiment of the present application. As shown in fig. 8, the terminal device 8 of this embodiment includes: at least one processor 80 (only one is shown in fig. 8), a memory 81, said memory 81 having stored therein a computer program 82 executable on said processor 80. The processor 80, when executing the computer program 82, implements the steps of the various camera control method embodiments described above, such as steps 101 through 107 shown in fig. 1. Alternatively, the processor 80, when executing the computer program 82, performs the functions of the modules/units of the apparatus embodiments described above, such as the functions of the modules 61 to 62 shown in fig. 6.
The terminal device 8 may be a computing device such as a desktop computer, a notebook computer, a palm computer, a cloud server, etc. The terminal device may include, but is not limited to, a processor 80, a memory 81. It will be appreciated by those skilled in the art that fig. 8 is merely an example of a terminal device 8 and does not constitute a limitation of the terminal device 8, and may include more or less components than illustrated, or may combine certain components, or different components, e.g., the terminal device may also include an input transmitting device, a network access device, a bus, etc.
The processor 80 may be a central processing unit (CentralProcessing Unit, CPU), other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 81 may in some embodiments be an internal storage unit of the terminal device 8, such as a hard disk or a memory of the terminal device 8. The memory 81 may be an external storage device of the terminal device 8, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the terminal device 8. Further, the memory 81 may also include both an internal storage unit and an external storage device of the terminal device 8. The memory 81 is used for storing an operating system, application programs, boot loader (BootLoader), data, other programs etc., such as program codes of the computer program etc. The memory 81 may also be used for temporarily storing data that has been transmitted or is to be transmitted.
In addition, it will be clearly understood by those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The embodiment of the application also provides a terminal device, which comprises at least one memory, at least one processor and a computer program stored in the at least one memory and capable of running on the at least one processor, wherein the processor executes the computer program to enable the terminal device to realize the steps in any of the method embodiments.
Embodiments of the present application also provide a computer readable storage medium storing a computer program which, when executed by a processor, implements steps for implementing the various method embodiments described above.
The embodiments of the present application provide a computer program product enabling a terminal device to carry out the steps of the method embodiments described above when the computer program product is run on the terminal device.
The embodiment of the application also provides a chip system, which comprises a processor, wherein the processor is coupled with a memory, and the processor executes a computer program stored in the memory to realize the steps in the embodiments of the method. The processor may include an application processor and a coprocessor, and the application processor and the coprocessor are respectively responsible for the corresponding steps in the above-described method embodiments.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable storage medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (14)

1. A camera control method, characterized in that it is applied to a terminal device, the terminal device comprising: the intelligent sensing hub comprises a lens module, an intelligent sensing hub and an application processor, wherein the lens module comprises a motor and a driving chip for driving the motor, and the method comprises the following steps of:
monitoring motion data of the terminal equipment;
identifying the motion state of the terminal equipment according to the motion data;
when the identified motion state starts to shake, controlling the driving chip to be electrified;
when the identified motion state is shaking, controlling the driving chip to be powered down;
the motion data of the terminal equipment are monitored; identifying the motion state of the terminal equipment according to the motion data, including:
the intelligent sensing hub monitors the motion data of the terminal equipment and identifies the motion state of the terminal equipment according to the motion data;
the controlling the driving chip to be powered on includes:
the intelligent sensor hub generates trigger information, encapsulates the trigger information into virtual sensor information and then sends the virtual sensor information to the application processor through a primary data link;
and the application processor powers on the driving chip when the received virtual sensor message is the trigger message.
2. The camera control method according to claim 1, wherein the lens module further comprises: a target lens;
before identifying the motion state of the terminal device according to the motion data, the method further comprises:
detecting whether the target lens is started;
and when the target lens is not opened, executing the operation of identifying the motion state of the terminal equipment according to the motion data.
3. The camera control method according to claim 1, wherein the controlling the driving chip to be powered down includes:
the intelligent sensor hub in the terminal equipment generates release information, encapsulates the release information into virtual sensor information and sends the virtual sensor information to the application processor;
and the application processor powers down the driving chip when the received virtual sensor message is the trigger message.
4. The camera control method according to claim 3, wherein the terminal device further comprises: driving a camera;
and when the received virtual sensor message is the trigger message, the application processor powers on the driving chip, including:
the application processor sends a power-on instruction to the camera driver when the received virtual sensor message is the trigger message;
When the camera driver receives the power-on instruction, the camera driver controls the driving chip to be powered on;
and when the received virtual sensor message is the trigger message, the application processor powers down the driving chip, including:
the application processor sends a power-down instruction to the camera driver when the received virtual sensor message is the release message;
and when the camera driver receives the power-down instruction, controlling the driving chip to power down.
5. The camera control method of claim 1, further comprising, before the intelligent sensor hub sends the trigger message encapsulated as a virtual sensor message to an application processor:
detecting whether a module to which the application processor belongs is in a dormant state or not;
when the module of the application processor is in the dormant state, the intelligent sensor hub wakes up the module of the application processor, and executes the operation of packaging the trigger message into a virtual sensor message and then sending the virtual sensor message to the application processor.
6. The camera control method according to claim 1, characterized by further comprising:
And when the identified motion state is in shaking, controlling the driving chip to be electrified.
7. The camera control method according to any one of claims 1 to 6, characterized by further comprising, after the controlling the powering up of the driving chip:
timing the power-on duration;
and when the timing duration reaches the preset duration upper limit value, controlling the driving chip to be powered down.
8. The camera control method according to claim 1, characterized by further comprising, before said identifying the motion state of the terminal device from the motion data:
acquiring a scene where the terminal equipment is located;
and when the scene where the terminal equipment is located is a first target scene, executing the operation of identifying the motion state of the terminal equipment according to the motion data.
9. The method for controlling a camera according to claim 8, wherein the acquiring the scene in which the terminal device is located includes:
acquiring sound intensity of the environment where the terminal equipment is located, vibration intensity of the terminal equipment and moving speed of the terminal equipment;
and when the sound intensity is lower than a volume threshold value, the vibration intensity is lower than a vibration threshold value and the moving speed is lower than a speed threshold value, judging that the scene where the terminal equipment is positioned is the first target scene.
10. The camera control method according to claim 1, wherein the lens module further comprises: a target lens;
the target lens is an automatic focusing lens or an optical anti-shake lens.
11. The camera control method according to claim 10, wherein when the target lens is an auto-focus lens, the motion data is acceleration data.
12. The camera control method according to claim 1, wherein the hardware abstraction layer of the terminal device includes a camera abstraction layer and an activity recognition abstraction layer corresponding to the application processor;
the intelligent sensor hub generates trigger information, encapsulates the trigger information into virtual sensor information and then sends the virtual sensor information to the application processor; and when the received virtual sensor message is the trigger message, the application processor powers on the driving chip, including:
the intelligent sensor hub generates trigger information, encapsulates the trigger information into virtual sensor information and then sends the virtual sensor information to the activity recognition abstraction layer;
the dynamic identification abstraction layer sends the virtual sensor message to the camera abstraction layer;
And the camera abstraction layer powers on the driving chip when the received virtual sensor message is the trigger message.
13. A terminal device, characterized in that it comprises a camera, a memory, a smart sensor hub and an application processor, said memory having stored thereon a computer program executable on said processor, said smart sensor hub and said application processor implementing the camera control method according to any one of claims 1 to 12 when executing said computer program.
14. A chip system comprising a smart sensor hub and an application processor, both coupled to a memory, the smart sensor hub and the application processor executing a computer program stored in the memory to implement the camera control method of any one of claims 1 to 12.
CN202310208838.9A 2023-03-07 2023-03-07 Camera control method and device and terminal equipment Active CN116074615B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310208838.9A CN116074615B (en) 2023-03-07 2023-03-07 Camera control method and device and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310208838.9A CN116074615B (en) 2023-03-07 2023-03-07 Camera control method and device and terminal equipment

Publications (2)

Publication Number Publication Date
CN116074615A CN116074615A (en) 2023-05-05
CN116074615B true CN116074615B (en) 2023-09-08

Family

ID=86175075

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310208838.9A Active CN116074615B (en) 2023-03-07 2023-03-07 Camera control method and device and terminal equipment

Country Status (1)

Country Link
CN (1) CN116074615B (en)

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011193185A (en) * 2010-03-15 2011-09-29 Sony Corp Imaging device, imaging system, method of controlling interchangeable lens, and program
CN202276353U (en) * 2011-09-23 2012-06-13 无锡中科智能信息处理研发中心有限公司 Intelligent building information sharing platform based on multi-sensor fusion
JP2015050812A (en) * 2013-08-30 2015-03-16 株式会社ニコン Vibration actuator, lens barrel, and camera
CN105869353A (en) * 2015-12-08 2016-08-17 乐视移动智能信息技术(北京)有限公司 Human-body falling down event detection method, apparatus and mobile terminal thereof
CN107302661A (en) * 2017-06-26 2017-10-27 维沃移动通信有限公司 A kind of camera control method and mobile terminal
CN107734228A (en) * 2017-10-31 2018-02-23 广东欧珀移动通信有限公司 A kind of camera module and its control method, electronic equipment
CN109981938A (en) * 2017-12-28 2019-07-05 广东欧珀移动通信有限公司 A kind of camera module and its control method, electronic equipment, storage medium
CN111328020A (en) * 2018-12-17 2020-06-23 华为技术有限公司 Service processing method and device based on indoor positioning system
CN112839177A (en) * 2021-01-20 2021-05-25 北京小米移动软件有限公司 Lens control method, lens control device and storage medium
CN113014763A (en) * 2021-02-20 2021-06-22 维沃移动通信有限公司 Camera module and electronic equipment
CN113766127A (en) * 2021-09-07 2021-12-07 Oppo广东移动通信有限公司 Control method and device of mobile terminal, storage medium and electronic equipment
CN114666682A (en) * 2022-03-25 2022-06-24 陈同中 Multi-sensor Internet of things resource self-adaptive deployment management and control middleware
CN114721108A (en) * 2020-12-18 2022-07-08 北京小米移动软件有限公司 Driving device, method, apparatus and medium for eliminating lens impact noise
WO2022152069A1 (en) * 2021-01-13 2022-07-21 宁波舜宇光电信息有限公司 Camera module and terminal device
CN115132224A (en) * 2021-03-25 2022-09-30 北京小米移动软件有限公司 Abnormal sound processing method, device, terminal and storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9752892B2 (en) * 2014-02-20 2017-09-05 Google Inc. Methods and systems for acquiring sensor data on a device using multiple acquisition modes
US20150281409A1 (en) * 2014-03-25 2015-10-01 Rupesh Tatiya Using usb over ip to share a non-usb sensor with another device
CN105759935B (en) * 2016-01-29 2019-01-18 华为技术有限公司 A kind of terminal control method and terminal
CN108108007B (en) * 2017-12-21 2019-11-19 维沃移动通信有限公司 A kind of processing method and mobile terminal reducing power consumption
US20220214435A1 (en) * 2019-06-05 2022-07-07 Sony Semiconductor Solutions Corporation Distance measuring sensor, signal processing method, and distance measuring module
KR20210056632A (en) * 2019-11-11 2021-05-20 삼성전자주식회사 Method for image processing based on message and electronic device implementing the same

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011193185A (en) * 2010-03-15 2011-09-29 Sony Corp Imaging device, imaging system, method of controlling interchangeable lens, and program
CN202276353U (en) * 2011-09-23 2012-06-13 无锡中科智能信息处理研发中心有限公司 Intelligent building information sharing platform based on multi-sensor fusion
JP2015050812A (en) * 2013-08-30 2015-03-16 株式会社ニコン Vibration actuator, lens barrel, and camera
CN105869353A (en) * 2015-12-08 2016-08-17 乐视移动智能信息技术(北京)有限公司 Human-body falling down event detection method, apparatus and mobile terminal thereof
CN107302661A (en) * 2017-06-26 2017-10-27 维沃移动通信有限公司 A kind of camera control method and mobile terminal
CN107734228A (en) * 2017-10-31 2018-02-23 广东欧珀移动通信有限公司 A kind of camera module and its control method, electronic equipment
CN109981938A (en) * 2017-12-28 2019-07-05 广东欧珀移动通信有限公司 A kind of camera module and its control method, electronic equipment, storage medium
CN111328020A (en) * 2018-12-17 2020-06-23 华为技术有限公司 Service processing method and device based on indoor positioning system
CN114721108A (en) * 2020-12-18 2022-07-08 北京小米移动软件有限公司 Driving device, method, apparatus and medium for eliminating lens impact noise
WO2022152069A1 (en) * 2021-01-13 2022-07-21 宁波舜宇光电信息有限公司 Camera module and terminal device
CN112839177A (en) * 2021-01-20 2021-05-25 北京小米移动软件有限公司 Lens control method, lens control device and storage medium
CN113014763A (en) * 2021-02-20 2021-06-22 维沃移动通信有限公司 Camera module and electronic equipment
CN115132224A (en) * 2021-03-25 2022-09-30 北京小米移动软件有限公司 Abnormal sound processing method, device, terminal and storage medium
CN113766127A (en) * 2021-09-07 2021-12-07 Oppo广东移动通信有限公司 Control method and device of mobile terminal, storage medium and electronic equipment
CN114666682A (en) * 2022-03-25 2022-06-24 陈同中 Multi-sensor Internet of things resource self-adaptive deployment management and control middleware

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
数码相机防抖技术及检测方法进展;王潇潇;谭山;黄锋;;电子测试(第21期);全文 *

Also Published As

Publication number Publication date
CN116074615A (en) 2023-05-05

Similar Documents

Publication Publication Date Title
CN110708086B (en) Split-screen display method and electronic equipment
CN114217699B (en) Method for detecting pen point direction of handwriting pen, electronic equipment and handwriting pen
CN110839096A (en) Touch method of equipment with folding screen and folding screen equipment
CN114513847B (en) Positioning method, device, system, electronic equipment and storage medium
CN115297405A (en) Audio output method and terminal equipment
CN110780929B (en) Method for calling hardware interface and electronic equipment
WO2023005282A9 (en) Message pushing method and apparatus
WO2022170856A1 (en) Method for establishing connection, and electronic device
CN116048833B (en) Thread processing method, terminal equipment and chip system
CN110647731A (en) Display method and electronic equipment
CN114915996A (en) Communication exception handling method and related device
WO2022135157A1 (en) Page display method and apparatus, and electronic device and readable storage medium
CN113438366A (en) Information notification interaction method, electronic device and storage medium
WO2022048453A1 (en) Unlocking method and electronic device
CN111381996B (en) Memory exception handling method and device
CN116028148B (en) Interface processing method and device and electronic equipment
CN116074615B (en) Camera control method and device and terminal equipment
CN114691248B (en) Method, device, equipment and readable storage medium for displaying virtual reality interface
CN116048831A (en) Target signal processing method and electronic equipment
CN115729431A (en) Control content dragging method, electronic device and system
CN114398108A (en) Electronic device, drive loading method thereof, and medium
CN115083400A (en) Voice assistant awakening method and device
WO2024159996A1 (en) Network search device management method and electronic device
CN115514840B (en) Method, system, equipment and readable storage medium for notifying message prompt
CN113542315B (en) Communication framework, business event processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant