CN116996765B - Control chip management method, electronic device, and computer-readable storage medium - Google Patents

Control chip management method, electronic device, and computer-readable storage medium Download PDF

Info

Publication number
CN116996765B
CN116996765B CN202311271728.3A CN202311271728A CN116996765B CN 116996765 B CN116996765 B CN 116996765B CN 202311271728 A CN202311271728 A CN 202311271728A CN 116996765 B CN116996765 B CN 116996765B
Authority
CN
China
Prior art keywords
camera
cameras
aon
instruction
control chip
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311271728.3A
Other languages
Chinese (zh)
Other versions
CN116996765A (en
Inventor
赵玉霞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202311271728.3A priority Critical patent/CN116996765B/en
Publication of CN116996765A publication Critical patent/CN116996765A/en
Application granted granted Critical
Publication of CN116996765B publication Critical patent/CN116996765B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/65Control of camera operation in relation to power supply
    • H04N23/651Control of camera operation in relation to power supply for reducing power consumption by affecting camera operations, e.g. sleep mode, hibernation mode or power off of selective parts of the camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application relates to the technical field of electronics, and provides a control chip management method, electronic equipment and a computer readable storage medium. The method is applied to electronic equipment, the electronic equipment at least comprises a target camera and other cameras, the target camera and the other cameras are different, and control chips of the target camera and the other cameras are connected with the same power port. The method comprises the following steps: acquiring a shooting instruction, wherein the shooting instruction is used for indicating the target camera to shoot; and responding to the shooting instruction, sending a dormancy instruction to the control chips of the other cameras, wherein the dormancy instruction is used for indicating the control chips of the other cameras to enter a dormancy state. The method can save the power consumption of the electronic equipment.

Description

Control chip management method, electronic device, and computer-readable storage medium
Technical Field
The present application relates to the field of electronic technology, and in particular, to a control chip management method, an electronic device, and a computer readable storage medium.
Background
In order to make the shooting functions of the terminal equipment increasingly richer and improve the shooting effect, the functions and scenes supported by the camera of the terminal equipment are also increasingly more and more.
In some scenarios, for example, in the case of supporting functions such as spaced gestures and face detection, it is often necessary that the front camera be always on to ensure that external gestures or faces can be recognized at any time. The front-end camera is often also configured with an auto-focus integrated circuit (AF IC) to achieve an auto-focusing function of the front-end camera, so as to improve the shooting effect of the front-end camera. Specifically, the terminal device needs to supply power to the AF IC of the front camera in a state that the front camera is normally open, and drives the motor of the front camera through the AF IC, so as to realize automatic focusing. In the current terminal device, an AF IC of the front camera is powered by a power management unit (power management unit, PMU). For example, the power port of the AF IC of the front-end camera is connected with the power port of a low dropout linear regulator (low dropout regulator, LDO) of the PUM to supply power. Typically, the cameras on the terminal device include more than one front camera and one or more rear cameras. The rear camera can also realize automatic focusing under the control of the corresponding AF IC. Some rear cameras are also often equipped with an optical image stabilizer (Optical Image Stabilizer, OIS) to prevent shake. In the current terminal equipment, in order to save cost, the same path of LDO power supply is generally used for supplying power to devices such as AF IC, OIS and the like of all cameras.
For terminal equipment powered by the same power supply for devices such as AF ICs and OIS of a plurality of cameras, the AF ICs and OIS of the rear cameras can be powered together when the AF ICs of the front cameras are powered. The AF ICs, OIS, etc. of the rear cameras do not need to use devices, and after power is applied, a certain substrate power consumption is generated, which results in power consumption waste.
Disclosure of Invention
The application provides a control chip management method, a control chip management device, electronic equipment, a computer readable storage medium and a computer program product, which can save power consumption.
In a first aspect, a method for managing a control chip is provided, the method is applied to an electronic device, the electronic device at least includes a target camera and other cameras, the target camera and other cameras are different, and control chips of the target camera and other cameras are all connected with the same power port, and the method includes: acquiring a shooting instruction, wherein the shooting instruction is used for indicating a target camera to shoot; and responding to the shooting instruction, sending a dormancy instruction to the control chips of the other cameras, wherein the dormancy instruction is used for indicating the control chips of the other cameras to enter a dormancy state.
In general, an electronic device is provided with a plurality of cameras, a front camera is disposed on one side of a screen, and a rear camera is disposed on one side of a rear cover. In general, the front camera may be used for detecting a face, a gesture, etc., and needs to be kept in a normally open state. The rear cameras typically include a main camera and may also include cameras that support other photographing functions, such as wide angle cameras that photograph a wide range of scenes, tele cameras that photograph a long view, and so on. Sometimes, the rear camera may also include a macro camera that photographs close-up objects. The cameras are provided with corresponding automatic focusing chips for automatic focusing, and some cameras are also provided with OIS for anti-shake.
To save cost, the same LDO power supply of PMU is typically used to power devices such as AF IC and OIS of all cameras.
For electronic equipment powered by the same power supply for devices such as AF ICs and OIS of a plurality of cameras, the AF ICs and OIS of the rear cameras can be powered together when the AF ICs of the front cameras are required to be powered (simply powered up). The devices which are not needed to be used in the current state such as an AF IC, an OIS and the like of the rear camera generate certain substrate power consumption after being loaded with power, so that the power consumption is wasted. For example, the OIS may enter a standby mode (i.e., standby mode) after power is applied thereto. OIS generates a substrate current of 3-5mA even in the off-screen state in standby mode, and thus there is some power consumption.
When the PMU loads power to the control chip of the target camera, the control chips of other cameras and the control chip of the target camera share the power port of the same PMU, so that the PMU can load power to the control chips of other cameras at the same time. At this time, the control chips of other cameras enter a standby state. In response to the photographing instruction, the electronic device may transmit a sleep instruction to a control chip of the other camera than the target camera. The sleep instruction may be transmitted in the form of an I2C signal. The sleep command may be input through a mode port on the control chip different from the power port, instructing the control chip to enter a sleep state from a standby state. The control chip entering the sleep state has no additional power consumption.
The shooting instruction may be an instruction input by a user to manually click on a screen of the electronic device to open the camera APP to enter a shooting state, or an instruction input by a user to open the camera APP by voice, which is not limited in the embodiment of the present application. Optionally, the shooting instruction may also be an instruction to open the front camera to a normally open state, for example, an instruction for a user to open a function of detecting a gesture in real time or detecting a face in real time. Optionally, the shooting instruction may also be an instruction input by a user to open a camera in an online-on-camera (AON) mode. If the front camera is in an AON mode when the electronic equipment is started by default, the shooting instruction is an instruction for loading the camera after the electronic equipment is started. The application does not limit the specific form of the shooting instruction.
When the target camera is a front camera in a real-time online mode, the electronic device can send a dormancy instruction to each control chip of the rear camera in sequence, so that the control chips such as an AF IC and an OIS corresponding to the rear camera enter a dormancy state, and further, extra power consumption is avoided.
Alternatively, the target camera may be one of the rear cameras. For example, when the rear-mounted main camera is in the AON mode, the electronic device may output a sleep instruction from control chips such as a camera except the main camera and an AF IC of the front-mounted camera in the rear-mounted camera to instruct the control chips to enter a sleep state, so as to avoid generating additional power consumption.
Under the condition that the control chips of all cameras are electrified, the electronic equipment sends a dormancy instruction to the control chips such as AF ICs, OIS and the like of other cameras which are not needed to be used, and instructs the control chips of the other cameras which are not needed to be used to enter a dormancy state, so that the power consumption is saved. According to the method, the power supply mode of the control chip of each camera is not required to be redesigned, and a new power port is not required to be introduced to supply power to different control chips respectively, so that the production and design cost can be guaranteed not to be increased. Meanwhile, the control chips of other cameras are prevented from generating extra power consumption under the condition of not being used, and the power consumption of the whole camera is reduced, so that the standby time is prolonged.
In some possible implementations, the target camera is a front camera and the other cameras are rear cameras.
In general, the front camera needs to monitor the face or the gesture in real time, and can be set to be in a normally open state. The rear camera can be opened or closed as required.
In some possible implementations, when the front camera is in the real-time online AON mode, sending a sleep instruction to a control chip corresponding to another camera includes: sending a dormancy instruction to control chips corresponding to other cameras through an AON interface of a hardware abstraction layer; the method further comprises the steps of: the intelligent sensing hub is used for controlling the front-mounted camera to shoot.
It should be noted that, in the current electronic device, the strategy of camera mapping is divided into two cases. An AP side map in a non-AON mode, and a Sensor Hub side map in an AON mode. When the front camera is in an AON mode, under the condition that other cameras are not used, the electronic equipment sends a dormancy instruction to the control chip corresponding to the other cameras through the AON interface of the hardware abstraction layer, and the control chip of the unused camera is instructed to enter a dormancy state to save power consumption.
In some possible implementations, when the front camera is not in the AON mode, sending a sleep instruction to a control chip corresponding to another camera includes: sending a dormancy instruction to control chips corresponding to other cameras through the camera hardware abstraction layer; the method further comprises the steps of: and driving and controlling the front camera to shoot through the front camera of the inner core layer.
When the front camera is in an AON mode, under the condition that other cameras are not used, the image strategy of the electronic equipment is executed on the AP side, and the electronic equipment sends a dormancy instruction to a control chip corresponding to the other cameras through an AON interface of a hardware abstraction layer to instruct the control chip of the unused camera to enter a dormancy state so as to save power consumption. Meanwhile, the electronic equipment controls the front camera to draw through the Sensor Hub.
In some possible implementations, the target camera includes a front camera and a main camera in the rear camera, the front camera is in an AON mode, and sending a sleep instruction to a control chip corresponding to the other cameras includes: sending a dormancy instruction to control chips corresponding to other cameras through an AON interface of a hardware abstraction layer; the method further comprises the steps of: and controlling the intelligent sensing hub to control the front camera to shoot through the AON interface.
When a user opens a main camera in the rear camera to shoot, and the front camera is in a normally open state, the target camera comprises the front camera and the main camera. If the front camera is in the AON mode, the electronic equipment can send a dormancy instruction to the control chip corresponding to other cameras through the AON interface of the hardware abstraction layer, and the control chip of the unused camera is instructed to enter a dormancy state so as to save power consumption. Meanwhile, the electronic equipment controls the front camera to be mapped through the Sensor Hub, and the electronic equipment also controls the main camera to be mapped through the main camera drive of the kernel layer.
In some possible implementations, the target camera includes a front camera and a main camera in the rear camera, the front camera is not in an AON mode, and sending a sleep instruction to a control chip corresponding to the other cameras includes: sending a dormancy instruction to control chips corresponding to other cameras through the camera hardware abstraction layer; the method further comprises the steps of: and driving and controlling the front camera to shoot through the front camera of the inner core layer.
When a user opens a main camera in the rear camera to shoot, and the front camera is in a normally open state, the target camera comprises the front camera and the main camera. If the front camera is not in the AON mode, the electronic equipment can send a dormancy instruction to the control chip corresponding to other cameras through the camera hardware abstraction layer, and the control chip of the unused camera is instructed to enter a dormancy state so as to save power consumption. Meanwhile, the electronic equipment controls the main camera to be drawn through the main camera drive of the kernel layer, and controls the front camera to be drawn through the front camera drive of the kernel layer.
In some possible implementations, the target camera is a main camera in the rear camera, the main camera is in an AON mode, and sends a sleep instruction to a control chip corresponding to the other cameras, including: sending a dormancy instruction to control chips corresponding to other cameras through an AON interface of a hardware abstraction layer; the method further comprises the steps of: and controlling the intelligent sensing hub to control the main camera to shoot through the AON interface.
When a user opens a main camera in the rear camera to shoot, and the front camera is not opened, if the main camera is in an AON mode, the electronic equipment can send a dormancy instruction to control chips corresponding to other cameras through an AON interface of a hardware abstraction layer, and the control chips of the unused cameras are instructed to enter a dormancy state to save power consumption. Meanwhile, the electronic equipment controls the main camera to be mapped through the Sensor Hub.
In some possible implementations, the target camera is a main camera in the rear camera, the main camera is in an AON mode, and sends a sleep instruction to a control chip corresponding to the other cameras, including: sending dormancy instructions to control chips corresponding to other cameras except the main camera in the rear camera through an AON interface of the hardware abstraction layer; the method further comprises the steps of: the intelligent sensing hub is controlled to control the main camera to shoot through the AON interface, and the front camera is controlled to shoot through the front camera driving of the inner core layer.
When a user opens a main camera in the rear camera to shoot, and the front camera defaults to be in a normally open state, if the main camera is in an AON mode, the electronic equipment can send a dormancy instruction to control chips corresponding to other cameras of the non-main camera and the non-front camera through an AON interface of a hardware abstraction layer, and the control chips corresponding to the non-main camera and the other cameras of the non-front camera are instructed to enter the dormancy state to save power consumption. Meanwhile, the electronic equipment controls the main camera to be mapped through the Sensor Hub, and controls the front camera to be mapped through the front camera driving of the kernel layer.
In some possible implementations, the control chip includes an auto-focus chip and/or an image stabilizer.
Typically, the master camera will be equipped with an image stabilizer OIS. Each camera is provided with an automatic focusing chip to realize the automatic focusing chip. These control chips have a certain power consumption in the standby state. When the control chip enters the sleep state under the instruction of the sleep instruction, the power consumption can be saved.
In some possible implementations, the shooting instruction is an instruction input by the user to turn on the camera APP.
The shooting instruction may be an instruction input by a user to manually click on a screen of the electronic device to open the camera APP to enter a shooting state, or an instruction input by a user to open the camera APP by voice, which is not limited in the embodiment of the present application. Optionally, the shooting instruction may also be an instruction to open the front camera to a normally open state, for example, an instruction for a user to open a function of detecting a gesture in real time or detecting a face in real time. Optionally, the shooting instruction may also be an instruction input by a user to open the camera AON mode. If the front camera is in an AON mode when the electronic equipment is started by default, the shooting instruction is an instruction for loading the camera after the electronic equipment is started. The application does not limit the specific form of the shooting instruction.
In a second aspect, a management device for a control chip is provided, which includes a unit composed of software and/or hardware, where the unit is configured to perform any one of the methods in the first aspect.
In a third aspect, there is provided an electronic device, comprising: a processor, a memory, and an interface; the processor, the memory and the interface cooperate with each other such that the electronic device performs any one of the methods of the technical solutions of the first aspect.
In a fourth aspect, there is provided a computer readable storage medium having stored therein a computer program which, when executed by a processor, causes the processor to perform any one of the methods of the first aspect.
In a fifth aspect, there is provided a computer program product comprising: computer program code which, when run on an electronic device, causes the electronic device to perform any one of the methods of the solutions of the first aspect.
Drawings
Fig. 1 is a schematic structural diagram of an example of a terminal device 100 according to an embodiment of the present application;
Fig. 2 is a software architecture block diagram of a terminal device 100 provided in an embodiment of the present application;
FIG. 3 is a flowchart illustrating an exemplary method for managing a control chip according to an embodiment of the present application;
FIG. 4 is a software architecture diagram of an exemplary control chip management method according to an embodiment of the present application;
FIG. 5 is a flowchart illustrating a method for managing a control chip according to another embodiment of the present application;
FIG. 6 is a flowchart illustrating a method for managing a control chip according to another embodiment of the present application;
Fig. 7 is a schematic structural diagram of a management device of a control chip according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings in the embodiments of the present application. Wherein, in the description of the embodiments of the present application, unless otherwise indicated, "/" means or, for example, a/B may represent a or B; "and/or" herein is merely an association relationship describing an association object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone. In addition, in the description of the embodiments of the present application, "plurality" means two or more than two.
The terms "first," "second," "third," and the like, are used below for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first", "a second", or a third "may explicitly or implicitly include one or more such feature.
The management method of the control chip provided by the embodiment of the application can be applied to terminal equipment such as mobile phones, tablet computers, wearable equipment, vehicle-mounted equipment, augmented reality (augmented reality, AR)/Virtual Reality (VR) equipment, notebook computers, ultra-mobile personal computer (UMPC), netbooks, personal digital assistants (personal DIGITAL ASSISTANT, PDA) and the like, and the embodiment of the application does not limit the specific types of the terminal equipment.
Fig. 1 is a schematic structural diagram of an exemplary terminal device 100 according to an embodiment of the present application. The terminal device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the terminal device 100. In other embodiments of the application, terminal device 100 may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The software system of the terminal device 100 may employ a layered architecture, an event driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. In the embodiment of the application, taking an Android system with a layered architecture as an example, a software structure of the terminal device 100 is illustrated.
Fig. 2 is a software configuration block diagram of the terminal device 100 of the embodiment of the present application. The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun rows (Android runtime) and system libraries, and a kernel layer, respectively. The application layer may include a series of application packages.
As shown in fig. 2, the application package may include applications for cameras, gallery, calendar, phone calls, maps, navigation, WLAN, bluetooth, music, video, short messages, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for the application of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the terminal device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction.
Android runtime include core libraries and virtual machines. Android runtime is responsible for scheduling and management of the android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media library (media library), three-dimensional graphics processing library (e.g., openGL ES), 2D graphics engine (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
For easy understanding, the following embodiments of the present application will take a terminal device having a structure shown in fig. 1 and fig. 2 as an example, and specifically describe a management method of a control chip provided by the embodiments of the present application in conjunction with the accompanying drawings and application scenarios.
In order to make the shooting functions of the terminal equipment increasingly richer and improve the shooting effect, the functions and scenes supported by the camera of the terminal equipment are also increasingly more and more. In some scenarios, for example, in the case of supporting functions such as spaced gestures and face detection, it is often necessary that the front camera be always on to ensure that external gestures or faces can be recognized at any time. The front camera often also configures an AF IC to realize an auto-focusing function of the front camera, thereby improving the shooting effect of the front camera. Specifically, the terminal device needs to supply power to the AF IC of the front camera in a state that the front camera is normally open, and drives the motor of the front camera to move through the AF IC, so as to realize automatic focusing. In the current terminal equipment, an AF IC of a front camera supplies power through a power port of a PMU. For example, the power port of the AF IC of the front camera is connected with one path of LDO power port of the PUM to supply power. Typically, the cameras on the terminal device include more than one front camera and one or more rear cameras. The rear camera can also realize automatic focusing under the control of the corresponding AF IC. Some rear cameras also often have OIS configured to prevent shake. In order to save cost, the same path of LDO power supply is generally used for supplying power to devices such as AF IC, OIS and the like of all cameras.
For terminal equipment powered by the same power supply for devices such as AF ICs and OIS of a plurality of cameras, the AF ICs and OIS of the rear cameras can be loaded with the power supply together when the AF ICs of the front cameras are required to be loaded with the power supply (simply called power-on). The devices which are not needed to be used in the current state such as an AF IC, an OIS and the like of the rear camera generate certain substrate power consumption after being loaded with power, so that the power consumption is wasted. For example, the OIS may enter a standby mode (i.e., standby mode) after power is applied thereto. OIS generates a substrate current of 3-5mA even in the off-screen state in standby mode, and thus there is some power consumption.
The embodiment of the application provides a management method of a control chip, when one camera is opened, terminal equipment sends a dormancy instruction to devices such as AF ICs, OIS and the like of other unused cameras, and indicates the devices such as AF ICs, OIS and the like of the unused cameras to enter a dormancy state (sleep mode). The power consumption of the device entering the sleep state is lower than that in the standby mode after power-up, so that the power consumption can be saved.
The management method of the control chip provided by the embodiment of the application can be applied to terminal equipment comprising a plurality of cameras. In general, a camera disposed on a screen side of a terminal device is referred to as a front camera, and a camera disposed on a rear case side of the terminal device is referred to as a rear camera. Typically the latter camera typically comprises more than one camera, for example a main camera, and may comprise one or more of a tele camera, a wide camera and a macro camera. The terminal device can select to call different cameras or different camera combinations to shoot under different shooting modes or shooting requirements. Each camera is provided with a corresponding control chip, and various shooting functions are realized under the control of the corresponding control chip. For example, the camera can realize automatic focusing under the control of a corresponding AF IC and realize optical anti-shake under the control of OIS.
Fig. 3 is a schematic diagram of a management flow of a control chip according to an embodiment of the present application. Specifically, the method can comprise the following steps:
S301, acquiring a shooting instruction, wherein the shooting instruction is used for instructing a target camera to shoot.
When a user wants to take a picture by using the terminal equipment, an application program such as a camera APP can be opened to open a camera to take a picture. The terminal device can then receive and recognize a shooting instruction input by an operation performed by the user to turn on the camera APP or turn on a certain shooting mode. And responding to the shooting instruction, the terminal equipment opens the corresponding target camera in the current shooting mode, and simultaneously supplies power to a control chip of the target camera so as to shoot. For example, when a user opens an application of the camera APP and adjusts to a self-timer mode, the terminal device may open the front-facing camera. The shooting instruction may be an instruction input by a user to manually click on a screen of the terminal device, or may be an instruction input by a user through voice, which is not limited in the embodiment of the present application.
Optionally, the shooting instruction may also be an instruction to open the front camera to a normally open state, for example, an instruction for a user to open a function of detecting a gesture in real time or detecting a face in real time. Optionally, the shooting instruction may also be an instruction input by a user to open a camera in an online-on-camera (AON) mode. If the terminal equipment is started by default, the front camera is in an AON mode, and the shooting instruction is an instruction for loading the camera after the terminal equipment is started. The application does not limit the specific form of the shooting instruction.
S302, responding to shooting instructions, sending dormancy instructions to control chips of other cameras so as to instruct the control chips of the other cameras to enter a dormancy state, wherein the target camera is different from the other cameras.
When the PMU loads power to the control chip of the target camera, the control chips of other cameras and the control chip of the target camera share the power port of the same PMU, so that the PMU can load power to the control chips of other cameras at the same time. At this time, the control chips of other cameras enter a standby state. In response to the photographing instruction, the terminal device may transmit a sleep instruction to the control chip of the other cameras other than the target camera. The sleep instruction may be transmitted in the form of an I2C signal. The sleep command may be input through a mode port on the control chip different from the power port, instructing the control chip to enter a sleep state from a standby state. The control chip entering the sleep state has no additional power consumption.
When the target camera is a front camera in a real-time online mode, the terminal equipment can send a dormancy instruction to each control chip of the rear camera in sequence, so that the control chips such as an AF IC and an OIS corresponding to the rear camera enter a dormancy state, and further, extra power consumption is avoided.
Alternatively, the target camera may be one of the rear cameras. For example, when the rear-mounted main camera is in the AON mode, the terminal device may output a sleep instruction from control chips such as a camera except the main camera and an AF IC of the front-mounted camera in the rear-mounted camera to instruct the control chips to enter a sleep state, so as to avoid generating additional power consumption.
In the embodiment shown in fig. 3, when the control chips of all the cameras are powered on, the terminal device sends a sleep instruction to the control chips of the AF ICs, OIS, etc. of the other cameras that do not need to be used, so as to instruct the control chips of the other cameras that do not need to be used to enter a sleep state, thereby saving power consumption. According to the method, the power supply mode of the control chip of each camera is not required to be redesigned, and a new power port is not required to be introduced to supply power to different control chips respectively, so that the production and design cost can be guaranteed not to be increased. Meanwhile, the control chips of other cameras are prevented from generating extra power consumption under the condition of not being used, and the power consumption of the whole camera is reduced, so that the standby time is prolonged.
It should be noted that, in the current terminal device, the strategy of camera mapping is divided into two cases.
One common pattern (pattern in non-AON mode) is that the execution flow is at the AP (application process, application processor CPU of the terminal device) side. Specifically, referring to fig. 4, in fig. 4, a current opened camera is taken as a front camera, and the front camera is in a normally open mode to detect a gesture or a scene of a face, for example, to describe a graph flow of the camera. When a user opens the front-facing camera through operating the camera APP of the APP layer of the terminal equipment and is in a normal open mode, the camera APP responds to the operation of the user and transmits the drawing instruction of the front-facing camera to the lower layer by layer. Specifically, the camera APP delivers a map instruction to a camera system library layer (also called a hardware abstraction layer, CAMERA HARDWARE abstraction layer, CAMERA HAL for short) through a Java native interface (Java NATIVE INTERFACE, JNI) of an application Framework (FWK) layer. CAMERA HAL in turn pass the map instructions down to the front-end camera driver (e.g., AON CAMERA DRIVER, which may also be referred to as V4L2 node) of the Kernel layer (Kernel) through the camera service layer interfaces (CAMERA SERVICE LAYER INTERFACE, CSL INTERFACE). The front camera drive instructs the front camera to collect images to shoot pictures under the instruction of the picture command. Alternatively, the map instruction may be described as: msgid-aon_camera_is_work. Meanwhile, in the technical scheme of the embodiment of the application, the kernel layer is additionally provided with nodes corresponding to the control chips of other non-front cameras. Taking a newly added node as a node (named as/dev/hicam _ OIS /) of the OIS of the post-positioned main camera as an example, the node of the OIS of the main camera is called a main camera OIS driver, and CAMERA HAL sends a drawing instruction downwards and also sends a dormancy instruction to the node/dev/hicam _ OIS/by adopting an input/output control (I/O CTL) instruction. For example, the sleep instruction may be described as: HICAM _ois_sleep. The node of the OIS may be described as a master camera OIS driver (e.g., custom camera OIS Driver) at the kernel layer, where the master camera OIS driver controls the corresponding OIS to enter a sleep state under the instruction of a sleep instruction. OIS entering sleep state has no additional power consumption.
In general, a corresponding AF IC is further provided for the main camera in the terminal device to achieve auto-focusing of the main camera. When the terminal equipment is in a normally open state of the front camera, a dormancy instruction can be sent to a driver (for short, a driver) of the AF IC of the main camera which is newly added in the kernel layer, so that the AF IC of the main camera also enters a dormancy state.
The second strategy of camera mapping is applied in AON mode. The AON mode will be first described here: the AON mode is an energy efficient real-time operating system running at ultra low power consumption, which runs on the Sensor Hub side (as opposed to the AP side). Typically, the hardware layer of the terminal device will set a Sensor Hub. The Sensor Hub can be connected with sensors such as ambient light and a proximity light Sensor, and a low-power-consumption strategy is realized. The AON mode is a strategy with low power consumption of the real-time online camera. The Sensor Hub is provided with an AON control module and an AON function related firmware (AON firmware, AON FW). The current opened camera is taken as a front camera, and the front camera is in an AON mode to detect a gesture or a scene of a human face for describing the picture flow of the camera. Continuing to refer to the flow shown in fig. 4, when the user opens the front-end camera by operating the camera APP of the APP layer of the terminal device and the front-end camera is in the AON mode, the camera APP responds to the operation of the user and transmits the graphics instruction of the front-end camera according to the AON mode to the lower layer by layer. Specifically, the camera APP transmits a map instruction to an AON interface (AON interface description language, AON HIDL) of the system library through a Java local interface of the application framework layer. The AON service receives the map instruction through the AON HIDL, and passes the map instruction down to the Sensor Hub through a lower-layer-oriented AON interface (HSEEAON interface) of the system library under the support of the AON service (AON service). The msgid message identity (MESSAGE IDENTITY documents, msgid) of the map instruction may be described as: msgid-aon_camera_is_work. The Sensor Hub receives a map instruction through an AON control module (AON control), and then under the support of AON FW, the AON camera (i.e. the front camera) is controlled to map in an AON mode. Meanwhile, in the technical scheme of the embodiment of the application, the corresponding node of the kernel layer is operated through HSEEAON interface of the system library, for example, a dormancy instruction is sent to the OIS node/dev/hicam _ OIS of the main camera. For example, the sleep instruction may be described as: HICAM _ois_sleep. In the process of downward transmission of the graphic instruction, the module at the lower layer can reply a message to the upper layer after receiving the graphic instruction to indicate that the transmission of the graphic instruction is successful.
In the foregoing, the process of setting the OIS of the rear-mounted main camera to the sleep state is described by taking the front-mounted camera as an example. Meanwhile, the AF IC of the main camera is not required to be used, so that the AF IC of the main camera can be configured to enter a sleep state at the same time. Of course, if the main camera is provided with other control chips, the control chip and the control chip of the front camera may be configured to enter the sleep state under the condition that the control chip and the control chip of the front camera share the same power supply.
It should be noted that, even if other control chips of the camera that do not need to be used are forced to be powered on, the terminal device can send a sleep instruction to the drive corresponding to the control chip of the camera that does not need to be used, so as to instruct the control chip of the camera that does not need to be used to enter a sleep state, thereby saving power consumption. The specific manner may refer to a manner of sending the dormancy instruction to the OIS driver of the master camera, and only the sending object of the dormancy instruction is different, which is not described herein. After the graph instruction is successfully transferred downwards, a feedback message of successful transfer is replied to an upper layer.
In the actual use process, corresponding drivers (MISC DEVICE) can be set in the kernel layer for different control chips of each camera, for example, the drivers of AF ICs and OIS of control chips of a tele camera, a wide camera and the like. When a certain camera is not needed to be used, even if the camera is forced to be electrified, the terminal equipment can send a dormancy instruction to a drive corresponding to a control chip of the unused camera through the image processing module or HSEEAON interface to instruct the control chip of the unused camera to enter a dormancy state, so that the power consumption is saved. The manner in which the terminal device sends the sleep instruction to the drive of the control chip such as the AF IC and OIS of the unused camera may be the same as the manner in which the sleep instruction is sent to the main camera OIS drive, which is not described here again.
In one case, when the front camera is in a normal open state or in an AON mode, the rear main camera may also be capturing and displaying the preview interface, that is, both the front camera and the rear main camera are simultaneously in the drawing. The image processing module or HSEEAON interface can send a dormancy instruction to the drive of the control chips of the rear-mounted long-focus camera and the wide-angle camera to instruct the control chips of the long-focus camera and the wide-angle camera to enter a dormancy state, so that invalid power consumption is reduced as much as possible while normal image drawing of the camera is ensured, and power consumption configuration is more reasonable.
The process of how the front-end camera sends the sleep instruction in the process of drawing is described in the foregoing, and the condition of outputting the sleep instruction in the process of turning on the camera for shooting by a user is described next.
Fig. 5 is a flowchart of a method for controlling a chip to enter a sleep state according to an embodiment of the present application. The method specifically comprises the following steps:
s501, starting a target camera.
When a user inputs a shooting instruction, starting a corresponding target camera to work when starting the camera APP.
Specifically, the target camera is related to the current shooting scene. For example, when a user turns on the camera APP to capture a landscape, the target camera is one or more of the rear cameras, such as a camera and a tele camera; when a user opens the camera APP to perform self-timer or detect a face or a gesture, the target camera is a front-mounted camera. Correspondingly, the shooting instruction is an instruction input by opening the camera APP and entering a shooting scene. Optionally, the shooting instruction may also be an instruction to open the front camera to a normally open state, for example, an instruction for a user to open a function of detecting a gesture in real time or detecting a face in real time. Optionally, the shooting instruction may also be an instruction input by the user to open the AON mode of the camera. If the terminal equipment is started by default, the front-facing camera is in an AON mode, and the shooting instruction is an instruction for starting the terminal equipment and loading the front-facing camera. Alternatively, the shooting instruction may be an instruction to turn on the AON mode of the rear camera. The application does not limit the specific form of the shooting instruction.
S502, judging whether the current camera is a front camera or not. If yes, executing S503A; if not, S503B is performed.
The terminal equipment judges whether the currently started target camera is a front camera or not. If the currently started target camera is a front-end camera, the terminal equipment sends a picture command to the front-end camera from the lower part of the application program layer, so that the front-end camera shoots pictures.
Optionally, the front camera may also be in an AON mode, and if the front camera is in the AON mode, the mapping process may be performed on the senser hub side; alternatively, if the front camera is not in AON mode, the mapping procedure may be performed on the AP side. If the currently started target camera is not the front camera but the rear camera, a graph instruction is issued to the rear camera so that the rear camera can shoot a graph. The flow of the camera may be described with reference to the embodiment of fig. 4, and will not be described herein.
S503A, outputting a dormancy instruction to a control chip of the other cameras which are not the front-end cameras.
If the currently started target camera is a front camera, the other cameras which are not the front cameras do not need to work. The terminal device can output a sleep instruction to the control chips of other cameras, such as the rear main camera, the long-focus camera and the wide-angle camera, so that the control chips of the other cameras enter a sleep state to save power consumption.
If the rear-mounted main camera is started currently, and the front-mounted camera still needs to be kept in a normally-open state to detect the face or the gesture, the target camera is the front-mounted camera and the main camera. The other cameras are cameras which are not main cameras in the rear camera. And the terminal equipment outputs a dormancy instruction to the control chips of the other cameras which are not the front camera and the main camera so as to enable the control chips of the other cameras to enter a dormancy state. Meanwhile, the terminal device may also perform the step of S504.
S503B, judging whether the target camera is a camera in AON mode operation (called AON camera for short). If yes, executing S505A; if not, S505B is executed.
And if the currently started target camera is not the front camera, indicating that the target camera is one or more of the rear cameras. For example, if the target camera is the main camera in the rear camera, it is indicated that the front camera and the other cameras except the main camera in the rear camera do not need to work, and the terminal device can output a sleep instruction to the control chips of the front camera and the other cameras in the rear camera, so that each control chip of the front camera and the other cameras enters a sleep state to save power consumption.
For another example, when the target camera is the main camera and the wide-angle camera in the rear camera, it is indicated that the front camera and the other cameras except the main camera and the wide-angle camera in the rear camera do not need to work. The terminal device can output a sleep instruction to the control chips of the front camera and the other cameras except the main camera and the wide-angle camera in the rear camera, so that the control chips of the front camera and the other cameras enter a sleep state to save power consumption.
S504, the control target camera executes a graph flow.
The flow of the camera may be described with reference to the embodiment of fig. 4, and will not be described herein.
S505A, sending dormancy instructions to other cameras of the non-target cameras.
For example, the currently started target camera is not the front camera but the main camera in the rear camera, and the main camera is in the AON mode, and the terminal device may send a sleep instruction to the control chips of other cameras such as the tele camera and the wide camera of the non-main camera in the rear camera, so as to instruct the control chips of the other cameras to enter the sleep state to save power consumption.
When the target camera is not a front camera but is a camera in the AON mode, the terminal device may execute the image flow of the target camera on the senser hub side.
S505B, executing the drawing flow of the target camera.
When the target camera is neither a front camera nor a camera in the AON mode, the terminal device may execute the image flow of the target camera on the AP side.
The specific implementation process of S505B may also refer to the related description of the embodiment of fig. 4, which is not repeated herein.
In the embodiment shown in fig. 5, when the currently started target camera is the front camera, the terminal device outputs a sleep instruction to the control chip of the other rear camera, so as to instruct the control chip of the rear camera not to be used to enter a sleep state, thereby saving power consumption; and under the condition that the target camera is not a front camera, according to whether the target camera is in an AON mode or not, a corresponding module is adopted to output a dormancy instruction, and a control chip of the unused camera is instructed to enter a dormancy state, so that the power consumption is saved. The method can flexibly output the dormancy instructions to different objects according to the using states of the cameras, so that the power consumption saving mode is more reasonable.
Alternatively, if only the front camera on the terminal device supports the AON mode, and none of the other cameras support the AON mode, the specific procedure of using the embodiment shown in fig. 5 is as follows: firstly judging whether a target camera is a front camera or not; if yes, outputting a dormancy instruction to control chips of other cameras through an image processing module (camX), and executing a drawing flow of the front camera on the AP side; if not, continuing to judge whether the target camera is an AON camera. If the target camera is an AON camera, outputting a dormancy instruction to control chips of other cameras through HSEEAON interfaces, and executing the drawing flow of the AON camera on the senser hub side. If the target camera is not the AON camera, executing a drawing flow of the target camera at the AP side; meanwhile, a dormancy instruction is output to control chips of other cameras through an image processing module (camX).
Optionally, the terminal device may further directly determine whether the currently started target camera is in the AON mode to output the sleep instruction, and specifically may refer to a flow shown in fig. 6, including:
s601, starting a target camera.
The S601 may be referred to the description of S501 hereinabove, and will not be repeated here.
S602, judging whether the target camera is in an AON mode. If yes, executing S603A; if not, S603B is executed.
S603A, the terminal equipment outputs a dormancy instruction to the control chips of other cameras through HSEEAON interfaces.
If the target camera is in the AON mode, the terminal equipment can control the target camera to execute a drawing flow on the senser hub side; meanwhile, a dormancy instruction is output to control chips of other cameras through HSEEAON interfaces.
S603B, the terminal equipment outputs dormancy instructions to control chips of other cameras through the image processing module.
If the target camera does not have the AON mode, the terminal equipment can control the target camera to execute a drawing flow on the AP side; meanwhile, a dormancy instruction is output to control chips of other cameras through an image processing module (camX).
S604, executing a drawing flow of the target camera.
The specific process of step S604 may be referred to the embodiment shown in fig. 4, and will not be described herein.
Optionally, when the target camera is not in the AON mode, the terminal device may further continue to determine whether the target camera is a front camera. If yes, the terminal equipment executes a drawing flow on the AP side, and simultaneously sends a dormancy instruction to a control chip of the rear camera through an image processing module (camX); if not, the terminal device executes the drawing flow at senser hub side; and simultaneously, outputting a dormancy instruction to the control chip of the front camera and the unused rear camera through the HSEEAON interface. Optionally, if only the front camera in the cameras of the terminal device supports the AON mode, the target camera is not in the AON mode, and when the target camera is not the front camera, the dormancy instruction can be sent to the control chip of the other rear cameras, and the dormancy instruction is not required to be sent to the front camera, so that the front camera can be in a normally open state, and the function of real-time detection is ensured. Meanwhile, the front camera normally works out to ensure the normal operation of the function of detecting the face or gestures.
Examples of the methods provided by the present application are described in detail above. It is to be understood that the corresponding means, in order to carry out the functions described above, comprise corresponding hardware structures and/or software modules for carrying out the respective functions. Those of skill in the art will readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The present application may divide the function modules of the management device of the control chip according to the above-described method example, for example, each function may be divided into each function module, or two or more functions may be integrated in one module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, the division of the modules in the present application is illustrative, and is merely a logic function division, and other division manners may be implemented in practice.
Fig. 7 shows a schematic structural diagram of a management device for a control chip according to the present application. The apparatus 700 is applied to an electronic device, the electronic device includes at least two cameras, a control chip of each of the at least two cameras is connected to the same power port, and the apparatus 700 includes:
the acquiring module 701 is configured to acquire a shooting instruction, where the shooting instruction is used to instruct a target camera to shoot;
The processing module 702 is configured to send a sleep instruction to control chips of other cameras in response to a shooting instruction, where the sleep instruction is used to instruct the control chips of the other cameras to enter a sleep state, and the target camera and the other cameras are cameras in at least two cameras, and the target camera and the other cameras are different.
In some embodiments, the target camera is a front camera and the other cameras are rear cameras.
In some embodiments, when the front-end camera is in the real-time online AON mode, the processing module 702 is specifically configured to send a sleep instruction to a control chip corresponding to another camera through an AON interface of the hardware abstraction layer;
the processing module 702 is further configured to control the front camera to shoot through the intelligent sensor hub.
In some embodiments, when the front-end camera is not in the AON mode, the processing module 702 is specifically configured to send, through the camera hardware abstraction layer, a sleep instruction to a control chip corresponding to another camera;
the processing module 702 is further configured to drive and control the front camera to shoot through the front camera of the kernel layer.
In some embodiments, the target camera includes a front camera and a main camera in the rear camera, where the front camera is in an AON mode, and the processing module 702 is specifically configured to send a sleep instruction to a control chip corresponding to another camera through an AON interface of the hardware abstraction layer;
the processing module 702 is further configured to control the intelligent sensor hub to control the front camera to shoot through the AON interface.
In some embodiments, the target camera includes a front camera and a main camera in the rear camera, where the front camera is not in an AON mode, and the processing module 702 is specifically configured to send a sleep instruction to a control chip corresponding to another camera through a camera hardware abstraction layer;
the processing module 702 is further configured to drive and control the front camera to shoot through the front camera of the kernel layer.
In some embodiments, the target camera is a main camera in the rear camera, the main camera is in an AON mode, and the processing module 702 is specifically configured to send a sleep instruction to a control chip corresponding to another camera through an AON interface of the hardware abstraction layer;
The processing module 702 is further configured to control, through the AON interface, the intelligent sensor hub to control the main camera to perform shooting.
In some embodiments, the target camera is a main camera in the rear camera, the main camera is in an AON mode, and the processing module 702 is specifically configured to send a sleep instruction to control chips corresponding to other cameras except the main camera in the rear camera through an AON interface of the hardware abstraction layer;
The processing module 702 is further configured to control, through the AON interface, the intelligent sensor hub to control the main camera to perform shooting, and control, through the front-end camera of the kernel layer, the front-end camera to perform shooting.
In some embodiments, the control chip includes an autofocus chip and/or an image stabilizer.
In some embodiments, the shooting instruction is an instruction input by the user to turn on the camera APP.
The specific manner in which the apparatus 700 performs the management method of the control chip and the resulting beneficial effects may be referred to the relevant description in the method embodiments, and will not be repeated here.
The embodiment of the application also provides electronic equipment, which comprises the processor. The electronic device provided in this embodiment may be the terminal device 100 shown in fig. 1, for executing the management method of the control chip described above. In case of an integrated unit, the terminal device may comprise a processing module, a storage module and a communication module. The processing module may be configured to control and manage actions of the terminal device, for example, may be configured to support the terminal device to execute steps executed by the display unit, the detection unit, and the processing unit. The memory module may be used to support the terminal device to execute stored program codes, data, etc. And the communication module can be used for supporting the communication between the terminal equipment and other equipment.
Wherein the processing module may be a processor or a controller. Which may implement or perform the various exemplary logic blocks, modules and circuits described in connection with this disclosure. A processor may also be a combination that performs computing functions, e.g., including one or more microprocessors, digital signal processing (DIGITAL SIGNAL processer, DSP) and microprocessor combinations, and so forth. The memory module may be a memory. The communication module can be a radio frequency circuit, a Bluetooth chip, a Wi-Fi chip and other equipment which interact with other terminal equipment.
In an embodiment, when the processing module is a processor and the storage module is a memory, the terminal device according to this embodiment may be a device having the structure shown in fig. 1.
The embodiment of the application also provides a computer readable storage medium, in which a computer program is stored, which when executed by a processor, causes the processor to execute the method for managing a control chip according to any one of the embodiments.
The embodiment of the application also provides a computer program product, which when running on a computer, causes the computer to execute the related steps so as to realize the management method of the control chip in the embodiment.
The electronic device, the computer readable storage medium, the computer program product or the chip provided in this embodiment are used to execute the corresponding method provided above, so that the beneficial effects thereof can be referred to the beneficial effects in the corresponding method provided above, and will not be described herein.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted or not performed. In addition, the coupling or direct coupling or communication connection shown or discussed with respect to each other may be an indirect coupling or communication connection via interfaces, devices, or units, and the replacement units may or may not be physically separate, and the components shown as units may be one physical unit or multiple physical units, that is, may be located in one place, or may be distributed in multiple different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present application. Therefore, the protection scope of the application is subject to the protection scope of the claims.

Claims (12)

1. The utility model provides a control chip's management method, is applied to electronic equipment, including target camera and other cameras on the electronic equipment at least, target camera with other cameras are different, target camera with other camera's control chip all connects the same power port, its characterized in that, the method includes:
Acquiring a shooting instruction, wherein the shooting instruction is used for indicating the target camera to shoot;
and responding to the shooting instruction, and sending a dormancy instruction to the control chips of the other cameras, wherein the dormancy instruction is used for indicating the control chips of the other cameras to enter a dormancy state.
2. The method of claim 1, wherein the target camera is a front camera and the other cameras are rear cameras.
3. The method of claim 2, wherein when the front camera is in the real-time online AON mode, the sending the sleep command to the control chip corresponding to the other cameras includes:
sending a dormancy instruction to the control chip of the other cameras through the AON interface of the hardware abstraction layer;
The method further comprises the steps of:
and controlling the intelligent sensing hub to control the front camera to shoot through the AON interface.
4. The method of claim 2, wherein when the front camera is not in the AON mode, the sending the sleep command to the control chip corresponding to the other camera includes:
Sending a dormancy instruction to the control chip corresponding to the other cameras through the camera hardware abstraction layer;
The method further comprises the steps of:
And driving and controlling the front camera to shoot through the front camera of the inner core layer.
5. The method of claim 1, wherein the target camera includes a master camera of a front camera and a rear camera, and when the front camera is in the AON mode, the sending a sleep instruction to the control chip corresponding to the other camera includes:
sending a dormancy instruction to the control chip of the other cameras through the AON interface of the hardware abstraction layer;
The method further comprises the steps of:
and controlling the intelligent sensing hub to control the front camera to shoot through the AON interface.
6. The method of claim 5, wherein the target camera includes a master camera of a front camera and a rear camera, and when the front camera is not in the AON mode, the sending a sleep instruction to the control chip corresponding to the other camera includes:
Sending a dormancy instruction to the control chip corresponding to the other cameras through the camera hardware abstraction layer;
The method further comprises the steps of:
And driving and controlling the front camera to shoot through the front camera of the inner core layer.
7. The method of claim 1, wherein the target camera is a master camera in a rear camera, the master camera is in an AON mode, and the sending the sleep command to the control chip corresponding to the other cameras includes:
Sending a dormancy instruction to the control chip corresponding to the other cameras through an AON interface of the hardware abstraction layer;
The method further comprises the steps of:
And controlling the intelligent sensing hub to control the main camera to shoot through the AON interface.
8. The method of claim 1, wherein the target camera is a master camera in a rear camera, the master camera is in an AON mode, and the sending the sleep command to the control chip corresponding to the other cameras includes:
sending a dormancy instruction to control chips corresponding to other cameras except the main camera in the rear camera through an AON interface of a hardware abstraction layer;
The method further comprises the steps of:
and controlling the intelligent sensing hub to control the main camera to shoot through the AON interface, and controlling the front camera to shoot through the front camera driving of the inner core layer.
9. The method according to any one of claims 1 to 8, wherein the control chip comprises an autofocus chip and/or an image stabilizer.
10. The method according to any one of claims 1 to 8, wherein the shooting instruction is an instruction input by a user to turn on a camera APP.
11. An electronic device, comprising: a processor, a memory, and an interface;
The processor, the memory and the interface cooperate to cause the electronic device to perform the method of any of claims 1-10.
12. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a computer program which, when executed by a processor, causes the processor to perform the method of any of claims 1 to 10.
CN202311271728.3A 2023-09-28 2023-09-28 Control chip management method, electronic device, and computer-readable storage medium Active CN116996765B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311271728.3A CN116996765B (en) 2023-09-28 2023-09-28 Control chip management method, electronic device, and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311271728.3A CN116996765B (en) 2023-09-28 2023-09-28 Control chip management method, electronic device, and computer-readable storage medium

Publications (2)

Publication Number Publication Date
CN116996765A CN116996765A (en) 2023-11-03
CN116996765B true CN116996765B (en) 2024-04-23

Family

ID=88528842

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311271728.3A Active CN116996765B (en) 2023-09-28 2023-09-28 Control chip management method, electronic device, and computer-readable storage medium

Country Status (1)

Country Link
CN (1) CN116996765B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105898142A (en) * 2016-04-18 2016-08-24 天津全华时代航天科技发展有限公司 Intelligent work controlling oblique shooting system
CN106331240A (en) * 2016-10-26 2017-01-11 维沃移动通信有限公司 Mobile terminal
CN106851111A (en) * 2017-03-20 2017-06-13 惠州Tcl移动通信有限公司 Dual camera method for controlling power supply, system and mobile terminal based on mobile terminal
JP2019096279A (en) * 2017-02-21 2019-06-20 キヤノン株式会社 Electronic apparatus and control method thereof
CN111654609A (en) * 2020-06-12 2020-09-11 杭州海康威视数字技术股份有限公司 Control method for realizing low power consumption of camera and camera
CN115923826A (en) * 2023-01-03 2023-04-07 重庆长安汽车股份有限公司 Automatic driving control method, device, equipment and storage medium
CN219107540U (en) * 2022-12-30 2023-05-30 上海毅视光电科技有限公司 Double-camera circuit structure of borescope based on same parallel data bus

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5817267B2 (en) * 2011-07-07 2015-11-18 富士ゼロックス株式会社 Control device, image processing device
JP7140540B2 (en) * 2018-05-07 2022-09-21 キヤノン株式会社 Electronic device, electronic device control method, program, storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105898142A (en) * 2016-04-18 2016-08-24 天津全华时代航天科技发展有限公司 Intelligent work controlling oblique shooting system
CN106331240A (en) * 2016-10-26 2017-01-11 维沃移动通信有限公司 Mobile terminal
JP2019096279A (en) * 2017-02-21 2019-06-20 キヤノン株式会社 Electronic apparatus and control method thereof
CN106851111A (en) * 2017-03-20 2017-06-13 惠州Tcl移动通信有限公司 Dual camera method for controlling power supply, system and mobile terminal based on mobile terminal
CN111654609A (en) * 2020-06-12 2020-09-11 杭州海康威视数字技术股份有限公司 Control method for realizing low power consumption of camera and camera
CN219107540U (en) * 2022-12-30 2023-05-30 上海毅视光电科技有限公司 Double-camera circuit structure of borescope based on same parallel data bus
CN115923826A (en) * 2023-01-03 2023-04-07 重庆长安汽车股份有限公司 Automatic driving control method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN116996765A (en) 2023-11-03

Similar Documents

Publication Publication Date Title
CN112269527B (en) Application interface generation method and related device
US20220342850A1 (en) Data transmission method and related device
WO2021000881A1 (en) Screen splitting method and electronic device
CN110708086A (en) Split-screen display method and electronic equipment
WO2022089208A1 (en) File dragging method, and electronic device
CN114237779A (en) Method for displaying window, method for switching window, electronic equipment and system
CN114816167B (en) Application icon display method, electronic device and readable storage medium
CN111738577A (en) Raw material management method and related device
US20230236714A1 (en) Cross-Device Desktop Management Method, First Electronic Device, and Second Electronic Device
CN111381996B (en) Memory exception handling method and device
CN112817610A (en) Cota package installation method and related device
CN116996765B (en) Control chip management method, electronic device, and computer-readable storage medium
CN112769210A (en) Electronic device and charging method thereof
WO2023029983A1 (en) Control content dragging method and system, and electronic device
CN114816031B (en) Power saving method for terminal equipment, terminal equipment and medium
CN115442509B (en) Shooting method, user interface and electronic equipment
CN111581119B (en) Page recovery method and device
CN116095413A (en) Video processing method and electronic equipment
CN114816311B (en) Screen movement method and device
WO2024109198A1 (en) Window adjustment method and related apparatus
CN113179362B (en) Electronic device and image display method thereof
WO2024017332A1 (en) Method for controlling component, and related apparatus
EP4325356A1 (en) Desktop sharing method and electronic device
WO2023246290A1 (en) Power consumption control method and electronic device
US20240069845A1 (en) Focus synchronization method and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant