CN115514881B - Method for controlling camera and electronic equipment - Google Patents

Method for controlling camera and electronic equipment Download PDF

Info

Publication number
CN115514881B
CN115514881B CN202210954724.4A CN202210954724A CN115514881B CN 115514881 B CN115514881 B CN 115514881B CN 202210954724 A CN202210954724 A CN 202210954724A CN 115514881 B CN115514881 B CN 115514881B
Authority
CN
China
Prior art keywords
electronic device
request
camera
physical camera
parameter information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210954724.4A
Other languages
Chinese (zh)
Other versions
CN115514881A (en
Inventor
李明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202210954724.4A priority Critical patent/CN115514881B/en
Publication of CN115514881A publication Critical patent/CN115514881A/en
Application granted granted Critical
Publication of CN115514881B publication Critical patent/CN115514881B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet

Abstract

The application provides a method and electronic equipment for controlling a camera, when a second electronic equipment detects the use permission of a physical camera of a second electronic equipment, which is requested by a first electronic equipment, the second electronic equipment generates and displays a first UI (user interface) to a user, the user can control the physical camera on the second electronic equipment through the first UI, when the user performs a certain operation on the first UI, the second electronic equipment detects the request from the first UI, and the second electronic equipment executes corresponding operation on the physical camera according to the request of the first UI, so that the control requirement of the user on the physical camera of the second electronic equipment during the period that the physical camera of the second electronic equipment is controlled by the first electronic equipment can be met.

Description

Method for controlling camera and electronic equipment
Technical Field
The present application relates to the field of terminals, and more particularly, to a method of controlling a camera and an electronic device in the field of terminals.
Background
When a virtual camera corresponding to a physical camera of another electronic device is created on one electronic device, a user can control the physical camera of the other electronic device through the virtual camera, the former can be called a master control device, and the latter can be called a remote device. For example, a user may control the turning on or off of a physical camera of a remote device through a virtual camera on a master device.
However, the user cannot control the physical camera of the remote device during the period when the physical camera of the remote device is controlled by the master device, so the above technical solution cannot meet the control requirement of the user on the physical camera of the remote device during the period when the physical camera of the remote device is controlled by the master device, thereby affecting the user experience.
Disclosure of Invention
The embodiment of the application provides a method for controlling a camera, which can meet the control requirement of a user on the physical camera of remote equipment during the control of the physical camera of the remote equipment by main control equipment.
In a first aspect, there is provided a method of controlling a camera, the method performed by a second electronic device, comprising: when the second electronic device detects a first request from the first electronic device, starting a physical camera and displaying a first UI, wherein the first request requests the second electronic device for the use permission of the physical camera; when a second request from the first UI is detected by a second electronic device, a first operation is performed on the physical camera, wherein the first operation is requested by the second request to be performed by the second electronic device.
In the above scheme, when the second electronic device detects the use permission from the first electronic device for requesting the physical camera of the second electronic device, the second electronic device generates and displays the first UI to the user, the user can control the physical camera on the second electronic device through the first UI, when the user performs a certain operation on the first UI, the second electronic device detects the request from the first UI, and the second electronic device performs a corresponding operation on the physical camera according to the request of the first UI, so that the control requirement of the user on the physical camera of the second electronic device during the period that the physical camera of the second electronic device is controlled by the first electronic device can be met.
With reference to the first aspect, in certain implementation manners of the first aspect, the method further includes: and the second electronic equipment displays the current picture acquired by the physical camera after executing the first operation on the first UI.
In the above scheme, the second electronic device displays the current picture acquired by the physical camera after the first operation is performed on the first UI, so that the user can control the physical camera of the second electronic device through the first UI, and can see the current picture acquired by the physical camera from the second electronic device, thereby improving the user experience.
With reference to the first aspect and the foregoing implementation manner of the first aspect, in certain implementation manners of the first aspect, the method further includes: the second electronic device sends the data information and the first parameter information of the first picture to the first electronic device, wherein the first parameter information is the parameter information of the physical camera when the first picture is shot.
In the above scheme, after the user makes the second electronic device perform the first operation on the physical camera through the first UI, the second electronic device may send the data information of the current picture acquired by the physical camera after performing the first operation and the parameter information of the physical camera when shooting the current picture to the first electronic device, and the first electronic device may display the current picture to the user according to the received data information of the current picture and the parameter information of the physical camera when shooting the current picture, so as to improve user experience.
With reference to the first aspect and the foregoing implementation manner of the first aspect, in certain implementation manners of the first aspect, the method further includes: when a second electronic device detects a third request from the first electronic device, performing a second operation on the physical camera, the second operation being an operation requested by the third request to be performed by the second electronic device; the second electronic device sends data information of a second picture and second parameter information to the first electronic device, wherein the second picture is a current picture acquired by the physical camera after the second operation is executed, and the second parameter information is parameter information of the physical camera when the second picture is shot.
In the above scheme, the second electronic device can process the request from the first UI and also process the request from the first electronic device, in other words, the user can control not only the physical camera of the second electronic device but also the physical camera of the second electronic device through the first UI, so as to realize multi-terminal control of the physical camera of the second electronic device.
With reference to the first aspect and the foregoing implementation manners of the first aspect, in some implementation manners of the first aspect, the performing a first operation includes: and when a control strategy is met, the second electronic device performs the first operation on the physical camera, wherein the control strategy comprises the following steps: and when the receiving time of the second request is earlier than the receiving time of the third request, executing the second request, or when the second request collides with the third request, refusing to execute the third request.
In the above-described scheme, since the second electronic device is capable of processing not only the request from the first UI but also the request from the first electronic device, in this case, the operation requested to be performed by the request from the first UI may collide with the operation requested to be performed by the request from the first electronic device, in order to avoid the occurrence of the collision, the second electronic device may determine whether the request is allowed to be performed according to the control policy before performing the request, and eventually only the request allowed to be performed according to the control policy is processed, thereby avoiding the occurrence of the collision.
With reference to the first aspect and the foregoing implementation manners of the first aspect, in some implementation manners of the first aspect, the performing a second operation includes: and when a control strategy is met, the second operation is performed on the physical camera by second electronic equipment, wherein the control strategy comprises the following steps: and when the receiving time of the second request is earlier than the receiving time of the third request, the second electronic device executes the second request, or when the second request collides with the third request, the second electronic device refuses to execute the third request.
In the above-described scheme, since the second electronic device is capable of processing not only the request from the first UI but also the request from the first electronic device, in this case, the operation requested to be performed by the request from the first UI may collide with the operation requested to be performed by the request from the first electronic device, in order to avoid the occurrence of the collision, the second electronic device may determine whether the request is allowed to be performed according to the control policy before performing the request, and eventually only the request allowed to be performed according to the control policy is processed, thereby avoiding the occurrence of the collision.
With reference to the first aspect and the foregoing implementation manner of the first aspect, in certain implementation manners of the first aspect, the method further includes: a second electronic device receives user input data from the first UI; and the second electronic equipment acquires the control strategy according to the user input data.
In the above scheme, the second electronic device acquires the control policy according to the user input data from the first UI, so that the user can set the control policy through the first UI, in other words, the user decides what control policy to use, thereby improving the user experience.
In a second aspect, the present application provides an apparatus, which is included in an electronic device, having a function of implementing the above aspect and a possible implementation of the above aspect, as regards behavior of the electronic device. The functions may be realized by hardware, or may be realized by hardware executing corresponding software. The hardware or software includes one or more modules or units corresponding to the functions described above.
In a third aspect, the present application provides an electronic device, including: a touch display screen, wherein the touch display screen comprises a touch-sensitive surface and a display; a camera; one or more processors; a memory; a plurality of applications; and one or more computer programs. Wherein one or more computer programs are stored in the memory, the one or more computer programs comprising instructions. The instructions, when executed by the electronic device, cause the electronic device to perform the method of controlling a camera in any of the possible implementations of any of the above aspects.
In a fourth aspect, the present application provides an electronic device comprising one or more processors and one or more memories. The one or more memories are coupled to the one or more processors, the one or more memories being operable to store computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the method of controlling a camera in any of the possible implementations of the above.
In a fifth aspect, the present application provides a computer readable storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the method of video playback possible in any one of the above aspects.
In a sixth aspect, the present application provides a computer program product for, when run on an electronic device, causing the electronic device to perform the method of controlling a camera as possible in any one of the above aspects.
Drawings
FIG. 1 is a system framework diagram provided by an embodiment of the present application;
FIG. 2 is a schematic block diagram of an electronic device provided by an embodiment of the present application;
FIG. 3 is a schematic diagram of a method of controlling a camera according to an embodiment of the present application;
FIG. 4 (a) is a diagram illustrating an example of a user interface for controlling a camera according to an embodiment of the present application;
FIG. 4 (b) is a diagram of a user interface for controlling a camera according to another embodiment of the present application;
FIG. 5 (a) is a diagram of another example of a user interface for controlling a camera provided in an embodiment of the present application;
fig. 5 (b) is a schematic diagram of another example of a user interface for controlling a camera according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
For ease of understanding, the description of the embodiments of the present application will be briefly described.
Main control equipment
The master device may be an electronic device that initiates a request to other electronic devices for use of the usage rights of the physical camera of the other electronic devices.
Remote device
The remote device may be an electronic device with a physical camera controlled by a master device.
Virtual camera
The virtual camera may be a virtual camera corresponding to the physical camera of the remote device created by the master device according to parameter information of the physical camera of the remote device, and the master device may control the physical camera of the remote device through the virtual camera with respect to the physical camera.
Fig. 1 shows a frame diagram of a system provided in an embodiment of the present application, where a master device may control a physical camera of a remote device through a virtual camera corresponding to the physical camera of the remote device.
Fig. 2 is a schematic structural diagram of an electronic device 100 according to an embodiment of the present application. The electronic device 100 may include a processor 110, an internal memory 120, a universal serial bus (universal serial bus, USB) interface 130, a camera 140, a display 150, and a touch sensor 160, among others.
It should be understood that the illustrated structure of the embodiment of the present invention does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present invention is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The electronic device 100 implements display functions through a GPU, a display screen 150, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 150 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 150 is used to display images, videos, and the like. The display 150 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 150, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 140, a video codec, a GPU, a display screen 150, an application processor, and the like.
The ISP is used to process the data fed back by the camera 140. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 140.
The camera 140 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, the electronic device 100 may include 1 or N cameras 140, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The internal memory 120 may be used to store computer executable program code including instructions. The internal memory 120 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, etc.), and so on. In addition, the internal memory 120 may include a high-speed random access memory, and may also include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 120 and/or instructions stored in a memory provided in the processor.
The touch sensor 160, also referred to as a "touch device". The touch sensor 160 may be disposed on the display screen 150, and the touch sensor 160 and the display screen 150 form a touch screen, which is also called a "touch screen". The touch sensor 160 is used to detect a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 150. In other embodiments, the touch sensor 160 may also be disposed on the surface of the electronic device 100 at a different location than the display 150.
For example, the electronic device 100 may be a first electronic device or a second electronic device.
The method of controlling a camera provided herein is described in detail below in conjunction with the system shown in fig. 1, and fig. 3 shows a schematic interactive flow chart of a method 300 of controlling a camera.
Step 301, when the second electronic device detects a first request from the first electronic device, starting a physical camera of the second electronic device and displaying a first User Interface (UI), wherein the first request requests the second electronic device for the use authority of the physical camera.
In step 302, when the second electronic device detects a second request from the first UI, a first operation is performed on the physical camera, the first operation being an operation requested by the second electronic device to be performed by the second request.
The first electronic device may, for example, discover the second electronic device by using a near-field broadcast manner, and then the first electronic device may establish a connection with the second electronic device and may acquire parameter information of a physical camera of the second electronic device, and the first electronic device may create a virtual camera corresponding to the physical camera of the second electronic device on the first electronic device according to the parameter information of the physical camera of the second electronic device, and then the first electronic device may achieve the purpose of controlling the physical camera of the second electronic device through the virtual camera, where the first electronic device is a master control device and the second electronic device is a remote device.
It should be noted that, in the embodiment of the present application, the parameter information of the physical camera is used to indicate the capability of the physical camera, for example, the parameter information of the physical camera may include the resolution, the frame rate, the color format and so on supported by the physical camera.
After the first electronic device creates a virtual camera corresponding to the physical camera of the second electronic device on the first electronic device, a user may call the virtual camera through an Application (APP) installed on the first electronic device, and when the virtual camera is called, the second electronic device may detect a first request from the first electronic device, where the first request is used to request the second electronic device to start up its physical camera, and the second electronic device starts up the physical camera according to the first request.
For example, the APP on the first electronic device may display a UI on its display interface, where the UI is used for the user to select a remote device that requests the usage right of the physical camera, and may be further used for the user to further select the usage right of the front camera or the rear camera of the physical camera. In order to distinguish from a UI that appears subsequently, the UI may be referred to as a UI corresponding to the remote device in view of the UI being used for the user to select the remote device.
Assuming that the user selects a front-facing camera of the physical camera of the second electronic device, after the user finishes selecting, the APP on the first electronic device calls a virtual camera corresponding to the physical camera of the second electronic device, and after the virtual camera is called, the second electronic device detects a first request from the second electronic device, the first request is used for requesting the second electronic device to start the front-facing camera of the right camera of the second electronic device, and the second electronic device calls a camera interface to start the front-facing camera according to the first request. The physical camera of the second electronic device captures a current picture in an actual environment, and the second electronic device can send the data information of the current picture and the parameter information of the physical camera when the current picture is shot to the first electronic device, so that the first electronic device displays the current picture corresponding to the data information of the current picture to a user according to the parameter information of the physical camera when the current picture is shot.
It is worth mentioning that, when the virtual camera corresponding to the physical camera of the second electronic device is invoked, the first electronic device may display the UI corresponding to the virtual camera based on the invocation, and the UI corresponding to the virtual camera may be used for the user to control the physical camera of the second electronic device.
Further, the second electronic device may generate a UI (e.g., a first UI) based on the first request from the first electronic device and display the first UI on the second electronic device. The user may enable control of the physical camera of the second electronic device on the second electronic device through interaction with the first UI. When the user completes a certain operation on the first UI, the second electronic device detects a second request from the first UI, and the second electronic device performs the first operation on the physical camera according to the second request.
For example, the user requests to increase the resolution of the physical camera from the second electronic device through the first UI, at which time the second electronic device performs a first operation to increase the resolution of the physical camera in response to the user's request.
Illustratively, the first UI may display a screen captured by the physical camera of the second electronic device in real time to the user in addition to providing an interaction portal between the user and the second electronic device, in which case the method 300 may further include:
In step 303, the second electronic device displays a first screen on the first UI, where the first screen is a current screen acquired by the physical camera after the first operation is performed.
For example, the second electronic device may further transmit data information of the first screen and parameter information of a physical camera of the second electronic device when the first screen is photographed to the first electronic device, in which case the method 300 may further include:
in step 304, the second electronic device sends the data information and the first parameter information of the first frame to the first electronic device, where the first parameter information is the parameter information of the physical camera when the first frame is shot.
For example, when the second electronic device performs the first operation, the resolution of the physical camera is increased from 1280×720 to 1920×1080, in which case the resolution used when the physical camera shoots the first picture is 1920×1080, and after the first picture is acquired, the second electronic device transmits the data information of the first picture and the parameter information of the physical camera when shooting the first picture to the first electronic device, so that the first electronic device displays the first picture corresponding to the data information of the first picture to the user according to the parameter information of the physical camera when shooting the first picture, where the parameter information of the physical camera when shooting the first picture includes the resolution, the resolution is 1920×1080, in other words, the resolution of the first picture displayed to the user by the first electronic device is 1920×1080, which is the resolution used when the physical camera shoots the first picture.
Illustratively, the second electronic device may process the request from the first electronic device in addition to the request from the first UI, in which case the method 300 may further include:
in step 305, when the second electronic device detects a third request from the first electronic device, a second operation is performed on the physical camera, where the second operation is an operation performed by the second electronic device requested by the third request.
Step 306, the second electronic device sends the data information and the second parameter information of the second frame to the first electronic device, where the second frame is the current frame obtained by the physical camera after the second operation is performed, and the second parameter information is the parameter information of the physical camera when the second frame is shot.
When a user performs a certain operation on the virtual camera corresponding to the physical camera of the second electronic device on the UI corresponding to the virtual camera, the second electronic device may detect a third request from the first electronic device, and the second electronic device executes the second operation according to the third request, and then the second electronic device may send, after executing the second operation, data information of the second picture acquired by the physical camera and parameter information of the physical camera when the second picture is shot to the first electronic device, so that the first electronic device displays the second picture corresponding to the data information of the second picture to the user according to the parameter information of the physical camera when the second picture is shot.
For example, for a detected request from the UI corresponding to the virtual camera, or a request from the first UI, the second electronic device may first determine whether the control policy is satisfied, and when the control policy is satisfied, the second electronic device executes the corresponding request.
The control strategy may include: a sequence priority control strategy, a local priority control strategy and an opposite terminal priority strategy, wherein the sequence priority control strategy refers to that the second electronic equipment sequentially executes corresponding requests according to the sequence of receiving the requests; the local priority control policy refers to that when an operation requested to be executed by a request from a first electronic device collides with an operation requested to be executed by a request from a second electronic device, the second electronic device refuses to execute the request from the first electronic device; the peer-to-peer priority policy refers to that when an operation requested to be performed by a request from a first electronic device collides with an operation requested to be performed by a request from a second electronic device, the second electronic device refuses to perform the request from the second electronic device.
For example, assume that the control policy is a sequential priority control policy, in which case the second electronic device executes all requests detected, in which case the second electronic device executes both for the second request and for the third request.
In this case, when the second electronic device detects both the second request and the third request, it is assumed that the first operation requested by the second request and the second operation requested by the third request to be performed by the second electronic device collide, and the second electronic device refuses to perform the second operation corresponding to the third request based on the local priority control policy.
For example, the second electronic device detects both the second request, which corresponds to the first operation of setting the flash of the physical camera to the auto mode, and the third request, which corresponds to the second operation of setting the flash of the physical camera to the normally on mode, in which case the second electronic device may refuse to execute the third request, in other words, in which case the second electronic device executes only the second request, i.e., sets the flash of the physical camera to the auto mode.
In this case, when the second electronic device detects both the second request and the third request, it is assumed that the first operation requested by the second request and the second operation requested by the third request to be performed by the second electronic device collide, and at this time, the second electronic device refuses to perform the first operation corresponding to the second request based on the peer-to-peer priority control policy.
For example, the second electronic device detects both the second request, which corresponds to the first operation of turning on the night view photographing mode, and the third request, which corresponds to the second operation of turning on the large aperture photographing mode, in which case the second electronic device may refuse to execute the second request, in other words, in which case the second electronic device executes only the third request, that is, the large aperture photographing mode.
Illustratively, the control policy may be preconfigured on the second electronic device, or the control policy may be set by the user through the first UI, for example, the user interacts with the second electronic device through the first UI, and the control policy is set on the first UI, where the second electronic device may detect user input data from the first UI, and the second electronic device obtains the control policy according to the user input data.
In this embodiment of the present application, the user who operates the first electronic device and the user who operates the second electronic device may be the same user, or may be different users, which is not limited in this embodiment of the present application.
The method 300 is described in detail below with reference to fig. 4-5, taking a live scenario as an example.
When the first electronic device finds the second electronic device by using the near field mode, the first electronic device can create a virtual camera corresponding to the physical camera of the second electronic device on the first electronic device, and then the first electronic device can control the physical camera of the second electronic device through the virtual camera, and the first electronic device is a master control device and the second electronic device is a remote device. For the manner in which the first electronic device discovers the second electronic device and the manner in which the first electronic device creates the virtual camera, please refer to the related description in the method 300, and for brevity, the description is omitted here.
The method comprises the steps that a first APP installed on a first electronic device is used for live broadcasting by an anchor, wherein the first APP has a live broadcasting function and has the capability of calling a virtual camera corresponding to a physical camera of a second electronic device.
The method comprises the steps that when a live broadcast is performed, a host uses the function of a physical camera of a first electronic device and the function of a virtual camera corresponding to a physical camera of a second electronic device, for the physical camera of the first electronic device, when a user opens the live broadcast function, the first electronic device correspondingly opens the physical camera, the physical camera is assumed to be currently opened to be a front-end camera, at the moment, the user watching the live broadcast can see the host on the electronic device, for the virtual camera corresponding to the physical camera of the second electronic device, the host can select on a UI corresponding to a remote device displayed by a first APP, and for the virtual camera corresponding to the physical camera of the second electronic device, the host selects a rear-end camera corresponding to the virtual camera corresponding to the second electronic device, and after the host completes selection, the first APP can call the virtual camera corresponding to the physical camera of the second electronic device on the first electronic device.
When the virtual camera corresponding to the second electronic device is called, the virtual camera proxy module on the second electronic device detects a first request from the first APP, and the first request is used for requesting the second electronic device to start the rear camera of the physical camera of the second electronic device because the host selects the rear camera of the virtual camera corresponding to the second electronic device, and the virtual camera proxy module calls the camera interface to start the rear camera of the physical camera according to the first request.
The virtual camera agent module calls the camera interface to start the rear camera of the physical camera according to the first request, the physical camera captures a current picture in an actual environment, the virtual camera agent module can call the camera interface to send data information of the current picture and parameter information of the physical camera when the current picture is shot to a first APP on the first electronic device, and the first APP can display the current picture corresponding to the data information of the current picture to a user watching live broadcasting according to the parameter information of the physical camera when the current picture is shot. At this time, the first electronic device may display the photographing picture of the physical camera of the first electronic device and the photographing picture of the virtual camera corresponding to the physical camera of the second electronic device, and similarly, for a user watching live broadcast, not only can see the photographing picture of the physical camera of the first electronic device but also can see the photographing picture of the virtual camera corresponding to the physical camera of the second electronic device on the electronic device.
For example, when a host is live, a user of the first electronic device is a host, a user of the second electronic device is an assistant of the host, when the host starts live, a front camera of a physical camera of the first electronic device is used for shooting himself, when the host needs to show goods to a user watching live, a rear camera of a virtual camera corresponding to the second electronic device can be called on the first electronic device through the first APP, when the rear camera on the second electronic device is turned on, the assistant can shoot the goods to be shown, and at the moment, the user watching live can see not only the host but also the corresponding goods on the electronic device.
In addition, the assistant may also control the physical camera of the second electronic device on the second electronic device, for example, the virtual camera proxy module may generate a UI (e.g., a first UI) after detecting the first request from the first electronic device, and display the first UI401 on the second electronic device, the first UI401 may control the physical camera of the second electronic device through the first UI401, for example, the assistant may set a flash mode of the physical camera of the second electronic device through an option 4011, the assistant may set a tone mode of the physical camera through an option 4012, the assistant may adjust a resolution head of the physical camera through an option 4013, and the assistant may set other options of the physical camera through an option 4014.
It should be noted that the options shown in the first UI401 are only illustrated as examples, and in specific implementations, the first UI may further include more options than illustrated, for example, may further include a focusing option, which is not limited in this embodiment of the present application.
For example, the assistant adjusts the resolution of the physical camera via option 4013, at which point the virtual camera proxy module detects a request to adjust resolution (e.g., a second request) from the first UI, and the virtual camera proxy module invokes the camera interface to adjust the resolution of the physical camera based on the second request.
After executing the first operation corresponding to the second request, the virtual camera proxy module may call the camera interface to display, through the first UI, a current screen (for example, a first screen) acquired by the physical camera, in which case, as shown in fig. 4 (b), the first UI402 may be a first screen acquired by the physical camera by a person in a dashed line box 4025 in the first UI402, and the screen in a dashed line box 4025 shown in fig. 4 (b) may be a non-full screen display mode, and the assistant may set, through the option 4026, the screen in the dashed line box 4026 to a full screen display mode.
In addition, the virtual camera agent module can call the camera interface, data information of a first picture acquired by the physical camera after the first operation is executed and parameter information of the physical camera when the first picture is shot are sent to the first APP on the first electronic device, and the first APP displays the first picture corresponding to the data information of the first picture to a user watching live broadcast according to the parameter information of the physical camera when the first picture is shot.
For example, the assistant increases the resolution of the physical camera from 1280×720 to 1920×1080 through the option 4023, in which case the resolution used when the physical camera captures the first picture is 1920×1080, after the first picture is acquired, the second electronic device transmits the data information of the first picture and the parameter information of the physical camera when the first picture is captured to the first APP on the first electronic device, the first APP displays the first picture corresponding to the data information of the first picture to the user on the first electronic device according to the parameter information of the physical camera when the first picture is captured, the parameter information of the physical camera when the first picture is captured includes the resolution, the resolution is 1920×1080, in other words, the resolution of the first picture displayed to the user by the first electronic device is 1920×1080.
When the anchor performs a certain operation on the virtual camera corresponding to the physical camera of the second electronic device on the UI corresponding to the virtual camera, the virtual camera proxy module detects a third request from the first electronic device, and the virtual camera proxy module invokes the camera interface to perform a second operation according to the third request, and then the virtual camera proxy module may invoke the camera interface to send data information of a current picture (for example, a second picture) acquired by the physical camera after performing the second operation and parameter information of the physical camera when shooting the second picture to the first APP on the first electronic device, so that the first APP displays the second picture corresponding to the data information of the second picture acquired by the physical camera to a user watching live broadcast after performing the second operation according to the parameter information of the physical camera when shooting the second picture.
For example, the anchor selects a soft option corresponding to the tone on the UI corresponding to the virtual camera, at this time, the virtual camera proxy module detects a third request for requesting to adjust the tone of the physical camera to a soft mode, and the virtual camera proxy module invokes the camera interface to adjust the physical camera to the soft mode according to the third request, and then, the virtual camera proxy module may invoke the camera interface to send the data information of the second picture acquired by the physical camera after the mode adjustment and the parameter information of the physical camera when the second picture is shot to the first APP on the first electronic device, so that the first APP displays the second picture corresponding to the data information of the second picture on the first electronic device according to the parameter information of the physical camera when the second picture is shot to a user who views live broadcast.
As can be seen from the foregoing description, the virtual camera proxy module may detect a request from a UI corresponding to the virtual camera, and detect a request from the first UI, and in a specific implementation, the virtual camera proxy module may first send the request to the control policy management module, where the control policy management module stores a control policy, and the control policy management module determines whether the request can be executed on the premise of ensuring that the control policy is satisfied, and the control policy management module may notify the virtual camera proxy module of the result, and the virtual camera proxy module determines whether to execute the corresponding request according to the result. For a detailed description of the control strategy, please refer to the related description in the method 300, and for brevity, the description is omitted here.
For example, assuming that the control policy is a sequential priority control policy, the control policy management module obtains a result of allowing execution for all the detected requests, and notifies the virtual camera proxy module of the result, and the virtual camera proxy module calls the camera interface to execute the corresponding request, and in this case, the virtual camera proxy module calls the camera interface to execute the second request and the third request.
For example, it is assumed that the control policy is a local priority control policy, in this case, when the virtual camera proxy module detects both the second request and the third request, and it is assumed that the first operation requested to be performed by the second request conflicts with the second operation requested to be performed by the third request, the control policy management module refuses to perform the second operation corresponding to the third request based on the local priority control policy, and notifies the virtual camera proxy module of the result of refusal of the execution.
For example, the virtual camera proxy module detects both the second request, which corresponds to the first operation of setting the flash to the automatic mode, and the third request, which corresponds to the second operation of setting the flash to the constant mode, in which case the control policy management module refuses to execute the third request based on the local priority control policy, in other words, in which case the virtual camera proxy module invokes the camera to execute only the second request, setting the flash to the automatic mode.
For example, it is assumed that the control policy is a peer-to-peer priority control policy, in this case, when the virtual camera proxy module detects both the second request and the third request, and it is assumed that the first operation requested to be performed by the second request conflicts with the second operation requested to be performed by the third request, the control policy management module refuses to perform the first operation corresponding to the second request based on the peer-to-peer priority control policy, and notifies the virtual camera proxy module of the result of refusal of the execution.
For example, the virtual camera agent module detects both the second request and the third request, the first operation corresponding to the second request is to start the night scene photographing mode, the second operation corresponding to the third request is to start the large aperture photographing mode, in which case the control policy management module refuses to execute the second request based on the peer-to-peer priority control policy, in other words, in which case the virtual camera agent module invokes the camera to execute only the third request, i.e. to start the large aperture photographing mode.
For example, the control policy may be configured in advance in the control policy management module, or the control policy may be set by the user through the first UI, for example, the assistant may select an option on the first UI, the option representing a certain control policy, and after the assistant makes the selection, the inside of the second electronic device may acquire the control policy according to the selection of the assistant (e.g., user input data), and finally the acquired control policy may be stored in the control policy management module, so that the control policy management module may determine whether to allow the request to be executed according to the stored control policy.
For example, when the assistant sets a control policy through the first UI 402, the assistant may select a setting option in the first UI 402, in response to an operation of the assistant, the second electronic device may display the first UI 501 as shown in fig. 5 (a), the first UI 501 may include a control policy option 5011 therein, the assistant may select the control policy option 5011 in the first UI 501, in response to an operation of the assistant, the second electronic device may display the first UI 502 as shown in fig. 5 (b), the first UI 502 may include a sequential priority control policy option 5021 therein, a local priority control policy option 5022, and an opposite priority control policy option 5023, and the first assistant may select one of the first UIs 502 as a control policy, for example, the assistant selects the sequential priority control policy option 5021 on behalf of the assistant.
It should be noted that, in the embodiment of the present application, the first UI may be suspended on any interface and may be dragged at will, for example, the first UI may be suspended on a screen locking interface of the second electronic device or other interfaces after the second electronic device is unlocked.
It should be noted that, the foregoing description is only given by taking the setting of the control policy by the assistant as an example, but this is not limited to the embodiment of the present application, and the setting of the control policy may be performed by a host or other users in specific implementation.
It should be further noted that, in the embodiment of the present application, the time when the control policy is set by the user is not limited, and the user may set the control policy through the first UI at any time after the display of the first UI, and the control policy management module stores the control policy or updates the stored control policy in real time according to the setting of the user.
In the embodiment of the present application, the virtual camera agent module and the control policy management module are used to execute different functions, but this is not limited to the embodiment of the present application, and in a specific implementation, the functions executed by the virtual camera agent module and the control policy management module may be implemented by one module, or may be implemented by a plurality of modules, which is not particularly limited in the embodiment of the present application.
In this embodiment, the electronic device may be divided into functional modules according to the above method example, for example, each functional module may be divided into functional modules corresponding to each function, or two or more functions may be integrated into one processing module, for example, the above virtual camera agent module and the control policy management module may be integrated into one processing module. The integrated modules described above may be implemented in hardware. It should be noted that, in this embodiment, the division of the modules is schematic, only one logic function is divided, and another division manner may be implemented in actual implementation.
It should be noted that, all relevant contents of each step related to the above method embodiment may be cited to the functional description of the corresponding functional module, which is not described herein.
The electronic device provided in this embodiment is configured to execute the method for controlling a camera, so that the same effects as those of the implementation method can be achieved. In case an integrated unit is employed, the electronic device may comprise a processing module, a storage module and a communication module. The processing module may be configured to control and manage actions of the electronic device, for example, may be configured to support the electronic device to execute steps executed by the processing unit. The memory module may be used to support the electronic device in storing program code, data, etc. And the communication module can be used for supporting the communication between the electronic device and other devices.
Wherein the processing module may be a processor or a controller. Which may implement or perform the various exemplary logic blocks, modules, and circuits described in connection with this disclosure. A processor may also be a combination that performs computing functions, e.g., including one or more microprocessors, digital signal processing (digital signal processing, DSP) and microprocessor combinations, and the like. The memory module may be a memory. The communication module can be a radio frequency circuit, a Bluetooth chip, a Wi-Fi chip and other equipment which interact with other electronic equipment.
In one embodiment, when the processing module is a processor and the storage module is a memory, the electronic device according to this embodiment may be a device having the structure shown in fig. 2.
The present embodiment also provides a computer-readable storage medium having stored therein computer instructions which, when executed on an electronic device, cause the electronic device to perform the above-described related method steps to implement the method of controlling a camera in the above-described embodiments.
The present embodiment also provides a computer program product which, when run on a computer, causes the computer to perform the above-described related steps to implement the method of controlling a camera in the above-described embodiments.
In addition, embodiments of the present application also provide an apparatus, which may be specifically a chip, a component, or a module, and may include a processor and a memory connected to each other; the memory is configured to store computer-executable instructions, and when the device is running, the processor may execute the computer-executable instructions stored in the memory, so that the chip performs the method for controlling the camera in the above method embodiments.
The electronic device, the computer storage medium, the computer program product, or the chip provided in this embodiment are used to execute the corresponding methods provided above, so that the beneficial effects thereof can be referred to the beneficial effects in the corresponding methods provided above, and will not be described herein.
It will be appreciated by those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts shown as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions to cause a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes and substitutions are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (9)

1. A method of controlling a camera, the method performed by a second electronic device, the second electronic device having established a connection with a first electronic device, comprising:
the second electronic device starts a physical camera of the second electronic device and displays a first user interface UI when detecting a first request from the first electronic device, wherein the first request is used for requesting the second electronic device for the use authority of the physical camera, the first UI comprises a first picture, and the first picture is a current picture acquired by the physical camera;
the second electronic device sends data information and first parameter information of the first picture to the first electronic device, wherein the first parameter information is input for parameter information adjustment of the first electronic device, the first parameter information is parameter information of the physical camera when the first picture is shot, and the first parameter information comprises at least one of resolution, frame rate or color format supported by the physical camera;
The second electronic device performs a first operation on the physical camera when detecting a second request generated by one or more operations of a user acting on the first UI, the first operation being an operation requested by the second electronic device to be performed by the second request, the first operation being used to adjust parameter information of the physical camera, the parameter information of the physical camera including at least one of a resolution, a frame rate, or a color format supported by the physical camera.
2. The method according to claim 1, wherein the method further comprises:
the first UI includes a second screen, which is a current screen acquired by the physical camera after the first operation is performed.
3. The method according to claim 2, wherein the method further comprises:
the second electronic device sends data information of the second picture and second parameter information to the first electronic device, wherein the second parameter information is parameter information of the physical camera when the second picture is shot, and the second parameter information comprises at least one of resolution, frame rate or color format supported by the physical camera.
4. A method according to any one of claims 1 to 3, further comprising:
the second electronic device executing a second operation on the physical camera when detecting a third request from the first electronic device, the second operation being an operation requested by the third request to be executed by the second electronic device;
the second electronic device sends data information of a third picture and third parameter information to the first electronic device, wherein the third picture is a current picture acquired by the physical camera after the second operation is executed, the third parameter information is parameter information of the physical camera when the third picture is shot, and the third parameter information comprises at least one of resolution, frame rate or color format supported by the physical camera.
5. The method of claim 4, wherein the performing a first operation comprises:
and executing the first operation on the physical camera when a control strategy is satisfied, wherein the control strategy comprises:
and when the receiving time of the second request is earlier than the receiving time of the third request, executing the second request, or when the second request collides with the third request, refusing to execute the third request.
6. The method of claim 4, wherein the performing a second operation comprises:
and executing the second operation on the physical camera when a control strategy is satisfied, wherein the control strategy comprises:
and when the receiving time of the second request is earlier than the receiving time of the third request, executing the second request, or when the second request collides with the third request, refusing to execute the third request.
7. The method according to claim 5 or 6, characterized in that the method further comprises:
the second electronic device receives user input data from the first UI;
and the second electronic equipment acquires the control strategy according to the user input data.
8. An electronic device, the electronic device comprising:
one or more processors, memory, and one or more programs, wherein the one or more programs are stored in the memory, which when executed by the processor, cause the electronic device to perform the method of controlling a camera of any of claims 1-7.
9. A readable storage medium storing instructions that, when executed on an electronic device, cause the electronic device to perform the method of controlling a camera of any one of claims 1 to 7.
CN202210954724.4A 2020-11-11 2020-11-11 Method for controlling camera and electronic equipment Active CN115514881B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210954724.4A CN115514881B (en) 2020-11-11 2020-11-11 Method for controlling camera and electronic equipment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210954724.4A CN115514881B (en) 2020-11-11 2020-11-11 Method for controlling camera and electronic equipment
CN202011257123.5A CN114500822B (en) 2020-11-11 2020-11-11 Method for controlling camera and electronic equipment

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202011257123.5A Division CN114500822B (en) 2020-11-11 2020-11-11 Method for controlling camera and electronic equipment

Publications (2)

Publication Number Publication Date
CN115514881A CN115514881A (en) 2022-12-23
CN115514881B true CN115514881B (en) 2023-07-11

Family

ID=81489859

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202011257123.5A Active CN114500822B (en) 2020-11-11 2020-11-11 Method for controlling camera and electronic equipment
CN202210954724.4A Active CN115514881B (en) 2020-11-11 2020-11-11 Method for controlling camera and electronic equipment

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202011257123.5A Active CN114500822B (en) 2020-11-11 2020-11-11 Method for controlling camera and electronic equipment

Country Status (1)

Country Link
CN (2) CN114500822B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103634524A (en) * 2013-11-15 2014-03-12 北京智谷睿拓技术服务有限公司 Control method and control equipment of camera system and camera system
CN105516507A (en) * 2015-12-25 2016-04-20 联想(北京)有限公司 Information processing method and electronic equipment

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08237635A (en) * 1995-02-28 1996-09-13 Canon Inc Image pickup controller and control method thereof
JP3934521B2 (en) * 2002-10-04 2007-06-20 日本電信電話株式会社 Video remote control device, video remote control method, video remote control program, and recording medium recording video remote control program
CN104349032A (en) * 2013-07-23 2015-02-11 中兴通讯股份有限公司 Method for photographing and mobile terminal
CN104796610B (en) * 2015-04-20 2018-05-29 广东欧珀移动通信有限公司 Camera sharing method, device, system and the mobile terminal of a kind of mobile terminal
WO2019059954A1 (en) * 2017-09-19 2019-03-28 Rovi Guides, Inc. System and methods for navigating internet appliances using a media guidance application
CN110944109B (en) * 2018-09-21 2022-01-14 华为技术有限公司 Photographing method, device and equipment
CN114666435B (en) * 2019-04-19 2023-03-28 华为技术有限公司 Method for using enhanced function of electronic device, chip and storage medium
CN110971823B (en) * 2019-11-29 2021-06-29 维沃移动通信(杭州)有限公司 Parameter adjusting method and terminal equipment
CN115103106A (en) * 2019-12-18 2022-09-23 荣耀终端有限公司 Control method, electronic equipment, computer readable storage medium and chip
CN111010588B (en) * 2019-12-25 2022-05-17 成都酷狗创业孵化器管理有限公司 Live broadcast processing method and device, storage medium and equipment
CN111083379A (en) * 2019-12-31 2020-04-28 维沃移动通信(杭州)有限公司 Shooting method and electronic equipment

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103634524A (en) * 2013-11-15 2014-03-12 北京智谷睿拓技术服务有限公司 Control method and control equipment of camera system and camera system
CN105516507A (en) * 2015-12-25 2016-04-20 联想(北京)有限公司 Information processing method and electronic equipment

Also Published As

Publication number Publication date
CN115514881A (en) 2022-12-23
CN114500822A (en) 2022-05-13
CN114500822B (en) 2024-03-05

Similar Documents

Publication Publication Date Title
US20220342475A1 (en) Terminal control method and terminal
CN109167931B (en) Image processing method, device, storage medium and mobile terminal
CN113824873B (en) Image processing method and related electronic equipment
EP4198812A1 (en) Service processing method and device
CN113778663B (en) Scheduling method of multi-core processor and electronic equipment
CN111953848B (en) System, method, related device and storage medium for realizing application function through context awareness
KR20230133970A (en) Photography methods, devices and electronics
WO2022007854A1 (en) Screen recording method and screen recording system
WO2023226612A1 (en) Exposure parameter determining method and apparatus
CN113099146B (en) Video generation method and device and related equipment
CN112437235B (en) Night scene picture generation method and device and mobile terminal
CN115514881B (en) Method for controlling camera and electronic equipment
CN111259441B (en) Device control method, device, storage medium and electronic device
CN115589539B (en) Image adjustment method, device and storage medium
WO2023077939A1 (en) Camera switching method and apparatus, and electronic device and storage medium
EP4336857A1 (en) Photographing method and related apparatus
CN116744106B (en) Control method of camera application and terminal equipment
CN117119295B (en) Camera control method and electronic device
CN117156293A (en) Photographing method and related device
CN116755748B (en) Card updating method, electronic device, and computer-readable storage medium
EP4262226A1 (en) Photographing method and related device
CN115484384B (en) Method and device for controlling exposure and electronic equipment
WO2024061036A1 (en) Inter-engine communication method and related device
WO2022111701A1 (en) Screen projection method and system
CN117692753A (en) Photographing method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant