CN114500822A - Method for controlling camera and electronic equipment - Google Patents

Method for controlling camera and electronic equipment Download PDF

Info

Publication number
CN114500822A
CN114500822A CN202011257123.5A CN202011257123A CN114500822A CN 114500822 A CN114500822 A CN 114500822A CN 202011257123 A CN202011257123 A CN 202011257123A CN 114500822 A CN114500822 A CN 114500822A
Authority
CN
China
Prior art keywords
electronic device
request
camera
physical camera
picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011257123.5A
Other languages
Chinese (zh)
Other versions
CN114500822B (en
Inventor
李明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202011257123.5A priority Critical patent/CN114500822B/en
Priority to CN202210954724.4A priority patent/CN115514881B/en
Publication of CN114500822A publication Critical patent/CN114500822A/en
Application granted granted Critical
Publication of CN114500822B publication Critical patent/CN114500822B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

When the second electronic device detects the use right of the physical camera of the second electronic device, which is requested by the first electronic device, the second electronic device generates and displays a first UI (user interface) to a user, the user can control the physical camera on the second electronic device through the first UI, when the user performs certain operation on the first UI, the second electronic device can detect the request from the first UI, and the second electronic device performs corresponding operation on the physical camera according to the request of the first UI, so that the control requirement of the user on the physical camera of the second electronic device can be met when the physical camera of the second electronic device is controlled by the first electronic device.

Description

Method for controlling camera and electronic equipment
Technical Field
The present application relates to the field of terminals, and more particularly, to a method of controlling a camera in the field of terminals and an electronic device.
Background
After a virtual camera corresponding to a physical camera of another electronic device is created on one electronic device, a user can control the physical camera of the other electronic device through the virtual camera, the former may be referred to as a master device, and the latter may be referred to as a remote device. For example, a user may control the turning on or off of a physical camera of a remote device through a virtual camera on a master device.
However, during the period that the physical camera of the remote device is controlled by the main control device, the user cannot control the physical camera of the remote device, and therefore, the above technical solution cannot meet the control requirement of the user on the physical camera of the remote device during the period that the physical camera of the remote device is controlled by the main control device, thereby affecting the user experience.
Disclosure of Invention
The embodiment of the application provides a method for controlling a camera, which can meet the control requirement of a user on a physical camera of a remote device when the physical camera of the remote device is controlled by a main control device.
In a first aspect, a method of controlling a camera is provided, the method performed by a second electronic device, comprising: when a second electronic device detects a first request from a first electronic device, starting a physical camera and displaying a first UI, wherein the first request requests the second electronic device for the use permission of the physical camera; when a second electronic device detects a second request from the first UI, a first operation is executed on the physical camera, and the first operation is the operation which is requested to be executed by the second electronic device by the second request.
In the above scheme, when the second electronic device detects that the second electronic device requests the use right of the physical camera of the second electronic device from the first electronic device, the second electronic device generates and displays the first UI to the user, the user can control the physical camera on the second electronic device through the first UI, when the user performs a certain operation on the first UI, the second electronic device detects the request from the first UI, and the second electronic device performs a corresponding operation on the physical camera according to the request of the first UI, so that the control requirement of the user on the physical camera of the second electronic device can be met while the physical camera of the second electronic device is controlled by the first electronic device.
With reference to the first aspect, in certain implementations of the first aspect, the method further includes: and the second electronic equipment displays the current picture acquired by the physical camera after the first operation is executed on the first UI.
In the above scheme, the second electronic device displays the current picture acquired by the physical camera after the first operation is executed on the first UI, so that the user can not only control the physical camera of the second electronic device through the first UI, but also see the current picture acquired by the physical camera from the second electronic device, and the user experience is improved.
With reference to the first aspect and the foregoing implementations of the first aspect, in certain implementations of the first aspect, the method further includes: and the second electronic equipment sends the data information and the first parameter information of the first picture to the first electronic equipment, wherein the first parameter information is the parameter information of the physical camera when the first picture is shot.
In the above scheme, after the user enables the second electronic device to perform the first operation on the physical camera through the first UI, the second electronic device may send data information of a current picture acquired by the physical camera after performing the first operation and parameter information of the physical camera when the current picture is shot to the first electronic device, and the first electronic device may display the current picture to the user according to the received data information of the current picture and the received parameter information of the physical camera when the current picture is shot, so as to improve user experience.
With reference to the first aspect and the foregoing implementations of the first aspect, in certain implementations of the first aspect, the method further includes: when a second electronic device detects a third request from the first electronic device, performing a second operation on the physical camera, wherein the second operation is an operation requested to be performed by the second electronic device by the third request; and the second electronic equipment sends data information and second parameter information of a second picture to the first electronic equipment, wherein the second picture is a current picture acquired by the physical camera after the second operation is executed, and the second parameter information is the parameter information of the physical camera when the second picture is shot.
In the above solution, the second electronic device can process not only the request from the first UI but also the request from the first electronic device, in other words, the user can control not only the physical camera of the second electronic device through the first electronic device but also the physical camera of the second electronic device through the first UI, thereby implementing multi-terminal control over the physical camera of the second electronic device.
With reference to the first aspect and the foregoing implementation manners of the first aspect, in some implementation manners of the first aspect, the performing a first operation includes: when a control strategy is satisfied, a second electronic device performs the first operation on the physical camera, wherein the control strategy comprises: executing the second request first when the receiving time of the second request is earlier than the receiving time of the third request, or refusing to execute the second request when the second request conflicts with the third request, or refusing to execute the third request when the second request conflicts with the third request.
In the above-described aspect, since the second electronic device can process not only the request from the first UI but also the request from the first electronic device, in this case, the operation requested to be performed by the request from the first UI may collide with the operation requested to be performed by the request from the first electronic device, and in order to avoid the collision, the second electronic device may determine whether the request is allowed to be performed according to the control policy before performing the request, and finally process only the request allowed to be performed, which is determined according to the control policy, so as to avoid the collision.
With reference to the first aspect and the foregoing implementation manners of the first aspect, in some implementation manners of the first aspect, the performing a second operation includes: when a control strategy is satisfied, a second electronic device performs the second operation on the physical camera, wherein the control strategy comprises: when the receiving time of the second request is earlier than the receiving time of the third request, the second electronic device executes the second request, or when the second request conflicts with the third request, the second electronic device refuses to execute the third request.
In the above-described aspect, since the second electronic device can process not only the request from the first UI but also the request from the first electronic device, in this case, the operation requested to be performed by the request from the first UI may collide with the operation requested to be performed by the request from the first electronic device, and in order to avoid the collision, the second electronic device may determine whether the request is allowed to be performed according to the control policy before performing the request, and finally process only the request allowed to be performed, which is determined according to the control policy, so as to avoid the collision.
With reference to the first aspect and the foregoing implementations of the first aspect, in certain implementations of the first aspect, the method further includes: the second electronic device receiving user input data from the first UI; and the second electronic equipment acquires the control strategy according to the user input data.
In the above scheme, the second electronic device obtains the control policy according to the user input data from the first UI, so that the user can set the control policy through the first UI, in other words, the user decides which control policy to use, thereby improving user experience.
In a second aspect, the present application provides an apparatus, included in an electronic device, having functionality to implement the above aspects and possible implementations of the above aspects. The functions may be implemented by hardware, or by hardware executing corresponding software. The hardware or software includes one or more modules or units corresponding to the above-described functions.
In a third aspect, the present application provides an electronic device, comprising: a touch display screen, wherein the touch display screen comprises a touch sensitive surface and a display; a camera; one or more processors; a memory; a plurality of application programs; and one or more computer programs. Wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions. The instructions, when executed by the electronic device, cause the electronic device to perform a method of controlling a camera in any possible implementation of any of the above aspects.
In a fourth aspect, the present application provides an electronic device comprising one or more processors and one or more memories. The one or more memories are coupled to the one or more processors for storing computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the method of controlling the camera in any of the possible implementations of any of the aspects described above.
In a fifth aspect, the present application provides a computer-readable storage medium comprising computer instructions that, when executed on an electronic device, cause the electronic device to perform any one of the above possible video playback methods.
In a sixth aspect, the present application provides a computer program product for causing an electronic device to perform any one of the possible methods of controlling a camera of the above aspects when the computer program product is run on the electronic device.
Drawings
FIG. 1 is a system framework diagram provided by an embodiment of the present application;
FIG. 2 is a schematic block diagram of an electronic device provided by an embodiment of the application;
fig. 3 is a schematic diagram of a method for controlling a camera according to an embodiment of the present application;
FIG. 4 (a) is a schematic diagram of an example of a user interface for controlling a camera according to an embodiment of the present disclosure;
FIG. 4 (b) is a schematic diagram of a user interface for controlling a camera according to another example provided by the embodiment of the present application;
FIG. 5 (a) is a schematic diagram of a user interface for controlling a camera according to another embodiment of the present disclosure;
fig. 5 (b) is a schematic diagram of a user interface for controlling a camera according to another example provided in the embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
For ease of understanding, the matters referred to in the embodiments of the present application will be briefly described first.
Master control equipment
The master device may be an electronic device that initiates a request to other electronic devices for usage rights to use physical cameras of the other electronic devices.
Remote device
The remote device may be an electronic device with a physical camera controlled by the master device.
Virtual camera
The virtual camera may be a virtual camera, so-called virtual camera, on which the master device creates a virtual camera corresponding to the physical camera of the remote device according to the parameter information of the physical camera of the remote device, with respect to the physical camera, and the master device may control the physical camera of the remote device through the virtual camera.
Fig. 1 illustrates a framework diagram of a system provided in an embodiment of the present application, where a master device may control a physical camera of a remote device through a virtual camera corresponding to the physical camera of the remote device.
For example, fig. 2 is a schematic structural diagram of an example of the electronic device 100 according to the embodiment of the present application. The electronic device 100 may include a processor 110, an internal memory 120, a Universal Serial Bus (USB) interface 130, a camera 140, a display 150, and a touch sensor 160, among others.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the time sequence signal to finish the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only illustrative and is not limited to the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The electronic device 100 implements display functions via the GPU, the display screen 150, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 150 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 150 is used to display images, video, and the like. The display screen 150 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 150, with N being a positive integer greater than 1.
The electronic device 100 may implement a shooting function through the ISP, the camera 140, the video codec, the GPU, the display 150, the application processor, and the like.
The ISP is used to process the data fed back by the camera 140. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 140.
The camera 140 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, electronic device 100 may include 1 or N cameras 140, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
Internal memory 120 may be used to store computer-executable program code, including instructions. The internal memory 120 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data) created during the use of the electronic device 100, and the like. In addition, the internal memory 120 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 120 and/or instructions stored in a memory provided in the processor.
The touch sensor 160 is also referred to as a "touch device". The touch sensor 160 may be disposed on the display screen 150, and the touch sensor 160 and the display screen 150 form a touch screen, which is also called a "touch screen". The touch sensor 160 is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 150. In other embodiments, the touch sensor 160 may be disposed on a surface of the electronic device 100, different from the position of the display screen 150.
Illustratively, the electronic device 100 may be a first electronic device or a second electronic device.
The method for controlling a camera provided by the present application is described in detail below with reference to the system shown in fig. 1, and fig. 3 shows a schematic interaction flow diagram of a method 300 for controlling a camera.
Step 301, when the second electronic device detects a first request from the first electronic device, starting a physical camera of the second electronic device and displaying a first User Interface (UI), where the first request requests a use permission of the physical camera from the second electronic device.
And step 302, when the second electronic device detects the second request from the first UI, executing a first operation on the physical camera, wherein the first operation is the operation executed by the second electronic device requested by the second request.
Illustratively, the first electronic device discovers other electronic devices around the first electronic device by using a near field broadcast manner, for example, the first electronic device discovers a second electronic device by using a bluetooth broadcast manner, then the first electronic device may establish a connection with the second electronic device and may acquire parameter information of a physical camera of the second electronic device, the first electronic device may create a virtual camera corresponding to the physical camera of the second electronic device on the first electronic device according to the parameter information of the physical camera of the second electronic device, and then the first electronic device may achieve the purpose of controlling the physical camera of the second electronic device by using the virtual camera, in which case, the first electronic device is a main control device and the second electronic device is a far-end device.
It should be noted that the parameter information of the physical camera in the embodiment of the present application is used to indicate the capability of the physical camera, for example, the parameter information of the physical camera may include the resolution, the frame rate, the color format, and the like supported by the physical camera.
After the first electronic device creates a virtual camera corresponding to a physical camera of the second electronic device thereon, a user may call the virtual camera through an Application (APP) installed on the first electronic device, when the virtual camera is called, the second electronic device may detect a first request from the first electronic device, the first request is used to request the second electronic device to start the physical camera thereof, and the second electronic device starts the physical camera according to the first request.
For example, the APP on the first electronic device may display a UI on its display interface, where the UI is used for the user to select a remote device that is to request the usage right of the physical camera, and may also be used for the user to further select the usage right of the front camera or the rear camera of the physical camera. To distinguish from a subsequently appearing UI, the UI may be referred to as a UI corresponding to the remote device in view of the UI being used for user selection of the remote device.
Assuming that a user selects a front camera of a physical camera of a second electronic device, after the user completes the selection, an APP on a first electronic device calls a virtual camera corresponding to the physical camera of the second electronic device, after the virtual camera is called, the second electronic device detects a first request from the second electronic device, the first request is used for requesting the second electronic device to start the front camera of the right camera, and the second electronic device calls a camera interface to start the front camera according to the first request. And then the physical camera of the second electronic device captures the current picture in the actual environment, and the second electronic device can send the data information of the current picture and the parameter information of the physical camera when the current picture is shot to the first electronic device, so that the first electronic device can display the current picture corresponding to the data information of the current picture to the user according to the parameter information of the physical camera when the current picture is shot.
It is worth mentioning that, when the virtual camera corresponding to the physical camera of the second electronic device is called, the first electronic device may display the UI corresponding to the virtual camera based on the call, and the UI corresponding to the virtual camera may be used for the user to control the physical camera of the second electronic device.
In addition, the second electronic device may generate one UI (e.g., a first UI) based on the first request from the first electronic device and display the first UI on the second electronic device. The user may effect control of a physical camera of the second electronic device on the second electronic device through interaction with the first UI. After the user completes certain operation on the first UI, the second electronic device detects a second request from the first UI, and the second electronic device executes the first operation on the physical camera according to the second request.
Illustratively, the user requests the second electronic device to increase the resolution of the physical camera through the first UI, and at this time, the second electronic device performs a first operation of increasing the resolution of the physical camera in response to the request of the user.
For example, in addition to providing an interactive portal between the user and the second electronic device, the first UI may display a picture taken in real time by a physical camera of the second electronic device to the user, in which case the method 300 may further include:
in step 303, the second electronic device displays a first screen on the first UI, where the first screen is a current screen acquired by the physical camera after the first operation is performed.
For example, the second electronic device may further transmit data information of the first screen and parameter information of a physical camera of the second electronic device when the first screen is captured to the first electronic device, in which case the method 300 may further include:
step 304, the second electronic device sends data information of the first picture and first parameter information to the first electronic device, wherein the first parameter information is parameter information of a physical camera when the first picture is shot.
For example, when the second electronic device performs the first operation, the resolution of the physical camera is adjusted from 1280 × 720 up to 1920 × 1080, in this case, the resolution used when the physical camera takes the first picture is 1920 × 1080, after acquiring the first picture, the second electronic equipment sends the data information of the first picture and the parameter information of the physical camera when the first picture is shot to the first electronic equipment, so that the first electronic device can display the first picture corresponding to the data information of the first picture to the user according to the parameter information of the physical camera when the first picture is shot, wherein the parameter information of the physical camera when the first picture is shot comprises the resolution, the resolution is 1920 × 1080, in other words, the resolution of the first picture displayed to the user by the first electronic device is a resolution used when the first picture is shot by the physical camera, that is, 1920 × 1080.
For example, the second electronic device may process the request from the first electronic device in addition to the request from the first UI, in which case the method 300 may further include:
and 305, when the second electronic device detects the third request from the first electronic device, executing a second operation on the physical camera, wherein the second operation is the operation executed by the second electronic device requested by the third request.
Step 306, the second electronic device sends data information and second parameter information of a second picture to the first electronic device, where the second picture is a current picture acquired by the physical camera after the second operation is performed, and the second parameter information is parameter information of the physical camera when the second picture is shot.
When a user performs a certain operation on a virtual camera corresponding to a physical camera of the second electronic device on a UI corresponding to the virtual camera, the second electronic device may detect a third request from the first electronic device, and the second electronic device executes a second operation according to the third request, and then the second electronic device may send, after executing the second operation, data information of a second screen acquired by the physical camera and parameter information of the physical camera when shooting the second screen to the first electronic device, so that the first electronic device may display, to the user, the second screen corresponding to the data information of the second screen according to the parameter information of the physical camera when shooting the second screen.
For example, for a detected request from the UI corresponding to the virtual camera or a detected request from the first UI, the second electronic device may first determine whether the control policy is satisfied, and when the control policy is satisfied, the second electronic device executes the corresponding request.
The control strategy may include: the sequence priority control strategy, the local priority control strategy and the opposite terminal priority strategy, wherein the sequence priority control strategy refers to that the second electronic equipment executes corresponding requests in sequence according to the order of receiving the requests; the local priority control strategy is used for refusing the request from the first electronic equipment to execute when the operation requested to be executed by the request from the first electronic equipment conflicts with the operation requested to be executed by the request from the second electronic equipment; the peer-to-peer priority policy is to deny, by the second electronic device, the request from the second electronic device when an operation requested to be performed by the request from the first electronic device conflicts with an operation requested to be performed by the request from the second electronic device.
For example, assume that the control policy is a priority-ordered control policy, in which case the second electronic device performs all requests detected, and in which case the second electronic device performs both the second request and the third request.
For example, it is assumed that the control policy is a local priority control policy, in this case, when the second electronic device detects both the second request and the third request, it is assumed that a first operation performed by the second electronic device requested by the second request conflicts with a second operation performed by the second electronic device requested by the third request, and at this time, the second electronic device refuses to perform a second operation corresponding to the third request based on the local priority control policy.
For example, the second electronic device detects both the second request corresponding to the first operation of setting the flash of the physical camera to the automatic mode and the third request corresponding to the second operation of setting the flash of the physical camera to the normally on mode, in which case the second electronic device may refuse to execute the third request, in other words, in which case the second electronic device only executes the second request, that is, sets the flash of the physical camera to the automatic mode.
For example, it is assumed that the control policy is an opposite-end priority control policy, in this case, when the second electronic device detects both the second request and the third request, it is assumed that a first operation performed by the second electronic device requested by the second request conflicts with a second operation performed by the second electronic device requested by the third request, and at this time, the second electronic device refuses to perform the first operation corresponding to the second request based on the opposite-end priority control policy.
For example, the second electronic device detects both the second request corresponding to the first operation of turning on the night view photographing mode and the third request corresponding to the second operation of turning on the large-aperture photographing mode, in which case the second electronic device may refuse to perform the second request, in other words, in which case the second electronic device only performs the third request, i.e., turns on the large-aperture photographing mode.
For example, the control policy may be set by the user through the first UI, and the second electronic device may detect user input data from the first UI, and the second electronic device obtains the control policy according to the user input data.
In this embodiment, a user operating the first electronic device and a user operating the second electronic device may be the same user or different users, and this is not limited in this embodiment of the present application.
The method 300 is described in detail below with reference to fig. 4 to 5, taking a live scene as an example.
When the first electronic device discovers the second electronic device by using the near field mode, the first electronic device may create a virtual camera corresponding to the physical camera of the second electronic device thereon, and then the first electronic device may control the physical camera of the second electronic device through the virtual camera, where the first electronic device is a master device and the second electronic device is a remote device. For the way in which the first electronic device discovers the second electronic device and the way in which the first electronic device creates the virtual camera, please refer to the related description in the method 300, and for brevity, the description is omitted here.
Suppose that the anchor currently uses a first APP installed on a first electronic device for live broadcasting, wherein the first APP has a live broadcasting function and has a capability of calling a virtual camera corresponding to a physical camera of a second electronic device.
The method includes that a first electronic device and a second electronic device are connected in series, the first electronic device is supposed to use functions of a physical camera of the first electronic device and functions of a virtual camera corresponding to the physical camera of the second electronic device, for the physical camera of the first electronic device, when a user opens a live broadcast function, the first electronic device can correspondingly open the physical camera, the supposed physical camera is a front camera which is opened at present, the live broadcast watching user can see the first electronic device on the electronic device, for the virtual camera corresponding to the physical camera of the second electronic device, the first APP can select on a UI corresponding to a remote device displayed by the first APP, the supposed main broadcast selects a rear camera of the virtual camera corresponding to the second electronic device, and after the main broadcast finishes selection, the first APP can call the virtual camera corresponding to the physical camera of the second electronic device on the first electronic device.
After the virtual camera corresponding to the second electronic device is called, the virtual camera agent module on the second electronic device detects a first request from the first APP, and since the anchor selects the rear camera of the virtual camera corresponding to the second electronic device, the first request is used for requesting the second electronic device to start the rear camera of the physical camera, and the virtual camera agent module calls the camera interface to start the rear camera of the physical camera according to the first request.
The virtual camera agent module calls the camera interface to start the rear camera of the physical camera according to the first request, the physical camera can capture a current picture in an actual environment, the virtual camera agent module can call the camera interface to send data information of the current picture and parameter information of the physical camera when the current picture is shot to a first APP on the first electronic device, and the first APP can display the current picture corresponding to the data information of the current picture to a user watching live broadcast according to the parameter information of the physical camera when the current picture is shot. At this time, the first electronic device may display a shot picture of the physical camera of the first electronic device and a shot picture of the virtual camera corresponding to the physical camera of the second electronic device, and similarly, for a user watching a live broadcast, the user may view the shot picture of the physical camera of the first electronic device on his own electronic device and also view a shot picture of the virtual camera corresponding to the physical camera of the second electronic device.
For example, the anchor is when live, the user of first electronic equipment is the anchor, the user of second electronic equipment is the assistant of the anchor, when the anchor begins to live, the leading camera of the physics camera of using first electronic equipment shoots oneself, when the anchor needs to show commodity to watching the user of live, can call the rear camera of the virtual camera that second electronic equipment corresponds on first electronic equipment through first APP, after rear camera on the second electronic equipment is opened, the assistant can shoot the commodity that will show, at this moment, the user who watches the live can not only see the anchor on own electronic equipment, can also see corresponding commodity.
In addition, the assistant may also control the physical camera of the second electronic device on the second electronic device, for example, the virtual camera agent module may generate a UI (e.g., a first UI) upon detecting a first request from the first electronic device, and display the first UI401 on the second electronic device, the first UI401 may be as illustrated in (a) of fig. 4, the assistant may control the physical camera of the second electronic device through the first UI401, for example, the assistant may set a mode of a flash of the physical camera of the second electronic device through an option 4011, the assistant may set a tone mode of the physical camera through an option 4012, the assistant may adjust a resolution head of the physical camera through an option 4013, and the assistant may set other options of the physical camera through an option 4014.
It should be noted that the options shown in the first UI401 are only used as an exemplary illustration, and in a specific implementation, the first UI may further include more options than those shown in the drawings, for example, a focusing option may also be included, which is not limited in this embodiment of the application.
For example, the assistant has adjusted the resolution of the physical camera via option 4013, and at this time, the virtual camera agent module detects a request (e.g., a second request) from the first UI to adjust the resolution, and the virtual camera agent module calls the camera interface to adjust the resolution of the physical camera according to the second request.
After executing the first operation corresponding to the second request, the virtual camera agent module may invoke the camera interface to display a current screen (e.g., a first screen) acquired by the physical camera through the first UI, in this case, the first UI402 may be as shown in (b) in fig. 4, where a person in a dashed line box 4025 in the first UI402 is the first screen acquired by the physical camera, a screen in a dashed line box 4025 in (b) in fig. 4 is in a non-full screen display mode, and the assistant may set the screen in a dashed line box 4026 to be in a full screen display mode through an option 4026.
In addition, the virtual camera agent module can call a camera interface, the data information of a first picture acquired by the physical camera after the first operation is executed and the parameter information of the physical camera when the first picture is shot are sent to a first APP on the first electronic device, and the first APP displays the first picture corresponding to the data information of the first picture to a user watching live broadcast according to the parameter information of the physical camera when the first picture is shot.
For example, the assistant adjusts the resolution of the physical camera from 1280 × 720 to 1920 × 1080 via option 4023, in this case, the resolution used when the physical camera takes the first picture is 1920 × 1080, after the first picture is acquired, the second electronic equipment sends the data information of the first picture and the parameter information of the physical camera when the first picture is shot to a first APP on the first electronic equipment, the first APP displays a first picture corresponding to the data information of the first picture to a user on the first electronic equipment according to the parameter information of the physical camera when the first picture is shot, wherein the parameter information of the physical camera when the first picture is shot comprises resolution, the resolution is 1920 × 1080, in other words, the resolution of the first picture displayed to the user by the first electronic device is a resolution used when the first picture is shot by the physical camera, that is, 1920 × 1080.
When the anchor performs a certain operation on the virtual camera corresponding to the physical camera of the second electronic device on the UI corresponding to the virtual camera, the virtual camera agent module may detect a third request from the first electronic device, and the virtual camera agent module invokes the camera interface to perform a second operation according to the third request, and then the virtual camera agent module may invoke the camera interface, and send data information of a current picture (for example, a second picture) acquired by the physical camera after performing the second operation and parameter information of the physical camera when shooting the second picture to the first APP on the first electronic device, so that the first APP presents the second picture corresponding to the data information of the second picture acquired by the physical camera to a user watching live broadcast according to the parameter information of the physical camera when shooting the second picture after performing the second operation.
For example, the anchor selects a soft option corresponding to the color tone on a UI corresponding to the virtual camera, at this time, the virtual camera agent module may detect a third request for requesting to adjust the color tone of the physical camera to a soft mode, and the virtual camera agent module invokes the camera interface to adjust the physical camera to the soft mode according to the third request, and then the virtual camera agent module may invoke the camera interface, and send data information of a second picture acquired by the physical camera after the mode adjustment and parameter information of the physical camera when the second picture is shot to the first APP on the first electronic device, so that the first APP displays the second picture corresponding to the data information of the second picture on the first electronic device for a user watching live broadcast according to the parameter information of the physical camera when the second picture is shot.
As can be seen from the foregoing description, the virtual camera agent module may detect both the request from the UI corresponding to the virtual camera and the request from the first UI, and in a specific implementation, the virtual camera agent module may first send the request to the control policy management module, where the control policy management module stores the control policy, and determines whether the request can be executed on the premise that the control policy management module ensures that the control policy is satisfied, and the control policy management module may notify the virtual camera agent module of the result, and the virtual camera agent module determines whether to execute the corresponding request according to the result. For a detailed description of the control strategy, reference is made to the related description of the method 300, and for brevity, the detailed description is omitted here.
For example, assuming that the control policy is a priority control policy, in this case, for all detected requests, the control policy management module will obtain a result of allowing execution and notify the virtual camera agent module of the result, and the virtual camera agent module calls the camera interface to execute the corresponding request, in this case, for the second request and the third request, the virtual camera agent module calls the camera interface to execute the corresponding request.
For example, assuming that the control policy is a local priority control policy, in this case, when the virtual camera agent module detects both the second request and the third request, and it is assumed that a first operation requested to be executed by the second request conflicts with a second operation requested to be executed by the third request, the control policy management module rejects to execute the second operation corresponding to the third request based on the local priority control policy, and notifies the virtual camera agent module of the result of the rejection.
For example, the virtual camera agent module detects both a second request corresponding to a first operation to set the flash to the auto mode and a third request corresponding to a second operation to set the flash to the constant mode, in which case the control policy management module denies execution of the third request based on the local priority control policy, in other words, in which case the virtual camera agent module calls the camera to execute only the second request to set the flash to the auto mode.
For example, assuming that the control policy is an opposite-end priority control policy, in this case, when the virtual camera agent module detects both the second request and the third request, and it is assumed that a first operation requested to be executed by the second request conflicts with a second operation requested to be executed by the third request, the control policy management module rejects to execute the first operation corresponding to the second request based on the opposite-end priority control policy, and notifies the virtual camera agent module of a result of the rejection.
For example, the virtual camera agent module detects both the second request and the third request, the first operation corresponding to the second request is to open a night scene shooting mode, and the second operation corresponding to the third request is to open a large aperture shooting mode.
For example, the control policy may be configured in the control policy management module in advance, or the control policy may be set by the user through the first UI, for example, the assistant may select an option on the first UI, the option representing a certain control policy, after the assistant makes the selection, the inside of the second electronic device may obtain the control policy according to the selection of the assistant (for example, user input data), and finally, the obtained control policy may be saved in the control policy management module, so that the control policy management module may determine whether to allow the request to be executed according to the saved control policy.
Illustratively, when the assistant sets a control policy through the first UI402, the assistant may select a setting option in the first UI402, the second electronic device displays the first UI 501 as shown in (a) of fig. 5 in response to the operation of the assistant, the control policy option 5011 may be included in the first UI 501, the assistant may select the control policy option 5011 in the first UI 501, the second electronic device displays the first UI 502 as shown in (b) of fig. 5 in response to the operation of the assistant, the sequential priority control policy option 5021, the local priority control policy option 5022, and the peer priority control policy option 5023 may be included in the first UI 502, the first assistant may select one as a control policy in the first UI 502, for example, the helper has selected the sequential priority control strategy option 5021 and the helper has selected the sequential priority control strategy on its behalf.
It should be noted that the first UI in the embodiment of the present application may be suspended above any interface, and may be dragged at will, for example, the first UI may be suspended on a screen lock interface of the second electronic device or other interfaces after the second electronic device is unlocked.
It should be noted that, the above description only uses the helper setting control policy as an exemplary description, but this does not limit the embodiments of the present application, and in a specific implementation, the anchor or other users may also complete the setting of the control policy.
It should be further noted that, in the embodiment of the present application, a time when the user sets the control policy is not limited, the user may set the control policy through the first UI at any time after the first UI is displayed, and the control policy management module stores the control policy or updates the stored control policy in real time according to the setting of the user.
It should be noted that, in the embodiment of the present application, only the virtual camera agent module and the control policy management module are used to execute different functions respectively as an exemplary description, but this does not limit the embodiment of the present application at all, and in the specific implementation, the functions executed by the virtual camera agent module and the control policy management module may be implemented by one module, or may be implemented by a plurality of modules, which is not particularly limited in the embodiment of the present application.
In this embodiment, the electronic device may be divided into function modules according to the method example, for example, each function module may be divided corresponding to each function, or two or more functions may be integrated into one processing module, for example, the virtual camera agent module and the control policy management module may be integrated into one processing module. The integrated module may be implemented in the form of hardware. It should be noted that the division of the modules in this embodiment is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
It should be noted that all relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again.
The electronic device provided by the embodiment is used for executing the method for controlling the camera, so that the same effect as the implementation method can be achieved. In case an integrated unit is employed, the electronic device may comprise a processing module, a storage module and a communication module. The processing module may be configured to control and manage actions of the electronic device, and for example, may be configured to support the electronic device to execute steps executed by the processing unit. The memory module can be used to support the electronic device for storing program codes and data, etc. The communication module can be used for supporting the communication between the electronic equipment and other equipment.
The processing module may be a processor or a controller. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. A processor may also be a combination of computing functions, e.g., a combination of one or more microprocessors, a Digital Signal Processing (DSP) and a microprocessor, or the like. The storage module may be a memory. The communication module may specifically be a radio frequency circuit, a bluetooth chip, a Wi-Fi chip, or other devices that interact with other electronic devices.
In an embodiment, when the processing module is a processor and the storage module is a memory, the electronic device according to this embodiment may be a device having the structure shown in fig. 2.
The present embodiment also provides a computer-readable storage medium, in which computer instructions are stored, and when the computer instructions are executed on an electronic device, the electronic device executes the above related method steps to implement the method for controlling a camera in the above embodiment.
The present embodiment also provides a computer program product, which when running on a computer, causes the computer to execute the relevant steps described above, so as to implement the method for controlling a camera in the above embodiments.
In addition, embodiments of the present application also provide an apparatus, which may be specifically a chip, a component or a module, and may include a processor and a memory connected to each other; the memory is used for storing computer execution instructions, and when the device runs, the processor can execute the computer execution instructions stored in the memory, so that the chip can execute the method for controlling the camera in the above method embodiments.
The electronic device, the computer storage medium, the computer program product, or the chip provided in this embodiment are all configured to execute the corresponding method provided above, so that the beneficial effects achieved by the electronic device, the computer storage medium, the computer program product, or the chip may refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
Through the description of the above embodiments, those skilled in the art will understand that, for convenience and simplicity of description, only the division of the above functional modules is used as an example, and in practical applications, the above function distribution may be completed by different functional modules as needed, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, a module or a unit may be divided into only one logic function, and may be implemented in other ways, for example, a plurality of units or components may be combined or integrated into another apparatus, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed to a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially contributed to by the prior art, or all or part of the technical solutions may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (16)

1. A method of controlling a camera, the method performed by a second electronic device, comprising:
when a first request from a first electronic device is detected, starting a physical camera of a second electronic device and displaying a first UI, wherein the first request requests the second electronic device for the use permission of the physical camera;
when a second request from the first UI is detected, a first operation is performed on the physical camera, wherein the first operation is the operation which is requested to be performed by the second electronic equipment by the second request.
2. The method of claim 1, further comprising:
displaying a first picture on the first UI, wherein the first picture is a current picture acquired by the physical camera after the first operation is executed.
3. The method according to claim 1 or 2, characterized in that the method further comprises:
and sending data information and first parameter information of the first picture to the first electronic device, wherein the first parameter information is parameter information of the physical camera when the first picture is shot.
4. The method according to any one of claims 1 to 3, further comprising:
when a third request from the first electronic device is detected, performing a second operation on the physical camera, wherein the second operation is an operation performed by the second electronic device requested by the third request;
and sending data information and second parameter information of a second picture to the first electronic device, wherein the second picture is a current picture acquired by the physical camera after the second operation is executed, and the second parameter information is the parameter information of the physical camera when the second picture is shot.
5. The method of claim 4, wherein the performing the first operation comprises:
performing the first operation on the physical camera when a control policy is satisfied, wherein the control policy includes: executing the second request first when the receiving time of the second request is earlier than the receiving time of the third request, or refusing to execute the second request when the second request conflicts with the third request, or refusing to execute the third request when the second request conflicts with the third request.
6. The method of claim 4, wherein the performing the second operation comprises:
performing the second operation on the physical camera when a control policy is satisfied, wherein the control policy comprises: executing the second request first when the receiving time of the second request is earlier than the receiving time of the third request, or refusing to execute the second request when the second request conflicts with the third request, or refusing to execute the third request when the second request conflicts with the third request.
7. The method of claim 5 or 6, further comprising:
receiving user input data from the first UI;
and acquiring the control strategy according to the user input data.
8. An electronic device, characterized in that the electronic device comprises:
one or more processors, memory, and one or more programs, wherein the one or more programs are stored in the memory, which when executed by the processors, cause the electronic device to perform the steps of:
when a first request from a first electronic device is detected, starting a physical camera of the electronic device and displaying a first UI, wherein the first request requests the electronic device for the use permission of the physical camera;
when a second request from the first UI is detected, a first operation is performed on the physical camera, wherein the first operation is the operation which is requested to be performed by the electronic equipment by the second request.
9. The electronic device of claim 8, wherein the one or more programs, when executed by the processor, cause the electronic device to perform the steps of:
and displaying a current picture acquired by the physical camera after the first operation is executed on the first UI.
10. The electronic device of claim 8 or 9, wherein the one or more programs, when executed by the processor, cause the electronic device to perform the steps of:
and sending data information and first parameter information of the first picture to the first electronic device, wherein the first parameter information is parameter information of the physical camera when the first picture is shot.
11. The electronic device of any of claims 8-10, wherein the one or more programs, when executed by the processor, cause the electronic device to perform the steps of:
when a third request from the first electronic device is detected, performing a second operation on the physical camera, wherein the second operation is an operation performed by the electronic device requested by the third request;
and sending data information and second parameter information of a second picture to the first electronic device, wherein the second picture is a current picture acquired by the physical camera after the second operation is executed, and the second parameter information is the parameter information of the physical camera when the second picture is shot.
12. The electronic device of claim 11, wherein the one or more programs, when executed by the processor, cause the electronic device to perform the steps of:
performing the first operation on the physical camera when a control policy is satisfied, wherein the control policy includes: executing the second request first when the receiving time of the second request is earlier than the receiving time of the third request, or refusing to execute the second request when the second request conflicts with the third request, or refusing to execute the third request when the second request conflicts with the third request.
13. The electronic device of claim 11, wherein the one or more programs, when executed by the processor, cause the electronic device to perform the steps of:
performing the second operation on the physical camera when a control policy is satisfied, wherein the control policy includes: executing the second request first when the receiving time of the second request is earlier than the receiving time of the third request, or refusing to execute the second request when the second request conflicts with the third request, or refusing to execute the third request when the second request conflicts with the third request.
14. The electronic device of claim 12 or 13, wherein the one or more programs, when executed by the processor, cause the electronic device to perform the steps of:
receiving user input data from the first UI;
and acquiring the control strategy according to the user input data.
15. A computer-readable storage medium storing computer instructions which, when run on an electronic device, cause the electronic device to perform the method of controlling a camera of any one of claims 1 to 7.
16. A computer program product, characterized in that it causes a computer to carry out the method of controlling a camera according to any one of claims 1 to 7, when the computer program product is run on the computer.
CN202011257123.5A 2020-11-11 2020-11-11 Method for controlling camera and electronic equipment Active CN114500822B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011257123.5A CN114500822B (en) 2020-11-11 2020-11-11 Method for controlling camera and electronic equipment
CN202210954724.4A CN115514881B (en) 2020-11-11 2020-11-11 Method for controlling camera and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011257123.5A CN114500822B (en) 2020-11-11 2020-11-11 Method for controlling camera and electronic equipment

Related Child Applications (2)

Application Number Title Priority Date Filing Date
CN202210954724.4A Division CN115514881B (en) 2020-11-11 2020-11-11 Method for controlling camera and electronic equipment
CN202410140219.5A Division CN118158518A (en) 2020-11-11 Method for controlling camera and electronic equipment

Publications (2)

Publication Number Publication Date
CN114500822A true CN114500822A (en) 2022-05-13
CN114500822B CN114500822B (en) 2024-03-05

Family

ID=81489859

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202011257123.5A Active CN114500822B (en) 2020-11-11 2020-11-11 Method for controlling camera and electronic equipment
CN202210954724.4A Active CN115514881B (en) 2020-11-11 2020-11-11 Method for controlling camera and electronic equipment

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202210954724.4A Active CN115514881B (en) 2020-11-11 2020-11-11 Method for controlling camera and electronic equipment

Country Status (1)

Country Link
CN (2) CN114500822B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024087900A1 (en) * 2022-10-27 2024-05-02 荣耀终端有限公司 Camera switching method and related electronic device

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08237635A (en) * 1995-02-28 1996-09-13 Canon Inc Image pickup controller and control method thereof
JP2004128997A (en) * 2002-10-04 2004-04-22 Nippon Telegr & Teleph Corp <Ntt> Device, method and program for video remote control, and recording medium with the program recorded thereon
CN104796610A (en) * 2015-04-20 2015-07-22 广东欧珀移动通信有限公司 Mobile terminal and camera sharing method, device and system thereof
US20190090014A1 (en) * 2017-09-19 2019-03-21 Rovi Guides, Inc. Systems and methods for navigating internet appliances using a media guidance application
CN110944109A (en) * 2018-09-21 2020-03-31 华为技术有限公司 Photographing method, device and equipment
CN110971823A (en) * 2019-11-29 2020-04-07 维沃移动通信(杭州)有限公司 Parameter adjusting method and terminal equipment
CN111010588A (en) * 2019-12-25 2020-04-14 广州酷狗计算机科技有限公司 Live broadcast processing method and device, storage medium and equipment
CN111083379A (en) * 2019-12-31 2020-04-28 维沃移动通信(杭州)有限公司 Shooting method and electronic equipment
CN111083364A (en) * 2019-12-18 2020-04-28 华为技术有限公司 Control method, electronic equipment, computer readable storage medium and chip
WO2020211735A1 (en) * 2019-04-19 2020-10-22 华为技术有限公司 Method for using enhanced function of electronic device and related apparatus

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104349032A (en) * 2013-07-23 2015-02-11 中兴通讯股份有限公司 Method for photographing and mobile terminal
CN103634524A (en) * 2013-11-15 2014-03-12 北京智谷睿拓技术服务有限公司 Control method and control equipment of camera system and camera system
CN105516507A (en) * 2015-12-25 2016-04-20 联想(北京)有限公司 Information processing method and electronic equipment

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08237635A (en) * 1995-02-28 1996-09-13 Canon Inc Image pickup controller and control method thereof
JP2004128997A (en) * 2002-10-04 2004-04-22 Nippon Telegr & Teleph Corp <Ntt> Device, method and program for video remote control, and recording medium with the program recorded thereon
CN104796610A (en) * 2015-04-20 2015-07-22 广东欧珀移动通信有限公司 Mobile terminal and camera sharing method, device and system thereof
US20190090014A1 (en) * 2017-09-19 2019-03-21 Rovi Guides, Inc. Systems and methods for navigating internet appliances using a media guidance application
CN110944109A (en) * 2018-09-21 2020-03-31 华为技术有限公司 Photographing method, device and equipment
WO2020211735A1 (en) * 2019-04-19 2020-10-22 华为技术有限公司 Method for using enhanced function of electronic device and related apparatus
CN110971823A (en) * 2019-11-29 2020-04-07 维沃移动通信(杭州)有限公司 Parameter adjusting method and terminal equipment
CN111083364A (en) * 2019-12-18 2020-04-28 华为技术有限公司 Control method, electronic equipment, computer readable storage medium and chip
CN111010588A (en) * 2019-12-25 2020-04-14 广州酷狗计算机科技有限公司 Live broadcast processing method and device, storage medium and equipment
CN111083379A (en) * 2019-12-31 2020-04-28 维沃移动通信(杭州)有限公司 Shooting method and electronic equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024087900A1 (en) * 2022-10-27 2024-05-02 荣耀终端有限公司 Camera switching method and related electronic device

Also Published As

Publication number Publication date
CN115514881B (en) 2023-07-11
CN115514881A (en) 2022-12-23
CN114500822B (en) 2024-03-05

Similar Documents

Publication Publication Date Title
US20220342475A1 (en) Terminal control method and terminal
KR102149187B1 (en) Electronic device and control method of the same
CN109167931B (en) Image processing method, device, storage medium and mobile terminal
WO2018036164A1 (en) Screen flash photographing method, device and mobile terminal
WO2022042211A1 (en) Service processing method and device
CN111860530B (en) Electronic equipment, data processing method and related device
CN109040523B (en) Artifact eliminating method and device, storage medium and terminal
EP3609175B1 (en) Apparatus and method for generating moving image data including multiple section images in electronic device
CN113824873A (en) Image processing method and related electronic equipment
CN108513069B (en) Image processing method, image processing device, storage medium and electronic equipment
KR20230133970A (en) Photography methods, devices and electronics
EP4376433A1 (en) Camera switching method and electronic device
US20180368074A1 (en) Terminal device, network device, frame format configuration method, and system
KR20200094500A (en) Electronic device and method for processing line data included in image frame data into multiple intervals
CN115604572A (en) Image acquisition method and device
WO2018161568A1 (en) Photographing method and device based on two cameras
CN113923461A (en) Screen recording method and screen recording system
WO2022083325A1 (en) Photographic preview method, electronic device, and storage medium
CN114500822B (en) Method for controlling camera and electronic equipment
CN108259767B (en) Image processing method, image processing device, storage medium and electronic equipment
EP4262226A1 (en) Photographing method and related device
CN113891008B (en) Exposure intensity adjusting method and related equipment
CN118158518A (en) Method for controlling camera and electronic equipment
CN117119316B (en) Image processing method, electronic device, and readable storage medium
EP4336857A1 (en) Photographing method and related apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant