CN117707242A - Temperature control method and related device - Google Patents

Temperature control method and related device Download PDF

Info

Publication number
CN117707242A
CN117707242A CN202310848706.2A CN202310848706A CN117707242A CN 117707242 A CN117707242 A CN 117707242A CN 202310848706 A CN202310848706 A CN 202310848706A CN 117707242 A CN117707242 A CN 117707242A
Authority
CN
China
Prior art keywords
temperature
frame rate
electronic device
camera
code rate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310848706.2A
Other languages
Chinese (zh)
Inventor
张伟
李鹏飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202310848706.2A priority Critical patent/CN117707242A/en
Publication of CN117707242A publication Critical patent/CN117707242A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D23/00Control of temperature
    • G05D23/19Control of temperature characterised by the use of electric means
    • G05D23/20Control of temperature characterised by the use of electric means with sensing elements having variation of electric or magnetic properties with change of temperature

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the application provides a temperature control method and a related device, and relates to the technical field of terminals. The method comprises the following steps: when the temperature of the electronic equipment using the camera is increased, the frame rate of the image acquired by the camera and the code rate of the image transmitted by the electronic equipment can be reduced, so that the temperature of the electronic equipment is not too high.

Description

Temperature control method and related device
Technical Field
The application relates to the technical field of terminals, in particular to a temperature control method and a related device.
Background
In a scene of a video conference or video lecture or the like, a user may use a camera of the electronic device B on the electronic device a, which may be referred to as a shared camera scene.
However, in the process of using the camera of the electronic device B by the electronic device a, the system load of the electronic device B may be increased, which may cause the temperature of the electronic device B to increase, and reduce the user experience.
Disclosure of Invention
According to the temperature control method and the temperature control device, when the temperature of the electronic equipment using the camera is increased, the frame rate of the image collected by the camera and the code rate of the image transmitted by the electronic equipment can be reduced, so that the temperature of the electronic equipment is not too high.
In a first aspect, a temperature control method provided in an embodiment of the present application includes: the first electronic device invokes a camera device of the second electronic device; the second electronic equipment collects images by using the camera device and transmits the images to the first electronic equipment; when the temperature of the second electronic equipment is increased to a first preset value in the process of acquiring images by the second electronic equipment by using the camera device, if the first frame rate of the images acquired by the camera device is larger than the target frame rate, the frame rate of the images acquired by the camera device is reduced by the second electronic equipment; otherwise, if the first frame rate is less than or equal to the target frame rate, the second electronic device reduces the code rate of the transmission image. Therefore, the frame rate of the image collected by the camera and the code rate of the image transmitted by the electronic equipment can be adaptively adjusted based on the reference frame rate, the reference code rate, the temperature grade and the like under the condition that the user experience is not affected basically, so that the temperature of the electronic equipment is adjusted, and the probability of overhigh temperature of the electronic equipment is reduced.
In a possible implementation, during the process of capturing the image by the second electronic device using the camera device: at a first moment, the temperature of the second electronic equipment is a first temperature, the frame rate of the image acquired by the camera device is a first frame rate, and the first frame rate is larger than the target frame rate; the temperature of the second electronic equipment is the second temperature, the frame rate of the image acquired by the camera device is the second frame rate, wherein the second time is later than the first time, the second temperature is higher than the first temperature, the second frame rate is lower than the first frame rate, and the second frame rate is higher than the target frame rate; the temperature of the second electronic equipment is a third temperature, the frame rate of the image acquired by the camera device is a third frame rate, the third temperature is later than the second time, the third temperature is higher than the second temperature, the third frame rate is lower than the second frame rate, and the third frame rate is higher than or equal to the target frame rate. Therefore, the effect of gradually reducing the frame rate can be realized, the descending amplitude of the frame rate is not large, the display effect of the interface is not too stuck, and the user experience can be not influenced as much as possible.
In a possible implementation, between the second time and the third time, the temperature of the second electronic device is a fourth temperature, and the frame rate at which the camera device collects the image is a second frame rate, where the fourth temperature is greater than the second temperature, the fourth temperature is less than the third temperature, and both the fourth temperature and the second temperature belong to the first preset temperature interval. In this way, frequent adjustments of the frame rate according to changes in temperature may not be required, so that the second electronic device may transmit images to the first electronic device at a relatively stable frame rate.
In one possible implementation, the temperature level corresponding to the first temperature is a first level, the temperature level corresponding to the second temperature is a second level, the temperature level corresponding to the third temperature is a third level, and the temperature level corresponding to the fourth temperature is a second level. Thus, the flow for executing frame rate adjustment is reduced, and the occupation of memory resources is reduced.
In a possible implementation, during the process of capturing the image by the second electronic device using the camera device: at a fourth moment, the temperature of the second electronic equipment is a fifth temperature, the frame rate of the image acquired by the camera device is a first frame rate, the code rate of the image transmitted by the second electronic equipment is a first code rate, and the first frame rate is larger than the target frame rate; the temperature of the second electronic equipment is the sixth temperature, the frame rate of the image acquired by the camera device is the target frame rate, and the code rate of the image transmitted by the second electronic equipment is the second code rate, wherein the fifth time is later than the fourth time, the sixth temperature is higher than the fifth temperature, and the second code rate is lower than the first code rate; at a sixth moment, the temperature of the second electronic equipment is a seventh temperature, the frame rate of the image acquired by the camera device is a target frame rate, the code rate of the image transmitted by the second electronic equipment is a third code rate, wherein the sixth moment is later than the fifth moment, the seventh temperature is higher than the sixth temperature, and the third code rate is lower than the second code rate. In this way, the second electronic device reduces the frame rate first and then reduces the code rate, and at this time, when the frame rate is reduced, the code rate is also reduced, so that the probability of the interface being stuck and/or blurred can be reduced.
In a possible implementation, between the fifth time and the sixth time, the temperature of the second electronic device is an eighth temperature, the frame rate at which the camera device collects the image is a target frame rate, and the code rate at which the second electronic device transmits the image is a second code rate, where the eighth temperature is greater than the sixth temperature, the eighth temperature is less than the seventh temperature, and both the eighth temperature and the sixth temperature belong to a second preset temperature interval. In this way, frequent adjustments of the code rate according to changes in temperature may not be required, so that the second electronic device may transmit images to the first electronic device at a relatively stable code rate.
In a possible implementation, during the process of capturing the image by the second electronic device using the camera device: at a seventh moment, the temperature of the second electronic equipment is a ninth temperature, the frame rate of the image acquired by the camera device is a first frame rate, the code rate of the image transmitted by the second electronic equipment is a first code rate, and the first frame rate is smaller than or equal to the target frame rate; the temperature of the second electronic equipment is tenth temperature, the frame rate of the image acquired by the camera device is the target frame rate, and the code rate of the image transmitted by the second electronic equipment is fourth code rate, wherein the eighth time is later than the seventh time, the tenth temperature is higher than the ninth temperature, and the fourth code rate is lower than the first code rate; at the ninth moment, the temperature of the second electronic equipment is eleventh temperature, the frame rate of the image acquired by the camera device is the target frame rate, the code rate of the image transmitted by the second electronic equipment is fifth code rate, wherein the ninth moment is later than the eighth moment, the eleventh temperature is higher than tenth temperature, and the fifth code rate is lower than the fourth code rate. Therefore, the descending amplitude of the code rate value is not large, the display effect of the interface is not too fuzzy, and the user experience can be not influenced as much as possible.
In a possible implementation, when the temperature of the second electronic device is reduced to the second preset value during the process of capturing the image by using the camera device, if the frame rate of capturing the image by using the camera device is smaller than the first frame rate, the second electronic device increases the frame rate. Therefore, the frame rate of the image acquired by the camera can be improved, and the image texture is improved.
In one possible implementation, a second electronic device increases a frame rate, comprising: the second electronic device increasing the frame rate to the first frame rate; or the second electronic equipment increases the frame rate to a fourth frame rate, the fourth frame rate is smaller than the first frame rate, and when the temperature of the second electronic equipment is reduced to a third preset value, the second electronic equipment increases the fourth frame rate to the first frame rate, and the third preset value and the second preset value belong to different temperature intervals; alternatively, the second electronic device increases the frame rate to a fourth frame rate; and when the temperature of the second electronic equipment is less than or equal to a second preset value in the first preset time period, the second electronic equipment increases the fourth frame rate to the first frame rate. Therefore, after the frame rate of the second electronic device is increased, the frame rate of the image data received by the first electronic device is also increased, so that the user experience is improved.
In a possible implementation, in the process that the second electronic device collects the image by using the camera device, when the temperature of the second electronic device is reduced to a fourth preset value, if the code rate of the image transmitted by the second electronic device is smaller than the first code rate, the second electronic device increases the code rate. Therefore, the code rate of the image acquired by the camera can be improved, and the image texture is improved.
In one possible implementation, a second electronic device increases a code rate, comprising: the second electronic device increases the code rate to the first code rate; or the second electronic equipment increases the code rate to a sixth code rate, the sixth code rate is smaller than the first code rate, when the temperature of the second electronic equipment is reduced to a fifth preset value, the second electronic equipment increases the sixth code rate to the first code rate, and the fifth preset value and the fourth preset value belong to different temperature intervals; or the second electronic equipment increases the code rate to the sixth code rate, and when the temperature of the second electronic equipment is smaller than or equal to the fourth preset value in the second preset time period, the second electronic equipment increases the sixth code rate to the first code rate. Therefore, after the second electronic equipment improves the code rate, the code rate of the image data received by the first electronic equipment is improved, so that the user experience is improved.
In a possible implementation, before the first electronic device invokes the camera device of the second electronic device, the method further includes: the first electronic device displays an interface of the target application, wherein the interface comprises identifications of one or more cameras; the identification of the one or more cameras comprises the identification of a camera device of the second electronic equipment; the first electronic device invokes a camera device of the second electronic device, comprising: the first electronic device invokes the camera device of the second electronic device in response to a selected operation of the identification of the camera device of the second electronic device. Therefore, the user can use the camera device of the second electronic device to carry out video conference or video teaching and the like on the first electronic device, and user experience is improved.
In a second aspect, embodiments of the present application provide a communication system that includes a first electronic device and a second electronic device. Wherein the first electronic device is configured to perform the method performed by the first electronic device in the first aspect or any one of the possible implementation manners of the first aspect. The second electronic device is configured to perform the method performed by the second electronic device in the first aspect or any one of the possible implementations of the first aspect.
The first electronic device is used for calling a camera device of the second electronic device, displaying an interface of the target application, and particularly responding to the selected operation of the identification of the camera device of the second electronic device.
The second electronic device is used for acquiring images by using the camera device, transmitting the images to the first electronic device, reducing the frame rate of the images acquired by the camera device, and particularly reducing the code rate of the transmitted images.
The second electronic device is also configured to increase the frame rate and also to increase the code rate.
In a third aspect, embodiments of the present application provide an electronic device, including a processor and a memory, where the memory is configured to store code instructions, and where the processor is configured to execute the code instructions to perform a method performed by the first electronic device in the first aspect or any one of the possible implementations of the first aspect, or to perform a method performed by the second electronic device in the first aspect or any one of the possible implementations of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, in which a computer program or instructions are stored which, when run on a computer, cause the computer to perform the method performed by the first electronic device in the first aspect or any one of the possible implementations of the first aspect, or perform the method performed by the second electronic device in the first aspect or any one of the possible implementations of the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program product comprising a computer program which, when run on a computer, causes the computer to perform the method performed by the first electronic device in the first aspect or any one of the possible implementations of the first aspect, or to perform the method performed by the second electronic device in the first aspect or any one of the possible implementations of the first aspect.
In a sixth aspect, the present application provides a chip or chip system comprising at least one processor and a communication interface, the communication interface and the at least one processor being interconnected by a wire, the at least one processor being configured to execute a computer program or instructions to perform a method performed by a first electronic device in or a second electronic device in or any one of the possible implementations of the first aspect. The communication interface in the chip can be an input/output interface, a pin, a circuit or the like.
In one possible implementation, the chip or chip system described above in the present application further includes at least one memory, where the at least one memory has instructions stored therein. The memory may be a memory unit within the chip, such as a register, a cache, etc., or may be a memory unit of the chip (e.g., a read-only memory, a random access memory, etc.).
It should be understood that, the second aspect to the sixth aspect of the present application correspond to the technical solutions of the first aspect of the present application, and the beneficial effects obtained by each aspect and the corresponding possible embodiments are similar, and are not repeated.
Drawings
Fig. 1 is a schematic diagram of a shared camera scene provided in an embodiment of the present application;
fig. 2 is a schematic diagram of a camera function provided in an embodiment of the present application;
fig. 3 is a schematic structural diagram of a first electronic device according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a second electronic device according to an embodiment of the present application;
fig. 5 is a schematic software structure of an electronic device according to an embodiment of the present application;
FIG. 6 is a diagram of interaction between software modules of a computer and a mobile phone according to an embodiment of the present application;
FIG. 7 is an interactive schematic diagram of a temperature control method according to an embodiment of the present disclosure;
FIG. 8 is a schematic diagram of a temperature control method according to an embodiment of the present disclosure;
fig. 9 is a schematic structural diagram of a chip according to an embodiment of the present application.
Detailed Description
In order to facilitate the clear description of the technical solutions of the embodiments of the present application, the following simply describes some terms and techniques related to the embodiments of the present application:
1. Terminology
In the embodiments of the present application, the words "first," "second," and the like are used to distinguish between identical or similar items that have substantially the same function and effect. For example, the first chip and the second chip are merely for distinguishing different chips, and the order of the different chips is not limited. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ.
It should be noted that, in the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In the embodiments of the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, or c may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or plural.
In a scene of a video conference or video lecture or the like, a user may use a camera of the electronic device B on the electronic device a, which may be referred to as a shared camera scene. In the following, an electronic device a is taken as a computer, an electronic device B is taken as a mobile phone, and a video conference scene of a user using a camera of the mobile phone on the computer is described.
As shown in fig. 1, the computer 100 may be connected to the mobile phone 101, and an interface 1001 of an application may be displayed in the computer 100, where the application may perform a video conference, and the interface 1001 of the application may include related functions of the video conference, such as a mute function, a camera function 1002, an interactive function, a more function, a sharing function, and the like, and the interface 1001 may further include an end sharing button, an end live button, and the like. It is understood that the application may provide functions of camera opening, closing, etc., but the content displayed in the interface 1001 of different applications may be different, and the embodiment of the present application is not limited.
The camera function 1002 may include a camera function on and a camera function off, and the camera function 1002 may also have a corresponding select camera menu 1003. For example, the camera function may be as shown in fig. 2, the application interface may display icons and text for turning on the camera 200, and in response to a user's operation to turn on the camera, the application interface may display icons and text for turning off the camera 201.
In addition, in response to the user clicking the drop-down arrow 202, the application interface may display a selection camera menu 203, where the selection camera menu 203 may include different types of cameras such as a local camera, a front camera 204 of the mobile phone, a rear camera 205 of the mobile phone, and the like, and the selection camera menu 203 may also provide functions such as beauty, virtual background, video setting, and the like. It will be appreciated that the content displayed and the functions provided in the selection camera menu 203 may vary from application to application, and embodiments of the present application are not limited thereto.
In response to a user selecting the operation of the handset front camera 204, the selected handset front camera 206 may be displayed in the application interface. The interface display manner of the selected front camera 206 of the mobile phone may be set by application customization, which is not limited in the embodiment of the present application.
As shown in fig. 1, when the computer 100 responds to the user selecting the front camera of the mobile phone, the mobile phone 101 can open the front camera and use the front camera to perform a video conference with the computer 100.
However, in the process of using the mobile phone camera by the computer, the system load of the mobile phone is increased, which may cause the temperature of the mobile phone to increase, and reduce the user experience.
In view of this, when the temperature of the electronic device using the camera increases, the temperature control method provided in the embodiment of the present application may reduce the frame rate at which the camera collects images and the code rate at which the electronic device transmits images, so that the temperature of the electronic device is not too high.
The temperature control method provided by the embodiment of the application can be applied to a communication system, and the communication system can comprise a first electronic device and a second electronic device.
The first electronic device and the second electronic device may be the same type of device or different types of devices, and by way of example, the first electronic device and the second electronic device may be any one of the following electronic devices: a mobile phone, tablet, palmtop, notebook, personal computer (personal computer, PC), mobile internet device (mobile internet device, MID), wearable device, virtual Reality (VR) device, augmented reality (augmented reality, AR) device, wireless terminal in industrial control (industrial control), wireless terminal in unmanned (self-driving), wireless terminal in tele-surgery (remote medical surgery), wireless terminal in smart grid (smart grid), wireless terminal in transportation security (transportation safety), wireless terminal in smart city (smart city), wireless terminal in smart home (smart home), cellular phone, cordless phone, session initiation protocol (session initiation protocol, SIP) phone, wireless local loop (wireless local loop, WLL) station, personal digital assistant (personal digital assistant, PDA), handheld device with wireless communication function, computing device or other processing device connected to a wireless modem, device, wireless terminal in a wearable device, electronic device in a 5G network or future mobile application (PLMN, future mobile application, etc.), is not limited to this embodiment of the public communication network (PLMN).
By way of example, and not limitation, in embodiments of the present application, the electronic device may also be a wearable device. The wearable device can also be called as a wearable intelligent device, and is a generic name for intelligently designing daily wear by applying wearable technology and developing wearable devices, such as glasses, gloves, watches, clothes, shoes and the like. The wearable device is a portable device that is worn directly on the body or integrated into the clothing or accessories of the user. The wearable device is not only a hardware device, but also can realize a powerful function through software support, data interaction and cloud interaction. The generalized wearable intelligent device includes full functionality, large size, and may not rely on the smart phone to implement complete or partial functionality, such as: smart watches or smart glasses, etc., and focus on only certain types of application functions, and need to be used in combination with other devices, such as smart phones, for example, various smart bracelets, smart jewelry, etc. for physical sign monitoring.
In addition, in the embodiment of the application, the electronic device may also be an electronic device in an internet of things (internet of things, ioT) system, and the IoT is an important component of future information technology development, and the main technical characteristic of the IoT is that the article is connected with a network through a communication technology, so that man-machine interconnection and an intelligent network for internet of things are realized.
The electronic device in the embodiment of the application may also be referred to as: a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), an access terminal, a subscriber unit, a subscriber station, a mobile station, a remote terminal, a mobile device, a user terminal, a wireless communication device, a user agent, or a user equipment, etc.
In an embodiment of the present application, the electronic device or each network device includes a hardware layer, an operating system layer running above the hardware layer, and an application layer running above the operating system layer. The hardware layer includes hardware such as a central processing unit (central processing unit, CPU), a memory management unit (memory management unit, MMU), and a memory (also referred to as a main memory). The operating system may be any one or more computer operating systems that implement business processes through processes (processes), such as a Linux operating system, a Unix operating system, an Android operating system, an iOS operating system, or a windows operating system. The application layer comprises applications such as a browser, an address book, word processing software, instant messaging software and the like.
The first electronic device may include the computer 100 in fig. 1, and the second electronic device may include the mobile phone 101 in fig. 1, and the second electronic device may also be a tablet or other electronic device, which is not limited in this embodiment. It will be appreciated that this example does not constitute a limitation on the first electronic device and the second electronic device.
Fig. 3 is a schematic structural diagram of a first electronic device according to an embodiment of the present application.
The first electronic device may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, a wireless communication module 150, an audio module 160, a speaker 160A, a receiver 160B, a microphone 160C, an earphone interface 160D, keys 170, an indicator 171, a camera 172, a display 173, and the like.
It should be understood that the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the first electronic device. In other embodiments of the present application, the first electronic device may include more or fewer components than shown, or may combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors. The controller can be a neural center and a command center of the first electronic device, and can generate operation control signals according to the instruction operation codes and the time sequence signals to complete instruction fetching and instruction execution control.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a USB interface, among others.
It should be understood that the connection relationship between the modules illustrated in the embodiment of the present invention is only illustrative, and does not limit the structure of the first electronic device. In other embodiments of the present application, the first electronic device may also use different interfacing manners in the foregoing embodiments, or a combination of multiple interfacing manners.
The internal memory 121 may be used to store computer-executable program code that includes instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the first electronic device (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications of the first electronic device and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The camera 193 is used to capture still images or video. In some embodiments, the electronic device may include 1 or N cameras 193, N being a positive integer greater than 1.
The first electronic device implements display functions through the GPU, the display 173, and the application processor, etc. The GPU is a microprocessor for image processing, and is connected to the display 173 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information. The first electronic device may implement a photographing function through the ISP, the camera 173, the video codec, the GPU, the display screen 173, the application processor, and the like.
By way of example, fig. 4 shows a schematic structural diagram of a second electronic device.
The second electronic device may include a processor 210, an external memory interface 220, an internal memory 221, a usb interface 230, a charge management module 240, a power management module 241, a battery 242, an antenna 1, an antenna 2, a mobile communication module 250, a wireless communication module 260, an audio module 270, a speaker 270A, a receiver 270B, a microphone 270C, an earphone interface 270D, a sensor module 280, keys 290, a motor 291, an indicator 292, a camera 293, a display 294, and a subscriber identity module (subscriber identification module, SIM) card interface 295, etc. The sensor module 280 may include a pressure sensor 280A, a gyroscope sensor 280B, a barometric sensor 280C, a magnetic sensor 280D, an acceleration sensor 280E, a distance sensor 280F, a proximity sensor 280G, a fingerprint sensor 280H, a temperature sensor 280J, a touch sensor 280K, an ambient light sensor 280L, a bone conduction sensor 280M, and the like.
It should be understood that the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the second electronic device. In other embodiments of the present application, the second electronic device may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 210 may include one or more processing units such as, for example: processor 210 may include an AP, a modem processor, a GPU, an ISP, a controller, a video codec, a DSP, a baseband processor, and/or an NPU, etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 210 for storing instructions and data. In some embodiments, the memory in the processor 210 is a cache memory. The memory may hold instructions or data that the processor 210 has just used or recycled. If the processor 210 needs to reuse the instruction or data, it may be called directly from the memory. Repeated accesses are avoided and the latency of the processor 210 is reduced, thereby improving the efficiency of the system.
In some embodiments, processor 210 may include one or more interfaces. The interfaces may include an I2C interface, an I2S interface, a PCM interface, a UART interface, MIPI, a GPIO interface, a SIM card interface, and/or a USB interface, among others.
It should be understood that the connection relationship between the modules illustrated in the embodiment of the present invention is only illustrative, and does not limit the structure of the second electronic device. In other embodiments of the present application, the second electronic device may also use different interfacing manners in the foregoing embodiments, or a combination of multiple interfacing manners.
The internal memory 221 may be used to store computer executable program code that includes instructions. The internal memory 221 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the second electronic device (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 221 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, UFS, and the like. The processor 210 performs various functional applications of the second electronic device and data processing by executing instructions stored in the internal memory 221 and/or instructions stored in a memory provided in the processor. For example, the methods of embodiments of the present application may be performed.
The camera 293 is used to capture still images or video. In some embodiments, the second electronic device may include 1 or N cameras 293, N being a positive integer greater than 1.
The display 294 is used to display images, videos, and the like. The display 294 includes a display panel. In some embodiments, the second electronic device may include 1 or N display screens 294, N being a positive integer greater than 1. The second electronic device implements display functions via the GPU, the display screen 294, and the application processor, etc. The GPU is a microprocessor for image processing, and is connected to the display screen 294 and the application processor.
The second electronic device implements display functions via the GPU, the display screen 294, and the application processor, etc. The GPU is a microprocessor for image processing, and is connected to the display screen 294 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 210 may include one or more GPUs that execute program instructions to generate or change display information. The second electronic device may implement a photographing function through the ISP, the camera 293, the video codec, the GPU, the display 294, the application processor, and the like.
It will be appreciated that some of the specific details presented above with respect to the first electronic device and the second electronic device may not be required, and that the specific structures of the first electronic device and the second electronic device are not limited by the embodiments of the present application.
In this embodiment of the present application, the first electronic device and the second electronic device may use the same operating system, or may use different operating systems, where the operating systems may include any of the following operating systems, for example: linux operating system, unix operating system, android operating system, iOS operating system or windows operating system, etc.
Fig. 5 illustrates a part of the software architecture of the electronic device by using the Android system as the software system of the second electronic device. The software system may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into five layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, a hardware abstraction layer, and a kernel layer, respectively.
The application layer may include a series of application packages. The application packages may include applications for telephones, music, calendars, cameras, games, memos, videos, and the like. Applications may include system applications and three-way applications.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
The application framework layer may include a window manager, resource manager, notification manager, content provider, and view system, among others. In the embodiment of the application program framework layer, the application program framework layer can further comprise a shared camera service, a temperature control service, a virtualization service, a communication service and the like.
The window manager is used for managing window programs. The window manager may obtain the display screen size, determine if there is a status bar, lock screen, touch screen, drag screen, intercept screen, etc.
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The content provider is used for realizing the function of data sharing among different application programs, allowing one program to access the data in the other program, and simultaneously ensuring the safety of the accessed data.
The view system may be responsible for interface rendering and event handling for the application.
The shared camera service may be used to perform camera related business processes, e.g., the shared camera service may interact with the communication service in signaling, e.g., signaling to start or shut down the camera; the shared camera service may also perform camera enablement through the virtualization service, synchronize camera status, and so on.
The temperature control service may provide a temperature change interface that may call back the temperature and/or temperature level of the electronic device when the temperature level of the electronic device changes. The temperature control service may also be referred to as a temperature control service.
The virtualization service can be used for realizing the call of cameras among devices, and can also be used for adjusting the frame rate of images acquired by the cameras and the code rate of images transmitted by the electronic device according to temperature adjustment information issued by the shared camera service. The virtualization service may also communicate camera data and signaling with the communication service.
The communication service can transmit camera data between two devices, for example, the camera data can comprise a frame rate of images acquired by a camera, a code rate of images transmitted by electronic devices, contents shot by the camera and the like; the communication service may also interact with the shared camera service and/or the virtualization service in signaling, e.g., the communication service may interact with the shared camera service in signaling including camera startup or shutdown, etc., and the communication service may interact with the virtualization service in signaling including parameter negotiation, etc.
Android runtimes include core libraries and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like. For example, in the embodiment of the present application, the virtual machine may be used to perform functions of turning on or off the camera, detecting the temperature of the electronic device, and adjusting the frame rate of the image acquired by the camera and/or the code rate of the image transmission.
The system library may also be referred to as Native layer, which may include a plurality of functional modules. For example: media libraries (media libraries), function libraries (function libraries), graphics processing libraries (e.g., openGL ES), etc.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio video encoding formats, such as: MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The function library provides multiple service API interfaces for the developer, and is convenient for the developer to integrate and realize various functions quickly.
The graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The hardware abstraction layer is a layer of abstracted structure between the kernel layer and the Android run. The hardware abstraction layer may be a package for hardware drivers that provides a unified interface for the invocation of upper layer applications.
The kernel layer is a layer between hardware and software. The kernel layer may include bluetooth drivers, wireless fidelity (wireless fidelity, wi-Fi) technology drivers, display drivers, camera drivers, audio drivers, battery drivers, central processor drivers, USB drivers, and the like.
It should be noted that, the embodiments of the present application are only illustrated by using an android system, and in other operating systems (such as Windows systems, IOS systems, etc.), the schemes of the present application can also be implemented as long as the functions implemented by the respective functional modules are similar to those of the embodiments of the present application.
For convenience of description, the embodiment of the application describes a temperature control method by taking a shared camera scene of a user using a mobile phone camera on a computer as an example.
The computer side may be referred to as a user, and the mobile phone side may be referred to as a used party. The user and the used party can each include various functional modules for realizing the shared camera, for example, the shared camera service, the virtualization service, the communication service, the temperature control service and the like.
For convenience of description, the shared camera service of the user will be referred to as a first shared camera service, the virtualized service of the user will be referred to as a first virtualized service, and the communication service of the user will be referred to as a first communication service. The shared camera service of the used party is referred to as a second shared camera service, the virtualized service of the used party is referred to as a second virtualized service, and the communication service of the used party is referred to as a second communication service.
Fig. 6 shows a diagram of the interaction of software modules between a computer and a cell phone.
The application program layer of the user can comprise three-party applications capable of video conference, and the user can select the camera of the user in a certain three-party application. Taking the example that the user selects the front-facing camera of the used party, when the user responds to the operation that the user opens the front-facing camera of the used party, the information of opening the front-facing camera of the used party can be transmitted to the first shared camera service in the application framework layer.
After the first shared camera service receives the information of opening the front-end camera of the user, on one hand, the first shared camera service can enable and synchronize states of the cameras of the user through the first virtualization service.
The camera enabling may include camera enabling of a user and camera enabling of a used user. The camera enabling can be understood as the identification and mounting of the first virtualization service to the camera, and when the first virtualization service identifies the camera of the user or the used party and mounts the camera to the user, the camera is described as having the used capability.
The first virtualization service may perform camera-enabled data transmission with the second virtualization service through a data channel established between the first communication service and the second communication service. When the second virtualization service successfully enables the camera, the second virtualization service can transmit information of the camera enabling success to the first virtualization service through a data channel between the first communication service and the second communication service, the first virtualization service can return callback information of the camera enabling success of a user to the first sharing camera service, and the first sharing camera service can return the callback information to the three-party application, so that an interface of the three-party application can display the camera of the user. For example, the camera of the used party displayed by the interface of the three-party application may include the front camera 204 of the handset and the rear camera 205 of the handset in fig. 2 described above.
The synchronization of the camera states can comprise state synchronization of camera enabling, and can also comprise state synchronization of camera opening, camera opening success, camera closing success and the like in the use process of the camera.
On the other hand, the first shared camera service may perform transmission of control signaling with the first communication service, where the control signaling may include selecting a front or rear camera, turning on or off the camera, and so on. For example, the first shared camera service may pass information to the first communication service that opens the front-facing camera of the party being used.
The first communication service may transmit camera data between the user and the user, and the first communication service may transmit information of opening the front camera of the user to the second communication service of the user through a bluetooth module or a Wi-Fi module of a hardware abstraction layer, and a bluetooth driver or a Wi-Fi driver of a kernel layer.
In the application framework layer of the used party, the second communication service can report information of opening the front-end camera of the used party to the second shared camera service, the second shared camera service can transmit the information to the second virtualization service, and the second virtualization service starts a camera driver by calling a camera enabling interface of the hardware abstraction layer. When the camera of the user is turned on, the second virtualization service may transmit a result of the camera being turned on to the second communication service through the second shared camera service.
The second communication service may perform transmission of camera data between the user and the used party, and return a result of the camera activation to the first communication service of the user. The first communication service can report the result to the first shared camera service, the first shared camera service can return the result to the three-party application, and the three-party application can display information such as icons, characters and the like started by the camera in the interface.
It can be appreciated that the second virtualization service may also transmit the result of camera activation to the second communication service, where the second communication service returns the result of camera activation to the first communication service. The first communication service may report the result of the camera start to the first virtualization service. That is, the transmission of the camera data and the interaction of the signaling may be performed between the first virtualization service and the first communication service without passing through the first shared camera service. The transmitted camera data may include a frame rate of an image acquired by the camera, a code rate of an image transmitted by a user, content photographed by the camera, and the like, and the interactive signaling may include parameter negotiation, camera selection, camera capability starting, and the like.
It will be appreciated that when a three-way application is started, the first shared camera service, the first virtualization service, the first communication service, the second virtualization service, the second communication service, etc. may start running, so that the camera of the user or the used party may be enabled. When the interface of the three-party application can see the camera icon and/or the camera text, the camera is enabled successfully, and the camera has the capability of being used. When the used party receives the information of opening the camera, the second shared camera service and the temperature control service can start to operate. In this way, the second shared camera service and the temperature control service start to operate again when the camera is used, so that the power consumption of a user can be saved.
When the used party receives the information of closing the camera, the second shared camera service and the temperature control service can stop running. In this way, the second shared camera service and the temperature control service operate when the camera is used, and power consumption of a used party can be saved. When the three-party application is closed, the first shared camera service, the first virtualization service, the first communication service, the second virtualization service, the second communication service, and the like may cease to operate.
The implementation flow of the temperature control method according to the embodiment of the present application is described below with reference to fig. 7. It should be noted that, the computer and the mobile phone perform data interaction based on the communication channel between the first communication service and the second communication service, and the implementation flow in fig. 7 does not temporarily reflect the interaction between the first communication service and the second communication service.
1. And starting the camera sharing service.
When a user uses the camera of the mobile phone in the three-party application of the computer, the three-party application can send information of using the camera to the mobile phone through the first camera sharing service, the first communication service and the like of the computer, and the mobile phone can start the second camera sharing service after receiving the information. The process of sending the information of using the camera to the mobile phone by the specific computer can refer to the description in the embodiment corresponding to fig. 6, and will not be repeated.
2. And reporting the camera initialization frame rate and the camera initialization resolution.
After the second camera sharing service is started, the three-party application can report the camera initialization frame rate and the camera initialization resolution to the first virtualization service through the first camera sharing service. It will be appreciated that different applications may correspond to different camera initialization frame rates and camera initialization resolutions.
The first virtualization service transmits the camera initialization frame rate and the camera initialization resolution to the second virtualization service based on a communication channel between the first communication service and the second communication service, and the second virtualization service can report the camera initialization frame rate and the camera initialization resolution to the second camera sharing service. It can be appreciated that the second camera sharing service may convert the camera initialization resolution into a camera initialization code rate, and further, the second camera sharing service may store the camera initialization frame rate and the camera initialization code rate for subsequent use. The initialization frame rate may be referred to as a reference frame rate or a base frame rate, and the initialization code rate may be referred to as a reference code rate or a base code rate.
Illustratively, the camera initialization frame rate may be 30 frames per second (fps), the camera initialization resolution may be 1080P, and the second camera sharing service may convert the camera initialization resolution to a camera initialization code rate, for example, the camera initialization code rate may be 5 megabits per second (Mbps).
3. Registering a temperature monitoring callback.
In order to timely adjust the frame rate of the image collected by the camera or the code rate of the image transmitted by the mobile phone when the temperature of the mobile phone is increased or decreased, the second camera sharing service can register the temperature monitoring callback interface with the temperature control service. When the temperature of the mobile phone shell changes, the second camera sharing service can acquire the temperature and/or the temperature grade of the mobile phone shell through the callback interface. In this embodiment of the present application, the temperature of the mobile phone shell may be understood as the surface temperature of the mobile phone, the temperature of the mobile phone shell may also be referred to as the shell temperature or the front shell temperature, the frame rate at which the camera collects the image may be referred to as the frame rate, and the code rate at which the image is transmitted by the mobile phone may be referred to as the code rate.
In the implementation of acquiring the shell temperature by the possible temperature control service, the temperature sensor of the mobile phone may report the shell temperature to the temperature control service, or the temperature control service may periodically call a relevant interface to detect the shell temperature, which is not limited in the embodiment of the present application.
It is understood that the temperature control service may adjust the corresponding temperature level according to different changes in temperature. For example, table 1 below may represent the correspondence of the shell temperature to the temperature level.
TABLE 1
Entry temperature Back-off temperature Temperature rating
The shell temperature is 37 DEG C The temperature of the front shell is 35 DEG C 1
The shell temperature is 40 DEG C The temperature of the front shell is 38 DEG C 2
The shell temperature is 43 DEG C The temperature of the front shell is 41 DEG C 3
The shell temperature is 48 DEG C The temperature of the front shell is 46 DEG C 4
The entry temperature is understood to mean that, when the shell temperature increases to a certain temperature, the corresponding temperature level also increases accordingly. For example, when the shell temperature increases to 37 ℃, the corresponding temperature level is 1, and when the shell temperature increases to 40 ℃, the corresponding temperature level increases to 2. When the shell temperature is less than 37 ℃, the corresponding temperature level is 0, and it can be understood that the temperature level 0 can be a default temperature level of the mobile phone.
The rollback temperature is understood to mean that when the shell temperature decreases to a certain temperature, the corresponding temperature level decreases accordingly. For example, when the shell temperature is reduced to 41 ℃, the corresponding temperature level is 2, that is, the shell temperature is backed off to a lower temperature level; similarly, when the shell temperature is reduced to 38 ℃, the corresponding temperature level is reduced to 1.
It may be understood that the entering temperature, the retreating temperature and the temperature level may be determined by the temperature control service in a self-defining manner, and the values of the entering temperature, the retreating temperature and the temperature level are not limited in this embodiment. In addition, the change in the shell temperature and the change in the temperature level may not be positively correlated, for example, the temperature level may be correspondingly reduced as the shell temperature increases, and the temperature level may be correspondingly increased as the shell temperature decreases. The specific shell temperature and temperature level changes are not limited in this application.
4. The temperature level 3 is recalled.
When the second camera sharing service obtains callback information of the temperature level 3, the corresponding shell temperature rises to 43 ℃, which indicates that the current temperature of the mobile phone is higher, and the user experience can be possibly affected. Thus, the second camera sharing service may initiate a temperature down policy of temperature level 3.
5. And starting a temperature grade 3 cooling strategy.
In possible implementations, the cooling policy may include (1) reducing the frame rate and/or (2) reducing the code rate.
(1) The frame rate is reduced.
The second camera sharing service may reduce the frame rate. Specifically, when the reference frame rate is greater than the preset minimum frame rate, the second camera sharing service may reduce the frame rate to 2/3 of the reference frame rate. If the temperature level continues to rise, the second camera sharing service may reduce the frame rate to 1/2 of the reference frame rate. Different electronic devices may have different preset minimum frame rate values, for example, the preset minimum frame rate may be 15fps, and the value of the preset minimum frame rate is not limited in the embodiment of the present application.
It should be noted that the reduced frame rate value must not be lower than the preset minimum frame rate, that is, if the reduced frame rate is smaller than the preset minimum frame rate, the second camera sharing service may set the frame rate to the preset minimum frame rate. In this way, the smoothness of the picture can be basically maintained, so that the user experience is not too bad.
Illustratively, taking the reference frame rate of 30fps and the preset minimum frame rate of 15fps as an example, the cooling strategy may be executed when the second camera sharing service obtains that the temperature level increases from the temperature level 2 to the temperature level 3. Since the reference frame rate is greater than the preset minimum frame rate, the second camera sharing service may reduce the frame rate to around 20 fps. If the temperature level continues to rise, after a rise from temperature level 3 to temperature level 4, the second camera sharing service may reduce the frame rate to around 15 fps.
(2) The code rate is reduced.
When the frame rate is less than or equal to the preset minimum frame rate, the second camera sharing service may reduce the code rate. Wherein the case where the frame rate is less than or equal to the preset minimum frame rate may include (a) the reference frame rate is less than or equal to the preset minimum frame rate; (b) The second camera sharing service reduces the frame rate to a preset minimum frame rate, etc.
(a) The reference frame rate is less than or equal to a preset minimum frame rate.
When the reference frame rate is less than or equal to the preset minimum frame rate, the second camera sharing service can keep the frame rate unchanged, and the code rate is reduced to 2/3 of the reference code rate. If the temperature level continues to rise, the second camera sharing service may reduce the code rate to 1/2 of the reference code rate.
Illustratively, taking the example that the reference frame rate is 15fps, the preset minimum frame rate is 15fps, and the reference code rate is 2.5Mbps, when the second camera sharing service obtains that the temperature level is increased from the temperature level 2 to the temperature level 3, the cooling strategy may be executed. Because the reference frame rate is equal to the preset minimum frame rate, the second camera sharing service can keep the frame rate unchanged, and the code rate is reduced to about 1.7 Mbps. If the temperature level continues to rise, after the temperature level rises from 3 to 4, the second camera sharing service may reduce the code rate to about 1.2 Mbps.
(b) The second camera sharing service reduces the frame rate to a preset minimum frame rate.
When the second camera sharing service reduces the frame rate to the preset minimum frame rate, but the temperature level continues to rise or the inner shell temperature continues to rise in the first preset time period, the second camera sharing service can keep the frame rate unchanged, and the code rate is reduced to 2/3 of the reference code rate. If the inner shell temperature continues to rise for the first preset time period, the second camera sharing service can reduce the code rate to 1/2 of the reference code rate. The first preset time period may be defined by the second camera sharing service, for example, the first preset time period may be 5 minutes, and the value of the first preset time period is not limited in this embodiment of the present application.
For example, taking the example that the reference frame rate is 30fps, the preset minimum frame rate is 15fps, the reference code rate is 5Mbps, the first preset time period is 5 minutes, when the temperature level is increased from the temperature level 3 to the temperature level 4, the second camera sharing service reduces the frame rate to 15fps, the code rate is correspondingly reduced due to the reduction of the frame rate, and therefore, the code rate can be reduced to about 2.5 Mbps. If the inner shell temperature continues to rise for 5 minutes, the second camera sharing service can reduce the code rate to about 1.7 Mbps. If the inner shell temperature is still rising for 5 minutes, the second camera sharing service can reduce the code rate to about 1.2 Mbps.
It may be understood that the above-mentioned second camera sharing service may reduce the frame rate to 2/3 or 1/2 of the reference frame rate, and reduce the code rate to 2/3 or 1/2 of the reference code rate may be set for an empirical value, and the second camera sharing service may also reduce the frame rate or the code rate to other values respectively.
The second camera sharing service reduces the frame rate to 2/3 of the reference frame rate, then to 1/2 of the reference frame rate, and/or the second camera sharing service reduces the code rate to 2/3 of the reference code rate, then to 1/2 of the reference code rate, so that the effect of gradually reducing the frame rate or the code rate can be realized, the frame rate value or the code rate value can not be greatly reduced, the display effect of the interface can not be too stuck and/or too fuzzy, and the user experience can be not influenced as much as possible.
Of course, the temperature control method of the embodiment of the present application may also implement reducing the frame rate to the minimum value once, for example, reducing the frame rate to 1/2 of the reference frame rate, and/or reducing the code rate to the minimum value once, for example, reducing the code rate to 1/2 of the reference code rate. The temperature control method of the embodiment of the application can also realize the reduction of the frame rate and the code rate for more times, without limitation.
In the above implementation, the code rate may be reduced when the frame rate can no longer be reduced. This is because when the frame rate is reduced, the code rate is also reduced, and thus it is not necessary to reduce the code rate while reducing the frame rate. If the frame rate is reduced and the code rate is reduced, the interface may have a large probability of a stuck and/or blurred display effect, and the user experience is poor.
Of course, the temperature control method of the embodiment of the present application may also achieve simultaneous reduction of the frame rate and the code rate, or reduce the code rate first and then reduce the frame rate, without limitation. After the second camera sharing service executes the cooling strategy, the second virtualization service can reduce the load of the encoder and the data transmission quantity, so that the frame rate and the code rate of the image data received by the computer are also reduced, and the temperature of the mobile phone can be reduced.
When the temperature of the mobile phone is reduced, the temperature control method can further improve the frame rate of the images collected by the camera and the code rate of the images transmitted by the mobile phone, and improve the image texture.
6. The temperature level 2 is recalled.
When the second camera sharing service obtains callback information with the temperature level falling back to 2, the corresponding shell temperature is reduced to 41 ℃, and the temperature of the mobile phone is not too high, so that the second camera sharing service can (1) improve the frame rate and/or (2) improve the code rate.
7. The temperature is reduced, and the frame rate and the code rate are improved.
(1) And the frame rate is improved.
The second camera sharing service may increase the frame rate if the second camera sharing service adopts a scheme of reducing the frame rate before increasing the frame rate.
In one possible implementation of increasing the frame rate, the second camera sharing service may first increase the frame rate to 2/3 of the reference frame rate, and if the temperature level continues to decrease, the second camera sharing service may increase the frame rate to the reference frame rate.
Illustratively, taking a reference frame rate of 30fps and a current frame rate of 15fps as an example, if the temperature level drops, for example, from temperature level 3 to temperature level 2, the second camera sharing service may increase the frame rate from 15fps to around 20 fps. If the temperature level continues to drop, from temperature level 2 to temperature level 1, the second camera sharing service may increase the frame rate from 20fps to 30fps.
In another possible implementation of frame rate boosting, the second camera sharing service may boost the frame rate to the reference frame rate at a time.
Illustratively, taking a reference frame rate of 30fps and a current frame rate of 15fps as an example, if the temperature level drops, for example, from temperature level 3 to temperature level 2, the second camera sharing service may increase the frame rate from 15fps to 30fps.
In yet another possible implementation of increasing the frame rate, the second camera sharing service may increase the frame rate to the reference frame rate if the inner shell temperature is not increased for a second preset period of time after increasing the frame rate to 2/3 of the reference frame rate. The second preset time period may be defined by the second camera sharing service, for example, the second preset time period may be 3 minutes, and the first preset time period and the second preset time period may be the same or different, where the value of the second preset time period in the embodiment of the present application is not limited.
Illustratively, taking the reference frame rate of 30fps, the current frame rate of 15fps, and the second preset time period of 3 minutes as an example, if the temperature level drops, for example, from the temperature level 3 to the temperature level 2, the second camera sharing service may increase the frame rate from 15fps to about 20 fps. If the shell temperature does not rise within 3 minutes, the second camera sharing service may increase the frame rate from 20fps to 30fps after 3 minutes.
(2) And the code rate is improved.
If the second camera sharing service adopts a scheme of reducing the code rate before the code rate is increased, the second camera sharing service can increase the code rate.
In one possible implementation of increasing the code rate, the second camera sharing service may increase the code rate to 2/3 of the reference code rate first, and if the temperature level continues to increase, the second camera sharing service may increase the code rate to the reference code rate.
For example, taking the reference code rate of 5Mbps and the current code rate of 2.5Mbps as an example, if the temperature level drops, for example, from the temperature level 3 to the temperature level 2, the second camera sharing service may increase the code rate from 2.5Mbps to about 3.3 Mbps. If the temperature level continues to drop, the second camera sharing service may increase the code rate from 3.3Mbps to 5Mbps after dropping from temperature level 2 to temperature level 1.
In another possible implementation of code rate promotion, the second camera sharing service may promote the code rate to the reference code rate at a time.
Illustratively, taking the reference code rate of 5Mbps and the current code rate of 2.5Mbps as an example, if the temperature level drops, for example, from the temperature level 3 to the temperature level 2, the second camera sharing service may increase the code rate from 2.5Mbps to 5Mbps.
In another possible implementation of increasing the code rate, after increasing the code rate to 2/3 of the reference code rate, the second camera sharing service may increase the code rate to the reference code rate if the inner shell temperature is not increased in the second preset time period.
For example, taking the reference code rate of 5Mbps, the current code rate of 2.5Mbps, and the second preset time period of 3 minutes as an example, if the temperature level is reduced, for example, from the temperature level 3 to the temperature level 2, the second camera sharing service may increase the code rate from 2.5Mbps to about 3.3 Mbps. If the shell temperature does not rise within 3 minutes, the second camera sharing service may increase the code rate from 3.3Mbps to 5Mbps after 3 minutes.
It can be understood that, in the temperature control method of the embodiment of the present application, the frame rate may be first increased, and then the code rate may be increased. This is because the code rate increases as the frame rate increases, so it is not necessary to increase the code rate at the same time as the frame rate increases. If the frame rate is increased while the code rate is increased, the temperature of the mobile phone may be increased faster. Of course, the temperature control method of the embodiment of the application may also implement raising the frame rate and the code rate simultaneously, or raise the code rate first and then raise the frame rate, without limitation.
After the frame rate and/or the code rate of the second camera sharing service are/is improved, the second virtualization service can increase the load of the encoder and the transmission data quantity, so that the frame rate and the code rate of the image data received by the computer are also improved, and the user experience is improved.
It can be understood that, when the second camera sharing service acquires the temperature level of 3 again, the above cooling policy may be executed again, which is not described again.
As can be seen from the above, according to the embodiment of the present application, under the condition that user experience is not substantially affected, the frame rate of the image collected by the camera and the code rate of the image transmitted by the electronic device are adaptively adjusted based on the reference frame rate, the reference code rate, the temperature level and the like, so that the temperature of the electronic device is adjusted, and the probability of overhigh temperature of the electronic device is reduced.
The method according to the embodiment of the present application will be described in detail by way of specific examples. The following embodiments may be combined with each other or implemented independently, and the same or similar concepts or processes may not be described in detail in some embodiments.
Fig. 8 shows a temperature control method of an embodiment of the present application. The method comprises the following steps:
s801, the first electronic device calls a camera device of the second electronic device.
In this embodiment of the present application, the first electronic device and the second electronic device may be the same type of device, or may be different types of devices. The first electronic device may include the computer 100 in the embodiment corresponding to fig. 1, the second electronic device may include the mobile phone 101 in the embodiment corresponding to fig. 1, and the second electronic device may also include a tablet or other electronic devices, which are not limited in this embodiment.
S802, the second electronic equipment collects images through the camera device and transmits the images to the first electronic equipment.
In this embodiment of the present application, the process of transmitting the image from the second electronic device to the first electronic device may refer to the related description in the embodiment corresponding to fig. 7, which is not repeated.
S803, in the process that the second electronic equipment collects images by using the camera device, when the temperature of the second electronic equipment rises to a first preset value, if the first frame rate of the images collected by the camera device is larger than the target frame rate, the second electronic equipment reduces the frame rate of the images collected by the camera device; otherwise, if the first frame rate is less than or equal to the target frame rate, the second electronic device reduces the code rate of the transmission image.
In this embodiment, the first preset value may be understood as a value of the second electronic device having a relatively high temperature, for example, the first preset value may be understood as a temperature interval value corresponding to the temperature level 3 in the embodiment corresponding to fig. 7, and may also be understood as a temperature interval value corresponding to the level 4, which is not limited.
The first frame rate may be understood as an initialization frame rate at which the camera device captures an image, for example, the first frame rate may be understood as a reference frame rate in the embodiment corresponding to fig. 7 described above.
The target frame rate may be understood as a frame rate capable of basically maintaining the smoothness of the picture, and by way of example, the target frame rate may be understood as a preset minimum frame rate in the embodiment corresponding to fig. 7, for example, the target frame rate may be 15fps, and the specific value of the target frame rate is not limited in this embodiment.
The manner in which the second electronic device reduces the frame rate of the image acquired by the camera device and the manner in which the second electronic device reduces the code rate of the transmission image may refer to the related description in the embodiment corresponding to fig. 7, which is not repeated.
According to the method and the device for controlling the temperature of the electronic equipment, under the condition that user experience is not affected basically, the frame rate of the image collected by the camera and the code rate of the image transmitted by the electronic equipment can be adjusted in a self-adaptive mode based on the reference frame rate, the reference code rate, the temperature level and the like, so that the temperature of the electronic equipment is adjusted, and the probability of overhigh temperature of the electronic equipment is reduced.
Optionally, on the basis of the embodiment corresponding to fig. 8, in a process that the second electronic device acquires an image by using the camera device: at a first moment, the temperature of the second electronic equipment is a first temperature, the frame rate of the image acquired by the camera device is a first frame rate, and the first frame rate is larger than the target frame rate; the temperature of the second electronic equipment is the second temperature, the frame rate of the image acquired by the camera device is the second frame rate, wherein the second time is later than the first time, the second temperature is higher than the first temperature, the second frame rate is lower than the first frame rate, and the second frame rate is higher than the target frame rate; the temperature of the second electronic equipment is a third temperature, the frame rate of the image acquired by the camera device is a third frame rate, the third temperature is later than the second time, the third temperature is higher than the second temperature, the third frame rate is lower than the second frame rate, and the third frame rate is higher than or equal to the target frame rate.
In this embodiment of the present application, the first time may be understood as a time when the temperature of the second electronic device is not too high, and the second electronic device does not reduce the frame rate. The second instant may be understood as an instant at which the temperature of the second electronic device increases and the second electronic device decreases the frame rate. The third instant may be understood as an instant at which the temperature of the second electronic device continues to rise and the second electronic device again decreases the frame rate.
The first temperature may be understood as a temperature value at which the temperature of the second electronic device is not too high, and when the temperature of the second electronic device is the first temperature, the second electronic device may not need to reduce the frame rate. For example, the first temperature may be a temperature value less than a first preset value.
The second temperature may be understood as a temperature value of the second electronic device after the temperature of the second electronic device is increased, and when the temperature of the second electronic device is the second temperature, the second electronic device may decrease the frame rate. For example, the second temperature may be understood as a temperature interval value corresponding to the temperature level 3 in the embodiment corresponding to fig. 7.
The third temperature may be understood as a temperature value of the second electronic device after the temperature of the second electronic device continues to increase, and when the temperature of the second electronic device is the third temperature, the second electronic device may decrease the frame rate again. For example, the third temperature may be understood as a temperature interval value corresponding to the temperature level 4 in the embodiment corresponding to fig. 7.
The second frame rate may be understood as the frame rate after the second electronic device decreases the frame rate at the second temperature, for example, the second frame rate may be understood as 2/3 of the reference frame rate in the embodiment corresponding to fig. 7, or may be another frame rate value smaller than the reference frame rate, which is not limited. The third frame rate may be understood as the frame rate after the second electronic device decreases the frame rate again at the third temperature, for example, the third frame rate may be understood as 1/2 of the reference frame rate in the embodiment corresponding to fig. 7, or may be another frame rate value smaller than the second frame rate, which is not limited.
In this embodiment of the present application, the second electronic device reduces the frame rate to the second frame rate and then to the third frame rate, so that the effect of gradually reducing the frame rate can be achieved, the reducing amplitude of the frame rate is not very large, the display effect of the interface is not too stuck, and the user experience is not affected as much as possible.
Optionally, on the basis of the embodiment corresponding to fig. 8, between the second time and the third time, the temperature of the second electronic device is a fourth temperature, and the frame rate at which the camera device collects the image is a second frame rate, where the fourth temperature is greater than the second temperature, the fourth temperature is less than the third temperature, and both the fourth temperature and the second temperature belong to the first preset temperature interval.
In this embodiment, the first preset temperature interval may be understood as a temperature interval value corresponding to the temperature level 3 in the embodiment corresponding to fig. 7, or may be a temperature interval value corresponding to another temperature level, which is not limited.
Although the fourth temperature is greater than the second temperature, the temperature of the second electronic device increases, and the fourth temperature and the second temperature both belong to the same preset temperature interval, so the second electronic device may not reduce the frame rate. In this way, the second electronic device may not need to frequently adjust the frame rate according to the change in temperature, so that the second electronic device may transmit images to the first electronic device at a relatively stable frame rate.
Optionally, on the basis of the embodiment corresponding to fig. 8, the temperature level corresponding to the first temperature is a first level, the temperature level corresponding to the second temperature is a second level, the temperature level corresponding to the third temperature is a third level, and the temperature level corresponding to the fourth temperature is a second level.
In this embodiment, the first level may be understood as temperature level 1 or temperature level 2 in the embodiment corresponding to fig. 7, the second level may be understood as temperature level 3 in the embodiment corresponding to fig. 7, and the third level may be understood as temperature level 4 in the embodiment corresponding to fig. 7.
The second electronic device can adjust the frame rate for the change of the temperature level, so that the frame rate does not need to be frequently adjusted according to the change of the temperature value, the second electronic device reduces the flow of executing the frame rate adjustment, and the occupation of memory resources is reduced.
Optionally, on the basis of the embodiment corresponding to fig. 8, in a process that the second electronic device acquires an image by using the camera device: at a fourth moment, the temperature of the second electronic equipment is a fifth temperature, the frame rate of the image acquired by the camera device is a first frame rate, the code rate of the image transmitted by the second electronic equipment is a first code rate, and the first frame rate is larger than the target frame rate; the temperature of the second electronic equipment is the sixth temperature, the frame rate of the image acquired by the camera device is the target frame rate, and the code rate of the image transmitted by the second electronic equipment is the second code rate, wherein the fifth time is later than the fourth time, the sixth temperature is higher than the fifth temperature, and the second code rate is lower than the first code rate; at a sixth moment, the temperature of the second electronic equipment is a seventh temperature, the frame rate of the image acquired by the camera device is a target frame rate, the code rate of the image transmitted by the second electronic equipment is a third code rate, wherein the sixth moment is later than the fifth moment, the seventh temperature is higher than the sixth temperature, and the third code rate is lower than the second code rate.
In this embodiment of the present application, the fourth time may be understood as a time when the temperature of the second electronic device is not too high, and the frame rate is not reduced or the code rate is not reduced by the second electronic device. The fifth time may be understood as a time when the temperature of the second electronic device increases and the second electronic device decreases the frame rate. The sixth time may be understood as a time when the second electronic device decreases the code rate when the temperature of the second electronic device continues to increase.
The fifth temperature may be understood as a temperature value at which the temperature of the second electronic device is not too high, and when the temperature of the second electronic device is the fifth temperature, the second electronic device may not need to reduce the frame rate or reduce the code rate. For example, the fifth temperature may be a temperature value less than the first preset value. The fifth temperature and the first temperature may be the same or different, and are not limited.
The sixth temperature may be understood as a temperature value of the second electronic device after the temperature of the second electronic device is increased, and when the temperature of the second electronic device is the sixth temperature, the second electronic device may decrease the frame rate. For example, the sixth temperature may be understood as a temperature interval value corresponding to the temperature level 3 in the embodiment corresponding to fig. 7. The sixth temperature and the second temperature may be the same or different, and are not limited.
The seventh temperature may be understood as a temperature value of the second electronic device after the temperature of the second electronic device continues to increase, and when the temperature of the second electronic device is the seventh temperature, the second electronic device may reduce the code rate. For example, the seventh temperature may be understood as a temperature interval value corresponding to the temperature level 4 in the embodiment corresponding to fig. 7. The seventh temperature and the third temperature may be the same or different, and are not limited.
The first code rate may be understood as an initialization code rate of the transmission image, for example, the first code rate may be understood as a reference code rate in the embodiment corresponding to fig. 7 described above.
The second code rate may be understood as a code rate after the second electronic device decreases the frame rate and the code rate decreases accordingly when the second electronic device decreases the frame rate.
The third code rate may be understood as the code rate after the second electronic device reduces the code rate at the seventh temperature, for example, the third code rate may be understood as 2/3 of the reference code rate or 1/2 of the reference code rate in the embodiment corresponding to fig. 7, or may be other code rate values smaller than the reference code rate, which is not limited.
In the embodiment of the application, the second electronic device reduces the frame rate first and then reduces the code rate, and at this time, the code rate is also reduced when the frame rate is reduced, so that the probability of the occurrence of a stuck and/or blurred interface can be reduced.
Optionally, on the basis of the embodiment corresponding to fig. 8, between the fifth time and the sixth time, the temperature of the second electronic device is an eighth temperature, the frame rate at which the camera device collects the image is the target frame rate, and the code rate at which the second electronic device transmits the image is the second code rate, where the eighth temperature is greater than the sixth temperature, the eighth temperature is less than the seventh temperature, and both the eighth temperature and the sixth temperature belong to the second preset temperature interval.
In this embodiment, the second preset temperature interval may be understood as a temperature interval value corresponding to the temperature level 3 in the embodiment corresponding to fig. 7, or may be a temperature interval value corresponding to another temperature level, where the second preset temperature interval may be the same as or different from the first preset temperature interval, and is not limited.
Although the eighth temperature is greater than the sixth temperature, the temperature of the second electronic device increases, and the eighth temperature and the sixth temperature both belong to the same preset temperature interval, so the second electronic device may not reduce the code rate. In this way, the second electronic device may not need to frequently adjust the code rate according to the change in temperature, so that the second electronic device may transmit an image to the first electronic device at a relatively stable code rate.
Optionally, on the basis of the embodiment corresponding to fig. 8, in a process that the second electronic device acquires an image by using the camera device: at a seventh moment, the temperature of the second electronic equipment is a ninth temperature, the frame rate of the image acquired by the camera device is a first frame rate, the code rate of the image transmitted by the second electronic equipment is a first code rate, and the first frame rate is smaller than or equal to the target frame rate; the temperature of the second electronic equipment is tenth temperature, the frame rate of the image acquired by the camera device is the target frame rate, and the code rate of the image transmitted by the second electronic equipment is fourth code rate, wherein the eighth time is later than the seventh time, the tenth temperature is higher than the ninth temperature, and the fourth code rate is lower than the first code rate; at the ninth moment, the temperature of the second electronic equipment is eleventh temperature, the frame rate of the image acquired by the camera device is the target frame rate, the code rate of the image transmitted by the second electronic equipment is fifth code rate, wherein the ninth moment is later than the eighth moment, the eleventh temperature is higher than tenth temperature, and the fifth code rate is lower than the fourth code rate.
In this embodiment of the present application, the seventh time may be understood as a time when the temperature of the second electronic device is not too high, and the second electronic device does not reduce the code rate. The eighth time may be understood as a time when the temperature of the second electronic device increases and the second electronic device decreases the code rate. The ninth time may be understood as a time when the temperature of the second electronic device continues to rise and the second electronic device again decreases the code rate.
The ninth temperature may be understood as a temperature value of the second electronic device, where the temperature of the second electronic device is not too high, and when the temperature of the second electronic device is the ninth temperature, the second electronic device may not need to reduce the code rate. For example, the ninth temperature may be a temperature value less than the first preset value. The ninth temperature and the fifth temperature may be the same or different, and are not limited.
The tenth temperature may be understood as a temperature value of the second electronic device after the temperature of the second electronic device is increased, and when the temperature of the second electronic device is the tenth temperature, the second electronic device may reduce the code rate. For example, the tenth temperature may be understood as a temperature section value corresponding to the temperature level 3 in the embodiment corresponding to fig. 7. The tenth temperature and the sixth temperature may be the same or different, and are not limited.
The eleventh temperature may be understood as a temperature value of the second electronic device after the temperature of the second electronic device continues to increase, and when the temperature of the second electronic device is the eleventh temperature, the second electronic device may reduce the code rate again. For example, the eleventh temperature may be understood as a temperature section value corresponding to the temperature level 4 in the embodiment corresponding to fig. 7. The eleventh temperature and the seventh temperature may be the same or different, and are not limited.
The fourth code rate may be understood as the code rate after the second electronic device reduces the code rate at the tenth temperature, for example, the fourth code rate may be understood as 2/3 of the reference code rate in the embodiment corresponding to fig. 7, or may be other code rate values smaller than the reference code rate, which is not limited.
The fifth code rate may be understood as the code rate after the second electronic device reduces the code rate at the eleventh temperature, for example, the fifth code rate may be understood as 1/2 of the reference code rate in the embodiment corresponding to fig. 7, or may be other code rate values smaller than the fourth code rate, which is not limited.
In the embodiment of the application, the second electronic device reduces the code rate to the fourth code rate first and then reduces the code rate to the fifth code rate, so that the effect of gradually reducing the code rate can be achieved, the descending amplitude of the code rate value is not very large, the display effect of the interface is not too fuzzy, and the user experience can be not influenced as much as possible.
Optionally, on the basis of the embodiment corresponding to fig. 8, in a process that the second electronic device collects an image by using the camera device, when the temperature of the second electronic device is reduced to a second preset value, if the frame rate at which the camera device collects the image is smaller than the first frame rate, the second electronic device increases the frame rate.
In this embodiment of the present application, the second preset value may be understood as a value of the second electronic device with a relatively low temperature, for example, the second preset value may be understood as a temperature interval value corresponding to the temperature level 2 in the embodiment corresponding to fig. 7, and may also be understood as a temperature interval value corresponding to the level 1, which is not limited.
When the temperature of the second electronic equipment is reduced, the frame rate of the image acquired by the camera can be improved, so that the image texture is improved.
Optionally, on the basis of the embodiment corresponding to fig. 8, the raising the frame rate by the second electronic device may include: the second electronic device increasing the frame rate to the first frame rate; or the second electronic equipment increases the frame rate to a fourth frame rate, the fourth frame rate is smaller than the first frame rate, and when the temperature of the second electronic equipment is reduced to a third preset value, the second electronic equipment increases the fourth frame rate to the first frame rate, and the third preset value and the second preset value belong to different temperature intervals; alternatively, the second electronic device increases the frame rate to a fourth frame rate; and when the temperature of the second electronic equipment is less than or equal to a second preset value in the first preset time period, the second electronic equipment increases the fourth frame rate to the first frame rate.
In this embodiment of the present application, the fourth frame rate may be understood as a frame rate after the second electronic device increases the frame rate when the temperature of the second electronic device decreases to the second preset value, for example, the fourth frame rate may be understood as 2/3 of the reference frame rate in the embodiment corresponding to fig. 7, or may be other frame rate values smaller than the first frame rate, which is not limited.
The third preset value may be understood as a value of relatively low temperature of the second electronic device, and the third preset value may be smaller than the second preset value, for example, the third preset value may be understood as a temperature interval value corresponding to the temperature level 1 in the embodiment corresponding to fig. 7, which is not limited.
The first preset time period may be understood as a time period preset by the second electronic device, and exemplary, the first preset time period may be understood as a second preset time period in the embodiment corresponding to fig. 7, for example, the first preset time period may be 3 minutes, or may be other time periods, which is not limited.
The method for raising the frame rate by the second electronic device may refer to the related description of (1) raising the frame rate in the embodiment corresponding to fig. 7, which is not described herein.
After the frame rate of the second electronic device is increased, the frame rate of the image data received by the first electronic device is also increased, so that user experience is improved.
Optionally, on the basis of the embodiment corresponding to fig. 8, in the process that the second electronic device collects the image by using the camera device, when the temperature of the second electronic device is reduced to a fourth preset value, if the code rate of the image transmitted by the second electronic device is smaller than the first code rate, the second electronic device increases the code rate.
In this embodiment of the present application, the fourth preset value may be understood as a value of the second electronic device with a relatively low temperature, for example, the fourth preset value may be understood as a temperature interval value corresponding to the temperature level 2 in the embodiment corresponding to fig. 7, or may be understood as a temperature interval value corresponding to the level 1, and the fourth preset value and the second preset value may be the same or different, and are not limited.
When the temperature of the second electronic equipment is reduced, the code rate of the image acquired by the camera can be improved, and therefore the image texture is improved.
Optionally, on the basis of the embodiment corresponding to fig. 8, raising the code rate by the second electronic device may include: the second electronic device increases the code rate to the first code rate; or the second electronic equipment increases the code rate to a sixth code rate, the sixth code rate is smaller than the first code rate, when the temperature of the second electronic equipment is reduced to a fifth preset value, the second electronic equipment increases the sixth code rate to the first code rate, and the fifth preset value and the fourth preset value belong to different temperature intervals; or the second electronic equipment increases the code rate to the sixth code rate, and when the temperature of the second electronic equipment is smaller than or equal to the fourth preset value in the second preset time period, the second electronic equipment increases the sixth code rate to the first code rate.
In this embodiment of the present application, the sixth code rate may be understood as a code rate after the second electronic device increases the code rate when the temperature of the second electronic device decreases to the second preset value, for example, the sixth code rate may be understood as 2/3 of the reference code rate in the embodiment corresponding to fig. 7, or may be other code rate values smaller than the first code rate, which is not limited.
The fifth preset value may be understood as a value of the second electronic device having a relatively low temperature, and the fifth preset value may be smaller than the fourth preset value, for example, the fifth preset value may be understood as a temperature interval value corresponding to the temperature level 1 in the embodiment corresponding to fig. 7, and the fifth preset value and the third preset value may be the same or different, and are not limited.
The second preset time period may be understood as a time period preset by the second electronic device, and exemplary, the second preset time period may be understood as a second preset time period in the embodiment corresponding to fig. 7, for example, the second preset time period may be 3 minutes, or may be other time periods, and the second preset time period may be the same as or different from the first preset time period, and is not limited.
The method for raising the code rate of the second electronic device may refer to the description related to the raising of the code rate in the embodiment (2) corresponding to fig. 7, which is not repeated.
After the second electronic equipment improves the code rate, the code rate of the image data received by the first electronic equipment is improved, so that the user experience is improved.
Optionally, on the basis of the embodiment corresponding to fig. 8, before the first electronic device invokes the camera device of the second electronic device, the method may further include: the first electronic device displays an interface of the target application, wherein the interface comprises identifications of one or more cameras; the identification of the one or more cameras comprises the identification of a camera device of the second electronic equipment; the first electronic device invokes a camera device of the second electronic device, comprising: the first electronic device invokes the camera device of the second electronic device in response to a selected operation of the identification of the camera device of the second electronic device.
In the embodiment of the application, the target application may be understood as an application capable of providing a function of a camera device of the second electronic device, for example, the target application may include an application capable of performing a video conference. The target application may be understood as the application displayed in the computer 100 in the embodiment corresponding to fig. 1.
The identification of one or more cameras may be understood as different types of cameras provided by the target application interface, for example, the identification of one or more cameras may be understood as a local camera, a front-end camera 204 of the mobile phone, a rear-end camera 205 of the mobile phone, etc. in the embodiment corresponding to fig. 1, which are not described herein.
The interface of the target application comprises the identification of one or more cameras and can provide related functions of sharing the cameras for the user, so that the user can use the camera device of the second electronic device to carry out video conference or video teaching and the like on the first electronic device, and user experience is improved.
It should be noted that, the user information (including but not limited to user equipment information, user personal information, etc.) and the data (including but not limited to data for analysis, stored data, presented data, etc.) related to the present application are information and data authorized by the user or fully authorized by each party, and the collection, use and processing of the related data need to comply with the related laws and regulations and standards of the related country and region, and provide corresponding operation entries for the user to select authorization or rejection.
The foregoing description of the solution provided in the embodiments of the present application has been mainly presented in terms of a method. To achieve the above functions, it includes corresponding hardware structures and/or software modules that perform the respective functions. Those of skill in the art will readily appreciate that the various illustrative method steps described in connection with the embodiments disclosed herein may be implemented as hardware or a combination of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the application may divide the functional modules of the apparatus implementing the method according to the above method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
Fig. 9 is a schematic structural diagram of a chip according to an embodiment of the present application. Chip 900 includes one or more (including two) processors 901, communication lines 902, communication interfaces 903, and memory 904.
In some implementations, the memory 904 stores the following elements: executable modules or data structures, or a subset thereof, or an extended set thereof.
The methods described in the embodiments of the present application may be applied to the processor 901 or implemented by the processor 901. Processor 901 may be an integrated circuit chip with signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in the processor 901 or instructions in the form of software. The processor 901 may be a general purpose processor (e.g., a microprocessor or a conventional processor), a digital signal processor (digital signal processing, DSP), an application specific integrated circuit (application specific integrated circuit, ASIC), an off-the-shelf programmable gate array (field-programmable gate array, FPGA) or other programmable logic device, discrete gates, transistor logic, or discrete hardware components, and the processor 901 may implement or perform the methods, steps, and logic diagrams associated with the processes disclosed in the embodiments of the present application.
The steps of a method disclosed in connection with the embodiments of the present application may be embodied directly in hardware, in a decoded processor, or in a combination of hardware and software modules in a decoded processor. The software modules may be located in a state-of-the-art storage medium such as random access memory, read-only memory, programmable read-only memory, or charged erasable programmable memory (electrically erasable programmable read only memory, EEPROM). The storage medium is located in the memory 904, and the processor 901 reads information in the memory 904 and performs the steps of the above method in combination with its hardware.
The processor 901, the memory 904, and the communication interface 903 may communicate via a communication line 902.
In the above embodiments, the instructions stored by the memory for execution by the processor may be implemented in the form of a computer program product. The computer program product may be written in the memory in advance, or may be downloaded in the form of software and installed in the memory.
Embodiments of the present application also provide a computer program product comprising one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions in accordance with embodiments of the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wired (e.g., coaxial cable, fiber optic, digital subscriber line (digital subscriber line, DSL), or wireless (e.g., infrared, wireless, microwave, etc.), or semiconductor medium (e.g., solid state disk, SSD)) or the like.
Embodiments of the present application also provide a computer-readable storage medium. The methods described in the above embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. Computer readable media can include computer storage media and communication media and can include any medium that can transfer a computer program from one place to another. The storage media may be any target media that is accessible by a computer.
As one possible design, the computer-readable medium may include compact disk read-only memory (CD-ROM), RAM, ROM, EEPROM, or other optical disk memory; the computer readable medium may include disk storage or other disk storage devices. Moreover, any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, includes Compact Disc (CD), laser disc, optical disc, digital versatile disc (digital versatile disc, DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.

Claims (16)

1. A method of temperature control for a communication system, the system comprising a first electronic device and a second electronic device, the second electronic device comprising a camera device, the method comprising:
the first electronic device invokes the camera device of the second electronic device;
the second electronic equipment collects images by using the camera device and transmits the images to the first electronic equipment;
When the temperature of the second electronic equipment is increased to a first preset value in the process of acquiring images by the second electronic equipment by using the camera device, if the first frame rate of the images acquired by the camera device is larger than the target frame rate, the frame rate of the images acquired by the camera device is reduced by the second electronic equipment; otherwise, if the first frame rate is less than or equal to the target frame rate, the second electronic device reduces the code rate of the transmission image.
2. The method of claim 1, wherein during the second electronic device capturing an image with the camera device:
at a first moment, the temperature of the second electronic equipment is a first temperature, the frame rate of the image acquired by the camera device is the first frame rate, and the first frame rate is larger than the target frame rate;
the temperature of the second electronic device is a second temperature, and the frame rate of the image acquired by the camera device is a second frame rate, wherein the second time is later than the first time, the second temperature is greater than the first temperature, the second frame rate is less than the first frame rate, and the second frame rate is greater than the target frame rate;
And at a third moment, the temperature of the second electronic equipment is a third temperature, and the frame rate of the image acquired by the camera device is a third frame rate, wherein the third moment is later than the second moment, the third temperature is greater than the second temperature, the third frame rate is less than the second frame rate, and the third frame rate is greater than or equal to the target frame rate.
3. The method of claim 2, wherein, between the second time and the third time,
the temperature of the second electronic device is a fourth temperature, the frame rate of the image acquired by the camera device is the second frame rate, wherein the fourth temperature is greater than the second temperature, the fourth temperature is less than the third temperature, and the fourth temperature and the second temperature both belong to a first preset temperature interval.
4. A method according to claim 3, wherein the first temperature corresponds to a first level, the second temperature corresponds to a second level, the third temperature corresponds to a third level, and the fourth temperature corresponds to the second level.
5. The method of claim 1, wherein during the second electronic device capturing an image with the camera device:
At a fourth moment, the temperature of the second electronic equipment is a fifth temperature, the frame rate of the image acquired by the camera device is the first frame rate, the code rate of the image transmitted by the second electronic equipment is the first code rate, and the first frame rate is larger than the target frame rate;
a fifth moment, wherein the temperature of the second electronic device is a sixth temperature, the frame rate of the image acquired by the camera device is the target frame rate, and the code rate of the image transmitted by the second electronic device is a second code rate, wherein the fifth moment is later than the fourth moment, the sixth temperature is greater than the fifth temperature, and the second code rate is smaller than the first code rate;
and at a sixth moment, the temperature of the second electronic equipment is a seventh temperature, the frame rate of the image acquired by the camera device is the target frame rate, and the code rate of the image transmitted by the second electronic equipment is a third code rate, wherein the sixth moment is later than the fifth moment, the seventh temperature is greater than the sixth temperature, and the third code rate is smaller than the second code rate.
6. The method of claim 5, wherein, between the fifth time and the sixth time,
The temperature of the second electronic device is an eighth temperature, the frame rate of the image acquired by the camera device is the target frame rate, the code rate of the image transmitted by the second electronic device is the second code rate, wherein the eighth temperature is greater than the sixth temperature, the eighth temperature is less than the seventh temperature, and both the eighth temperature and the sixth temperature belong to a second preset temperature interval.
7. The method of claim 1, wherein during the second electronic device capturing an image with the camera device:
at a seventh moment, the temperature of the second electronic device is a ninth temperature, the frame rate of the image acquired by the camera device is the first frame rate, the code rate of the image transmitted by the second electronic device is the first code rate, and the first frame rate is smaller than or equal to the target frame rate;
the temperature of the second electronic device is tenth temperature, the frame rate of the image acquired by the camera device is the target frame rate, and the code rate of the image transmitted by the second electronic device is fourth code rate, wherein the eighth time is later than the seventh time, the tenth temperature is higher than the ninth temperature, and the fourth code rate is lower than the first code rate;
And at a ninth moment, the temperature of the second electronic equipment is eleventh temperature, the frame rate of the image acquired by the camera device is the target frame rate, and the code rate of the image transmitted by the second electronic equipment is a fifth code rate, wherein the ninth moment is later than the eighth moment, the eleventh temperature is higher than the tenth temperature, and the fifth code rate is lower than the fourth code rate.
8. The method of any of claims 1-7, wherein the second electronic device increases the frame rate if the frame rate at which the camera device captures images is less than the first frame rate when the temperature of the second electronic device decreases to a second preset value during the capturing of images by the second electronic device using the camera device.
9. The method of claim 8, wherein the second electronic device increasing the frame rate comprises:
the second electronic device increasing the frame rate to the first frame rate;
or the second electronic device increases the frame rate to a fourth frame rate, the fourth frame rate being smaller than the first frame rate, and when the temperature of the second electronic device decreases to a third preset value, the second electronic device increases the fourth frame rate to the first frame rate, the third preset value and the second preset value belonging to different temperature intervals;
Alternatively, the second electronic device increases the frame rate to the fourth frame rate; and when the temperature of the second electronic equipment is less than or equal to the second preset value in a first preset time period, the second electronic equipment increases the fourth frame rate to the first frame rate.
10. The method according to any one of claims 5-9, wherein the second electronic device increases the code rate when the temperature of the second electronic device decreases to a fourth preset value during the process of capturing the image by the second electronic device using the camera device, if the code rate of the image transmitted by the second electronic device is smaller than the first code rate.
11. The method of claim 10, wherein the second electronic device increasing the code rate comprises:
the second electronic equipment increases the code rate to the first code rate;
or the second electronic device increases the code rate to a sixth code rate, the sixth code rate is smaller than the first code rate, and when the temperature of the second electronic device is reduced to a fifth preset value, the second electronic device increases the sixth code rate to the first code rate, wherein the fifth preset value and the fourth preset value belong to different temperature intervals;
Or the second electronic equipment increases the code rate to the sixth code rate, and when the temperature of the second electronic equipment is smaller than or equal to the fourth preset value in a second preset time period, the second electronic equipment increases the sixth code rate to the first code rate.
12. The method of any of claims 1-11, wherein prior to the first electronic device invoking the camera device of the second electronic device, further comprising:
the first electronic device displays an interface of a target application, wherein the interface comprises identifications of one or more cameras; the identifiers of the one or more cameras comprise identifiers of the camera devices of the second electronic equipment;
the first electronic device invoking the camera device of the second electronic device, comprising: the first electronic device responds to the selected operation of the identification of the camera device of the second electronic device, and the first electronic device calls the camera device of the second electronic device.
13. An electronic device, comprising: a memory for storing a computer program for execution by a first electronic device performing the method of any one of claims 1-12 or for execution by a second electronic device performing the method of any one of claims 1-12.
14. A computer-readable storage medium storing instructions that, when executed, cause a computer to perform the method performed by a first electronic device of the methods of any of claims 1-12 or to perform the method performed by a second electronic device of the methods of any of claims 1-12.
15. A computer program product comprising a computer program which, when run, causes an electronic device to perform the method performed by a first electronic device of the methods of any of claims 1-12 or to perform the method performed by a second electronic device of the methods of any of claims 1-12.
16. A communication system, comprising: a first electronic device for performing the method performed by the first electronic device in the method according to any of claims 1-12, and a second electronic device for performing the method performed by the second electronic device in the method according to any of claims 1-12.
CN202310848706.2A 2023-07-11 2023-07-11 Temperature control method and related device Pending CN117707242A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310848706.2A CN117707242A (en) 2023-07-11 2023-07-11 Temperature control method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310848706.2A CN117707242A (en) 2023-07-11 2023-07-11 Temperature control method and related device

Publications (1)

Publication Number Publication Date
CN117707242A true CN117707242A (en) 2024-03-15

Family

ID=90150357

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310848706.2A Pending CN117707242A (en) 2023-07-11 2023-07-11 Temperature control method and related device

Country Status (1)

Country Link
CN (1) CN117707242A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107277374A (en) * 2017-07-28 2017-10-20 盯盯拍(深圳)技术股份有限公司 The control method of camera shooting terminal and the control device of camera shooting terminal
CN107509025A (en) * 2017-07-26 2017-12-22 维沃移动通信有限公司 A kind of camera control method and mobile terminal
CN108121524A (en) * 2017-12-19 2018-06-05 广东欧珀移动通信有限公司 The adjusting method and device, electronic equipment of electronic equipment image display preview frame per second
CN112822536A (en) * 2021-01-15 2021-05-18 闻泰通讯股份有限公司 Streaming media playing control method, device, medium and computer equipment
CN114554000A (en) * 2020-11-20 2022-05-27 华为终端有限公司 Camera calling method and system and electronic equipment
CN114584814A (en) * 2020-11-30 2022-06-03 华为技术有限公司 Method for adjusting power consumption of terminal, method for adjusting code rate and related equipment
CN114845035A (en) * 2021-01-30 2022-08-02 华为技术有限公司 Distributed shooting method, electronic equipment and medium
WO2023088061A1 (en) * 2021-11-16 2023-05-25 华为技术有限公司 Smart device control method and electronic device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107509025A (en) * 2017-07-26 2017-12-22 维沃移动通信有限公司 A kind of camera control method and mobile terminal
CN107277374A (en) * 2017-07-28 2017-10-20 盯盯拍(深圳)技术股份有限公司 The control method of camera shooting terminal and the control device of camera shooting terminal
CN108121524A (en) * 2017-12-19 2018-06-05 广东欧珀移动通信有限公司 The adjusting method and device, electronic equipment of electronic equipment image display preview frame per second
CN114554000A (en) * 2020-11-20 2022-05-27 华为终端有限公司 Camera calling method and system and electronic equipment
CN114584814A (en) * 2020-11-30 2022-06-03 华为技术有限公司 Method for adjusting power consumption of terminal, method for adjusting code rate and related equipment
CN112822536A (en) * 2021-01-15 2021-05-18 闻泰通讯股份有限公司 Streaming media playing control method, device, medium and computer equipment
CN114845035A (en) * 2021-01-30 2022-08-02 华为技术有限公司 Distributed shooting method, electronic equipment and medium
WO2023088061A1 (en) * 2021-11-16 2023-05-25 华为技术有限公司 Smart device control method and electronic device

Similar Documents

Publication Publication Date Title
US11567623B2 (en) Displaying interfaces in different display areas based on activities
US12032410B2 (en) Display method for flexible display, and terminal
CN115473957B (en) Image processing method and electronic equipment
CN109559270B (en) Image processing method and electronic equipment
US11861382B2 (en) Application starting method and apparatus, and electronic device
US20200249821A1 (en) Notification Handling Method and Electronic Device
CN113254120B (en) Data processing method and related device
EP4060475A1 (en) Multi-screen cooperation method and system, and electronic device
WO2023279820A1 (en) Method for adjusting touch panel sampling rate, and electronic device
US20230353862A1 (en) Image capture method, graphic user interface, and electronic device
CN110780929B (en) Method for calling hardware interface and electronic equipment
US20230168802A1 (en) Application Window Management Method, Terminal Device, and Computer-Readable Storage Medium
US20230269324A1 (en) Display method applied to electronic device, graphical user interface, and electronic device
WO2024001810A1 (en) Device interaction method, electronic device and computer-readable storage medium
CN115756268A (en) Cross-device interaction method and device, screen projection system and terminal
CN115333941A (en) Method for acquiring application running condition and related equipment
CN115119048B (en) Video stream processing method and electronic equipment
CN112817610A (en) Cota package installation method and related device
CN117707242A (en) Temperature control method and related device
WO2023169276A1 (en) Screen projection method, terminal device, and computer-readable storage medium
CN117130680B (en) Calling method of chip resources and electronic equipment
CN117076284B (en) Page loading time length detection method, equipment and storage medium
CN117130698B (en) Menu display method and electronic equipment
CN116185245B (en) Page display method and electronic equipment
CN116196621B (en) Application processing method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination