CN111263061B - Camera control method and communication terminal - Google Patents

Camera control method and communication terminal Download PDF

Info

Publication number
CN111263061B
CN111263061B CN202010074103.8A CN202010074103A CN111263061B CN 111263061 B CN111263061 B CN 111263061B CN 202010074103 A CN202010074103 A CN 202010074103A CN 111263061 B CN111263061 B CN 111263061B
Authority
CN
China
Prior art keywords
control instruction
control
instruction
executed
highest priority
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010074103.8A
Other languages
Chinese (zh)
Other versions
CN111263061A (en
Inventor
于研文
王志国
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Hisense Medical Equipment Co Ltd
Original Assignee
Qingdao Hisense Medical Equipment Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Hisense Medical Equipment Co Ltd filed Critical Qingdao Hisense Medical Equipment Co Ltd
Priority to CN202010074103.8A priority Critical patent/CN111263061B/en
Publication of CN111263061A publication Critical patent/CN111263061A/en
Application granted granted Critical
Publication of CN111263061B publication Critical patent/CN111263061B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices

Abstract

The application discloses a camera control method and a communication terminal, wherein the method comprises the steps of analyzing control instructions received in a specified time period, and obtaining a control instruction to be executed according to the priority corresponding to each control instruction; the control instruction is used for controlling the camera to execute corresponding actions; and sending the control instruction to be executed to the camera for execution. By analyzing at least one received control instruction, a control instruction to be executed is determined, the problem that the motion of the camera is out of order due to the fact that each control instruction is executed is avoided, and therefore the use of the camera is increased.

Description

Camera control method and communication terminal
Technical Field
The embodiment of the application relates to the technical field of informatization, in particular to a camera control method and a communication terminal.
Background
In the field of hospital digitization and informatization, when a plurality of operation classroom and a plurality of users log in an operation teaching system at the same time, a camera is controlled together; if the user A controls the camera to move leftwards, a leftward instruction is sent, and then the camera moves leftwards immediately; at the moment, the user B controls the camera to move rightwards or upwards, and then sends a right or upwards instruction; therefore, the overall motion state of the camera is as follows: move to the left and then to the right or upward.
The inventors have found that, in such a scenario, if a control command is transmitted a plurality of times in succession, there is a problem that the lifetime is shortened due to frequent rotation, seizure, long-term consumption, and the like.
Disclosure of Invention
The embodiment of the application provides a camera control method and a terminal, which are used for solving the problem of short service life caused by frequent rotation, blockage, long-term consumption and the like in the related technology.
According to an aspect of an exemplary embodiment, there is provided a camera control method, the method including:
analyzing the control instructions received in a specified time period, and obtaining a control instruction to be executed according to the priority corresponding to each control instruction; the control instruction is used for controlling the camera to execute corresponding actions;
and sending the control instruction to be executed to the camera for execution.
The beneficial effects produced by the embodiment are as follows: by the aid of the method and the device, at least one control instruction which is possibly received in the specified time period can be analyzed, and finally, only one control instruction to be executed is determined to be sent to the camera for execution, so that the camera executes fewer control instructions in the specified time period, and the service life of the camera is prolonged.
In some exemplary embodiments, determining one control instruction to be executed according to the priority of each control instruction includes:
comparing the priorities of the control instructions to determine the control instruction with the highest priority;
when one control instruction with the highest priority exists, taking the control instruction with the highest priority as the control instruction to be executed;
when there are a plurality of control instructions with the highest priority, selecting one control instruction from the plurality of control instructions with the highest priority as the control instruction to be executed, or combining the plurality of control instructions with the highest priority into one control instruction as the control instruction to be executed.
The beneficial effects produced by the embodiment are as follows: the embodiment further provides a specific implementation mode for determining the control instruction to be executed through the priority, so that the result of determining the control instruction to be executed is more accurate.
In some exemplary embodiments, the selecting one control instruction from a plurality of control instructions with highest priority as the control instruction to be executed includes:
classifying the control instructions with the highest priority, wherein the control instructions requesting the camera to execute the same action are of one class;
determining the number of control instructions contained in each type;
when one class with the largest number of control instructions exists, determining the control instructions of the class as the control instructions to be executed;
when the class with the largest number of control instructions is provided with a plurality of classes, combining a plurality of control instructions with the highest priority contained in the plurality of classes into one control instruction as the control instruction to be executed.
The beneficial effects produced by the embodiment are as follows: the embodiment provides that, if there are a plurality of control instructions determined according to the priorities, the control instructions to be executed can be further determined according to the number of different control instructions, and an implementation manner that there are a plurality of control instructions with the highest priorities is provided, so that the result of determining the control instructions to be executed is more accurate.
In some exemplary embodiments, merging a plurality of highest priority control instructions into one control instruction includes:
searching a merging instruction corresponding to the control instructions with the highest priority in a pre-constructed and stored instruction dictionary;
and determining the searched merging instruction as the merged control instruction.
The beneficial effects produced by the embodiment are as follows: in this embodiment, a specific implementation manner for determining a control instruction to be executed is provided by introducing an instruction dictionary if a plurality of control instructions with the same number and the highest priority are combined, so that the determination result is more accurate.
In some exemplary embodiments, the method further comprises:
and if the merging instruction corresponding to the control instructions with the highest priority levels is not found in the pre-constructed and stored instruction dictionary, selecting one control instruction from the control instructions with the highest priority levels as the control instruction to be executed.
The beneficial effects produced by the embodiment are as follows: by the embodiment, the technical scheme is provided when the same number of control instructions cannot be combined, so that the result of determining the control instructions to be executed is more accurate.
In some exemplary embodiments, the selecting one control instruction from a plurality of control instructions with highest priority as the control instruction to be executed includes:
selecting the last received control instruction from the control instructions with the highest priority as the control instruction to be executed; alternatively, the first and second electrodes may be,
selecting the control instruction received firstly as the control instruction to be executed; alternatively, the first and second electrodes may be,
and randomly selecting a control instruction as the control instruction to be executed.
The beneficial effects produced by the embodiment are as follows: the present embodiment is a further description of the previous embodiment, and how to select one control instruction from the control instructions with the highest priority is given by the present embodiment, so that the determined result is more accurate.
In some exemplary embodiments, comparing the priorities of the control instructions to determine the control instruction with the highest priority includes:
storing a first control instruction received within the specified time period into an instruction stack;
when the control instruction is received again in the appointed time period, comparing the control instruction received again with the priority of the control instruction in the instruction stack;
if the comparison result shows that the priorities are the same, storing the control instruction received again into the instruction stack;
if the comparison result is that the priority of the control instruction received again is the highest, filtering the control instruction which is lower than the priority of the control instruction received again in the instruction stack;
and when the timing of the specified time period is ended, determining that the control instruction in the instruction stack is the control instruction with the highest priority.
The beneficial effects produced by the embodiment are as follows: the embodiment provides another possible implementation manner of determining the highest priority in the control instructions according to the priorities, so that the determination result when determining the control instructions to be executed is more accurate.
According to another aspect of the exemplary embodiments, there is provided a communication terminal including:
the input and output unit is configured to receive a shooting picture of a camera to be displayed and output and display the shooting picture;
a display panel configured to display a display interface of an application program for displaying the photographing screen;
a backlight assembly configured to be positioned at a rear surface of the display panel, the backlight assembly including a plurality of backlight partitions, each of which may emit light of different brightness;
a processor respectively connected with the input and output unit, the display panel and the backlight assembly, and configured to:
analyzing the control instructions received in a specified time period, and obtaining a control instruction to be executed according to the priority corresponding to each control instruction; the control instruction is used for controlling the camera to execute corresponding actions;
and sending the control instruction to be executed to the camera for execution.
In some exemplary embodiments, the processor is configured to, when executing one control instruction to be executed according to the priority of each control instruction, perform:
comparing the priorities of the control instructions to determine the control instruction with the highest priority;
when one control instruction with the highest priority exists, taking the control instruction with the highest priority as the control instruction to be executed;
when there are a plurality of control instructions with the highest priority, selecting one control instruction from the plurality of control instructions with the highest priority as the control instruction to be executed, or combining the plurality of control instructions with the highest priority into one control instruction as the control instruction to be executed.
In some exemplary embodiments, the processor is configured to, when executing the control instruction selected from the plurality of control instructions with the highest priority as the control instruction to be executed, execute:
classifying the control instructions with the highest priority, wherein the control instructions requesting the camera to execute the same action are of one class;
determining the number of control instructions contained in each type;
when one class with the largest number of control instructions exists, determining the control instructions of the class as the control instructions to be executed;
when the class with the largest number of control instructions is provided with a plurality of classes, combining a plurality of control instructions with the highest priority contained in the plurality of classes into one control instruction as the control instruction to be executed.
In some exemplary embodiments, the processor is configured to, when executing merging a plurality of control instructions with highest priority into one control instruction, execute:
searching a merging instruction corresponding to the control instructions with the highest priority in a pre-constructed and stored instruction dictionary;
and determining the searched merging instruction as the merged control instruction.
In some exemplary embodiments, the processor is further configured to perform:
and if the merging instruction corresponding to the control instructions with the highest priority levels is not found in the pre-constructed and stored instruction dictionary, selecting one control instruction from the control instructions with the highest priority levels as the control instruction to be executed.
In some exemplary embodiments, the processor is configured to, when executing the control instruction selected from the plurality of control instructions with the highest priority as the control instruction to be executed, execute:
selecting the last received control instruction from the control instructions with the highest priority as the control instruction to be executed; alternatively, the first and second electrodes may be,
selecting the control instruction received firstly as the control instruction to be executed; alternatively, the first and second electrodes may be,
and randomly selecting a control instruction as the control instruction to be executed.
In some exemplary embodiments, the processor is configured to compare priorities of the control instructions, and when determining the control instruction with the highest priority, perform:
storing a first control instruction received within the specified time period into an instruction stack;
when the control instruction is received again in the appointed time period, comparing the control instruction received again with the priority of the control instruction in the instruction stack;
if the comparison result shows that the priorities are the same, storing the control instruction received again into the instruction stack;
if the comparison result is that the priority of the control instruction received again is the highest, filtering the control instruction which is lower than the priority of the control instruction received again in the instruction stack;
and when the timing of the specified time period is ended, determining that the control instruction in the instruction stack is the control instruction with the highest priority.
According to a further aspect of the exemplary embodiments, there is provided a computer storage medium having stored therein computer program instructions which, when run on a computer, cause the computer to execute the camera control method as described above.
The camera control method and the communication terminal provided by the embodiment of the application comprise the steps of analyzing control instructions received in a specified time period, and obtaining a control instruction to be executed according to the priority corresponding to each control instruction; the control instruction is an instruction configured to control the camera to execute a corresponding action; and sending the control instruction to be executed to the camera for execution. The received at least one control instruction is analyzed to determine a control instruction to be executed, so that the situation that the received control instruction is operated to cause blockage is avoided, the problem that the service life of the camera is shortened due to long-term consumption is avoided, and the service life of the camera is prolonged.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the application. The objectives and other advantages of the application may be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic structural diagram of a communication terminal according to an embodiment of the present application;
fig. 2 is a schematic view of an application scenario of a camera control method according to an embodiment of the present application;
FIG. 3 is a schematic diagram of an application principle according to an embodiment of the present application;
fig. 4 is a schematic flowchart of a camera control method according to an embodiment of the present application;
FIG. 5 is an interface diagram provided in accordance with an embodiment of the present application;
fig. 6 is a schematic flowchart of a camera control method according to an embodiment of the present application;
fig. 7 is another schematic flow chart of a camera control method according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a communication terminal according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application clearer, the present application will be described in further detail with reference to the accompanying drawings, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Fig. 1 shows a schematic configuration of a communication terminal 100.
The following describes an embodiment specifically taking the communication terminal 100 as an example. It should be understood that the communication terminal 100 shown in fig. 1 is only an example, and the communication terminal 100 may have more or less components than those shown in fig. 1, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
A block diagram of a hardware configuration of a communication terminal 100 according to an exemplary embodiment is exemplarily shown in fig. 1.
As shown in fig. 1, the communication terminal 100 may include, for example: RF (radio frequency) circuit 110, memory 120, display unit 130, camera 140, sensor 150, audio circuit 160, Wireless Fidelity (Wi-Fi) module 170, processor 180, bluetooth module 181, and power supply 190. In the embodiment of the present application, the input/output unit may be at least one of the audio circuit 160, the bluetooth module 181, the Wi-Fi module 170, and the camera 140.
The RF circuit 110 may be used for receiving and transmitting signals during information transmission and reception or during a call, and may receive downlink data of a base station and then send the downlink data to the processor 180 for processing; the uplink data may be transmitted to the base station. Typically, the RF circuitry includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
The memory 120 may be used to store software programs and data. The processor 180 executes various functions of the communication terminal 100 and data processing by executing software programs or data stored in the memory 120. The memory 120 may include high speed random access memory and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. The memory 120 stores an operating system that enables the communication terminal 100 to operate. The memory 120 may store an operating system and various application programs, and may also store codes for executing the terminal data processing method according to the embodiment of the present application.
The display unit 130 may be used to display input numbers or characters or image information and generate signal inputs related to user settings and function control of the communication terminal 100, for example, and specifically, the display unit 130 may include a touch screen 131 disposed on the front of the communication terminal 100 and may collect touch operations of a user thereon or nearby, such as clicking a button, dragging a scroll box, and the like, for example.
The display unit 130 may also be used to display a display interface of an application program of the photographing screen, for example. Specifically, the display unit 130 may include a display screen 132 disposed on the front surface of the communication terminal 100. The display screen 132 may be configured in the form of a liquid crystal display, a light emitting diode, or the like, for example. The display unit 130 may be used to display the interfaces of the various embodiments described in this application.
The touch screen 131 may cover the display screen 132, or the touch screen 131 and the display screen 132 may be integrated to implement the input and output functions of the communication terminal 100, and after the integration, the touch screen may be referred to as a touch display screen for short. In the present application, the display unit 130 may display the application programs and the corresponding operation steps.
As an input-output device, the camera 140 may be used to capture still images or video, for example. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing elements convert the light signals into electrical signals which are then passed to the processor 180 for conversion into digital image signals. The audio circuit 160, the bluetooth module 181, the Wi-Fi module 170, etc. may interact with other devices (e.g., medical instruments or other components of the communication terminal) for example to receive or output a captured image to the other devices.
The communication terminal 100 may further comprise at least one sensor 150, such as an acceleration sensor 151, a distance sensor 152, a fingerprint sensor 153, a temperature sensor 154. The communication terminal 100 may also be configured with other sensors such as a gyroscope, barometer, hygrometer, thermometer, infrared sensor, optical sensor, motion sensor, and the like.
The audio circuitry 160, speaker 161, microphone 162 may provide an audio interface between a user and the communication terminal 100. The audio circuit 160 may transmit the electrical signal converted from the received audio data to the speaker 161, and convert the electrical signal into a sound signal for output by the speaker 161. The communication terminal 100 may also be provided with a volume button for adjusting the volume of the sound signal. On the other hand, the microphone 162 converts the collected sound signal into an electrical signal, converts the electrical signal into audio data after being received by the audio circuit 160, and outputs the audio data to the RF circuit 110 to be transmitted to, for example, another terminal or outputs the audio data to the memory 120 for further processing. In this application, the microphone 162 may capture the voice of the user.
Wi-Fi belongs to a short-distance wireless transmission technology, and the communication terminal 100 may help a user to send and receive e-mails, browse webpages, access streaming media, and the like through the Wi-Fi module 170, which provides a wireless broadband internet access for the user.
The processor 180 is a control center of the communication terminal 100, connects various parts of the entire terminal using various interfaces and lines, and performs various functions of the communication terminal 100 and processes data by running or executing software programs stored in the memory 120 and calling data stored in the memory 120.
In some embodiments, processor 180 may include one or more processing units; the processor 180 may also integrate an application processor, which mainly handles operating systems, user interfaces, applications, etc., and a baseband processor, which mainly handles wireless communications. It will be appreciated that the baseband processor described above may also be integrated into the processor 180. In the present application, the processor 180 may run an operating system, an application program, a user interface display, and a touch response, and the processing method described in the embodiments of the present application. In addition, the processor 180 is coupled with the input-output unit and the display unit.
And the bluetooth module 181 is configured to perform information interaction with other bluetooth devices having a bluetooth module through a bluetooth protocol. For example, the communication terminal 100 may establish a bluetooth connection with a device (e.g., a medical device) having a bluetooth module through the bluetooth module 181, so as to perform data interaction.
The communication terminal 100 also includes a power supply 190 (such as a battery) to power the various components. The power supply may be logically connected to the processor 180 through a power management system to manage charging, discharging, power consumption, etc. through the power management system. The communication terminal 100 may also be configured with power buttons for powering the terminal on and off, and for locking the screen.
Referring to fig. 2, an application scenario diagram of the camera control method provided in the embodiment of the present application includes a user 200, a terminal 201, a background server 202, and a camera 203.
In the method, a plurality of users 200 and terminals 201 may exist, wherein each user 200 may select a control instruction through a user interface on the terminal 201, and then the terminal 201 sends the control instruction to the backend server 202 for analysis, and after the backend server 202 analyzes the control instruction sent by each user 200, one control instruction to be executed is obtained according to a priority corresponding to each control instruction and sent to the camera 203.
The terminal 201 and the backend server 202 may be communicatively connected through a communication network, which may be a local area network, a wide area network, or the like.
It should be noted that the backend server 202 can also implement control over a plurality of cameras 203. The application mainly discusses the control scene of a plurality of users to the same camera.
It should be noted that the application scenario shown in fig. 2 is only an example, and the embodiment of the present application does not limit this.
In the field of hospital digitization and informatization, when a plurality of operation classroom and a plurality of users log in an operation teaching system at the same time, a camera is controlled together; if the user A controls the camera to move leftwards, a leftward instruction is sent, and then the camera moves leftwards; at the moment or after the user A sends a left instruction, the user B controls the camera to move rightwards or upwards, and then sends a right or upwards instruction; therefore, the overall motion state of the camera is as follows: move to the left and then to the right or upward. The inventors have found that, in such a scenario, if a control command is transmitted a plurality of times in succession, there is a problem that the lifetime is shortened due to frequent rotation, seizure, long-term consumption, and the like.
Fig. 3 is a schematic diagram of an application principle according to an embodiment of the present application. The part can be implemented by a part of modules or functional components of the communication terminal shown in fig. 1, and only the main components will be described below, while other components, such as a memory, a controller, a control circuit, etc., will not be described herein again.
As shown in fig. 3, the application environment may include a user interface 310 to be waited for a user operation provided via an input and output unit, a display unit 320 for displaying the user interface, and a processor 330 controlling display of the user interface.
The display unit 320 may include a display panel 321, a backlight assembly 322. The display panel 321 is configured to display an image, the backlight assembly 322 is disposed at the back of the display panel 321, and the backlight assembly 322 may include a plurality of backlight partitions (not shown), each of which may emit light with different brightness to illuminate the display panel 321.
The processor 330 may be configured to control the backlight brightness of each backlight partition in the backlight assembly 322 and cause each backlight partition in the backlight assembly to light the display panel 321 according to the corresponding backlight brightness.
The processor 330 may include an analysis unit 331 and a transmission unit 332. The analysis unit 331 may be configured to analyze the control instructions received in a specified time period, and obtain one control instruction to be executed according to a priority corresponding to each control instruction; the control instruction is used for controlling the camera to execute corresponding actions. The sending unit 332 is configured to send the control instruction to be executed to the camera for execution.
In view of this, the present application provides a camera control method, which mainly includes: in a scene for controlling the camera, a plurality of users may send control instructions to the camera, and because a plurality of control instructions sent by the users are available and may be sent simultaneously, a series of problems of shortening the service life of the camera caused by the fact that the camera needs to execute each control instruction are avoided; the method and the device for analyzing the control commands realize the analysis of the control commands, so that the control commands to be executed are obtained according to the priorities corresponding to the control commands and are sent to the camera for execution. Through the implementation mode, the camera can move orderly according to the control instruction or the synthesized control instruction in the appointed time period, and the movement disorder of the camera is avoided because each control instruction is not required to be executed, so that the abrasion of the camera is avoided, and the service life of the camera is prolonged.
Based on the above description, fig. 4 shows a detailed schematic flow chart of a camera control method provided in the embodiment of the present application, where the flow chart specifically includes:
step 401: analyzing the control instructions received in a specified time period, and obtaining a control instruction to be executed according to the priority corresponding to each control instruction; the control instruction is used for controlling the camera to execute corresponding actions.
Step 402: and sending the control instruction to be executed to the camera for execution.
One possible scenario for analyzing the control instructions received within the specified time period is that the received control instructions all have corresponding priorities; therefore, for the control instructions received in the designated time period, one control instruction to be executed can be determined according to the priority of each control instruction.
In one possible implementation manner of determining the priority corresponding to the control instruction during implementation, the priority of the control instruction sent by the user account is determined by configuring the priority for each user account. In some exemplary embodiments, the prioritizing of the user account may be performed in a relatively simple configuration, including: the priority of the user account has a plurality of levels (such as high priority, medium priority, low priority, etc.), and the present application is not limited herein. A correspondence between a user account and its corresponding priority is shown in table 1:
TABLE 1
User account Priority level
User A High priority
User B Low priority
User C Low priority
The priority of the user a can be determined as a high priority from table 1, and therefore, if each control instruction sent by the user a, the user B, and the user C is received within a specified time period, the determination is performed according to the priority level, where the level of the user a is the highest, the control instruction sent by the user a is used as a control instruction to be executed, and is further sent to the camera for execution. In addition, the configuration manner and the corresponding relationship of the user account priority in table 1 are only one possible implementation manner provided by the embodiment of the present application, and are not used to limit the present application, for example, the configuration manner of the user account priority may further include: the first priority, the second priority, the third priority and the like can determine the configuration mode of the user account priority according to needs.
It should be noted that, there may be a case that each user account logs in or logs out at any time, and after the background server recognizes the login or logout signal of the user account, the priority of the user account in a specified time period may change. For example, user B, user C log out and user D logs in for a specified period of time, where user D may have a higher priority than user A, then user D may be determined to be a high priority user A with a low priority for the specified period of time. Of course, in some exemplary embodiments, a plurality of specific priorities corresponding to different user accounts may be configured, and a plurality of possible priorities of the plurality of user accounts may be determined by the specific priorities.
In addition, in some exemplary embodiments, an instruction stack is established, and then the instruction analyzer analyzes the control instructions in the instruction stack, so as to obtain a control instruction to be executed, and send the control instruction to be executed to the camera for execution.
Specifically, for example, each control instruction sent by each user account is stored in the instruction stack in combination with the level of the priority corresponding to the control instruction. One possible implementation of the instruction stack is shown in table 2:
TABLE 2
Receiving sequence of control instructions Control instruction Priority level of user account
6 End up All in one
5 End up Is low in
4 Up Is low in
3 To the left Height of
2 Start of Is low in
1 Start of Height of
Wherein "1-6" in table 2 indicates the receiving order of the control instructions in the stack, wherein the ordering of "1" indicates the control instruction received first. Through the analysis of the command analyzer, it can be determined that the control commands for controlling the camera motion include two control commands ordered as "3", "4" as shown in table 2. And, by combining the judgment of the priority, since the priority of the control instruction ranked as "3" is higher, the control instruction ranked as "3" to the left "is sent to the camera as the control instruction to be executed, so that the camera executes the action to the left.
In addition, when determining a control instruction to be executed according to the priority of each control instruction, in some exemplary embodiments, comparing the priorities of the control instructions to determine the control instruction with the highest priority, there are several possible implementation scenarios including:
scene 1: first, a first control instruction received in a specified time period can be stored in the instruction stack; and if the control instruction is received again in the appointed time period, comparing the control instruction received again with the priority of the control instruction in the instruction stack.
If the comparison result shows that the priorities are the same, storing the control instruction received again into the instruction stack; if the comparison result is that the priority of the control instruction received again is the highest, filtering out the control instruction which is lower than the priority of the control instruction received again in the instruction stack; and when the timing of the specified time period is ended, determining that the control instruction in the instruction stack is the control instruction with the highest priority.
It should be noted that the designated time period is set, and in some exemplary embodiments, the designated time period is set as the starting time point (e.g. T1) when the instruction stack receives the first control instruction for controlling the movement of the camera except the control instruction indicating "start", and then the designated time period is set as the time threshold (e.g. T) preset in the system. Therefore, it can be understood that the control instruction with the highest priority within the specified time period of T1 to T1+ T is determined. Or, the specified time period is a specified time period specified by the system according to the current state of the camera or the system.
In a possible embodiment, if no other control command is received in the specified time period, the first received control command is directly executed. In another possible embodiment, if a new control instruction is received again at this specified time period, the new control instruction is compared with the control instruction currently stored in the instruction stack. One possible implementation of the comparison is shown in table 3:
TABLE 3
Figure BDA0002378032910000151
Figure BDA0002378032910000161
If the control instruction is received again, if the control instruction received again is higher than the priority of the control instruction currently stored in the instruction stack or the priority of the control instruction received again is the same as the priority of the control instruction currently stored in the instruction stack, the control instruction currently stored in the instruction stack is shifted out, and the control instruction received again is stored in the instruction stack; for example, sequence number 1 in table 3 corresponds to the content of the row, or sequence number n corresponds to the content of the row. On the contrary, if the control instruction received again has lower priority than the control instruction currently stored in the instruction stack, filtering the control instruction received again, namely disregarding the control instruction received again; for example, the content of the row corresponding to the sequence number 2 in table 3. In addition, prompt information such as "operation occupation" and the like can be sent to the user account corresponding to the low-priority control instruction. For example, referring to fig. 5, an interface display diagram provided according to the embodiment of the present application is shown, where a user pops up a prompt message of "operation occupied" through a display panel within a specified time period after selecting a control instruction or after the specified time period is over, so that the user can determine that the control instruction is not executed this time, and then the user can wait for a certain time to reselect. By comparison of the control instructions in the instruction stack, it can be determined that the control instruction at the end of the specified time period in time in the instruction stack is the highest priority control instruction or the same priority and most recently received control instruction.
And after the appointed time period is ended, sending the control instruction currently stored in the instruction stack to the camera as the control instruction to be executed. Furthermore, after the execution of the control instruction by the camera is completed, a control instruction indicating "end" may be sent to the system, so that the system clears the instruction stack and waits for the reception of the next control instruction.
Scene 2: and when one control instruction with the highest priority exists, taking the control instruction with the highest priority as the control instruction to be executed.
When the control instruction to be executed is determined, all the control instructions in the specified time period can be analyzed, so that the control instruction with the highest priority is determined. If the form of the instruction stack is adopted, in some exemplary embodiments, all instructions received within a specified time period are stored in the instruction stack, and the control instruction with the highest priority in all control instructions in the instruction stack is determined through analyzing the priority of each control instruction in the instruction stack.
Wherein, only one control instruction is available if the priority is the highest. For example, the user accounts have only high priority and low priority, and in a specified time period, only one control instruction with high priority is determined through analysis, and the other control instructions are the control instructions sent by the user accounts with low priority, so that the control instructions with high priority are taken as the control instructions to be executed.
Scene 3: when there are a plurality of control instructions with the highest priority, selecting one control instruction from the plurality of control instructions with the highest priority as the control instruction to be executed, or combining the plurality of control instructions with the highest priority into one control instruction as the control instruction to be executed.
In this case, a plurality of control commands having the highest priority may exist after the analysis, corresponding to scenario 2. In one embodiment, one control instruction is selected from the determined plurality of highest priority control instructions as the control instruction to be executed.
One possible implementation is to classify a plurality of control commands with the highest priority, wherein the control commands requesting the camera to execute the same action are of one class; determining the number of control instructions contained in each type; when one class with the largest number of control instructions exists, determining the control instructions of the class as the control instructions to be executed; when the class with the largest number of control instructions is provided with a plurality of classes, combining a plurality of control instructions with the highest priority contained in the plurality of classes into one control instruction as the control instruction to be executed.
For example, the plurality of highest priority control instructions includes: "Up", "up", "left", two of the "up" control commands are identified as a first category and "left" is identified as a second category; then, because the first class includes two control commands and the second class includes one control command, the control command represented by the first class is used as the control command to be executed, that is, the "leftward" control command is sent to the camera for execution.
Or, for example, a plurality of highest priority control instructions, including: "up", "left", then it may be determined at this point that the first type of "up" control command includes two, and the second type of "left" control command includes two as well. In one embodiment, the two types of control instructions are combined into one control instruction as the control instruction to be executed.
And combining the control instructions with the highest priority into one control instruction to be used as the control instruction to be executed. In specific implementation, the merging instructions corresponding to the control instructions with the highest priority can be searched in a pre-constructed and stored instruction dictionary; and determining the searched merging instruction as the merged control instruction. For example, the combined instruction of the "up" and "left" control instructions is found from the instruction dictionary as the "top left" control instruction, and therefore the control instruction is sent to the camera for execution as the control instruction to be executed. Wherein, a possible construction method of the instruction dictionary is shown in table 4:
control instructions to be merged Merged control instructions in instruction dictionary
Upwards and leftwards Upper left of
Upwards and rightwards Upper right part
Down and left Left lower part
Down and to the right Lower right
It should be noted that the merged control command in the command dictionary in table 4 is only one possible implementation and is not a unique limitation on the command dictionary.
In addition, if the merging instruction corresponding to the control instructions with the highest priority is not found in the pre-constructed and stored instruction dictionary, selecting one control instruction from the control instructions with the highest priority as the control instruction to be executed, which may further include several possible implementation manners, as follows:
mode 1: and selecting the last received control instruction from the control instructions with the highest priority as the control instruction to be executed.
For example, if the plurality of control commands with the highest priority includes: "upward", "downward"; since the merged control command is not found in the command dictionary, the "down" control command is taken as the control command to be executed since the "down" control command was received last. In this embodiment, the possible reason is that different control instructions are sent by the same user, where the control instruction sent first is an error instruction, and therefore the control instruction sent later is taken as the last control instruction to be executed.
Mode 2: and selecting the control instruction received firstly as the control instruction to be executed.
Similarly, the corresponding method 1 may also use the control instruction received first as the control instruction to be executed. The possible reason is that the same user may first want to execute the "up" control command and then execute the "down" control command. If the user does not monitor or receive the downward control instruction executed by the camera after the designated time period is over, the user can choose to send the control instruction again.
Mode 3: and randomly selecting a control instruction as the control instruction to be executed.
By the camera control method, the control instruction which needs to be executed by the camera can be selectively sent to the camera if a plurality of control instructions are received in the specified time period. By this method, only one control instruction is executed, compared to the manner in which all the received control instructions need to be executed within a specified time period in the related art. Therefore, the camera avoids a series of problems of shortening the service life of the camera caused by multiple movements in a short time and the like.
To better understand the present application, first, an implementation process of controlling a camera by only one user is introduced, and referring to fig. 6, a flowchart of a camera control method provided in an embodiment of the present application is shown, where the process includes:
step 601: a control instruction indicating "start" is received.
Step 602: and acquiring a set specified time period (for example, the duration of the specified time period is T).
Step 603: a control instruction configured for camera movement is received.
If the designated time period is set within the time period T for sending the control command, the time for sending the control command needs to be recorded (T1). Alternatively, the start time or the end time of the specified time period is recorded.
Step 604: it is determined whether a specified time period has arrived.
If the specified time period has not been reached, continue to step 605, and if the specified time period has been reached, continue to step 606.
Step 605: and continuing to wait.
And returns to execution step 604.
Step 606: and sending the control instruction to the camera.
Step 607: a control command is received indicating "end".
Referring to fig. 7, another schematic flow chart of a camera control method provided in an embodiment of the present application is shown, where in fig. 7, a possible flow of controlling a camera by multiple users includes:
step 701: control instructions indicating "start" sent by multiple users are received.
Step 702: an instruction stack of control instructions is generated.
The embodiment of generating the instruction stack is only one possible embodiment of the present application, and one control instruction to be executed, which is used to control the movement of the camera, may also be used to determine a plurality of control instructions within a specified time period in the present application.
Step 703: an instruction analyzer of the system monitors the instruction stack for message changes in control instructions in real time.
Step 704: the command analyzer aggregates the number and levels of different control commands.
Step 705: and judging whether a control instruction with high priority exists or not.
If yes, go on to step 706, otherwise go to step 706'.
Step 706: it is determined whether there is only one control instruction of high priority.
If yes, go to step 711. Otherwise, execution continues at step 707.
Step 706': control instructions in the instruction stack that are all low priority are determined. Execution then continues at step 708.
Step 707: a plurality of high priority control instructions are recorded and low priority control instructions are ignored.
Wherein ignoring the low priority control instructions in some exemplary embodiments moves the low priority control instructions out of the instruction stack.
Step 708: the instruction analyzer aggregates the number of control instructions within the class of different control instructions of the priority.
Step 709: and judging whether the number of the control instructions of one class is the largest in each class.
If yes, proceed to step 710. Otherwise, step 710a is performed.
Step 710: and determining the control instruction of the class as the control instruction to be executed according to the majority principle.
Step 710 a: and determining the synthesized control instruction of the same class of the control instruction in the instruction dictionary.
Step 710 b: and determining the synthesized control instruction as the control instruction to be executed.
Step 711: and when the specified time period is reached, sending the control instruction to be executed to the camera for execution.
Step 712: a control command indicating "end" is received and the control command received within a specified time period is cleared.
Based on the same technical concept, fig. 8 exemplarily shows a communication terminal 800 provided in an embodiment of the present application, where the communication terminal 800 specifically includes:
an input/output unit 810 configured to receive a shot picture of a camera to be displayed and output the shot picture for display;
a display panel 820 configured to display a display interface of an application program for displaying the photographing screen;
a backlight assembly 830 configured to be positioned at a rear surface of the display panel, the backlight assembly including a plurality of backlight partitions, each of which may emit light of different brightness;
a processor 840, respectively connected to the input/output unit, the display panel, and the backlight assembly, configured to:
analyzing the control instructions received in a specified time period, and obtaining a control instruction to be executed according to the priority corresponding to each control instruction; the control instruction is used for controlling the camera to execute corresponding actions;
and sending the control instruction to be executed to the camera for execution.
In some exemplary embodiments, the processor 840 is configured to, when executing one control instruction to be executed according to the priority of each control instruction, perform:
comparing the priorities of the control instructions to determine the control instruction with the highest priority;
when one control instruction with the highest priority exists, taking the control instruction with the highest priority as the control instruction to be executed;
when there are a plurality of control instructions with the highest priority, selecting one control instruction from the plurality of control instructions with the highest priority as the control instruction to be executed, or combining the plurality of control instructions with the highest priority into one control instruction as the control instruction to be executed.
In some exemplary embodiments, when the processor 840 is configured to execute the selection of one control instruction from the plurality of control instructions with the highest priority as the control instruction to be executed, the following steps are performed:
classifying the control instructions with the highest priority, wherein the control instructions requesting the camera to execute the same action are of one class;
determining the number of control instructions contained in each type;
when one class with the largest number of control instructions exists, determining the control instructions of the class as the control instructions to be executed;
when the class with the largest number of control instructions is provided with a plurality of classes, combining a plurality of control instructions with the highest priority contained in the plurality of classes into one control instruction as the control instruction to be executed.
In some exemplary embodiments, the processor 840 is configured to, when executing merging a plurality of control instructions with highest priority into one control instruction, execute:
searching a merging instruction corresponding to the control instructions with the highest priority in a pre-constructed and stored instruction dictionary;
and determining the searched merging instruction as the merged control instruction.
In some exemplary embodiments, the processor 840 is further configured to perform:
and if the merging instruction corresponding to the control instructions with the highest priority levels is not found in the pre-constructed and stored instruction dictionary, selecting one control instruction from the control instructions with the highest priority levels as the control instruction to be executed.
In some exemplary embodiments, when the processor 840 is configured to execute the selection of one control instruction from the plurality of control instructions with the highest priority as the control instruction to be executed, the following steps are performed:
selecting the last received control instruction from the control instructions with the highest priority as the control instruction to be executed; alternatively, the first and second electrodes may be,
selecting the control instruction received firstly as the control instruction to be executed; alternatively, the first and second electrodes may be,
and randomly selecting a control instruction as the control instruction to be executed.
In some exemplary embodiments, the processor 840 is configured to compare priorities of the control instructions, and when determining the control instruction with the highest priority, perform:
storing a first control instruction received within the specified time period into an instruction stack;
when the control instruction is received again in the appointed time period, comparing the control instruction received again with the priority of the control instruction in the instruction stack;
if the comparison result shows that the priorities are the same, storing the control instruction received again into the instruction stack;
if the comparison result is that the priority of the control instruction received again is the highest, filtering the control instruction which is lower than the priority of the control instruction received again in the instruction stack;
and when the timing of the specified time period is ended, determining that the control instruction in the instruction stack is the control instruction with the highest priority.
The details of the above-mentioned communication terminal and its function implementation can be referred to the above related description in conjunction with fig. 1-7, and are not repeated herein.
In some possible implementations, various aspects of the methods provided by the embodiments of the present application may also be implemented in the form of a program product including program code for causing a computer device to perform the steps of the methods for data processing according to various exemplary implementations of the present application described in the present specification when the program code runs on the computer device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A program product for executing data processing according to an embodiment of the present application may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a server apparatus. However, the program product of the present application is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain or store a program for use by or in connection with an information delivery, apparatus, or device.
A readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium other than a readable storage medium that can transmit, propagate, or transport the program for use by or in connection with the periodic network action system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device.
The method for executing data processing in the embodiment of the application further provides a storage medium readable by the computing device, namely, the content is not lost after power failure. The storage medium stores therein a software program comprising program code which, when executed on a computing device, when read and executed by one or more processors, implements any of the above data processing aspects of the embodiments of the present application.
The present application is described above with reference to block diagrams and/or flowchart illustrations of methods, apparatus (systems) and/or computer program products according to embodiments of the application. It will be understood that one block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, and/or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, create means for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks.
Accordingly, the subject application may also be embodied in hardware and/or in software (including firmware, resident software, micro-code, etc.). Furthermore, the present application may take the form of a computer program product on a computer-usable or computer-readable storage medium having computer-usable or computer-readable program code embodied in the medium for use by or in connection with an instruction execution system. In the context of this application, a computer-usable or computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
Although the present application has been described in conjunction with specific features and embodiments thereof, it will be evident that various modifications and combinations can be made thereto without departing from the spirit and scope of the application. Accordingly, the specification and figures are merely exemplary of the present application as defined in the appended claims and are intended to cover any and all modifications, variations, combinations, or equivalents within the scope of the present application.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is also intended to include such modifications and variations.

Claims (10)

1. A camera control method, comprising:
analyzing the control instructions received in a specified time period, and obtaining a control instruction to be executed according to the priority corresponding to each control instruction; the control instruction is used for controlling the camera to execute corresponding actions;
sending the control instruction to be executed to the camera for execution;
determining a control instruction to be executed according to the priority of each control instruction, comprising: comparing the priorities of the control instructions to determine the control instruction with the highest priority; when one control instruction with the highest priority exists, taking the control instruction with the highest priority as the control instruction to be executed; when a plurality of control instructions with the highest priority are available, combining the control instructions with the highest priority into a new control instruction to be used as the control instruction to be executed;
wherein, merging a plurality of control instructions with the highest priority into a new control instruction comprises: searching a merging instruction corresponding to the control instructions with the highest priority in a pre-constructed and stored instruction dictionary; determining the searched merging instruction as the new control instruction merged; the new control instruction is different from the plurality of highest priority control instructions.
2. The method of claim 1, further comprising:
and if the merging instruction corresponding to the control instructions with the highest priority levels is not found in the pre-constructed and stored instruction dictionary, selecting one control instruction from the control instructions with the highest priority levels as the control instruction to be executed.
3. The method according to claim 2, wherein the selecting one control instruction from the plurality of highest priority control instructions as the control instruction to be executed comprises:
classifying the control instructions with the highest priority, wherein the control instructions requesting the camera to execute the same action are of one class;
determining the number of control instructions contained in each type;
when one class with the largest number of control instructions exists, determining the control instructions of the class as the control instructions to be executed;
when the class with the largest number of control instructions is provided with a plurality of classes, combining a plurality of control instructions with the highest priority contained in the plurality of classes into one control instruction as the control instruction to be executed.
4. The method according to claim 2, wherein the selecting one control instruction from the plurality of highest priority control instructions as the control instruction to be executed comprises:
selecting the last received control instruction from the control instructions with the highest priority as the control instruction to be executed; alternatively, the first and second electrodes may be,
selecting the control instruction received firstly as the control instruction to be executed; alternatively, the first and second electrodes may be,
and randomly selecting a control instruction as the control instruction to be executed.
5. The method of claim 1, wherein comparing the priorities of the control instructions to determine the control instruction with the highest priority comprises:
storing a first control instruction received within the specified time period into an instruction stack;
when the control instruction is received again in the appointed time period, comparing the control instruction received again with the priority of the control instruction in the instruction stack;
if the comparison result shows that the priorities are the same, storing the control instruction received again into the instruction stack;
if the comparison result is that the priority of the control instruction received again is the highest, filtering the control instruction which is lower than the priority of the control instruction received again in the instruction stack;
and when the timing of the specified time period is ended, determining that the control instruction in the instruction stack is the control instruction with the highest priority.
6. A communication terminal, comprising:
the input and output unit is configured to receive a shooting picture of a camera to be displayed and output and display the shooting picture;
a display panel configured to display a display interface of an application program for displaying the photographing screen;
a backlight assembly configured to be positioned at a rear surface of the display panel, the backlight assembly including a plurality of backlight partitions, each of which may emit light of different brightness;
a processor respectively connected with the input and output unit, the display panel and the backlight assembly, and configured to:
analyzing the control instructions received in a specified time period, and obtaining a control instruction to be executed according to the priority corresponding to each control instruction; the control instruction is used for controlling the camera to execute corresponding actions;
sending the control instruction to be executed to the camera for execution;
the processor is configured to specifically execute, when determining one control instruction to be executed according to the priority of each control instruction: comparing the priorities of the control instructions to determine the control instruction with the highest priority; when one control instruction with the highest priority exists, taking the control instruction with the highest priority as the control instruction to be executed; when a plurality of control instructions with the highest priority are available, combining the control instructions with the highest priority into a new control instruction to be used as the control instruction to be executed;
wherein, the processor is configured to execute, when merging a plurality of control instructions with highest priority into a new control instruction, the following steps: searching a merging instruction corresponding to the control instructions with the highest priority in a pre-constructed and stored instruction dictionary; determining the searched merging instruction as the new control instruction merged; the new control instruction is different from the plurality of highest priority control instructions.
7. The terminal of claim 6, wherein the processor is further configured to perform:
and if the merging instruction corresponding to the control instructions with the highest priority levels is not found in the pre-constructed and stored instruction dictionary, selecting one control instruction from the control instructions with the highest priority levels as the control instruction to be executed.
8. The terminal according to claim 7, wherein the processor is configured to, when the control instruction to be executed is selected from the plurality of control instructions with the highest priority, perform:
classifying the control instructions with the highest priority, wherein the control instructions requesting the camera to execute the same action are of one class;
determining the number of control instructions contained in each type;
when one class with the largest number of control instructions exists, determining the control instructions of the class as the control instructions to be executed;
when the class with the largest number of control instructions is provided with a plurality of classes, combining a plurality of control instructions with the highest priority contained in the plurality of classes into one control instruction as the control instruction to be executed.
9. The terminal according to claim 7, wherein the processor is configured to, when the control instruction to be executed is selected from the plurality of control instructions with the highest priority, perform:
selecting the last received control instruction from the control instructions with the highest priority as the control instruction to be executed; alternatively, the first and second electrodes may be,
selecting the control instruction received firstly as the control instruction to be executed; alternatively, the first and second electrodes may be,
and randomly selecting a control instruction as the control instruction to be executed.
10. The terminal according to claim 6, wherein the processor is configured to compare priorities of the control commands, and when determining the control command with the highest priority, perform:
storing a first control instruction received within the specified time period into an instruction stack;
when the control instruction is received again in the appointed time period, comparing the control instruction received again with the priority of the control instruction in the instruction stack;
if the comparison result shows that the priorities are the same, storing the control instruction received again into the instruction stack;
if the comparison result is that the priority of the control instruction received again is the highest, filtering the control instruction which is lower than the priority of the control instruction received again in the instruction stack;
and when the timing of the specified time period is ended, determining that the control instruction in the instruction stack is the control instruction with the highest priority.
CN202010074103.8A 2020-01-22 2020-01-22 Camera control method and communication terminal Active CN111263061B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010074103.8A CN111263061B (en) 2020-01-22 2020-01-22 Camera control method and communication terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010074103.8A CN111263061B (en) 2020-01-22 2020-01-22 Camera control method and communication terminal

Publications (2)

Publication Number Publication Date
CN111263061A CN111263061A (en) 2020-06-09
CN111263061B true CN111263061B (en) 2021-07-06

Family

ID=70954397

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010074103.8A Active CN111263061B (en) 2020-01-22 2020-01-22 Camera control method and communication terminal

Country Status (1)

Country Link
CN (1) CN111263061B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113254077A (en) * 2021-06-18 2021-08-13 深圳市欧瑞博科技股份有限公司 State control method and device of intelligent equipment, electronic equipment and storage medium
CN114143463A (en) * 2021-11-30 2022-03-04 北京达佳互联信息技术有限公司 Shared shooting method, device, equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104635920A (en) * 2013-11-13 2015-05-20 欧姆龙株式会社 Gesture recognition device and control method for the same
CN105472283A (en) * 2014-08-27 2016-04-06 中兴通讯股份有限公司 Projection control method, projection equipment and mobile terminal
CN107317839A (en) * 2012-07-04 2017-11-03 中兴通讯股份有限公司 Internet of things information processing method, apparatus and system
CN109451238A (en) * 2017-11-23 2019-03-08 北京臻迪科技股份有限公司 A kind of communication means, communication system and unmanned plane
CN208707814U (en) * 2018-09-29 2019-04-05 桂林智神信息技术有限公司 Shoot monitoring device and the clouds terrace system including it
CN110515710A (en) * 2019-08-06 2019-11-29 深圳市随手科技有限公司 Asynchronous task scheduling method, apparatus, computer equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180234660A1 (en) * 2017-02-10 2018-08-16 Nxtgen Technology, Inc. Limited and temporary queuing of video data captured by a portable camera prior to user initiation of video recording commands

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107317839A (en) * 2012-07-04 2017-11-03 中兴通讯股份有限公司 Internet of things information processing method, apparatus and system
CN104635920A (en) * 2013-11-13 2015-05-20 欧姆龙株式会社 Gesture recognition device and control method for the same
CN105472283A (en) * 2014-08-27 2016-04-06 中兴通讯股份有限公司 Projection control method, projection equipment and mobile terminal
CN109451238A (en) * 2017-11-23 2019-03-08 北京臻迪科技股份有限公司 A kind of communication means, communication system and unmanned plane
CN208707814U (en) * 2018-09-29 2019-04-05 桂林智神信息技术有限公司 Shoot monitoring device and the clouds terrace system including it
CN110515710A (en) * 2019-08-06 2019-11-29 深圳市随手科技有限公司 Asynchronous task scheduling method, apparatus, computer equipment and storage medium

Also Published As

Publication number Publication date
CN111263061A (en) 2020-06-09

Similar Documents

Publication Publication Date Title
KR102597680B1 (en) Electronic device for providing customized quality image and method for controlling thereof
WO2020013577A1 (en) Electronic device and method for predicting user actions on ubiquitous devices
RU2670786C9 (en) System, method and apparatus for device group control
CN111651263B (en) Resource processing method and device of mobile terminal, computer equipment and storage medium
KR102488410B1 (en) Electronic device for recording image using a plurality of cameras and method of operating the same
CN104488258A (en) Method and apparatus for dual camera shutter
CN114302185B (en) Display device and information association method
WO2022198853A1 (en) Task scheduling method and apparatus, electronic device, storage medium, and program product
WO2021185244A1 (en) Device interaction method and electronic device
CN106302996B (en) Message display method and device
CN111601066B (en) Information acquisition method and device, electronic equipment and storage medium
US20190051147A1 (en) Remote control method, apparatus, terminal device, and computer readable storage medium
CN111263061B (en) Camera control method and communication terminal
US9967830B2 (en) Method for controlling content transmission and electronic device for supporting the same
CN115136570A (en) Integration of internet of things devices
EP3698259A1 (en) Method and system for classifying time-series data
TW202125274A (en) Method and apparatus for scheduling resource, electronic device and computer readable storage medium
WO2017050090A1 (en) Method and device for generating gif file, and computer readable storage medium
CN114500442A (en) Message management method and electronic equipment
JP7236551B2 (en) CHARACTER RECOMMENDATION METHOD, CHARACTER RECOMMENDATION DEVICE, COMPUTER AND PROGRAM
EP3823249A1 (en) Discovery of iot devices and subsequent capability information exchange for invoking functions corresponding to capabilities of said iot devices
CN108984294B (en) Resource scheduling method, device and storage medium
CN111556161B (en) Terminal control method for advertisement and communication server
CN113961278A (en) Page display method and related equipment
CN111599446A (en) Management method of medical display equipment and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant