CN110769577A - Atmosphere lamp control method and device - Google Patents

Atmosphere lamp control method and device Download PDF

Info

Publication number
CN110769577A
CN110769577A CN201910993887.1A CN201910993887A CN110769577A CN 110769577 A CN110769577 A CN 110769577A CN 201910993887 A CN201910993887 A CN 201910993887A CN 110769577 A CN110769577 A CN 110769577A
Authority
CN
China
Prior art keywords
atmosphere lamp
assembly
instruction
target camera
message queue
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910993887.1A
Other languages
Chinese (zh)
Other versions
CN110769577B (en
Inventor
包玉刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Oppo Chongqing Intelligent Technology Co Ltd
Original Assignee
Oppo Chongqing Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo Chongqing Intelligent Technology Co Ltd filed Critical Oppo Chongqing Intelligent Technology Co Ltd
Priority to CN201910993887.1A priority Critical patent/CN110769577B/en
Publication of CN110769577A publication Critical patent/CN110769577A/en
Application granted granted Critical
Publication of CN110769577B publication Critical patent/CN110769577B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Abstract

The present disclosure relates to the field of terminal technologies, and in particular, to an ambience lamp control method, an ambience lamp control device, a computer-readable medium, and a wireless communication terminal. The method is applied to the terminal equipment configured with the atmosphere lamp, and comprises the following steps: responding to a shooting component activating instruction for a target shooting component, activating the target shooting component, and updating working state data of the target shooting component; and generating an atmosphere lamp assembly activating instruction based on the updated working state data of the target camera assembly so as to start the atmosphere lamp assembly according to the atmosphere lamp activating instruction. The technical scheme of this disclosure can realize opening or closing the atmosphere banks spare according to the operating condition real-time of subassembly of making a video recording.

Description

Atmosphere lamp control method and device
Technical Field
The present disclosure relates to the field of terminal technologies, and in particular, to an ambience lamp control method, an ambience lamp control device, a computer-readable medium, and a wireless communication terminal.
Background
At present, in order to improve the screen display effect, the existing intelligent terminal equipment adopts a full-screen, namely, an ultrahigh screen occupation ratio. Correspondingly, when assembling the front camera, the existing display screen solution adopts a water drop screen or a bang screen. In addition, the prior art also has a lifting and sliding front-facing camera, which can be hidden in the body when not in use.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
An object of the present disclosure is to provide an ambience lamp control method, an ambience lamp control device, a computer readable medium, and a wireless communication terminal, which can call and control an ambience lamp according to an operating state according to a camera module.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of the present disclosure, there is provided an ambience lamp control method including:
responding to a shooting component activating instruction for a target shooting component, activating the target shooting component, and updating working state data of the target shooting component;
and generating an atmosphere lamp assembly activating instruction based on the updated working state data of the target camera assembly so as to start the atmosphere lamp assembly according to the atmosphere lamp activating instruction.
According to a second aspect of the present disclosure, there is provided an ambience lamp control device including:
the target camera shooting assembly control module is used for responding to a camera shooting assembly activating instruction of a target camera shooting assembly, activating the target camera shooting assembly and updating working state data of the target camera shooting assembly;
and the atmosphere lamp assembly control module is used for generating an atmosphere lamp assembly activation instruction based on the updated working state data of the target camera assembly so as to start the atmosphere lamp assembly according to the atmosphere lamp activation instruction.
According to a third aspect of the present disclosure, a computer readable medium, having stored thereon a computer program, which when executed by a processor, implements the above-mentioned ambience lamp control method.
According to a fourth aspect of the present disclosure, there is provided a wireless communication terminal comprising:
one or more processors;
a storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the above-described ambience light control method.
According to the atmosphere lamp method provided by the embodiment of the disclosure, the working state of the camera shooting assembly is monitored, the atmosphere lamp assembly activating instruction is automatically generated when the state data of the camera shooting assembly is updated, and the atmosphere lamp assembly activating instruction is executed, so that the atmosphere lamp assembly is turned on or turned off in real time according to the working state of the camera shooting assembly.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
FIG. 1 schematically illustrates a flow diagram of an ambience lamp control method in an exemplary embodiment of the disclosure;
fig. 2 schematically illustrates a schematic structural diagram of a terminal device with a liftable front camera in an exemplary embodiment of the disclosure;
FIG. 3 schematically illustrates a schematic view of an atmosphere lamp mounting position in an exemplary embodiment of the present disclosure;
FIG. 4 schematically illustrates a schematic composition diagram of an ambience lamp control device in an exemplary embodiment of the disclosure;
fig. 5 schematically shows a structural diagram of a computer system of a wireless communication device in an exemplary embodiment of the disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The example embodiment provides an atmosphere lamp control method, which can be applied to mobile intelligent terminal equipment such as a mobile phone and a tablet personal computer equipped with an atmosphere lamp. For example, referring to the terminal device 20 shown in fig. 2, a liftable camera 21 is provided, and the camera 21 is a front camera. This camera 21 can be including parts such as the sliding assembly that is used for making the camera slide from top to bottom along fixed orbit, sensor, camera lens and module, and its structure adopts conventional scheme to realize, and this disclosure is no longer repeated here. In addition, referring to fig. 3, the camera 21 may include an upper cover 211, and an atmosphere lamp 220 assembly surrounding the lens 212 is disposed outside the lens 212 under the upper cover 211. The atmosphere lamp assembly may be a transparent tube. Alternatively, the ambience light may also take the form of a light strip. For example, referring to fig. 3, the atmosphere lamp assembly may be an atmosphere lamp 231 disposed at a side of the housing of the terminal device 20, or may be an atmosphere lamp 232 disposed at a bottom. The specific assembly position, and the form of the atmosphere lamp are not particularly limited by the present disclosure.
Referring to fig. 1, the above-described atmosphere lamp control method may include the steps of:
s11, responding to a camera shooting assembly activating instruction of a target camera shooting assembly, activating the target camera shooting assembly, and updating the working state data of the target camera shooting assembly;
and S12, generating an atmosphere lamp assembly activating instruction based on the updated working state data of the target camera assembly, and turning on the atmosphere lamp assembly according to the atmosphere lamp activating instruction.
In the atmosphere lamp control method provided by the present exemplary embodiment, on one hand, the working state of the camera module is monitored, and when the state data of the camera module is updated, an atmosphere lamp module activation instruction is automatically generated and executed, so that the atmosphere lamp module is turned on or off in real time according to the working state of the camera module. On the other hand, through the control of the atmosphere lamp, a user can know the running state of the camera shooting assembly through the atmosphere lamp.
Hereinafter, each step of the atmosphere lamp control method in the present exemplary embodiment will be described in more detail with reference to the drawings and examples.
In step S11, in response to an image pickup assembly activation instruction for a target image pickup assembly, the target image pickup assembly is activated, and the operating state data of the target image pickup assembly is updated.
In this exemplary embodiment, the camera assembly may be activated when a user takes a picture using the camera function or calls the camera using a third party application. Specifically, the method may include:
step S111, generating a target camera shooting assembly activation instruction in response to touch operation, and writing the target camera shooting assembly activation instruction into a control message queue;
and step S112, reading the control message queue, and executing the target camera shooting component activating instruction to start the target camera shooting component.
In this exemplary embodiment, the touch operation may be a touch operation of a user on a front camera or a rear camera of the terminal device; or touch operation of calling a front camera or a rear camera of the terminal equipment in the use process of other application programs. And generating an activation instruction aiming at the target camera shooting assembly based on the touch operation. For example, when a self-timer is taken or an instant messaging application program is used, an activation instruction is given to the front camera.
The following takes a front-facing camera with a liftable camera component as an example to exemplarily explain the method of the present disclosure
For example, for a front camera of a liftable type, a slider service may be provided to control the ascending and descending of the front camera. When a user carries out self-shooting, a front camera of the terminal device is a target camera shooting assembly, the user converts touch operation of the front camera to generate an activation instruction for the front camera shooting, transmission of the activation instruction can be achieved through a socket, and the activation instruction is sent to a slider service. After receiving the activation command, the slider service may generate ascending command information for controlling ascending, and add the ascending command information to a preset control message queue. The control message queue can be a message queue used for storing control instructions of the lifting type front camera and can be used for controlling a motor or a motor of the sliding assembly. For example, the message queue described above may employ a handler message mechanism to effect control of the motor of the sliding component.
In this exemplary embodiment, after adding the ascending instruction information to the control message queue, the control message queue may read the messages therein in sequence. And writing the ascending instruction information into a bottom layer BSP (Board support Package), so that the sliding block assembly responds to the ascending instruction information, the motor is driven, the front camera is ascended, and the target camera assembly is started. Meanwhile, when the front camera is raised, the BSP layer may feed back the state information of the front camera that is started and raised to the slider service, for example, report the state information in an input event manner, so as to update the working state data of the front camera.
In step S12, an atmosphere lamp assembly activation instruction is generated based on the updated operating state data of the target camera assembly, so as to turn on the atmosphere lamp assembly according to the atmosphere lamp activation instruction.
In the present exemplary embodiment, the ambience light component may be activated after updating the operating state information of the target camera component. Specifically, the method may include:
step S121, acquiring current state parameters of the target camera shooting assembly, and identifying the current state parameters;
step S122, when the current state parameters are identified to include first parameters, generating an atmosphere lamp activation instruction, and writing the atmosphere lamp activation instruction into a control message queue;
step S123, reading the control message queue to execute the atmosphere lamp activation command to turn on the atmosphere lamp component.
In this exemplary embodiment, the first parameter may be a parameter for describing the startup of the front camera. Based on the above embodiment, after the sliding service receives the state information of the target camera component reported by the bottom layer, the current state information can be analyzed to identify and determine the start of the front-facing camera. At this point, an ambience light activation command may be generated. Likewise, the ambience light activation command may be written into the control message queue described above.
In this example embodiment, after the atmosphere lamp activation command is written in the control message queue, the atmosphere lamp activation command may be sequentially read and written in the BSP layer. Causing the atmosphere lamp assembly to activate the atmosphere lamp in response to and executing the atmosphere lamp activation command.
Based on the above, in the present exemplary embodiment, the ambience lamp activation instruction may further include information of operating parameters of the ambience lamp, such as color, flashing frequency, or duration. For example, the operating parameters of the atmosphere lamp may be configuration parameters pre-customized by a user, or operating parameters generated by the system in a random manner or a default configuration manner.
In the present example embodiment, when generating an atmosphere lamp assembly activation instruction, an atmosphere lamp parameter monitoring service may be used to obtain current configuration parameters of the atmosphere lamp and configure the current configuration parameters as the atmosphere lamp operating parameters; or when the ambient light parameter monitoring service does not acquire the current configuration parameters, configuring the pre-configuration parameters as the ambient light operation parameters.
For example, the slider service may pre-establish a monitoring task of the atmosphere lamp operating parameters, for example, monitor the configuration content of the atmosphere lamp operating parameters at the system setting interface by a user in a setting observer (setting monitor) manner, so as to obtain the current configuration parameters of the atmosphere lamp assembly, and use the parameters as the atmosphere lamp operating parameters. When an activation command is generated, it is written to the BSP layer so that the atmosphere lamp operates according to the operating parameters.
Or if the user does not configure the operation parameters of the atmosphere lamp in advance, the monitoring task acquires the empty configuration parameters of the user, and the operation parameters of the atmosphere lamp can be generated in a random mode; or to implement system default ambient lamp operating parameters. For example, the color matching information of the terminal equipment is read, and the color matching information is used as the color matching of the atmosphere lamp.
Based on the above, in the present exemplary embodiment, the method described above may further include:
step S21, acquiring the current state parameters of the target camera shooting assembly, and identifying the current state parameters;
step S22, when the current state parameters are identified to include second parameters, generating an atmosphere lamp closing instruction, and writing the atmosphere lamp closing instruction into a control message queue;
step S23, reading the control message queue to execute the ambience light turn-off command and turn off the ambience light component.
In the present exemplary embodiment, the second parameter may be parameter information for describing a state in which the ascent operation ends or the front camera ascent is completed. When the slide block assembly is lifted, namely the front camera is lifted and reaches the top, state information of the lifting end can be generated and fed back to the slide block service. After receiving the state information and updating the current state parameters, the slider service can generate an atmosphere lamp closing instruction and write the atmosphere lamp closing instruction into the control message queue.
And reading the control message queue and writing the atmosphere lamp closing instruction into the BSP layer. Causing the atmosphere light assembly to respond and execute the atmosphere light turn-off command, thereby turning off the atmosphere light assembly.
Alternatively, in other exemplary embodiments of the present disclosure, the delayed turn-off time information of the atmosphere lamp may also be included in the atmosphere lamp activation instruction for starting the atmosphere lamp. For example, the rise time may be calculated in advance from the rise distance and the rise speed of the slider assembly and taken as the delayed off time of the atmosphere lamp assembly. When the atmosphere lamp activation instruction is written into the BSP layer, the instruction and the duration of delay closing are written into the BSP layer together, and therefore when the delay time is up, the atmosphere lamp closing instruction is executed. Thereby enabling timely activation and deactivation of the atmosphere light assembly.
Or, after the slider service receives the state information of the slider assembly after the slider assembly is lifted and fed back by the BSP layer, the atmosphere lamp closing instruction after time delay can be executed.
Based on the above, in the present exemplary embodiment, the method may further include:
step S31, responding to the first control operation of the target camera shooting assembly, generating an atmosphere lamp activation instruction, and writing the atmosphere lamp activation instruction into a control message queue;
step S32, reading the control message queue to turn on the atmosphere lamp component according to the atmosphere lamp activation instruction; and
step S33, generating an atmosphere lamp closing instruction, and writing the atmosphere lamp closing instruction into the control message queue;
step S34, reading the control message queue to turn off the ambience light component according to the ambience light turn-off command.
In this exemplary embodiment, the first control operation may be selection of a shooting mode such as a timed shooting mode, a video recording mode, or a delayed shooting mode. Alternatively, a selection operation for a certain execution state may be performed in a third-party application. The following description will be given taking time shooting as an example.
After the front camera is started, when a user selects timing shooting, an atmosphere lamp activation instruction can be generated, and the atmosphere lamp activation instruction can comprise starting instruction information, atmosphere lamp color information, flicker frequency information, an atmosphere lamp delay closing instruction corresponding to countdown time length of the timing shooting, and time information. So that when the atmosphere lamp activation command is written into the control message queue, the atmosphere lamp closing command can also be written into the control message queue.
After reading the current control message queue, the ambience lamp activation commands may be sequentially read and written to the BSP layer in a sequence such that the ambience lamp assembly is responsive to the ambience lamp activation commands to illuminate the ambience lamp. After the delay period, the atmosphere lamp turn-off command is written to the BSP layer such that the atmosphere lamp assembly is responsive to the atmosphere lamp turn-off command to turn off the atmosphere lamp assembly.
By activating the ambience light in response to a user operation, the user may make calls to the ambience light component in other operating modes of the camera or during use of other applications.
In addition, when the atmosphere lamp activation instruction is generated in response to the first control operation, the control information list may be read first to determine whether the atmosphere lamp turn-off instruction exists. And if the control information list exists, removing the currently existing atmosphere lamp closing instruction in the control information list, and rewriting a new atmosphere lamp activation instruction and a delayed atmosphere lamp closing instruction. Therefore, the problems that the countdown fails and the atmosphere lamp cannot be correctly lighted due to the fact that a user enters the self-timer countdown immediately after switching the front camera are solved.
Based on the above, in the present exemplary embodiment, the method may further include:
step S41, writing a camera shooting component closing instruction and an atmosphere lamp activating instruction for controlling the target camera shooting component into a control message queue when shutdown broadcast information is received;
step S42, reading the control message queue to turn off the target camera component according to the camera component turning-off instruction and turn on the atmosphere lamp component according to the atmosphere lamp activation instruction; and
step S43, generating an atmosphere lamp closing instruction, and writing the atmosphere lamp closing instruction into a control message queue;
step S44, reading the control message queue to turn off the ambience light component according to the ambience light turn-off command.
In this exemplary embodiment, in order to avoid a situation that the user cannot light the atmosphere lamp when the user is powered off in a state where the front-facing camera is started, a power-off broadcast message monitoring task may be established. Therefore, when the shutdown closing message is monitored, the closing instruction of the front camera, the atmosphere lamp activating instruction and the delayed atmosphere lamp closing instruction are directly written into the message queue. When the front camera is closed by the sliding block service, the atmosphere lamp assembly can be normally lightened when the sliding block assembly executes a descending instruction; and when the front camera finishes descending, the atmosphere lamp assembly can be normally closed.
According to the atmosphere lamp control method provided by the embodiment of the disclosure, the control message queue is established in advance, so that when a user activates the target camera, an activation instruction of the target camera, an atmosphere lamp activation instruction and a delayed atmosphere lamp closing instruction can be written into the message queue. And writing the instruction information into the BSP layer in sequence, so that the camera shooting assembly and the atmosphere lamp assembly can be started and stopped according to the instruction, and the target camera and the atmosphere lamp can be started and stopped synchronously.
It is to be noted that the above-mentioned figures are only schematic illustrations of the processes involved in the method according to an exemplary embodiment of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Further, referring to fig. 4, the present exemplary embodiment also provides an ambience lamp control device 40, including: a target camera assembly control module 401 and an ambience light assembly control module 402. Wherein:
the target camera component control module 401 may be configured to activate a target camera component in response to a camera component activation instruction for the target camera component, and update operating state data of the target camera component.
The atmosphere lamp assembly control module 402 may be configured to generate an atmosphere lamp assembly activation instruction based on the updated operating state of the target camera assembly, so as to turn on the atmosphere lamp assembly according to the atmosphere lamp activation instruction.
In one example of the present disclosure, the ambience light assembly control module 402 may include: a first state parameter acquiring unit, an atmosphere lamp activation instruction writing unit and an atmosphere lamp activation instruction executing unit (not shown in the figure). Wherein the content of the first and second substances,
the first state parameter acquiring unit may be configured to acquire a current state parameter of the target camera component, and identify the current state parameter.
The atmosphere lamp activation instruction writing unit may be configured to generate an atmosphere lamp activation instruction when it is identified that the current state parameter includes a first parameter, and write the atmosphere lamp activation instruction into a control message queue.
The atmosphere light activation instruction execution unit may be configured to read the control message queue to execute the atmosphere light activation instruction and turn on the atmosphere light assembly.
In one example of the present disclosure, the atmosphere light assembly control module 402 may include: the method can comprise the following steps: a second status parameter acquiring unit, an ambience lamp turn-off command writing unit, and an ambience lamp turn-off command executing unit (not shown in the figure). Wherein the content of the first and second substances,
the second state parameter acquiring unit can be used for acquiring the current state parameter of the target camera shooting assembly and identifying the current state parameter;
the atmosphere lamp turning-off instruction writing unit may be configured to generate an atmosphere lamp turning-off instruction when it is recognized that the current state parameter includes a second parameter, and write the atmosphere lamp turning-off instruction into a control message queue.
The ambience light turn off instruction execution unit may be configured to read the control message queue to execute the ambience light turn off instruction and turn off the ambience light assembly.
In one example of the present disclosure, the ambience light component activation instruction includes: operating parameters of the atmosphere lamp; the apparatus 40 further comprises: and an atmosphere lamp operation parameter acquisition module (not shown in the figure).
The atmosphere lamp operating parameter acquiring module may be configured to acquire a current configuration parameter of the atmosphere lamp by using an atmosphere lamp parameter monitoring service, and configure the current configuration parameter as the atmosphere lamp operating parameter; or when the ambient light parameter monitoring service does not acquire the current configuration parameters, configuring the pre-configuration parameters as the ambient light operation parameters.
In one example of the present disclosure, the target camera assembly control module 401 includes: a target image pickup assembly activation instruction writing unit, a target image pickup assembly activation instruction execution unit (not shown in the figure).
The target camera shooting assembly activation instruction writing unit may be configured to generate a target camera shooting assembly activation instruction in response to a touch operation, and write the target camera shooting assembly activation instruction into a control message queue.
The target image capturing component activation instruction execution unit may be configured to read the control message queue, and execute the target image capturing component activation instruction to start the target image capturing component.
In one example of the present disclosure, the apparatus 40 further includes: a first control operation response module, an atmosphere light assembly control module (not shown). Wherein the content of the first and second substances,
the first control operation response module may be configured to generate an atmosphere lamp activation instruction in response to a first control operation on the target camera component, and write the atmosphere lamp activation instruction into a control message queue.
The atmosphere light assembly control module may be configured to read the control message queue to turn on the atmosphere light assembly according to the atmosphere light activation instruction; generating an atmosphere lamp closing instruction, and writing the atmosphere lamp closing instruction into the control message queue; reading the control message queue to turn off the ambience light component according to the ambience light turn-off command.
In one example of the present disclosure, the apparatus 40 further includes: further comprising: a shutdown broadcast information monitoring module and a shutdown execution module (not shown in the figure). Wherein the content of the first and second substances,
the shutdown broadcast information monitoring module may be configured to write a camera shooting component shutdown instruction and an atmosphere lamp activation instruction for controlling the target camera shooting component into the control message queue when shutdown broadcast information is received.
The shutdown execution module may be configured to read the control message queue, to turn off the target camera component according to the camera component turn-off instruction, and to turn on the ambience lamp component according to the ambience lamp activation instruction; generating an atmosphere lamp closing instruction, and writing the atmosphere lamp closing instruction into a control message queue; reading the control message queue to turn off the ambience light component according to the ambience light turn-off command.
The details of each module in the above-mentioned ambience lamp control device have been described in detail in the corresponding ambience lamp control method, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Fig. 5 illustrates a schematic block diagram of a computer system suitable for use with a wireless communication device to implement an embodiment of the present invention.
It should be noted that the computer system 500 of the electronic device shown in fig. 5 is only an example, and should not bring any limitation to the functions and the scope of the application of the embodiment of the present invention.
As shown in fig. 5, the computer system 500 includes a Central Processing Unit (CPU)501 that can perform various appropriate actions and processes according to a program stored in a Read-Only Memory (ROM) 502 or a program loaded from a storage section 508 into a Random Access Memory (RAM) 503. In the RAM 503, various programs and data necessary for system operation are also stored. The CPU 501, ROM 502, and RAM 503 are connected to each other via a bus 504. An Input/Output (I/O) interface 505 is also connected to bus 504.
The following components are connected to the I/O interface 505: an input portion 506 including a keyboard, a mouse, and the like; an output section 507 including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, a speaker, and the like; a storage portion 508 including a hard disk and the like; and a communication section 509 including a Network interface card such as a LAN (Local Area Network) card, a modem, or the like. The communication section 509 performs communication processing via a network such as the internet. The driver 510 is also connected to the I/O interface 505 as necessary. A removable medium 511 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 510 as necessary, so that a computer program read out therefrom is mounted into the storage section 508 as necessary.
In particular, according to an embodiment of the present invention, the processes described below with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the invention include a computer program product comprising a computer program embodied on a computer-readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 509, and/or installed from the removable medium 511. The computer program executes various functions defined in the system of the present application when executed by a Central Processing Unit (CPU) 501.
It should be noted that the computer readable medium shown in the embodiment of the present invention may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read-Only Memory (ROM), an Erasable Programmable Read-Only Memory (EPROM), a flash Memory, an optical fiber, a portable Compact Disc Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present invention, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present invention, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present invention may be implemented by software, or may be implemented by hardware, and the described units may also be disposed in a processor. Wherein the names of the elements do not in some way constitute a limitation on the elements themselves.
It should be noted that, as another aspect, the present application also provides a computer-readable medium, which may be included in the electronic device described in the above embodiment; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by an electronic device, cause the electronic device to implement the method as described in the embodiments below. For example, the electronic device may implement the steps shown in fig. 1.
Furthermore, the above-described figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (10)

1. An atmosphere lamp control method is applied to a terminal device configured with an atmosphere lamp, and the method comprises the following steps:
responding to a shooting component activating instruction for a target shooting component, activating the target shooting component, and updating working state data of the target shooting component;
and generating an atmosphere lamp assembly activating instruction based on the updated working state data of the target camera assembly so as to start the atmosphere lamp assembly according to the atmosphere lamp activating instruction.
2. The method of claim 1, wherein generating an ambience light assembly activation command based on the updated operating state data of the target camera assembly to activate the ambience light assembly in accordance with the ambience light activation command comprises:
acquiring current state parameters of the target camera shooting assembly, and identifying the current state parameters;
when the current state parameters are identified to include first parameters, generating an atmosphere lamp activation instruction, and writing the atmosphere lamp activation instruction into a control message queue;
reading the control message queue to execute the ambience light activation instruction and turn on the ambience light component.
3. The method of claim 2, wherein upon turning on the ambience lamp assembly, the method further comprises:
acquiring current state parameters of the target camera shooting assembly, and identifying the current state parameters;
when the current state parameters are identified to include second parameters, generating an atmosphere lamp closing instruction, and writing the atmosphere lamp closing instruction into a control message queue;
reading the control message queue to execute the ambience light turn-off command and turn off the ambience light component.
4. The method of claim 1, wherein the ambient light assembly activation instructions comprise: operating parameters of the atmosphere lamp;
in the generating an ambience light component activation instruction, the method further comprises:
acquiring current configuration parameters of the atmosphere lamp by using an atmosphere lamp parameter monitoring service, and configuring the current configuration parameters as the atmosphere lamp operation parameters; or
And when the ambient light parameter monitoring service does not acquire the current configuration parameters, configuring the pre-configuration parameters as the ambient light operation parameters.
5. The method of claim 1, wherein activating a target camera assembly and updating operational status data for the target camera assembly in response to a camera assembly activation instruction for the target camera assembly comprises:
generating a target camera shooting assembly activating instruction in response to touch operation, and writing the target camera shooting assembly activating instruction into a control message queue;
and reading the control message queue, and executing the target camera shooting assembly activating instruction to start the target camera shooting assembly.
6. The method of claim 1, wherein after activating the target camera assembly, the method further comprises:
responding to a first control operation on the target camera shooting assembly, generating an atmosphere lamp activation instruction, and writing the atmosphere lamp activation instruction into a control message queue;
reading the control message queue to turn on the atmosphere light assembly according to the atmosphere light activation instruction; and
generating an atmosphere lamp closing instruction, and writing the atmosphere lamp closing instruction into the control message queue;
reading the control message queue to turn off the ambience light component according to the ambience light turn-off command.
7. The method of claim 1, further comprising:
writing a camera shooting assembly closing instruction and an atmosphere lamp activating instruction for controlling the target camera shooting assembly into a control message queue when shutdown broadcast information is received;
reading the control message queue to turn off the target camera component according to the camera component turning-off instruction and turn on the atmosphere lamp component according to the atmosphere lamp activation instruction; and
generating an atmosphere lamp closing instruction, and writing the atmosphere lamp closing instruction into a control message queue;
reading the control message queue to turn off the ambience light component according to the ambience light turn-off command.
8. An ambience lamp control device, comprising:
the target camera shooting assembly control module is used for responding to a camera shooting assembly activating instruction of a target camera shooting assembly, activating the target camera shooting assembly and updating working state data of the target camera shooting assembly;
and the atmosphere lamp assembly control module is used for generating an atmosphere lamp assembly activation instruction based on the updated working state data of the target camera assembly so as to start the atmosphere lamp assembly according to the atmosphere lamp activation instruction.
9. A computer-readable medium, on which a computer program is stored, which computer program, when being executed by a processor, carries out an ambience lamp control method as claimed in any one of claims 1 to 7.
10. A wireless communication terminal, comprising:
one or more processors;
storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the ambience lamp control method as claimed in any one of claims 1 to 7.
CN201910993887.1A 2019-10-18 2019-10-18 Atmosphere lamp control method and device Active CN110769577B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910993887.1A CN110769577B (en) 2019-10-18 2019-10-18 Atmosphere lamp control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910993887.1A CN110769577B (en) 2019-10-18 2019-10-18 Atmosphere lamp control method and device

Publications (2)

Publication Number Publication Date
CN110769577A true CN110769577A (en) 2020-02-07
CN110769577B CN110769577B (en) 2022-02-25

Family

ID=69332591

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910993887.1A Active CN110769577B (en) 2019-10-18 2019-10-18 Atmosphere lamp control method and device

Country Status (1)

Country Link
CN (1) CN110769577B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112203029A (en) * 2020-09-30 2021-01-08 海信视像科技股份有限公司 Display device and atmosphere lamp assembly control method

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201467285U (en) * 2009-06-05 2010-05-12 天津三星光电子有限公司 Digital camera with MP3 colorful playing function
US20130135871A1 (en) * 2011-11-30 2013-05-30 Hooshmand Harooni Multi-Purpose LED Lighting and Mirror Accessory for Use with Mobile Phone Devices
CN203206337U (en) * 2013-05-15 2013-09-18 广东欧珀移动通信有限公司 Photographing mobile phone
CN104113690A (en) * 2014-07-18 2014-10-22 四川长虹电器股份有限公司 Method for photo-taking correction based on intelligent lamp
CN203984737U (en) * 2014-07-03 2014-12-03 上海信耀电子有限公司 Atmosphere lamp control system in car based on wireless telecommunications
CN104301598A (en) * 2013-07-18 2015-01-21 国龙信息技术(上海)有限公司 Method of setting lighting effect of front camera by mobile terminal
CN106132020A (en) * 2016-07-29 2016-11-16 温州立地电子有限公司 A kind of with stroboscopic, the LED driver of constant power mode
CN106170170A (en) * 2016-08-25 2016-11-30 广东欧珀移动通信有限公司 The control method of atmosphere lamp and device
US9576349B2 (en) * 2010-12-20 2017-02-21 Microsoft Technology Licensing, Llc Techniques for atmospheric and solar correction of aerial images
EP3229463A1 (en) * 2016-04-08 2017-10-11 Rotolight Limited Lighting system and control thereof
CN107333370A (en) * 2017-08-10 2017-11-07 佛山市三水区彦海通信工程有限公司 A kind of intelligent atmosphere lamp adjusting method
CN107339056A (en) * 2017-05-12 2017-11-10 安徽后青春工业设计研究院有限公司 A kind of New intellectual burglar-proof door system
CN206835229U (en) * 2017-06-27 2018-01-02 清远市奇盛科技有限公司 A kind of IP Camera
CN207049694U (en) * 2017-03-27 2018-02-27 广东欧珀移动通信有限公司 Light-emitting device and mobile terminal
CN109275061A (en) * 2016-10-26 2019-01-25 绍兴上虞威拓机械电子有限公司 Multi-functional audio
CN109448610A (en) * 2018-12-13 2019-03-08 深圳市万普拉斯科技有限公司 Display device
CN109799722A (en) * 2019-02-18 2019-05-24 珠海格力电器股份有限公司 Control method, device, system and the storage medium of smart home system, equipment
CN110086968A (en) * 2019-04-15 2019-08-02 维沃移动通信有限公司 Camera module, the preparation method of proactive lampshade and terminal device
CN110297378A (en) * 2019-06-30 2019-10-01 Oppo广东移动通信有限公司 The control method of filming apparatus, electronic equipment and electronic equipment

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201467285U (en) * 2009-06-05 2010-05-12 天津三星光电子有限公司 Digital camera with MP3 colorful playing function
US9576349B2 (en) * 2010-12-20 2017-02-21 Microsoft Technology Licensing, Llc Techniques for atmospheric and solar correction of aerial images
US20130135871A1 (en) * 2011-11-30 2013-05-30 Hooshmand Harooni Multi-Purpose LED Lighting and Mirror Accessory for Use with Mobile Phone Devices
CN203206337U (en) * 2013-05-15 2013-09-18 广东欧珀移动通信有限公司 Photographing mobile phone
CN104301598A (en) * 2013-07-18 2015-01-21 国龙信息技术(上海)有限公司 Method of setting lighting effect of front camera by mobile terminal
CN203984737U (en) * 2014-07-03 2014-12-03 上海信耀电子有限公司 Atmosphere lamp control system in car based on wireless telecommunications
CN104113690A (en) * 2014-07-18 2014-10-22 四川长虹电器股份有限公司 Method for photo-taking correction based on intelligent lamp
EP3229463A1 (en) * 2016-04-08 2017-10-11 Rotolight Limited Lighting system and control thereof
CN106132020A (en) * 2016-07-29 2016-11-16 温州立地电子有限公司 A kind of with stroboscopic, the LED driver of constant power mode
CN106170170A (en) * 2016-08-25 2016-11-30 广东欧珀移动通信有限公司 The control method of atmosphere lamp and device
CN109275061A (en) * 2016-10-26 2019-01-25 绍兴上虞威拓机械电子有限公司 Multi-functional audio
CN207049694U (en) * 2017-03-27 2018-02-27 广东欧珀移动通信有限公司 Light-emitting device and mobile terminal
CN107339056A (en) * 2017-05-12 2017-11-10 安徽后青春工业设计研究院有限公司 A kind of New intellectual burglar-proof door system
CN206835229U (en) * 2017-06-27 2018-01-02 清远市奇盛科技有限公司 A kind of IP Camera
CN107333370A (en) * 2017-08-10 2017-11-07 佛山市三水区彦海通信工程有限公司 A kind of intelligent atmosphere lamp adjusting method
CN109448610A (en) * 2018-12-13 2019-03-08 深圳市万普拉斯科技有限公司 Display device
CN109799722A (en) * 2019-02-18 2019-05-24 珠海格力电器股份有限公司 Control method, device, system and the storage medium of smart home system, equipment
CN110086968A (en) * 2019-04-15 2019-08-02 维沃移动通信有限公司 Camera module, the preparation method of proactive lampshade and terminal device
CN110297378A (en) * 2019-06-30 2019-10-01 Oppo广东移动通信有限公司 The control method of filming apparatus, electronic equipment and electronic equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
方拓: "智能LED家居光环境设计—卫浴空间智能光环境", 《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112203029A (en) * 2020-09-30 2021-01-08 海信视像科技股份有限公司 Display device and atmosphere lamp assembly control method

Also Published As

Publication number Publication date
CN110769577B (en) 2022-02-25

Similar Documents

Publication Publication Date Title
WO2021175213A1 (en) Refresh rate switching method and electronic device
RU2712118C1 (en) Method of controlling screen display and device using said method
US7532191B2 (en) Apparatus, method and computer program product for controlling screen brightness of mobile terminal
EP3173923A1 (en) Method and device for image display
EP2991067B1 (en) Backlight brightness control method and device
US20200043427A1 (en) Backlight adjusting method and backlight adjusting device
CN108710306B (en) Control method and device of intelligent equipment and computer readable storage medium
CN105208191A (en) Mode switching method and mode switching device
KR102508108B1 (en) Image acquisition module, electronic equipment, image acquisition method and storage medium
EP3282644A1 (en) Timing method and device
CN110769577B (en) Atmosphere lamp control method and device
CN112087611B (en) Electronic equipment and display screen adjusting method thereof
EP3407583A1 (en) Method and device for switching display mode
CN111916032A (en) Gamma adjusting method and device for display panel
CN108184103B (en) Method and apparatus for displaying image
JP2014165725A (en) Portable terminal device
CN109901886B (en) Page language switching method, system, device and computer readable storage medium
CN116092449A (en) Screen brightness determining method and device and electronic equipment
CN109189198A (en) Image display method, device, terminal and storage medium
CN113873122A (en) Independent flash control method, system and device for double-flash camera and storage medium
CN106773753B (en) Equipment control method and device
CN108874332B (en) Interface display method and device
CN113760080B (en) Display method, device and storage medium
CN111381407A (en) Display panel, display device, scanning method and device
US11798516B2 (en) Method and device for adjusting display brightness, mobile terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant