CN116997065A - Light management method, device, vehicle and storage medium - Google Patents

Light management method, device, vehicle and storage medium Download PDF

Info

Publication number
CN116997065A
CN116997065A CN202311048352.XA CN202311048352A CN116997065A CN 116997065 A CN116997065 A CN 116997065A CN 202311048352 A CN202311048352 A CN 202311048352A CN 116997065 A CN116997065 A CN 116997065A
Authority
CN
China
Prior art keywords
vehicle
image
mounted terminal
processor
control instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311048352.XA
Other languages
Chinese (zh)
Inventor
龚华琴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Changan Automobile Co Ltd
Original Assignee
Chongqing Changan Automobile Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Changan Automobile Co Ltd filed Critical Chongqing Changan Automobile Co Ltd
Priority to CN202311048352.XA priority Critical patent/CN116997065A/en
Publication of CN116997065A publication Critical patent/CN116997065A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/175Controlling the light source by remote control
    • H05B47/18Controlling the light source by remote control via data-bus transmission
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B45/00Circuit arrangements for operating light-emitting diodes [LED]

Landscapes

  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The present application relates to the field of vehicle control technologies, and in particular, to a light management method, a device, a vehicle, and a storage medium. The method is applied to the vehicle-mounted terminal of the vehicle; the vehicle-mounted terminal is connected with the interactive lamp processor of the vehicle through a Controller Area Network (CAN) bus and an Ethernet; the CAN bus is used for transmitting control instructions, and the Ethernet is used for transmitting data; the method comprises the following steps: generating a first control instruction; sending a first control instruction to the interactive lamp processor through the CAN bus, so that the interactive lamp processor determines a target image from a plurality of images stored in the interactive lamp processor according to the image identification of the target image, and lights a lamp source corresponding to the target image; the multiple images are sent to the interactive lamp processor by the vehicle-mounted terminal through the Ethernet. Therefore, the problems of long time consumption and low efficiency of image transmission caused by too high load rate of the CAN bus when complex images are transmitted through the CAN bus CAN be solved.

Description

Light management method, device, vehicle and storage medium
Technical Field
The application relates to the technical field of vehicle control, in particular to an interactive lamp editing and controlling method based on the coexistence of an Ethernet and a controller area network (control ler area network, CAN), and particularly relates to a light management method, a device, a vehicle and a storage medium.
Background
With the development of electronization and pure electric, an automobile is not only a walking tool for human travel, but also an indispensable wisdom partner in human life; meanwhile, with the popularization and development of light emitting diode (light emitting diode, LED) light sources in automobile lighting systems, higher demands are being placed on the interactivity and the customizable performance of automobile lamps.
In order to improve the personalized use experience of the user on the interactive lamp of the vehicle, related technologies propose methods for customizing the light effect of the interactive lamp by some users, but when the light effect is realized, the methods generally transmit images and control instructions corresponding to the light effect through a CAN bus; when the user-defined light effect is complex, the data of the transmitted image are increased, so that the load rate of the CAN bus is too high, and the transmission of the image is long in time consumption and low in efficiency.
Disclosure of Invention
The application provides a light management method, a device, a vehicle and a storage medium, which are used for at least solving the problems of excessively high load rate of a CAN bus, long time consumption and low efficiency of image transmission when complex images are transmitted through the CAN bus. The technical scheme of the application is as follows:
According to a first aspect of the present application, there is provided a light management method applied to a vehicle-mounted terminal of a vehicle; the vehicle-mounted terminal is connected with the interactive lamp processor of the vehicle through a CAN bus and an Ethernet; the CAN bus is used for transmitting control instructions, and the Ethernet is used for transmitting data; the method comprises the following steps: generating a first control instruction; the first control instruction comprises an image identifier of the target image; the first control instruction is used for controlling the interactive lamp processor to light the lamp source corresponding to the target image; sending a first control instruction to the interactive lamp processor through the CAN bus, so that the interactive lamp processor determines a target image from a plurality of images stored in the interactive lamp processor according to the image identification of the target image, and lights a lamp source corresponding to the target image; the multiple images are sent to the interactive lamp processor by the vehicle-mounted terminal through the Ethernet.
According to the technical means, compared with the prior art, the vehicle-mounted terminal sends the control instruction to the controller and then the controller sends the control instruction to the interactive lamp processor, so that the problem that the control instruction is not sent timely CAN be caused.
In addition, compared with the related art, the vehicle-mounted terminal sends the image to the controller through the CAN bus, the controller sends the image to the interactive lamp processor through the CAN bus, the transmission process is low in efficiency, and when the transmitted image is complex, the problem of excessively high load rate of the CAN bus CAN be caused; in the method provided by the application, the image is directly transmitted between the vehicle-mounted terminal and the interactive lamp processor of the vehicle through the Ethernet, so that the image transmission efficiency can be improved, and when the transmitted image is complex and the data volume is large, the transmission time is shortened, and the use experience of a user is improved.
In one possible embodiment, before generating the first control instruction, the method includes: receiving a scene trigger instruction sent by a controller; the scene triggering instruction is used for indicating the vehicle-mounted terminal to trigger the light effect corresponding to the target scene; generating a first control instruction, comprising: and responding to the scene trigger instruction, and generating a first control instruction.
According to the technical means, when the target scene is triggered by the controller, the controller sends a scene triggering instruction; after receiving the scene trigger instruction, the vehicle-mounted terminal generates a first control instruction so as to facilitate the follow-up direct transmission of the first control instruction to the interactive lamp processor through the CAN bus.
In another possible embodiment, before generating the first control instruction, the method includes: receiving a first trigger operation; the first triggering operation is used for indicating the vehicle-mounted terminal to trigger the light effect corresponding to the target image; generating a first control instruction, comprising: and responding to the first trigger operation, and generating a first control instruction.
According to the technical means, when the target scene is triggered by the first triggering operation, the vehicle-mounted terminal CAN directly generate the first control instruction, so that the follow-up vehicle-mounted terminal CAN directly send the first control instruction to the interaction lamp processor through the CAN bus. Meanwhile, in the embodiment of the application, the vehicle-mounted terminal is directly connected with the interactive lamp processor, so that the first control instruction can be ensured to be rapidly sent to the interactive lamp processor, the transmission efficiency of the first control instruction is improved, and the use experience of a user is further improved.
In another possible embodiment, the method further comprises: responding to the received image editing operation, and displaying an edited image on an image editing interface of the vehicle-mounted terminal; the image editing operation is used for editing the image on the image editing interface; responding to the received image saving operation, and saving the image displayed by the image editing interface; and sending the image displayed by the image editing interface to the interactive lamp processor through the Ethernet.
According to the technical means, compared with the prior art that the image is sent to the controller through the CAN bus and then sent to the interactive lamp processor through the CAN bus, the transmission efficiency of the image is low, and when the image is complex, the problem that the load rate of the CAN bus is too high is solved; according to the method provided by the application, the user edits the image of the interactive lamp processor at the vehicle-mounted terminal, and after the editing is finished, the vehicle-mounted terminal directly transmits the image to the interactive lamp processor through the Ethernet, so that the transmission efficiency of the image CAN be improved, and when the image is more complex, the load rate of the CAN bus CAN be slowed down, the time consumption of image transmission is shortened, and the use experience of the user is further improved.
In another possible embodiment, the method further comprises: receiving a second trigger operation; the second triggering operation is used for indicating the vehicle-mounted terminal to trigger the light effect corresponding to the target video; the target video comprises N video frames; and responding to the second triggering operation, and sequentially sending N video frames to the interactive lamp processor through the Ethernet so that the interactive lamp processor sequentially lights the lamp sources corresponding to each video frame in the N video frames.
According to the technical means, unlike the method that the vehicle-mounted terminal in the related art sends N video frames to the interactive lamp processor through a low-voltage differential signaling (low-voltage differential signaling, LVDS) interface after receiving the second triggering operation, the method provided by the embodiment of the application does not use the LVDS interface, and sends the N video frames to the interactive lamp processor in sequence through the Ethernet, so that the harness cost of a vehicle can be reduced while the transmission rate is ensured.
According to a second aspect of the present application, there is provided a light management method applied to an interactive light processor of a vehicle; the interactive lamp processor is connected with the vehicle-mounted terminal of the vehicle through a CAN bus and an Ethernet; the CAN bus is used for transmitting control instructions, and the Ethernet is used for transmitting data; the method comprises the following steps: receiving a first control instruction sent by a vehicle-mounted terminal through a CAN bus; the first control instruction comprises an image identifier of the target image; the first control instruction is used for controlling the interactive lamp processor to light the lamp source corresponding to the target image; determining a target image from a plurality of images stored in an interactive lamp processor according to the image identification of the target image, and lighting a lamp source corresponding to the target image; the multiple images are sent to the interactive lamp processor by the vehicle-mounted terminal through the Ethernet.
According to the technical means, compared with the prior art that the interactive lamp processor is not directly connected with the vehicle-mounted terminal, in the method provided by the application, the interactive lamp processor receives the first control instruction sent by the vehicle entertainment terminal through the CAN bus, so that the vehicle-mounted terminal CAN directly transmit the control instruction to the interactive lamp processor, the time for transmitting the control instruction is reduced, and the instantaneity for transmitting the control instruction is improved. Meanwhile, in the method provided by the application, the interactive lamp processor receives the image sent by the vehicle-mounted terminal through the Ethernet, so that the image transmission efficiency can be improved, and when the transmitted image is complex and the data volume is large, the transmission time is shortened, and the use experience of a user is improved.
In one possible embodiment, the method further comprises: receiving an image sent by a vehicle-mounted terminal through an Ethernet; and configuring an image identifier for the image, and sending the image identifier of the image to the vehicle-mounted terminal.
According to the technical means, after receiving the image sent by the vehicle-mounted terminal, the interactive lamp processor returns an image identifier corresponding to the image; each image corresponds to a unique identifier, so that the vehicle-mounted terminal can conveniently manage the images and the image identifiers, and the image identifiers of the target images are placed into the first control instruction when the vehicle-mounted terminal needs to control the interaction lamp processor to display the light effect corresponding to the target images.
In another possible embodiment, the method further comprises: sequentially receiving N video frames sent by the vehicle-mounted terminal through the Ethernet; and sequentially lighting the lamp sources corresponding to each video frame in the N video frames.
According to the technical means, the interactive lamp processor does not need to receive video frames through the LVDS interface, but sequentially lights the lamp sources corresponding to each video frame in the N video frames after receiving the N video frames through the Ethernet, so that the harness cost of the vehicle can be reduced while the transmission rate is ensured.
According to a third aspect of the present application, there is provided a light management apparatus applied to an in-vehicle terminal of a vehicle; the vehicle-mounted terminal is connected with the interactive lamp processor of the vehicle through a CAN bus and an Ethernet; the CAN bus is used for transmitting control instructions, and the Ethernet is used for transmitting data; the light management device includes: the generation module is used for generating a first control instruction; the first control instruction comprises an image identifier of the target image; the first control instruction is used for controlling the interactive lamp processor to light the lamp source corresponding to the target image; the sending module is used for sending a first control instruction to the interactive lamp processor through the CAN bus so that the interactive lamp processor CAN determine a target image from a plurality of images stored in the interactive lamp processor according to the image identification of the target image and lighten a lamp source corresponding to the target image; the multiple images are sent to the interactive lamp processor by the vehicle-mounted terminal through the Ethernet.
In one possible embodiment, the light management device further includes: a receiving module; the receiving module is used for receiving a scene trigger instruction sent by the controller before the first control instruction is generated; the scene triggering instruction is used for indicating the vehicle-mounted terminal to trigger the light effect corresponding to the target scene; the generation module is specifically configured to generate a first control instruction in response to a scene trigger instruction.
In another possible embodiment, the light management device further includes: a receiving module; the receiving module is used for receiving a first trigger operation before generating a first control instruction; the first triggering operation is used for indicating the vehicle-mounted terminal to trigger the light effect corresponding to the target image; the generating module is specifically configured to generate a first control instruction in response to a first trigger operation.
In another possible embodiment, the light management device further includes: a display module and a storage module; the display module is used for responding to the received image editing operation and displaying the edited image on an image editing interface of the vehicle-mounted terminal; the image editing operation is used for editing the image on the image editing interface; the storage module is used for responding to the received image storage operation and storing the image displayed by the image editing interface; and the sending module is also used for sending the image to the interactive lamp processor through the Ethernet.
In another possible embodiment, the light management device further includes: a receiving module; the receiving module is used for receiving a second triggering operation; the second triggering operation is used for indicating the vehicle-mounted terminal to trigger the light effect corresponding to the target video; the target video comprises N video frames; and the sending module is further used for responding to the second triggering operation and sequentially sending the N video frames to the interactive lamp processor through the Ethernet so that the interactive lamp processor sequentially lights the lamp sources corresponding to each video frame in the N video frames.
According to a fourth aspect of the present application there is provided a light management device for use in an interactive light processor of a vehicle; the interactive lamp processor is connected with the vehicle-mounted terminal of the vehicle through a CAN bus and an Ethernet; the CAN bus is used for transmitting control instructions, and the Ethernet is used for transmitting data; the light management device includes: the receiving module is used for receiving a first control instruction sent by the vehicle-mounted terminal through the CAN bus; the first control instruction comprises an image identifier of the target image; the first control instruction is used for controlling the interactive lamp processor to light the lamp source corresponding to the target image; the lighting module is used for determining a target image from a plurality of images stored in the interactive lamp processor according to the image identification of the target image and lighting a lamp source corresponding to the target image; the multiple images are sent to the interactive lamp processor by the vehicle-mounted terminal through the Ethernet.
In one possible embodiment, the light management device further includes: a configuration module; the receiving module is also used for receiving the image sent by the vehicle-mounted terminal through the Ethernet; the configuration module is used for configuring the image identification for the image and sending the image identification of the image to the vehicle-mounted terminal.
In another possible implementation manner, the receiving module is further configured to sequentially receive N video frames sent by the vehicle-mounted terminal through ethernet; and the lighting module is also used for sequentially lighting the lamp sources corresponding to each video frame in the N video frames.
According to a fifth aspect of the present application, there is provided a vehicle comprising: a memory, a processor and a computer program stored on the memory and executable on the processor, the processor executing the program to implement the method of the first to second aspects and any one of the possible embodiments thereof.
According to a sixth aspect of the present application there is provided a computer readable storage medium, which when executed by a processor of an electronic device, enables the electronic device to perform the method of the first to second aspects and any one of its possible implementations.
Therefore, the technical characteristics of the application have the following beneficial effects:
(1) According to the technical means, compared with the prior art, the vehicle-mounted terminal sends the control instruction to the controller and then the controller sends the control instruction to the interactive lamp processor, so that the problem that the control instruction is not sent timely CAN be caused.
In addition, compared with the related art, the vehicle-mounted terminal sends the image to the controller through the CAN bus, the controller sends the image to the interactive lamp processor through the CAN bus, the transmission process is low in efficiency, and when the transmitted image is complex, the problem of excessively high load rate of the CAN bus CAN be caused; in the method provided by the application, the image is directly transmitted between the vehicle-mounted terminal and the interactive lamp processor of the vehicle through the Ethernet, so that the image transmission efficiency can be improved, and when the transmitted image is complex and the data volume is large, the transmission time is shortened, and the use experience of a user is improved.
(2) When the target scene is triggered by the controller, the controller sends a scene triggering instruction; after receiving the scene trigger instruction, the vehicle-mounted terminal generates a first control instruction so as to facilitate the follow-up direct transmission of the first control instruction to the interactive lamp processor through the CAN bus.
(3) When the target scene is triggered by the first triggering operation, the vehicle-mounted terminal CAN directly generate a first control instruction, so that the follow-up vehicle-mounted terminal CAN directly send the first control instruction to the interaction lamp processor through the CAN bus. Meanwhile, in the embodiment of the application, the vehicle-mounted terminal is directly connected with the interactive lamp processor, so that the first control instruction can be ensured to be rapidly sent to the interactive lamp processor, the transmission efficiency of the first control instruction is improved, and the use experience of a user is further improved.
(4) Compared with the prior art, the method has the advantages that the image is firstly sent to the controller through the CAN bus, and then the controller sends the image to the interactive lamp processor through the CAN bus, so that the transmission efficiency of the image is low, and when the image is complex, the problem that the load rate of the CAN bus is too high is solved; according to the method provided by the application, the user edits the image of the interactive lamp processor at the vehicle-mounted terminal, and after the editing is finished, the vehicle-mounted terminal directly transmits the image to the interactive lamp processor through the Ethernet, so that the transmission efficiency of the image CAN be improved, and when the image is more complex, the load rate of the CAN bus CAN be slowed down, the time consumption of image transmission is shortened, and the use experience of the user is further improved.
(5) In contrast to the method that in the related art, after receiving the second trigger operation, the vehicle-mounted terminal sends N video frames to the interactive lamp processor through a low-voltage differential signal interface (low-voltage differential signaling, LVDS), the method provided by the embodiment of the application does not use the LVDS interface, and sends N video frames to the interactive lamp processor in sequence through the ethernet, so that the harness cost of the vehicle can be reduced while the transmission rate is ensured.
(6) Compared with the prior art, the interactive lamp processor is not directly connected with the vehicle-mounted terminal, and in the method provided by the application, the interactive lamp processor receives the first control instruction sent by the vehicle entertainment terminal through the CAN bus, so that the vehicle-mounted terminal CAN directly transmit the control instruction to the interactive lamp processor, the time for transmitting the control instruction is reduced, and the instantaneity for transmitting the control instruction is improved. Meanwhile, in the method provided by the application, the interactive lamp processor receives the image sent by the vehicle-mounted terminal through the Ethernet, so that the image transmission efficiency can be improved, and when the transmitted image is complex and the data volume is large, the transmission time is shortened, and the use experience of a user is improved.
(7) After receiving the image sent by the vehicle-mounted terminal, the interactive lamp processor returns an image identifier corresponding to the image; each image corresponds to a unique identifier, so that the vehicle-mounted terminal can conveniently manage the images and the image identifiers, and the image identifiers of the target images are placed into the first control instruction when the vehicle-mounted terminal needs to control the interaction lamp processor to display the light effect corresponding to the target images.
(8) The interactive lamp processor does not need to receive video frames through an LVDS interface, but lights the lamp sources corresponding to each video frame in the N video frames in sequence after receiving the N video frames through the Ethernet, so that the harness cost of the vehicle can be reduced while the transmission rate is ensured.
It should be noted that, the technical effects caused by any implementation manner of the third aspect to the sixth aspect may refer to the technical effects caused by corresponding implementation manners of the first aspect and the second aspect, which are not described herein again.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application as claimed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application and do not constitute a undue limitation on the application.
FIG. 1 is a block diagram of a light management system according to an exemplary embodiment;
FIG. 2 is a schematic diagram illustrating a method of calculating a single frame load rate according to an exemplary embodiment;
FIG. 3 is a flow chart illustrating a method of light management according to an exemplary embodiment;
FIG. 4 is a flowchart illustrating another light management method according to an exemplary embodiment;
FIG. 5 is a flow chart illustrating another method of light management according to an exemplary embodiment;
FIG. 6 is a flowchart illustrating another light management method according to an exemplary embodiment;
FIG. 7 is a schematic diagram of a customized Ethernet data format shown in accordance with an exemplary embodiment;
FIG. 8 is a schematic diagram illustrating a format of image data according to an exemplary embodiment;
FIG. 9 is a flowchart illustrating another light management method according to an exemplary embodiment;
FIG. 10 is a flowchart illustrating another light management method according to an exemplary embodiment;
FIG. 11 is a flowchart illustrating another light management method according to an exemplary embodiment;
FIG. 12 is a flowchart illustrating another light management method according to an exemplary embodiment;
FIG. 13 is a flowchart illustrating another light management method according to an exemplary embodiment;
FIG. 14 is a flowchart illustrating another light management method according to an exemplary embodiment;
FIG. 15 is a flowchart illustrating another light management method according to an exemplary embodiment;
fig. 16 is a schematic diagram of a light management device according to an exemplary embodiment;
fig. 17 is a schematic diagram of another light management device shown according to an exemplary embodiment;
fig. 18 is a block diagram of a vehicle according to an exemplary embodiment.
Detailed Description
In order to enable a person skilled in the art to better understand the technical solutions of the present application, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the application described herein may be implemented in sequences other than those illustrated or otherwise described herein. The implementations described in the following exemplary examples do not represent all implementations consistent with the application. Rather, they are merely examples of apparatus and methods consistent with aspects of the application as detailed in the accompanying claims.
A light management method, apparatus, vehicle, and storage medium according to embodiments of the present application are described below with reference to the accompanying drawings. With the development of electronization and pure electric, an automobile is not only a walking tool for human travel, but also an indispensable wisdom partner in human life; meanwhile, with the popularization and development of light emitting diode (light emitting diode, LED) light sources in automobile lighting systems, higher demands are being placed on the interactivity and the customizable performance of automobile lamps. In order to improve the personalized use experience of the user on the interactive lamp of the vehicle, related technologies propose methods for customizing the light effect of the interactive lamp by some users, but when the light effect is realized, the methods generally transmit images and control instructions corresponding to the light effect through a CAN bus; when the user-defined light effect is complex, the data of the transmitted image are increased, so that the load rate of the CAN bus is too high, and the transmission of the image is long in time consumption and low in efficiency.
Therefore, on the premise of ensuring the control efficiency of quick start scenes (such as a welcome scene, an automatic lighting headlight scene, a charge-discharge reminding scene and the like), in order to prevent the load rate of a CAN bus from being too high and solve the problem that the transmission time is long when the image of an interactive lamp is complex, the application provides a lamplight management method. In addition, compared with the related art, the vehicle-mounted terminal sends the image to the controller through the CAN bus, the controller sends the image to the interactive lamp processor through the CAN bus, the transmission process is low in efficiency, and when the transmitted image is complex, the problem of excessively high load rate of the CAN bus CAN be caused; in the method provided by the application, the image is directly transmitted between the vehicle-mounted terminal and the interactive lamp processor of the vehicle through the Ethernet, so that the image transmission efficiency can be improved, and when the transmitted image is complex and the data volume is large, the transmission time is shortened, and the use experience of a user is improved.
For ease of understanding, embodiments of the present application are described in detail below with reference to the accompanying drawings.
Fig. 1 is a schematic diagram of a light management system according to an embodiment of the present application, where the light management system includes: a controller 100, an in-vehicle terminal 200, and an interactive lamp processor 300; wherein, the controller 100 is connected with the vehicle-mounted terminal 200 through a CAN bus; the controller 100 is electrically connected with the interactive lamp processor 300; the vehicle-mounted terminal 200 is connected with the interactive lamp processor 300 through a CAN bus and Ethernet; the CAN bus is used for transmitting control instructions, and the Ethernet is used for transmitting data.
A controller 100 for managing and controlling various functions and systems of the vehicle.
In some embodiments, the controller 100 is specifically configured to send a scene trigger instruction to the in-vehicle terminal 200; the scene triggering instruction is used for indicating the vehicle-mounted terminal 200 to trigger the light effect corresponding to the target scene.
In some embodiments, the controller 100 may be a vehicle domain controller.
The in-vehicle terminal 200 is used for providing various entertainment functions and services.
In some embodiments, the in-vehicle terminal 200 may be an in-vehicle entertainment terminal of a vehicle.
In some embodiments, the in-vehicle terminal 200 includes a system-on-chip (SOC) and a microcontroller (microcontroller unit, MCU).
The lighting module electricity application responsible for generating the patterns of the interactive lamp processor and the animation or the characters of the interactive lamp processor is deployed in the android system of the SOC.
As a possible implementation manner, the vehicle-mounted terminal 200 is specifically configured to generate a first control instruction in response to the above-mentioned scene trigger instruction; the first control instruction comprises an image identifier of the target image; the first control instruction is used for controlling the interactive lamp processor 300 to light the lamp source corresponding to the target image.
In another possible implementation manner, the vehicle terminal 200 is specifically configured to receive the first trigger operation.
The first triggering operation is used for indicating the vehicle-mounted terminal 200 to trigger the light effect corresponding to the target image.
In some embodiments, the vehicle-mounted terminal 200 is further configured to generate a first control instruction in response to the first trigger operation.
In some embodiments, the vehicle-mounted terminal 200 is specifically configured to send a first control instruction to the interaction light processor 300 through the CAN bus, so that the interaction light processor 300 determines a target image from a plurality of images stored in the interaction light processor 300 according to an image identifier of the target image, and lights a light source corresponding to the target image; wherein the plurality of images are transmitted to the interactive lamp processor 300 through the ethernet by the in-vehicle terminal 200.
Illustratively, the MCU of the vehicle-mounted terminal 200 sends a first control instruction to the interactive lamp processor 300 through the CAN bus, so that the interactive lamp processor 300 determines a target image from a plurality of images stored in the interactive lamp processor 300 according to the image identifier of the target image, and lights a lamp source corresponding to the target image.
In some embodiments, the vehicle-mounted terminal 200 is further configured to display the edited image on an image editing interface of the vehicle-mounted terminal in response to the received image editing operation; responding to the received image saving operation, and saving the image displayed by the image editing interface; and transmits the image displayed by the image editing interface to the interactive lamp processor 300 through the ethernet.
The image editing operation is used for editing the image on the image editing interface.
For example, the lighting modeling application in the SOC of the in-vehicle terminal 200 displays the edited image on the image editing interface after receiving the image operation; and sends the image to the interactive lamp processor 300 through the ethernet.
In some embodiments, the vehicle-mounted terminal 200 is further configured to receive a second trigger operation, and in response to the second trigger operation, sequentially send N video frames to the interactive lamp processor 300 through the ethernet, so that the interactive lamp processor 300 sequentially lights up a lamp source corresponding to each of the N video frames.
Wherein the target video comprises N video frames.
An interactive light processor 300 for interacting and communicating information.
In some embodiments, the interactive lamp processor 300 is specifically configured to receive, through the CAN bus, a first control instruction sent by the vehicle-mounted terminal 200.
The first control instruction comprises an image identifier of the target image; the first control instruction is used for controlling the interactive lamp processor 300 to light the lamp source corresponding to the target image.
The interactive lamp processor 300 reads the image identifier of the target image after receiving the first control instruction, and controls the light dot matrix display area to light the lamp sources at the corresponding positions according to the pattern of the target image.
In some embodiments, the interactive lamp processor 300 is further configured to determine a target image from a plurality of images stored in the interactive lamp processor 300 according to the image identifier of the target image, and light a lamp source corresponding to the target image.
Wherein the plurality of images are transmitted to the interactive lamp processor 300 through the ethernet by the in-vehicle terminal 200.
In some embodiments, the interactive lamp processor 300 includes: steering lamps, brake lamps, front and rear headlamps, width lamps, fog lamps, reversing lamps, warning lamps, light modules in the vehicle and the like.
It will be appreciated that in various embodiments, the interactive light processor 300 may be one or more, and embodiments of the present application are not limited in this regard.
It will be appreciated that as shown in fig. 2, a method for calculating the load rate of a single frame is provided. For a standard frame of controller area network variable data rate (CANFD) that supports a maximum of 64 bytes (byte) of data, the actual length of each frame is 572 bits (bits) per frame sent.
Wherein for a frame structure of a CANFD standard frame, the 29bit arbitration segment is calculated using a rate of 500 Kbps. Specifically, 29 bits include in common: 1-bit SOF segment+12-bit arbitration segment+1-bit IDE segment+1-bit FDF segment+1-bit R0 segment+1-bit BRS segment+2-bit ACK segment+7-bit EOF segment+3-bit frame interval. Wherein the bit time of one bit is 1/((500×1000)/1000000000) =2000 ns, and the load factor of the 29 bits in a single frame is ((29×2000 ns)/1 s) ×100% =0.0058%.
For the frame structure of a CANFD standard frame, 543 bits of the data segment are calculated using a rate of 2 Mbps. Specifically, 543 bits include in common: 1-bit ESI section+4-bit DLC section+512-bit data section+26-bit CRC section. Wherein, the bit time of one bit is 1/((2×1000000)/1000000000) =500 ns, and the loading rate of the 543bit in a single frame is ((543×500 ns)/1 s) ×100% = 0.02715%.
So the loading of one CANFD standard frame is 0.0058% +0.02715% = 0.03295%. Taking an example that the interactive lamp consists of 3000 LED single-color lamp beads, and each lamp bead uses 8bit data to represent the brightness of the lamp bead (the brightness of the lamp bead takes a gradient of 0-100 percent as a gradient). If the CANFD standard frame is used for transmission, 3000/64 approximately 47 frames of data of one picture need to be transmitted, if the period is 20ms, the load rate is (1000 ms/20 ms) × 0.03295% ×47 approximately 77.43%, and the load rate is far higher than the optimal load rate of the CAN bus by 30%. At this time, if the optimum load factor for the CAN bus is 30%, the period is calculated as N: (1000/N) × 0.03295% ×47=30%, resulting in n≡52ms, so that data transmission of a single picture requires 52ms×47=2444 ms; for ethernet, the actual length of each frame transmitted may be 1500 bytes, so if the ethernet is used to transmit a picture, 3000/1500=2 frames need to be transmitted to transmit the data of the single picture, and the data transmission of the single picture needs only 52ms by 2=104 ms, which is far less than the transmission time of the CANFD standard frame.
Fig. 3 is a schematic diagram of a light management method according to an embodiment of the present application, which is applied to a vehicle-mounted terminal of a vehicle; the vehicle-mounted terminal is connected with the interactive lamp processor of the vehicle through a Controller Area Network (CAN) bus and an Ethernet; the CAN bus is used for transmitting control instructions, and the Ethernet is used for transmitting data; the method comprises steps S101-S102.
S101, generating a first control instruction.
The first control instruction comprises an image identifier of the target image; the first control instruction is used for controlling the interactive lamp processor to light the lamp source corresponding to the target image.
As a possible implementation manner, before the step S101, the method further includes: and receiving a scene trigger instruction sent by the controller.
The scene triggering instruction is used for indicating the vehicle-mounted terminal to trigger the light effect corresponding to the target scene.
In some embodiments, a functional module is included in the controller; the function module is used for identifying the corresponding function scene.
Illustratively, the greeting function module is configured to identify a greeting scene.
As one possible implementation manner, after the functional module in the controller identifies the corresponding functional scene, it is detected whether the condition that the interactive lamp displays the formulated pattern is satisfied, if so, a scene triggering instruction is sent to the vehicle-mounted terminal through the CAN bus, so that the vehicle-mounted terminal triggers the light effect corresponding to the target scene according to the scene triggering instruction.
It will be appreciated that in some embodiments, the controller and the interactive lamp processor may communicate via a CAN bus. After the corresponding functional scene is identified by the functional module in the controller, whether the condition of the interactive lamp for displaying the formulated pattern is met or not is detected, if the condition is met, a scene trigger instruction is sent to the interactive lamp processor through the CAN bus, so that the interactive lamp processor obtains an image identifier of a target image according to the scene trigger instruction, and reads the target image to light a lamp source at a corresponding position according to the read target image.
In some embodiments, as shown in fig. 4, the step S101 may be specifically implemented as:
s1011, responding to the scene trigger instruction, generating a first control instruction.
It can be understood that when the target scene is triggered by the controller, the controller sends a scene trigger instruction; after receiving the scene trigger instruction, the vehicle-mounted terminal generates a first control instruction so as to facilitate the follow-up direct transmission of the first control instruction to the interactive lamp processor through the CAN bus.
In another possible implementation manner, before step S101, the method further includes: a first trigger is received.
The first triggering operation is used for indicating the vehicle-mounted terminal to trigger the light effect corresponding to the target image.
In some embodiments, after the user clicks the button corresponding to the target image on the interface of the vehicle-mounted terminal, the vehicle-mounted terminal may receive the first trigger operation of the user.
In some embodiments, as shown in fig. 5, the step S101 may be specifically implemented as: step S1012.
S1012, responding to the first triggering operation, and generating a first control instruction.
In some embodiments, the vehicle-mounted terminal generates the first control instruction according to a first trigger operation of the user.
Optionally, when a plurality of buttons corresponding to a plurality of images are triggered at the same time, the vehicle-mounted terminal may determine the plurality of images, and select the target image from the plurality of images according to the current scene of the vehicle and weights of different scenes.
It CAN be understood that when the target scene is triggered by the first triggering operation, the vehicle-mounted terminal CAN directly generate the first control instruction, so that the following vehicle-mounted terminal CAN directly send the first control instruction to the interaction lamp processor through the CAN bus. Meanwhile, in the embodiment of the application, the vehicle-mounted terminal is directly connected with the interactive lamp processor, so that the first control instruction can be ensured to be rapidly sent to the interactive lamp processor, the transmission efficiency of the first control instruction is improved, and the use experience of a user is further improved.
S102, sending a first control instruction to the interactive lamp processor through the CAN bus, so that the interactive lamp processor determines a target image from a plurality of images stored in the interactive lamp processor according to the image identification of the target image, and lights a lamp source corresponding to the target image.
The multiple images are sent to the interactive lamp processor by the vehicle-mounted terminal through the Ethernet.
In some embodiments, the vehicle-mounted terminal sends a first control instruction to the interaction lamp processor through the CAN bus, and after receiving the first control instruction, the interaction lamp processor determines a target image from a plurality of images stored by the vehicle-mounted terminal according to the image identification of the target image, and lights a light source corresponding to the target image.
In some embodiments, as shown in fig. 6, the method further comprises: steps S201 to S203.
Step S201, in response to the received image editing operation, displaying the edited image on the image editing interface of the in-vehicle terminal.
The image editing operation is used for editing the image on the image editing interface.
In some embodiments, when the user needs to edit the image of the interactive light processor, the vehicle-mounted terminal wakes up the SOC, and the user opens the image editing interface of the interactive light processor to edit the image.
Wherein the image editing interface comprises: and (3) the functions of switching the color of the painting brush, drawing the image by hand, inputting characters, switching the normal-brightness/breathing/flickering/gradually-brightness modes and the like.
In some embodiments, the image editing interface supports selecting and modifying images from an image library of the in-vehicle terminal.
Step S202, responding to the received image saving operation, and saving the image displayed by the image editing interface.
In some embodiments, after the user clicks the image editing interface to save, the vehicle-mounted terminal saves the image displayed by the image editing interface in response to the received image save operation.
And step S203, transmitting the image displayed by the image editing interface to the interactive lamp processor through the Ethernet.
In some embodiments, after the vehicle-mounted terminal saves the image displayed by the image editing interface, one or more groups of frame images are generated according to the image analysis and generation technology.
After the Ethernet transmission channel is established between the vehicle-mounted terminal and the interactive lamp processor, one or more groups of frame pictures are transmitted to the interactive lamp processor through a customized Ethernet data format.
Wherein, as shown in fig. 7, a customized ethernet data format is schematically shown. Wherein the customized ethernet data format comprises: a preamble of 8 bytes, a destination MAC address of 6 bytes, a source MAC address of 6 bytes, a type or length of 2 bytes or 4 bytes, image data of 46 bytes-1500 bytes, and a frame check sequence (frame check sequ ence, FCS) of 4 bytes.
It will be appreciated that in different embodiments, the data format of the ethernet may be different depending on the actual use requirements. The customized ethernet data format is only an example given by the embodiment of the present application, which is not limited to the customized ethernet data format.
Further, as shown in fig. 8, a schematic diagram of the format of the image data in the customized ethernet data format is shown. Wherein the image data includes: the method comprises the steps of 4-bit image type, 4-bit image identification, 1-byte image total frame number, 1-byte local frame number, mbyte local frame total pixel number and 4-byte pixel point value.
Wherein the pattern type represents an operation on the image currently desired to be transmitted.
Wherein, pattern types include: newly added and saved, modified and saved, deleted or directly displayed and not saved. Specifically, it is possible to represent a new addition and save by 0x00, a modification and save by 0x01, delete by 0x02, and directly display and not save by 0x 03.
For example, after the ethernet transmission channel is established between the vehicle-mounted terminal and the interactive lamp processor, one or more groups of frame pictures are transmitted to the interactive lamp processor through the customized ethernet data format, if the pattern type of the image data is 0x00, the one or more groups of frame pictures are newly added and stored.
The image identification may be a character string to represent an identification of the image currently desired to be transmitted.
The total frame number of the image indicates the total frame number that the image currently required to be transmitted needs to be transmitted.
The present frame number indicates the frame number of the image to be transmitted at the present time where the currently transmitted frame data is located.
The total number of pixels of the present frame identifies the total number of pixels of the currently transmitted frame data.
Each pixel value may be a string of characters that represents the value of each pixel (including red, green, blue and luminance values) involved in the currently transmitted frame.
It will be appreciated that in different embodiments, the format of the image data may be different depending on the actual use requirements. The format of the image data described above is only one example given in the embodiment of the present application, and the embodiment of the present application does not limit the format of the image data.
In some embodiments, as shown in fig. 9, the method further comprises: steps S301-S302.
S301, receiving a second triggering operation.
The second triggering operation is used for indicating the vehicle-mounted terminal to trigger the light effect corresponding to the target video; the target video includes N video frames.
In some embodiments, the target video may be a video animation, or a video stream composed of video frames of lyrics subtitles.
The target video may be a Music Video (MV) of a song, or a video stream composed of video frames of a lyrics subtitle of a song, for example.
In some embodiments, the user may generate the second trigger operation after clicking the associated trigger button on the in-vehicle terminal interface.
For example, when the car-mounted terminal interface is playing MV or displaying lyrics subtitle, if the user clicks the button for triggering the effect of the interactive light processor, a second triggering operation is generated.
S302, responding to a second triggering operation, and sequentially sending N video frames to the interactive lamp processor through the Ethernet so that the interactive lamp processor sequentially lights a lamp source corresponding to each video frame in the N video frames.
In some embodiments, after receiving the second trigger operation, the vehicle-mounted terminal generates N video frames from the target video, establishes an ethernet transmission channel with the interactive lamp processor, and sequentially sends the N video frames to the interactive lamp processor through the customized ethernet data format.
Wherein the customized ethernet data format is as shown in fig. 7, and the format of the image data in the customized ethernet data format is as shown in fig. 8.
For example, when the on-vehicle terminal establishes an ethernet transmission channel with the interactive light processor and then sequentially transmits N video frames to the interactive light processor, if the pattern type of the image data is 0x03, it indicates that the N video frames are directly displayed and not stored.
It can be understood that, unlike the method that the vehicle-mounted terminal in the related art sends N video frames to the interactive lamp processor through the LVDS interface after receiving the second trigger operation, the method provided by the embodiment of the application does not use the LVDS interface, and sends N video frames to the interactive lamp processor in turn through the ethernet, so that the transmission rate can be ensured, real-time sending and displaying (real-time enabling the interactive lamp processor to display the corresponding video stream or lyric caption, etc.) of the interactive lamp processor can be achieved, the use feeling of a user can be improved, and meanwhile, the harness cost of the vehicle can be reduced.
It CAN be understood that compared with the prior art that the image is sent to the controller through the CAN bus and then sent to the interactive lamp processor through the CAN bus, the transmission efficiency of the image is low, and when the image is complex, the problem of too high load rate of the CAN bus CAN be caused; according to the method provided by the application, the user edits the image of the interactive lamp processor at the vehicle-mounted terminal, and after the editing is finished, the vehicle-mounted terminal directly transmits the image to the interactive lamp processor through the Ethernet, so that the transmission efficiency of the image CAN be improved, and when the image is more complex, the load rate of the CAN bus CAN be slowed down, the time consumption of image transmission is shortened, and the use experience of the user is further improved.
FIG. 10 is a schematic illustration showing a light management method according to an embodiment of the present application, applied to an interactive light processor of a vehicle; the interactive lamp processor is connected with the vehicle-mounted terminal of the vehicle through a CAN bus and an Ethernet; the CAN bus is used for transmitting control instructions, and the Ethernet is used for transmitting data; the method comprises the following steps: steps S401 to S402.
S401, receiving a first control instruction sent by the vehicle-mounted terminal through the CAN bus.
The first control instruction comprises an image identifier of the target image; the first control instruction is used for controlling the interactive lamp processor to light the lamp source corresponding to the target image.
In some embodiments, the specific implementation of the step S401 may refer to the step S102, and the embodiments of the present application are not described herein.
S402, determining a target image from a plurality of images stored in the interactive lamp processor according to the image identification of the target image, and turning on a lamp source corresponding to the target image.
The multiple images are sent to the interactive lamp processor by the vehicle-mounted terminal through the Ethernet.
In some embodiments, after receiving the first control instruction, the interactive light processor determines a target image from the stored multiple images according to the image identifier of the target image in the first control instruction, and lights a light source corresponding to the target image according to the target image, so as to realize the light effect of the interactive light processor.
It CAN be appreciated that compared with the related art that the interactive lamp processor is not directly connected with the vehicle-mounted terminal, in the method provided by the application, the interactive lamp processor receives the first control instruction sent by the vehicle entertainment terminal through the CAN bus, so that the vehicle-mounted terminal CAN directly transmit the control instruction to the interactive lamp processor, the time for transmitting the control instruction is reduced, and the instantaneity for transmitting the control instruction is improved. Meanwhile, in the method provided by the application, the interactive lamp processor receives the image sent by the vehicle-mounted terminal through the Ethernet, so that the image transmission efficiency can be improved, and when the transmitted image is complex and the data volume is large, the transmission time is shortened, and the use experience of a user is improved.
In some embodiments, as shown in fig. 11, the method further comprises: steps S501-S502.
S501, receiving an image sent by the vehicle-mounted terminal through the Ethernet.
In some embodiments, after receiving an image sent by the vehicle-mounted terminal, the interactive lamp processor first identifies a pattern type of the image, and performs a corresponding operation on the image according to the pattern type.
For example, when the pattern type is newly added and saved, the interactive light processor will newly add the image to the database and save it.
In some embodiments, the specific implementation of the step S501 may refer to the step S203, and the embodiments of the present application are not described herein.
S502, configuring an image identifier for the image, and sending the image identifier of the image to the vehicle-mounted terminal.
In some embodiments, when the pattern type is newly added and saved or modified and saved, the interactive lamp processor performs an operation corresponding to the pattern type, then configures an image identifier for the image, and sends the image identifier of the image to the vehicle-mounted terminal.
It can be understood that after receiving the image sent by the vehicle-mounted terminal, the image identifier corresponding to the image is returned; each image corresponds to a unique identifier, so that the vehicle-mounted terminal can conveniently manage the images and the image identifiers, and the image identifiers of the target images are placed into the first control instruction when the vehicle-mounted terminal needs to control the interaction lamp processor to display the light effect corresponding to the target images.
In some embodiments, as shown in fig. 12, the method further comprises: steps S601 to S602.
S601, N video frames sent by the vehicle-mounted terminal are sequentially received through the Ethernet.
In some embodiments, the specific implementation of the step S601 may refer to the step S302, and the embodiments of the present application are not described herein.
S602, sequentially lighting the lamp sources corresponding to each video frame in the N video frames.
In some embodiments, after receiving N video frames sent by the vehicle-mounted terminal, the interactive light processor first identifies a pattern type of each video frame, and performs a corresponding operation on the N video frames according to the pattern type.
For example, when the pattern type is direct display and not save, the interactive light processor will sequentially illuminate the light source corresponding to each of the N video frames and not save the N video frames.
It can be understood that the interactive lamp processor does not need to receive video frames through the LVDS interface, but lights the lamp sources corresponding to each video frame in the N video frames in turn after receiving the N video frames through the Ethernet, so that the harness cost of the vehicle can be reduced while the transmission rate is ensured.
The above light management method provided by the application is to be understood conveniently, and the specific implementation manner of the light management method is described in the form of an embodiment.
In some embodiments, as shown in fig. 13, when the target scene is triggered at the vehicle-mounted terminal, the management of the interactive light processor may be specifically implemented as the following steps:
step a1, a functional module of the vehicle-mounted terminal identifies a corresponding functional scene.
Step a2, the vehicle-mounted terminal judges whether the condition that the interactive lamp processor displays the appointed image is met or not; if not, ending the control of the interactive lamp processor; if yes, go to step a3.
And a step a3 of sending a first control instruction containing the image identification of the target image to the interactive lamp processor through the CAN bus.
And a4, after the interactive lamp processor receives the first control instruction, reading the target image according to the image identification.
And a5, the interactive lamp processor lights the lamp sources at the positions corresponding to the dot matrix display areas of the car lamps according to the read target images.
In some embodiments, as shown in fig. 14, the sending of the edited image to the interactive light processor by the vehicle terminal may be implemented as follows:
and b1, waking up the SOC of the vehicle-mounted terminal, and opening an image editing interface by a user to edit the image.
And b2, responding to a holding request of a user, and generating an image according to an image analysis and generation technology. Wherein the image is composed of one or more frames.
And b3, establishing a transmission channel with the interactive lamp processor, and sending the image to the interactive lamp processor through the Ethernet.
And b4, the interactive lamp processor receives and stores the image sent by the vehicle-mounted terminal.
And b5, the interactive lamp processor generates an image identifier of the image and returns the image identifier to the vehicle-mounted terminal.
And b6, the vehicle-mounted terminal receives and stores the image identification.
In some embodiments, in order to better understand the necessity of separately transmitting the image and the control command in the light management method provided by the present application, a greeting scene is taken as an example, and the whole process of the light management method is specifically described below.
Specifically, as shown in fig. 15, the light management method of the welcome scene may be implemented as the following steps:
step c1, the vehicle-mounted terminal wakes up the SOC, and a user opens an image editing interface to edit the welcome image.
And c2, responding to a holding request of a user by the vehicle-mounted terminal, and generating a welcome image according to an image analysis and generation technology. Wherein, the welcome image is composed of one or more frames.
Step c3, the vehicle-mounted terminal sends the welcome image to the interactive lamp processor; the vehicle-mounted terminal and the interactive lamp processor transmit images through the Ethernet.
And c4, the interactive lamp processor receives and stores the welcome image sent by the vehicle-mounted terminal, and simultaneously generates an image identifier of the welcome image.
And c5, the interactive lamp processor sends the image identification to the vehicle-mounted terminal.
And c6, the vehicle-mounted terminal receives and stores the image identification.
And c7, waking up the controller after the key detection module of the controller detects the key.
And c8, judging whether the wake-up condition of the greeting scene is met by the greeting functional module in the controller.
And step c9, under the condition that the wake-up condition of the welcome scene is met, the power management module powers on the vehicle-mounted terminal and the interactive lamp processor.
If the wake-up condition of the welcome scene is not met, the light management method is ended.
Step c10, a network management module in the controller sends a first CAN message to the vehicle-mounted terminal; the first CAN message is used for waking up the MCU in the vehicle-mounted terminal.
Step c11, the vehicle-mounted terminal sends a second CAN message to the interaction lamp processor; the second CAN message is used for waking up the interactive lamp processor.
And c12, the controller puts the image identification of the welcome image corresponding to the welcome scene into a scene wake-up instruction.
Step c13, the controller sends a scene wake-up instruction to the vehicle-mounted terminal; the controller is connected with the vehicle-mounted terminal through a CAN bus.
And step c14, the vehicle-mounted terminal generates a welcome control instruction.
Step c15, the vehicle-mounted terminal sends a welcome control instruction to the interactive lamp processor; wherein, welcome control command sends through CAN bus.
And c16, the interactive lamp processor reads the corresponding personalized welcome image according to the image identification of the image corresponding to the welcome scene in the welcome control instruction.
And c17, the interactive lamp processor lights the lamp sources at the corresponding positions of the dot matrix display areas of the car lamps according to the read corresponding personalized welcome images.
It can be understood that aiming at the management requirement of the interactive lamp processor of the greeting scene, the distance from the key to the user is detected to get on the bus is short, and the required time is not long, so that if the interactive lamp processor needs to better realize the display of the image effect corresponding to the greeting scene under the greeting scene, the timeliness of the interactive lamp processor receiving the control instruction should be considered; meanwhile, in order to meet the requirement of the user on personalized editing of the light effect of the interactive lamp processor, the image corresponding to the light effect of the interactive lamp processor needs to be edited and modified in a self-defined mode.
Compared with the prior art, the method has the advantages that the vehicle-mounted terminal transmits the control instruction to the controller and then to the interactive lamp processor, so that the problem of untimely control instruction transmission CAN be caused.
In addition, compared with the related art, the vehicle-mounted terminal sends the image to the controller through the CAN bus, the controller sends the image to the interactive lamp processor through the CAN bus, the transmission process is low in efficiency, and when the transmitted image is complex, the problem of excessively high load rate of the CAN bus CAN be caused; in the method provided by the application, the image is directly transmitted between the vehicle-mounted terminal and the interactive lamp processor of the vehicle through the Ethernet, so that the image transmission efficiency can be improved, and when the transmitted image is complex and the data volume is large, the transmission time is shortened, and the use experience of a user is improved.
The above description of the solution provided by the embodiments of the present application is mainly from the perspective of a method, and in order to implement the light management device or the electronic device, the device includes a hardware structure and/or a software module that perform each function. Those of skill in the art will readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
According to the method, the light management device or the electronic device can be divided into the functional modules, for example, the light management device or the electronic device can comprise each functional module corresponding to each functional division, or two or more functions can be integrated into one processing module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
Fig. 16 is a schematic view of a light management device according to an exemplary embodiment, applied to an in-vehicle terminal of a vehicle; the vehicle-mounted terminal is connected with the interactive lamp processor of the vehicle through a CAN bus and an Ethernet; the CAN bus is used for transmitting control instructions, and the Ethernet is used for transmitting data; the light management apparatus 400 includes: a generating module 401, a transmitting module 402, a display module 403, a saving module 404 and a receiving module 405.
A generating module 401, configured to generate a first control instruction; the first control instruction comprises an image identifier of the target image; the first control instruction is used for controlling the interactive lamp processor to light the lamp source corresponding to the target image; a sending module 402, configured to send a first control instruction to the interaction light processor through the CAN bus, so that the interaction light processor determines a target image from a plurality of images stored in the interaction light processor according to an image identifier of the target image, and lights a light source corresponding to the target image; the multiple images are sent to the interactive lamp processor by the vehicle-mounted terminal through the Ethernet.
In a possible implementation manner, the receiving module 405 is configured to receive a scene trigger instruction sent by the controller before generating the first control instruction; the scene triggering instruction is used for indicating the vehicle-mounted terminal to trigger the light effect corresponding to the target scene; the generating module 401 is specifically configured to generate a first control instruction in response to a scene trigger instruction.
In another possible implementation manner, the receiving module 405 is configured to receive a first trigger operation before generating a first control instruction; the first triggering operation is used for indicating the vehicle-mounted terminal to trigger the light effect corresponding to the target image; the generating module 401 is specifically configured to generate a first control instruction in response to a first trigger operation.
In another possible implementation manner, the display module 403 is configured to display the edited image on the image editing interface of the vehicle-mounted terminal in response to the received image editing operation; the image editing operation is used for editing the image on the image editing interface; a saving module 404, configured to save the image displayed by the image editing interface in response to the received image saving operation; and the sending module is also used for sending the image to the interactive lamp processor through the Ethernet.
In another possible implementation manner, the receiving module 405 is configured to receive a second triggering operation; the second triggering operation is used for indicating the vehicle-mounted terminal to trigger the light effect corresponding to the target video; the target video comprises N video frames; the sending module 402 is further configured to send N video frames to the interactive light processor in turn through the ethernet in response to the second trigger operation, so that the interactive light processor sequentially lights up a light source corresponding to each video frame in the N video frames.
The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.
FIG. 17 is a schematic diagram of a light management device applied to an interactive light processor of a vehicle, shown in accordance with an exemplary embodiment; the interactive lamp processor is connected with the vehicle-mounted terminal of the vehicle through a CAN bus and an Ethernet; the CAN bus is used for transmitting control instructions, and the Ethernet is used for transmitting data; the light management device 500 includes: a receiving module 501, a lighting module 502 and a configuring module 503;
a receiving module 501, configured to receive a first control instruction sent by the vehicle-mounted terminal through a CAN bus; the first control instruction comprises an image identifier of the target image; the first control instruction is used for controlling the interactive lamp processor to light the lamp source corresponding to the target image; the lighting module 502 is configured to determine a target image from a plurality of images stored in the interaction lamp processor according to an image identifier of the target image, and light a lamp source corresponding to the target image; the multiple images are sent to the interactive lamp processor by the vehicle-mounted terminal through the Ethernet.
In a possible implementation manner, the receiving module 501 is further configured to receive, through ethernet, an image sent by the vehicle-mounted terminal; a configuration module 503, configured to configure an image identifier for an image, and send the image identifier of the image to the vehicle-mounted terminal.
In another possible implementation manner, the receiving module 501 is further configured to sequentially receive N video frames sent by the vehicle terminal through ethernet; the lighting module 502 is further configured to sequentially light a light source corresponding to each of the N video frames.
The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.
Fig. 18 is a block diagram of a vehicle, according to an example embodiment. As shown in fig. 18, vehicle 600 includes, but is not limited to: a processor 601 and a memory 602.
The memory 602 is used for storing executable instructions of the processor 601. It will be appreciated that the processor 601 is configured to execute instructions to implement the light management method of the above embodiments.
It should be noted that the vehicle structure shown in fig. 18 is not limiting of the vehicle, and the vehicle may include more or fewer components than shown in fig. 18, or may combine some components, or a different arrangement of components, as will be appreciated by those skilled in the art.
The processor 601 is a control center of the vehicle and utilizes various interfaces and lines to connect various parts of the entire vehicle, and by running or executing software programs and/or modules stored in the memory 602 and invoking data stored in the memory 602, performs various functions of the vehicle and processes the data, thereby performing overall monitoring of the vehicle. The processor 601 may include one or more processing units. Alternatively, the processor 601 may integrate an application processor that primarily handles operating systems, user interfaces, applications, etc., and a modem processor that primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 601.
The memory 602 may be used to store software programs as well as various data. The memory 602 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, application programs (such as a determination unit, a processing unit, etc.) required for at least one functional module, and the like. In addition, the memory 602 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
In an exemplary embodiment, a computer readable storage medium is also provided, such as a memory 602, comprising instructions executable by the processor 601 of the vehicle 600 to implement the light management method of the above embodiments.
In actual implementation, the functions of the generating module 401, the transmitting module 402, the display module 403, the saving module 404, and the receiving module 405 in fig. 16, and the receiving module 501, the lighting module 502, and the configuring module 503 in fig. 17 may be implemented by the processor 601 in fig. 18 calling a computer program stored in the memory 602. The specific implementation process may refer to the description of the light management method in the above embodiment, and will not be repeated here.
Alternatively, the computer readable storage medium may be a non-transitory computer readable storage medium, for example, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, embodiments of the application also provide a computer program product comprising one or more instructions executable by the processor 601 of the vehicle to perform the light management method of the above-described embodiments.
It should be noted that, when the instructions in the computer readable storage medium or one or more instructions in the computer program product are executed by the processor of the vehicle, the respective processes of the embodiments of the light management method are implemented, and the technical effects similar to those of the light management method can be achieved, so that repetition is avoided, and no detailed description is given here.
From the foregoing description of the embodiments, it will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules, so as to perform all the classification parts or part of the functions described above.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts shown as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. The purpose of the embodiment scheme can be achieved by selecting part or all of the classification part units according to actual needs.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application, or the portion contributing to the prior art or the whole classification portion or portion of the technical solution, may be embodied in the form of a software product stored in a storage medium, where the software product includes several instructions to cause a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to execute the whole classification portion or part of the steps of the method of the embodiments of the present application. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk, etc.
The present application is not limited to the above embodiments, and any changes or substitutions within the technical scope of the present application should be covered by the scope of the present application. Therefore, the protection scope of the application is subject to the protection scope of the claims.

Claims (14)

1. A light management method is characterized by being applied to a vehicle-mounted terminal of a vehicle; the vehicle-mounted terminal is connected with the interactive lamp processor of the vehicle through a Controller Area Network (CAN) bus and an Ethernet; the CAN bus is used for transmitting control instructions, and the Ethernet is used for transmitting data; the method comprises the following steps:
generating a first control instruction; the first control instruction comprises an image identifier of a target image; the first control instruction is used for controlling the interactive lamp processor to light a lamp source corresponding to the target image;
sending the first control instruction to the interactive lamp processor through the CAN bus, so that the interactive lamp processor determines the target image from a plurality of images stored in the interactive lamp processor according to the image identification of the target image, and lights a lamp source corresponding to the target image; the images are sent to the interactive lamp processor by the vehicle-mounted terminal through the Ethernet.
2. The method of claim 1, wherein prior to the generating the first control instruction, the method further comprises:
receiving a scene trigger instruction sent by a controller; the scene triggering instruction is used for indicating the vehicle-mounted terminal to trigger the light effect corresponding to the target scene;
the generating a first control instruction includes:
and responding to the scene trigger instruction, and generating the first control instruction.
3. The method of claim 1, wherein prior to the generating the first control instruction, the method further comprises:
receiving a first trigger operation; the first triggering operation is used for indicating the vehicle-mounted terminal to trigger the light effect corresponding to the target image;
the generating a first control instruction includes:
and responding to the first trigger operation, and generating the first control instruction.
4. The method according to claim 1, wherein the method further comprises:
responding to the received image editing operation, and displaying an edited image on an image editing interface of the vehicle-mounted terminal; the image editing operation is used for editing an image on the image editing interface;
responding to the received image saving operation, and saving the image displayed by the image editing interface;
And sending the image displayed by the image editing interface to the interactive lamp processor through the Ethernet.
5. The method according to claim 1, wherein the method further comprises:
receiving a second trigger operation; the second triggering operation is used for indicating the vehicle-mounted terminal to trigger the light effect corresponding to the target video; the target video comprises N video frames;
and responding to the second triggering operation, and sequentially sending N video frames to the interactive lamp processor through the Ethernet so that the interactive lamp processor sequentially lights a lamp source corresponding to each video frame in the N video frames.
6. A light management method, characterized by being applied to an interactive light processor of a vehicle; the interactive lamp processor is connected with the vehicle-mounted terminal of the vehicle through a CAN bus and an Ethernet; the CAN bus is used for transmitting control instructions, and the Ethernet is used for transmitting data; the method comprises the following steps:
receiving a first control instruction sent by the vehicle-mounted terminal through the CAN bus; the first control instruction comprises an image identifier of a target image; the first control instruction is used for controlling the interactive lamp processor to light a lamp source corresponding to the target image;
Determining the target image from a plurality of images stored in the interactive lamp processor according to the image identification of the target image, and lighting a lamp source corresponding to the target image; the images are sent to the interactive lamp processor by the vehicle-mounted terminal through the Ethernet.
7. The method of claim 6, wherein the method further comprises:
receiving an image sent by the vehicle-mounted terminal through the Ethernet;
and configuring an image identifier for the image, and sending the image identifier of the image to the vehicle-mounted terminal.
8. The method of claim 6, wherein the method further comprises:
sequentially receiving N video frames sent by the vehicle-mounted terminal through the Ethernet;
and sequentially lighting the lamp sources corresponding to each video frame in the N video frames.
9. A light management device, characterized by being applied to a vehicle-mounted terminal of a vehicle; the vehicle-mounted terminal is connected with the interactive lamp processor of the vehicle through a Controller Area Network (CAN) bus and an Ethernet; the CAN bus is used for transmitting control instructions, and the Ethernet is used for transmitting data; the light management device includes:
The generation module is used for generating a first control instruction; the first control instruction comprises an image identifier of a target image; the first control instruction is used for controlling the interactive lamp processor to light a lamp source corresponding to the target image;
the sending module is used for sending the first control instruction to the interactive lamp processor through the CAN bus so that the interactive lamp processor CAN find out the target image from a plurality of images stored in the interactive lamp processor according to the image identification of the target image and lighten a lamp source corresponding to the target image; the images are sent to the interactive lamp processor by the vehicle-mounted terminal through the Ethernet.
10. The apparatus of claim 9, wherein the apparatus further comprises: a receiving module;
the receiving module is used for receiving a scene trigger instruction sent by the controller before generating the first control instruction; the scene triggering instruction is used for indicating the vehicle-mounted terminal to trigger the light effect corresponding to the target scene;
the generating module is specifically configured to generate the first control instruction in response to the scene trigger instruction.
11. A light management device characterized by an interactive light processor for use with a vehicle; the interactive lamp processor is connected with the vehicle-mounted terminal of the vehicle through a CAN bus and an Ethernet; the CAN bus is used for transmitting control instructions, and the Ethernet is used for transmitting data; the light management device includes:
The receiving module is used for receiving a first control instruction sent by the vehicle-mounted terminal through the CAN bus; the first control instruction comprises an image identifier of a target image; the first control instruction is used for controlling the interactive lamp processor to light a lamp source corresponding to the target image;
the lighting module is used for determining the target image from a plurality of images stored in the interaction lamp processor according to the image identification of the target image, and lighting a lamp source corresponding to the target image; the images are sent to the interactive lamp processor by the vehicle-mounted terminal through the Ethernet.
12. The apparatus of claim 11, wherein the light management apparatus further comprises: a configuration module;
the receiving module is further used for receiving the image sent by the vehicle-mounted terminal through the Ethernet;
the configuration module is used for configuring the image identification for the image and sending the image identification of the image to the vehicle-mounted terminal.
13. A vehicle, characterized by comprising: a memory, a processor and a computer program stored on the memory and executable on the processor, the processor executing the program to implement the light management method of any one of claims 1 to 8.
14. A computer-readable storage medium, characterized in that, when computer-executable instructions stored in the computer-readable storage medium are executed by a processor of an electronic device, the electronic device is capable of performing the light management method as claimed in any one of claims 1 to 8.
CN202311048352.XA 2023-08-18 2023-08-18 Light management method, device, vehicle and storage medium Pending CN116997065A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311048352.XA CN116997065A (en) 2023-08-18 2023-08-18 Light management method, device, vehicle and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311048352.XA CN116997065A (en) 2023-08-18 2023-08-18 Light management method, device, vehicle and storage medium

Publications (1)

Publication Number Publication Date
CN116997065A true CN116997065A (en) 2023-11-03

Family

ID=88528297

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311048352.XA Pending CN116997065A (en) 2023-08-18 2023-08-18 Light management method, device, vehicle and storage medium

Country Status (1)

Country Link
CN (1) CN116997065A (en)

Similar Documents

Publication Publication Date Title
US20160174346A1 (en) Lighting control system, terminal, and lighting control method thereof
US9895992B2 (en) Providing battery charge state information of electric vehicle
WO2022028571A1 (en) Control method and device for atmosphere lamp, and vehicle and storage medium
CN110267394B (en) Atmosphere lamp control method and system
CN108961803A (en) Vehicle drive assisting method, device, system and terminal device
CN110901524B (en) Vehicle, vehicle equipment and vehicle-mounted atmosphere lamp style automatic adjusting method thereof
CN111204300B (en) Vehicle, vehicle equipment and display element color setting method
CN111148308A (en) Vehicle-mounted atmosphere lamp control method and system
CN110583100B (en) System and method for facilitating control of lighting devices and computer-readable storage medium
CN113401049A (en) Atmosphere lamp control method and device and computer storage medium
CN111462510A (en) Cloud intelligent voice prompt system based on mobile phone map navigation
CN105516851B (en) A kind of earphone control device
CN116997065A (en) Light management method, device, vehicle and storage medium
CN215581798U (en) Automobile atmosphere lamp system
CN108432158B (en) Method for releasing configuration information, method and device for accessing equipment
CN113771876A (en) Port vehicle control method based on multiple driving modes
CN109219197B (en) Method and device for controlling light change
CN103871180A (en) Unread message reminding device and method
CN209330438U (en) A kind of atmosphere lamp system and automobile
CN111078808A (en) Vehicle, user terminal and map personalized adjusting method based on environmental factors
CN214267843U (en) High stable electric motor car instrument drive module
CN101052258B (en) Lamp light controlled network and control method
CN114715029A (en) System for separating Bluetooth signal and light control signal and vehicle control system
CN212243143U (en) Steering lamp control system
CN211959605U (en) Intelligent street lamp with emotion interaction function and control system thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination