CN114179613B - Audio-video touch interactive control method for co-driver control panel - Google Patents

Audio-video touch interactive control method for co-driver control panel Download PDF

Info

Publication number
CN114179613B
CN114179613B CN202111505618.XA CN202111505618A CN114179613B CN 114179613 B CN114179613 B CN 114179613B CN 202111505618 A CN202111505618 A CN 202111505618A CN 114179613 B CN114179613 B CN 114179613B
Authority
CN
China
Prior art keywords
rgb
task
touch
control panel
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111505618.XA
Other languages
Chinese (zh)
Other versions
CN114179613A (en
Inventor
金瑞鸣
邓亮
王金磊
谢正华
薛蔚平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changzhou Xingyu Automotive Lighting Systems Co Ltd
Original Assignee
Changzhou Xingyu Automotive Lighting Systems Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changzhou Xingyu Automotive Lighting Systems Co Ltd filed Critical Changzhou Xingyu Automotive Lighting Systems Co Ltd
Priority to CN202111505618.XA priority Critical patent/CN114179613B/en
Publication of CN114179613A publication Critical patent/CN114179613A/en
Application granted granted Critical
Publication of CN114179613B publication Critical patent/CN114179613B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/28Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/448Execution paradigms, e.g. implementations of programming paradigms
    • G06F9/4482Procedural
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/143Touch sensitive instrument input devices
    • B60K2360/1434Touch panels
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/143Touch sensitive instrument input devices
    • B60K2360/1446Touch switches
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/162Visual feedback on control action
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/164Infotainment

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • User Interface Of Digital Computer (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)

Abstract

The invention discloses an audio-video touch interactive control method for a copilot control panel, which comprises the following steps: step S1, creating a Smart RGB refreshing task, a touch control task and a CAN message analysis task on an MCU of a co-driver control panel of the co-driver control panel by using a FreeRTOS system, and periodically updating global variables in the tasks to a vehicle machine through a timer; s2, the vehicle machine sends a selection signal of Smart RGB light effect to a CAN message analysis task of an MCU of a copilot control panel through a CAN bus, and a user realizes interaction through a point selection control display screen; and S3, analyzing the client requirements, obtaining an RGB frame sequence of each frame in a video image mode, further obtaining an RGB time sequence matrix, and sequentially transmitting the RGB time sequence matrix to Smart RGB from a serial data port according to frame time. The invention provides an audio-video touch interactive control method for a co-driver control panel, which carries out brand new design on animation, volume, air quantity, songs and skylight position touch control on the co-driver control panel, and integrates entertainment and control functions.

Description

Audio-video touch interactive control method for co-driver control panel
Technical Field
The invention relates to an audio-visual touch interactive control method for a co-driver control panel, and belongs to the technical field of vehicle-mounted LED driving and atmosphere lamps.
Background
At present, with the development of the artificial intelligence field, the manufacturing industry is undergoing a change of intelligence and functional diversity. Also, due to the diversity of functions, many high-end products are becoming more and more integrated, i.e., one product can realize multiple functions, from multipurpose remote control devices in smart home to car control interfaces in car, which are similar to mobile phone interfaces, the diversity is becoming more integrated. At this time, users are concerned about whether integration is implemented or not, and pay more attention to the way of integration, so providing a more novel and unique way of integration control becomes one of the key means for enterprises to strive for more customers.
Similar to living at home, control functions on vehicles are also classified into two types of entertainment and functionality. Among them, LED light effects are being increasingly used in terms of their lighting function, entertainment function, and information transfer function. At this time, the driving programming of the LED sequence is an important link in determining the output light effect. Different from an LED display screen, the programming drive of Smart RGB adopts a logic design method, on one hand, because SmartRGB facilitates wiring, the Smart RGB is generally used as a sequence, and an MCU only has one GPIO port to directly communicate with a serial data input port of the MCU; the MCU has limited memory, which is difficult to directly transmit video frame data (usually, 3 rows and N columns of submatrices, N is the total number of LEDs, if there are M frames in a video, there are M submatrices, when N is greater, the submatrices of each frame occupy a larger memory space and even cause program coding errors), and the customer's requirements (video) cannot generally be directly used as RGB values of the LED sequences. This may create a difference in driving design from customer demand (video), sometimes even failing to achieve the target lighting effect through logic design.
In addition, in-vehicle audio, air conditioner, sunroof, and the like can greatly enrich the in-vehicle environment adjustment function, but if the adjustment is performed one by one, not only is time-consuming, but also the driver is easily distracted or disturbed. If the adjustment is made by the co-driver, on the one hand, it is inconvenient to adjust the central control display screen located close to the driving side, and the repeated operation may also affect the action of the driver or obstruct the view of the mirror, such as the right side rearview mirror, and the driving safety is affected. Therefore, a panel capable of integrally adjusting various functions is arranged on the front side of the passenger, and the panel is not required to be multifunctional as a vehicle, but is better to have a brighter appearance than a display screen of the vehicle, so that not only can integrated control be realized, but also a more comfortable and pleasant in-vehicle environment is created for passengers in the front side of the passenger and even passengers in the whole vehicle, and the problem to be solved is urgent.
Disclosure of Invention
The invention aims to solve the technical problem of overcoming the defects of the prior art and providing the audio-video touch interactive control method for the co-driver control panel, which carries out brand new design on animation, volume, air quantity, songs and skylight position touch control on the co-driver control panel and integrates entertainment and control functions.
In order to solve the technical problems, the technical scheme of the invention is as follows:
a video-audio touch interaction control method for a copilot control panel comprises the following steps:
s1, creating a Smart RGB refreshing task, a touch control task and a CAN message analysis task on an MCU of a copilot control panel by using a FreeRTOS system, and periodically updating global variables in the tasks to a vehicle machine through a timer;
s2, the vehicle machine sends a selection signal of Smart RGB light effect to a CAN message analysis task of an MCU of a copilot control panel through a CAN bus, and a user realizes interaction through a point selection control display screen;
s3, analyzing the client demand, obtaining an RGB frame sequence of each frame in the form of a video image, further obtaining an RGB time sequence matrix, and sequentially transmitting the RGB time sequence to Smart RGB from a serial data port according to frame time;
step S4, respectively etching patterns required by light projection and key indication on a control panel of the copilot according to target Smart RGB light and laser for touching key light;
and S5, the passenger selects corresponding volume, air quantity, songs and skylight positions on a control panel of the passenger in a sliding or point-contact mode, and meanwhile, the light projection effect can be selected through a main interface of the vehicle.
Further, the priority of the Smart RGB refreshing task is the same as the priority of the touch control task, the priority of the Smart RGB refreshing task and the priority of the touch control task are both higher than the priority of the CAN message analysis task, and the tasks are switched from the high-priority task to the low-priority task through vTaskDelay ().
Further, the creating of the touch control task in step S1 specifically includes:
creates a 100ms timer through xtmercreate () and sets the callback function to:
vCAN_Send_TimerCallback();
periodically sending touch information of the same volume, air quantity, songs and skylights of the message ID through the callback function; if the four pieces of touch information belong to different messages respectively, setting four timers of 100ms, 200ms, 300ms and 400ms, and independently sending each touch signal in a time-sharing manner, and performing multi-contact control on entertainment and functional peripherals through the touch chip.
Further, the touch chip performs multi-contact control on entertainment and functional peripherals, and specifically includes:
and listing all truth arrangements of available output pins of the touch chip through a truth table, and selecting each group of touch signals in an MCU of the copilot control panel through if-else sentences to realize different contact control on volume, air volume, songs and skylight positions.
Further, the touch chip performs multi-contact control on entertainment and functional peripherals, and specifically includes:
the touch chip is used for sending a selection signal and then sending a data signal, the touch object to be controlled such as volume, air volume, song and skylight position is determined through the selection signal, and after a sub-condition judgment statement is entered, the data signal sent subsequently is judged so as to carry out correct touch control.
Further, the creating of the Smart RGB refresh task in step S1 specifically includes:
initializing a CAN controller, setting a CAN event callback function to be CAN_ISR () through CAN_InstallEventCallBack (), automatically calling the CAN event callback function after a vehicle sends a CAN message and collecting the message, and presenting the message content to a CAN message analysis task through xQueueSend_FrominsR ();
after the high-priority task is temporarily executed through vTaskDelay (), the CAN message analysis task clears the blocking state caused by xQueueReceive () when members exist in the queue, and assigns the data frame of the message to the corresponding global signal variable according to the message ID, so as to guide the Smart RGB refreshing task to switch the light effect.
Further, the establishing of the RGB frame sequence in step S3 specifically includes:
aiming at the video image required by each frame of customer, converting the video image into an (R, G, B) three-value matrix and a gray single-value matrix respectively through OpenCV, setting a reasonable threshold value, checking whether the target light effect presented by the gray single-value matrix completely reflects the customer requirement through a three-dimensional drawing method, and converting the gray single-value matrix into a 0/1 matrix through the threshold value;
obtaining an element-level product matrix of an (R, G, B) three-value matrix and a 0/1 matrix through a NumpyArray, converting the product matrix into an RGB sequence by taking the product matrix as input data of Smart RGB, storing the RGB sequence into one row of Excel, and sequentially arranging the RGB sequence of each frame into an RGB time sequence matrix;
the RGB time sequence matrix is stored in the flash of the MCU of the copilot control panel in the form of a sub-function local variable, the sub-function is repeatedly called through a main function when the program runs, so that the memory is continuously released, and only one frame of RGB sequence is selectively initialized as the local variable at a time through a switch statement in the sub-function.
By adopting the technical scheme, the intelligent control system integrates entertainment and control functions, wherein the touch function is mainly used for directly adjusting the volume, the air quantity, the songs and the skylight positions of the co-driver passengers, integrated control is realized, the co-driver passengers directly adjust the co-driver control panel, the operation is more convenient, the driver is not interfered, and the driving safety is improved. Smart RGB animation based on the laser etched light-transmitting surface provides multiple light selection modes for passengers, and can be selected once through a car machine and played circularly, so that a more comfortable and pleasant car interior environment is created for the passenger at the copilot position and even for all car passengers.
Drawings
FIG. 1 is a schematic view of an in-vehicle installation of the present invention;
FIG. 2 is a block diagram of a control method for controlling the touch interaction of audio and video on a co-driver control panel according to the present invention;
FIG. 3 is a flowchart of an animation analysis of a video image for a video/audio touch interactive control method on a co-driver control panel according to the present invention;
fig. 4 is a Smart RGB algorithm flow chart of a co-driver control panel MCU for a video and audio touch interactive control method on the co-driver control panel according to the present invention.
Detailed Description
In order that the invention may be more readily understood, a more particular description of the invention will be rendered by reference to specific embodiments that are illustrated in the appended drawings.
As shown in fig. 2, the present embodiment provides a method for controlling audio-visual touch interaction on a co-driver control panel, which includes:
and S1, creating a Smart RGB refreshing task, a touch control task and a CAN message analysis task on an MCU of a copilot control panel by using a FreeRTOS system, and periodically updating global variables in the tasks to a vehicle machine through a timer. The priority of the Smart RGB refreshing task is the same as the priority of the touch control task, the priority of the Smart RGB refreshing task and the priority of the touch control task are both higher than the priority of the CAN message analysis task, and the tasks are switched from the high-priority task to the low-priority task through vTaskDelay ().
And S2, the vehicle machine sends a selection signal of Smart RGB light effect to a CAN message analysis task of the MCU of the copilot control panel through a CAN bus, and a user realizes interaction through selecting a control display screen.
And S3, analyzing the client requirements, obtaining an RGB frame sequence of each frame in a video image mode, further obtaining an RGB time sequence matrix, and sequentially transmitting the RGB time sequence matrix to Smart RGB from a serial data port according to frame time.
And S4, respectively etching patterns required by light projection and key indication on a control panel of the copilot according to target Smart RGB light and laser for touching key light.
And S5, the passenger selects corresponding volume, air quantity, songs and skylight positions on a control panel of the passenger in a sliding or point-contact mode, and meanwhile, the light projection effect can be selected through a main interface of the vehicle. The control panel of the copilot transmits a point touch or sliding signal to the MCU of the copilot control panel through the CyPress touch chip, and the MCU of the copilot control panel analyzes the point touch or sliding gesture into a corresponding control signal according to the signal, so that the gesture can control entertainment and functional equipment, such as volume is continuously increased, air volume is increased to 3 gears or a skylight is opened to 50 percent. In this embodiment, the model of the CyPress touch chip is CY8C4025PVS-S412, and the model of the MCU of the co-driver control panel is NXP S32K144.
As shown in fig. 2, for the touch control task, the most entertainment and functional peripherals should be controlled by the least touch chip pins as possible, so the creating of the touch control task in step S1 specifically includes:
creates a 100ms timer through xtmercreate () and sets the callback function to:
vCAN_Send_TimerCallback();
periodically sending touch information of the same volume, air quantity, songs and skylights of the message ID through a callback function; if the four pieces of touch information belong to different messages respectively, setting four timers (groups) of 100ms, 200ms, 300ms and 400ms, and independently sending each touch signal in a time-sharing way and performing multi-contact control on entertainment and functional peripherals through the touch chip.
In the touch control task, in order to realize the control of more contacts by using one touch chip, a truth table can be used for listing all truth arrangements of available output pins of the touch chip, and each group of touch signals is selected in an MCU of a co-driver control panel through if-else sentences, so that the control of different contacts of volume, air volume, songs and skylight positions is realized.
In addition, in the touch control task, a mode that the touch chip sends a selection signal and then sends a data signal can be adopted, the position of a contact object to be controlled, such as volume, air volume, songs and skylight, is determined through the selection signal, and after entering a sub-condition judgment statement, the data signal sent subsequently is judged so as to carry out correct contact control.
For the Smart RGB refresh task, the creating of the Smart RGB refresh task in step S1 specifically includes:
initializing a CAN controller, setting a CAN event callback function to be CAN_ISR () through CAN_InstallEventCallBack (), automatically calling the CAN event callback function after a vehicle sends a CAN message and collecting the message, and presenting the message content to a CAN message analysis task through xQueueSendFrominsR ();
after the high-priority task is temporarily executed through vTaskDelay (), the CAN message analysis task clears the blocking state caused by xQueueReceive () when members exist in the queue, and assigns the data frame of the message to the corresponding global signal variable according to the message ID, so as to guide the Smart RGB refreshing task to switch the light effect.
As shown in fig. 3 and 4, the establishing of the RGB frame sequence in step S3 specifically includes:
the method comprises the steps of extracting a video required by a customer into an image of one frame, and converting the video image into an (R, G, B) three-value matrix and a gray level single-value matrix through OpenCV, wherein the (R, G, B) three-value matrix is an original color frame image, and the gray level single-value matrix is a gray level frame image. Setting a reasonable threshold value, checking whether the target light effect presented by the gray single-value matrix completely reflects the customer demand by a three-dimensional drawing method, for example, setting conditional colors by using a matrix drawing function in an origin Lab or using a conditional format in Excel, and converting the gray single-value matrix into a 0/1 matrix by the threshold value, wherein the 0/1 matrix is a 0/1 frame image;
the element-level product matrix of the (R, G, B) three-value matrix and the 0/1 matrix is obtained through the NumpyArray, namely, the background color is filtered through multiplying the 0/1 frame image and the original color frame image, so that the background noise in the original color frame image is filtered, and the continuous playing of the light effect is realized. The product matrix is used as input data of Smart RGB, the filtered color image is converted into RGB sequence, the RGB sequence is stored in one row of Excel, and the RGB sequence of each frame is orderly arranged into RGB time sequence matrix;
the RGB time sequence matrix is stored in the flash of the MCU of the copilot control panel in the form of a sub-function local variable, the sub-function is repeatedly called through a main function when the program runs, so that the memory is continuously released, and only one frame of RGB sequence is selectively initialized as the local variable at a time through a switch statement in the sub-function, so that the memory (RAM) of a large amount of frame data is repeatedly utilized. Through the switch statement in the subfunction, only one frame of RGB sequence is initialized each time and temporarily stored as a local variable, a large amount of video frame data can be stored by utilizing the advantage that the flash space is much larger than the RAM, and only one frame of data occupies the RAM space each time, so that the bearing capacity of a program is improved. The sequence data of each frame is directly transmitted to the serial input port of the LED sequence by the MCU of the co-driver control panel through a single GPIO port by adopting the sequence formed by Smart RGB, so that pin resources are saved, and wiring is facilitated.
In addition, for the condition that no light effect is preset in a main vehicle factory, an array is not initialized by an RGB matrix, elements of the empty array are directly modified through logic sentences such as if, while, switch, the self-designed light effect is assigned to the array through the design of the logic sentences and is continuously refreshed, and each frame of data is sent to an LED sequence through a serial data port of Smart RGB according to the expected frame interval time by an MCU of a copilot control panel.
Fig. 1 is a schematic diagram of an in-vehicle installation of the invention, wherein 1, co-driver volume adjustment, 2, co-driver air volume adjustment, 3, co-driver Smart RGB laser etching light transmission areas, 4, a paint spraying panel such as true wood or metal wire drawing on a co-driver control panel, and 5, a vehicle center control display screen. The co-driver control panel in the embodiment adopts spray painting real wood or metal wire drawing to build a situation close to nature and life.
The technical problems, technical solutions and beneficial effects that the present invention solves are further described in detail in the above specific embodiments, it should be understood that the above is only specific embodiments of the present invention and is not intended to limit the present invention, and any modifications, equivalent substitutions, improvements, etc. that fall within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (2)

1. The audio-visual touch interaction control method for the co-driver control panel is characterized by comprising the following steps of:
s1, creating a Smart RGB refreshing task, a touch control task and a CAN message analysis task on an MCU of a copilot control panel by using a FreeRTOS system, and periodically updating global variables in the tasks to a vehicle machine through a timer;
the creation of the touch control task specifically comprises the following steps:
creates a 100ms timer through xtmercreate () and sets the callback function to:
vCAN_Send_TimerCallback();
periodically sending touch information of the same volume, air quantity, songs and skylights of the message ID through the callback function; if the four pieces of touch information belong to different messages respectively, setting four timers of 100ms, 200ms, 300ms and 400ms, and independently sending each touch signal in a time-sharing manner, and performing multi-contact control on entertainment and functional peripherals through a touch chip;
the multi-contact control is carried out on entertainment and functional peripheral equipment through the touch chip, and the method specifically comprises the following steps:
listing all truth arrangements of available output pins of the touch chip through a truth table, and selecting each group of touch signals in an MCU of a copilot control panel through if-else sentences to realize different contact control on volume, air quantity, songs and skylight positions;
the touch chip is used for sending a selection signal and then sending a data signal, the contact object to be controlled such as volume, air quantity, song and skylight position is determined through the selection signal, and after a sub-condition judgment statement is entered, the data signal sent later is judged so as to carry out correct contact control;
the creating of the Smart RGB refreshing task specifically comprises the following steps:
initializing a CAN controller, setting a CAN event callback function to be CAN_ISR () through CAN_InstallEventCallBack (), automatically calling the CAN event callback function after a vehicle sends a CAN message and collecting the message, and presenting the message content to a CAN message analysis task through xQueueSend_FrominsR ();
after the high-priority task is temporarily executed through vTaskDelay (), clearing a blocking state caused by xQueueReceive () when members exist in the queue by the CAN message analysis task, and assigning a data frame of a message to a corresponding global signal variable according to a message ID (identity) for guiding a Smart RGB refreshing task to switch light effects;
s2, the vehicle machine sends a selection signal of Smart RGB light effect to a CAN message analysis task of an MCU of a copilot control panel through a CAN bus, and a user realizes interaction through a point selection control display screen;
s3, analyzing the client demand, obtaining an RGB frame sequence of each frame in the form of a video image, further obtaining an RGB time sequence matrix, and sequentially transmitting the RGB time sequence to Smart RGB from a serial data port according to frame time;
the establishment of the RGB frame sequence specifically comprises the following steps:
aiming at the video image required by each frame of customer, converting the video image into an (R, G, B) three-value matrix and a gray single-value matrix respectively through OpenCV, setting a reasonable threshold value, checking whether the target light effect presented by the gray single-value matrix completely reflects the customer requirement through a three-dimensional drawing method, and converting the gray single-value matrix into a 0/1 matrix through the threshold value;
obtaining an element-level product matrix of an (R, G, B) three-value matrix and a 0/1 matrix through a NumpyArray, converting the product matrix into an RGB sequence by taking the product matrix as input data of Smart RGB, storing the RGB sequence into one row of Excel, and sequentially arranging the RGB sequence of each frame into an RGB time sequence matrix;
storing the RGB time sequence matrix in the form of a sub-function local variable in the flash of the MCU of the copilot control panel, repeatedly calling the sub-function through a main function when the program runs, continuously releasing the memory, and selectively initializing one frame of RGB sequence as a local variable at a time through a switch statement in the sub-function;
step S4, respectively etching patterns required by light projection and key indication on a control panel of the copilot according to target Smart RGB light and laser for touching key light;
and S5, the passenger selects corresponding volume, air quantity, songs and skylight positions on a control panel of the passenger in a sliding or point-contact mode, and meanwhile, the light projection effect can be selected through a main interface of the vehicle.
2. The audio-visual touch interactive control method for a co-driver control panel according to claim 1, wherein the method comprises the following steps: the priority of the Smart RGB refreshing task is the same as the priority of the touch control task, the priority of the Smart RGB refreshing task and the priority of the touch control task are higher than the priority of the CAN message analysis task, and the tasks are switched from the high-priority task to the low-priority task through vTaskDelay ().
CN202111505618.XA 2021-12-10 2021-12-10 Audio-video touch interactive control method for co-driver control panel Active CN114179613B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111505618.XA CN114179613B (en) 2021-12-10 2021-12-10 Audio-video touch interactive control method for co-driver control panel

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111505618.XA CN114179613B (en) 2021-12-10 2021-12-10 Audio-video touch interactive control method for co-driver control panel

Publications (2)

Publication Number Publication Date
CN114179613A CN114179613A (en) 2022-03-15
CN114179613B true CN114179613B (en) 2024-03-05

Family

ID=80604347

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111505618.XA Active CN114179613B (en) 2021-12-10 2021-12-10 Audio-video touch interactive control method for co-driver control panel

Country Status (1)

Country Link
CN (1) CN114179613B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105751984A (en) * 2015-01-02 2016-07-13 现代自动车株式会社 Display Apparatus For Vehicle And Vehicle Having The Display Apparatus
CN105966247A (en) * 2016-07-14 2016-09-28 北京新能源汽车股份有限公司 Instrument panel for vehicle and vehicle with same
CN107317866A (en) * 2017-06-30 2017-11-03 昆明自动化成套集团股份有限公司 A kind of intelligent communication server and its construction method based on finite-state automata framework
CN207360092U (en) * 2017-10-17 2018-05-15 北京车和家信息技术有限公司 Vehicular meter component and vehicle
CN111469663A (en) * 2019-01-24 2020-07-31 宝马股份公司 Control system for a vehicle
CN111816189A (en) * 2020-07-03 2020-10-23 斑马网络技术有限公司 Multi-tone-zone voice interaction method for vehicle and electronic equipment
CN112429007A (en) * 2019-08-23 2021-03-02 比亚迪股份有限公司 Vehicle and auxiliary control method and device thereof, electronic equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10116748B2 (en) * 2014-11-20 2018-10-30 Microsoft Technology Licensing, Llc Vehicle-based multi-modal interface

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105751984A (en) * 2015-01-02 2016-07-13 现代自动车株式会社 Display Apparatus For Vehicle And Vehicle Having The Display Apparatus
CN105966247A (en) * 2016-07-14 2016-09-28 北京新能源汽车股份有限公司 Instrument panel for vehicle and vehicle with same
CN107317866A (en) * 2017-06-30 2017-11-03 昆明自动化成套集团股份有限公司 A kind of intelligent communication server and its construction method based on finite-state automata framework
CN207360092U (en) * 2017-10-17 2018-05-15 北京车和家信息技术有限公司 Vehicular meter component and vehicle
CN111469663A (en) * 2019-01-24 2020-07-31 宝马股份公司 Control system for a vehicle
CN112429007A (en) * 2019-08-23 2021-03-02 比亚迪股份有限公司 Vehicle and auxiliary control method and device thereof, electronic equipment and storage medium
CN111816189A (en) * 2020-07-03 2020-10-23 斑马网络技术有限公司 Multi-tone-zone voice interaction method for vehicle and electronic equipment

Also Published As

Publication number Publication date
CN114179613A (en) 2022-03-15

Similar Documents

Publication Publication Date Title
CN110949248A (en) Vehicle multi-mode atmosphere lamp control system and method
CN109104794B (en) Vehicle atmosphere lamp control system and method
CN110001505A (en) A kind of customizing method and system, vehicle of Vehicle lamp effect
US11001197B2 (en) Individualizable lighting system for a vehicle
CN102227704B (en) Apparatus and method for controlling sound reproduction apparatus
CN114416000A (en) Multi-screen interaction method and multi-screen interaction system applied to intelligent automobile
TWI738132B (en) Human-computer interaction method based on motion analysis, in-vehicle device
CN114179613B (en) Audio-video touch interactive control method for co-driver control panel
US20240227668A9 (en) Motor Vehicle Comprising a Plurality of Interior Light Modules
CN114913811A (en) Automobile dot matrix car lamp control method and system and vehicle
CN211280814U (en) Ceiling lighting system of vehicle and vehicle
CN213108907U (en) Air conditioner control switch control system with atmosphere lamp follow-up function and automobile
CN110297562B (en) Display driving method, display panel and display device
CN116691546A (en) Control method and device of vehicle-mounted equipment, vehicle and storage medium
CN115848138A (en) Cabin visual angle switching method, device and equipment and vehicle
CN114721615A (en) Method for setting automobile liquid crystal instrument
CN114084162B (en) Display method, device, equipment and storage medium
CN114670747A (en) Atmosphere lamp control system and vehicle
CN114537269A (en) Control method for light in automobile and automobile
CN114872542A (en) Automobile external signal interaction method and system, electronic equipment and automobile
CN112849118B (en) Control method of vehicle steering wheel, computer device, storage medium, and vehicle
US10466657B2 (en) Systems and methods for global adaptation of an implicit gesture control system
CN209395747U (en) Sunroof control system
KR20100012654A (en) Operation method of integration switch for vehicles
CN111949109A (en) Power consumption control method of vehicle-mounted terminal, vehicle-mounted terminal and vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant