CN114872650A - Control method and device of vehicle-mounted equipment, electronic equipment and storage medium - Google Patents

Control method and device of vehicle-mounted equipment, electronic equipment and storage medium Download PDF

Info

Publication number
CN114872650A
CN114872650A CN202210527606.5A CN202210527606A CN114872650A CN 114872650 A CN114872650 A CN 114872650A CN 202210527606 A CN202210527606 A CN 202210527606A CN 114872650 A CN114872650 A CN 114872650A
Authority
CN
China
Prior art keywords
passenger
seat
state
output
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210527606.5A
Other languages
Chinese (zh)
Inventor
张西涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Leading Technology Co Ltd
Original Assignee
Nanjing Leading Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Leading Technology Co Ltd filed Critical Nanjing Leading Technology Co Ltd
Priority to CN202210527606.5A priority Critical patent/CN114872650A/en
Publication of CN114872650A publication Critical patent/CN114872650A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/037Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for occupant comfort, e.g. for automatic adjustment of appliances according to personal settings, e.g. seats, mirrors, steering wheel
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60NSEATS SPECIALLY ADAPTED FOR VEHICLES; VEHICLE PASSENGER ACCOMMODATION NOT OTHERWISE PROVIDED FOR
    • B60N2/00Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles
    • B60N2/002Seats provided with an occupancy detection means mounted therein or thereon
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/023Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for transmission of signals between vehicle parts or subsystems
    • B60R16/0231Circuits relating to the driving or the functioning of the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/593Recognising seat occupancy

Abstract

The application discloses control method, device, electronic equipment and storage medium of mobile unit belongs to vehicle technical field, installs intelligent passenger cabin host computer on the vehicle, and intelligent passenger cabin host computer is connected with at least one output device, and every output device corresponds a seat, and this method includes: the intelligent cabin host computer can obtain passenger state representation information, analyze the passenger state representation information to obtain the riding state of each passenger in the automobile, then determine an output control strategy corresponding to the riding state of the passenger based on the corresponding relation between the pre-established riding state and the output control strategy, and further control the output state of the output equipment corresponding to the seat where the passenger rides according to the determined output control strategy. Therefore, the intelligent cabin design scheme for the passengers is provided, and the intelligent cabin host operates independently without accessing a vehicle machine system, so that the implementation of the intelligent cabin design scheme for the passengers is not limited.

Description

Control method and device of vehicle-mounted equipment, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of vehicle technologies, and in particular, to a method and an apparatus for controlling a vehicle-mounted device, an electronic device, and a storage medium.
Background
With the rapid development of vehicle technology, various vehicle-mounted devices are presented, and the purpose of the various vehicle-mounted devices is to bring better user experience.
In the prior art, the intelligent cabin scheme is mainly designed around a driver, and is not designed for passengers, and the intelligent cabin scheme is required to be accessed into a vehicle machine system when being implemented, however, the intelligent cabin scheme is limited by the technology of a main locomotive, the vehicle machine system is not sufficient in openness degree, and cannot be freely customized, so that the implementation of the intelligent cabin scheme is limited to a certain extent. In addition, most car machine chips are not enough in computing force, and the implementation of the intelligent cabin scheme is further limited.
Therefore, the prior art has the problems that no intelligent cabin solution for passengers exists and the implementation of the intelligent cabin solution is limited.
Disclosure of Invention
The embodiment of the application provides a control method of vehicle-mounted equipment, which is used for solving the problems that no intelligent cabin scheme for passengers exists in the prior art and the implementation of the intelligent cabin scheme is limited.
In a first aspect, an embodiment of the present application provides a method for controlling an on-board device, where an intelligent cabin host is installed on a vehicle, the intelligent cabin host is connected with at least one output device, and each output device corresponds to a seat, and the method includes:
the intelligent cabin host acquires passenger state representation information;
analyzing the passenger state representation information to obtain the riding state of each passenger in the vehicle;
determining an output control strategy corresponding to the riding state of the passenger based on a corresponding relation between the riding state and the output control strategy which is established in advance;
and controlling the output state of the output equipment corresponding to the seat where the passenger sits according to the determined output control strategy.
In some embodiments, the passenger state representation information includes an in-vehicle image sequence, and analyzing the passenger state representation information to obtain a riding state of each passenger in the vehicle includes:
performing human body recognition on each in-vehicle image in the in-vehicle image sequence to determine whether a passenger is on each seat; if any seat has a passenger, determining that the corresponding passenger is in a seating state, and if no passenger is on the seat, determining that the corresponding passenger is in an out-of-seating state; and/or
And performing face analysis on each in-vehicle image in the in-vehicle image sequence to determine whether each passenger watches the output device, and if the same passenger does not watch the output device for N times, determining that the passenger is in a non-watching state, wherein N is an integer greater than 1.
In some embodiments, the passenger status characterizing information includes sensing information of each seat sensor, and analyzing the passenger status characterizing information to obtain a riding status of each passenger in the vehicle includes:
if the passenger is determined to be in the seat based on the sensing information of any seat sensor, determining that the passenger is in a seating state;
and if it is determined that no passenger is on the seat based on the sensing information of the seat sensor, determining that the corresponding passenger is in the out-of-seat state.
In some embodiments, after determining that the riding state of any passenger is the stable sitting state, the method further comprises:
determining head pitch information for the passenger;
determining angle adjustment information for an output device corresponding to a seat in which the passenger is seated based on the head pitch information;
and adjusting the angle of the corresponding output equipment based on the angle adjustment information.
In some embodiments, prior to determining the head pitch information of the passenger, further comprising:
and determining that the sitting posture of the passenger accords with a stable sitting posture condition based on the sensing information of the seat sensor corresponding to the passenger.
In some embodiments, after determining that the riding state of any passenger is the seating state, the method further comprises:
acquiring multimedia data from a local or server;
and playing the multimedia data through an output device corresponding to the seat taken by the passenger.
In some embodiments, obtaining multimedia data from a server comprises:
sending user representation data of the passenger to the server, selecting, by the server, multimedia data matching the passenger based on the user representation data;
and receiving the multimedia data sent by the server.
In some embodiments, after determining that the riding state of any passenger is the seating state, the method further comprises:
receiving a screen projection request sent by a passenger through a terminal;
and playing the multimedia data acquired from the terminal through an output device in front of the seat where the passenger is.
In a second aspect, an embodiment of the present application provides a control device for an on-board device, where an intelligent cabin host is installed on a vehicle, the intelligent cabin host is connected with at least one output device, each output device corresponds to a seat, and the device is disposed in the intelligent cabin host, and the device includes:
the acquisition module is used for acquiring passenger state representation information;
the analysis module is used for analyzing the passenger state representation information to obtain the riding state of each passenger in the vehicle;
the determining module is used for determining an output control strategy corresponding to the riding state of the passenger based on the corresponding relation between the riding state and the output control strategy which is established in advance;
and the control module is used for controlling the output state of the output equipment corresponding to the seat in which the passenger takes according to the determined output control strategy.
In some embodiments, the passenger state characterization information includes an in-vehicle image sequence, and the analysis module is specifically configured to:
performing human body recognition on each in-vehicle image in the in-vehicle image sequence to determine whether a passenger is on each seat; if any seat has a passenger, determining that the corresponding passenger is in a seating state, and if no passenger is on the seat, determining that the corresponding passenger is in an out-of-seating state; and/or
And performing face analysis on each in-vehicle image in the in-vehicle image sequence to determine whether each passenger watches the output device, and if the same passenger does not watch the output device for N times, determining that the passenger is in a non-watching state, wherein N is an integer greater than 1.
In some embodiments, the passenger status characterizing information includes sensing information of each seat sensor, and the analyzing module is specifically configured to:
if the passenger is determined to be in the seat based on the sensing information of any seat sensor, determining that the corresponding passenger is in a seating state;
and if it is determined that no passenger is on the seat based on the sensing information of the seat sensor, determining that the corresponding passenger is in the out-of-seat state.
In some embodiments, the apparatus further comprises an adjustment module configured to:
determining head pitch information of any passenger after determining that the riding state of the passenger is a sitting state;
determining angle adjustment information for an output device corresponding to a seat in which the passenger is seated based on the head pitch information;
and adjusting the angle of the corresponding output equipment based on the angle adjustment information.
In some embodiments, the adjustment module is further configured to:
determining that the sitting posture of the passenger is in accordance with a stable sitting posture condition based on the sensing information of the seat sensor corresponding to the passenger before determining the head pitch information of the passenger.
In some embodiments, further comprising an output module to:
after determining that the riding state of any passenger is the sitting state, acquiring multimedia data from a local or server;
and playing the multimedia data through an output device corresponding to the seat taken by the passenger.
In some embodiments, the output module is specifically configured to:
sending user representation data of the passenger to the server, selecting, by the server, multimedia data matching the passenger based on the user representation data;
and receiving the multimedia data sent by the server.
In some embodiments, the system further comprises a screen projection module for:
after determining that the riding state of any passenger is a sitting state, receiving a screen projection request sent by the passenger through a terminal;
and playing the multimedia data acquired from the terminal through an output device in front of the seat where the passenger is.
In a third aspect, an embodiment of the present application provides an electronic device, including: at least one processor, and a memory communicatively coupled to the at least one processor, wherein:
the memory stores a computer program executable by the at least one processor, the computer program being executed by the at least one processor to enable the at least one processor to perform the control of the above-mentioned in-vehicle device.
In a fourth aspect, embodiments of the present application provide a storage medium in which a computer program is executed by a processor of an electronic device, and the electronic device is capable of executing control of the above-described in-vehicle device.
In the embodiment of the application, an intelligent cabin host is installed on a vehicle, the intelligent cabin host is connected with at least one output device, each output device corresponds to one seat, the intelligent cabin host can obtain passenger state representation information, the passenger state representation information is analyzed, the riding state of each passenger in the vehicle is obtained, then, an output control strategy corresponding to the riding state of the passenger is determined based on the corresponding relation between the riding state and the output control strategy which is established in advance, and the output state of the output device corresponding to the seat where the passenger takes is controlled according to the determined output control strategy. Therefore, the intelligent cabin design scheme for the passengers is provided, and the intelligent cabin host operates independently without accessing a vehicle machine system, so that the implementation of the intelligent cabin design scheme for the passengers is not limited.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is an application scene diagram of a control method for a vehicle-mounted device according to an embodiment of the present application;
fig. 2 is a flowchart of a control method for an on-board device according to an embodiment of the present disclosure;
fig. 3 is a flowchart of a control method for another vehicle-mounted device according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of a control device of an on-board device according to an embodiment of the present application;
fig. 5 is a schematic hardware structure diagram of an electronic device for implementing a control method of an in-vehicle device according to an embodiment of the present application.
Detailed Description
In order to solve the problems that no intelligent cabin scheme for passengers exists in the prior art and the implementation of the intelligent cabin scheme is limited, embodiments of the application provide a control method and device for an on-board device, an electronic device and a storage medium.
The preferred embodiments of the present application will be described below with reference to the accompanying drawings of the specification, it should be understood that the preferred embodiments described herein are merely for illustrating and explaining the present application, and are not intended to limit the present application, and that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
First, it should be noted that the control method for the vehicle-mounted device provided in the embodiment of the present application may be applied to a network car booking scenario or a non-network car booking scenario, and the following description will take the application to the network car booking as an example.
In order to compensate for the fact that an intelligent cabin scheme for passengers does not exist in the related art, the embodiment of the application provides a vehicle-mounted device independent of a vehicle-mounted machine system, which includes an intelligent cabin host and at least one output device connected with the intelligent cabin host, wherein the intelligent cabin host can obtain passenger state representation information, analyze the passenger state representation information to obtain the riding state of each passenger in a vehicle, determine an output control strategy corresponding to the riding state of the passenger based on a correspondence relationship between the riding state and the output control strategy, and further control the output state of the output device corresponding to a seat on which the passenger rides according to the determined output control strategy.
The intelligent cabin host can be powered by an on-board 12V power supply, and the number of output devices connected with the intelligent cabin host can be flexibly configured according to the number of seats, such as the output device is arranged in front of a passenger seat, the output device is not arranged in front of the passenger seat, the output devices are arranged in front of all seats except the passenger seat, the output devices are arranged in front of part of seats except the passenger seat, and the like.
Fig. 1 is an application scenario diagram of a control method for an on-board device provided in an embodiment of the present application, and includes 1 intelligent cabin host and 3 output devices: the intelligent cockpit host can respectively control the output states (screen off, sound off, screen on and the like) of the 3 output devices and output multimedia contents by means of the three interfaces.
In addition, the interior of the vehicle is also provided with cameras, for example, one camera is arranged in front of the interior of the vehicle, a plurality of cameras are arranged in the vehicle in a dispersed mode, and the arrangement position and the number of the cameras are preferably set to cover all the visual angles in the vehicle. Also, a seat sensor may be provided for each seat. The intelligent cabin host can regularly acquire in-vehicle images acquired by the camera and sensing information acquired by each seat sensor, and control the output state of each output device and output multimedia content based on the acquired information.
After introducing an application scenario of the embodiment of the present application, a control method of an in-vehicle device proposed by the present application is described below with specific embodiments. Fig. 2 is a flowchart of a control method of an on-board device according to an embodiment of the present application, where the method is applied to the intelligent cabin host in fig. 1, and the method includes the following steps.
In step 201, passenger status characterizing information is obtained.
In some embodiments, the passenger state characterization information may include a sequence of in-vehicle images; in some embodiments, the occupant status characterizing information may include sensed information of each seat sensor.
In step 202, the passenger status characterization information is analyzed to obtain the riding status of each passenger in the vehicle.
The riding state of each passenger is, for example, a sitting state, a sitting-away state, a watching state, an unviewed state, etc.
The intelligent cabin host can locally deploy some visual analysis algorithms with different analysis purposes, such as a human body recognition algorithm for analyzing whether a person is in a seat or not, and a sight line analysis algorithm for analyzing whether a passenger watches a display screen of the output device or not. In addition, the intelligent cabin host computer can also not deploy or deploy part of the visual analysis algorithm, and call the corresponding visual analysis algorithm from the server when the visual analysis of which purpose needs to be performed.
In some embodiments, the passenger state characterization information includes a sequence of in-vehicle images.
In this case, the human body recognition algorithm of the local or calling server side may be used to perform human body recognition on each in-vehicle image in the in-vehicle image sequence to determine whether there is a passenger on each seat; if any seat has a passenger, the corresponding passenger is determined to be in a seating state, and if no passenger is on any seat, the corresponding passenger is determined to be in an out-of-seating state.
In specific implementation, the position of the seat headrest in the in-vehicle image can be calibrated in advance according to the vehicle type, for example, the face detection frame is arranged above the key points of the shoulders and the neck, and the x coordinate of the face detection frame is arranged in the x coordinate range of the left shoulder and the right shoulder. If the overlapping area of the headrest frame and the face detection frame is larger than 50%, it is judged that a passenger is on the corresponding seat. Subsequently, when the intelligent cabin host computer performs human body recognition on each in-car image, the position relation between the human face detection frame and the headrest frame in the in-car image is analyzed, and whether passengers exist on one seat or not can be known.
In some embodiments, the occupant status characterizing information includes sensing information of seat sensors.
In this case, the sensing information of the seat sensors may be analyzed to determine whether or not a passenger is seated on each seat; if any seat has a passenger, the corresponding passenger is determined to be in a seating state, and if no passenger is on any seat, the corresponding passenger is determined to be in an out-of-seating state.
The sensing information of each seat sensor may include pressure information, and if it is determined that the pressure indicated by the pressure information is greater than a preset value, it is determined that there is a passenger in the seat, and if it is determined that the pressure indicated by the pressure information is not greater than the preset value, it is determined that there is no passenger in the seat.
In this case, a local sight line analysis algorithm or a server-side sight line analysis algorithm may be used to perform face analysis on each in-vehicle image to determine whether each passenger views the output device, when it is determined that one passenger views the output device based on any in-vehicle image, it may be determined that the passenger is in a viewing state, and if it is determined that the same passenger does not view the output device N times in succession, it may be determined that the passenger is in an unviewed state.
The condition that the passenger does not view the output device is determined, namely that the passenger does not look at the display screen of the output device, such as the passenger closes eyes, the head of the passenger is inclined to one side, and the like.
In step 203, an output control strategy corresponding to the riding state of the passenger is determined based on the correspondence relationship between the riding state and the output control strategy established in advance.
In specific implementation, the correspondence between the riding state and the output control strategy is as follows: the screen is lightened corresponding to the sitting state, the screen is extinguished and the sound is heard corresponding to the sitting state, the screen is extinguished corresponding to the watching state, the screen is extinguished and the sound is heard corresponding to the watching state, and the screen is lightened and the sound is heard corresponding to the watching state.
The output state corresponding to the unviewed state can be determined according to the currently played multimedia content, for example, the currently played multimedia content is music, and the output state corresponding to the unviewed state can be screen off, so that the user can still listen to the music in the screen off state, and for example, the currently played multimedia content is a video, and the output state corresponding to the unviewed state can be screen off and quieting, so that the user can not be affected.
In step 204, the output state of the output device corresponding to the seat in which the passenger is seated is controlled according to the determined output control strategy.
Namely, the output state of the output equipment corresponding to the seat where the passenger sits is controlled according to the determined strategies of turning off the screen, adding sound, lightening the screen, turning off the screen and the like.
Fig. 3 is a flowchart of a control method for another vehicle-mounted device according to an embodiment of the present disclosure, where the method is applied to the intelligent cabin host in fig. 1, and the method includes the following steps.
In step 301, passenger status characterization information is obtained in real time.
In step 302, the passenger status characterization information is analyzed to obtain the riding status of each passenger in the vehicle.
The step can be implemented in step 202, and is not described herein again.
In step 303, if it is determined that the riding state of the passenger is the seating state, the display screen of the output device corresponding to the seat on which the passenger sits is controlled to be lit.
In step 304, multimedia data is obtained from a local or server.
In some embodiments, the intelligent cockpit host locally may store some multimedia data such as introduction videos of attractions along the way, music, and the like.
In some embodiments, the intelligent cabin host may obtain the multimedia data from the server, and, for personalized recommendation, the intelligent cabin host may further send the user portrait data of the current passenger to the server, so that the server may select the multimedia data according to the user portrait data, and use the selected multimedia data as the multimedia data matched with the passenger.
In step 305, the acquired multimedia data is played through an output device corresponding to the seat in which the passenger sits.
In step 306, it is determined whether the sitting posture of the passenger meets the stable sitting posture condition based on the sensing information of the seat sensor of the seat in which the passenger sits, if yes, the process proceeds to step 307, and if not, the process continues to step 306.
In some embodiments, the sitting posture determination may be made with the aid of a seat sensor.
For example, the sensed information of each seat sensor may further include: the contact area between the passenger and the seat cushion and whether the passenger is attached to the seat backrest or not. Correspondingly, stabilize sitting posture condition like the area of contact of passenger with the seat cushion is no less than two-thirds, and the area of contact of passenger with the seat cushion is no less than two-thirds and the passenger laminates with the backrest etc..
In some embodiments, the sitting posture determination may be made with the aid of an in-vehicle image.
Generally, after a seat in which a passenger is present is identified based on each in-vehicle image, the sitting posture of each passenger is determined by combining the coordinates of key points of the upper body of the human body in the in-vehicle image and the positional relationship of the face detection frame. Accordingly, a stable seating condition such as a passenger leaning on the seat back.
In step 307, head pitch information for the passenger is determined.
During specific implementation, the intelligent cabin host machine can determine a face area from each in-car image by using a corresponding visual analysis algorithm, analyzes and determines a head pitch angle in the face area, and further takes the head pitch angle as head pitch information of a corresponding passenger.
In step 308, based on the head pitch information of the passenger, angle adjustment information for an output device corresponding to a seat in which the passenger is seated is determined.
For example, if the head pitch information of the passenger indicates that the head pitch angle is pitch, then-pitch may be used as the angle adjustment information for the output device corresponding to the seat on which the passenger sits.
In step 309, the angle of the corresponding output device is adjusted based on the angle adjustment information.
In step 310, if it is determined that the riding state of the passenger is the unviewed state or the out-of-seat state, the output device corresponding to the seat on which the passenger sits is controlled to turn off the screen and to stop the sound.
It should be noted that, in the above flow, the steps 301 and 302 are performed periodically, the step 303 and 310 are performed selectively based on the result obtained in each step 302, and there is no strict sequence relationship between the steps 304 and 305 and the step 306 and 309.
In practical application, after a passenger gets on the vehicle, the passenger can connect the terminal of the passenger to the intelligent cabin host through 4G, 5G or wireless Wi-Fi in the vehicle, a screen projection request is sent to the intelligent cabin host through the terminal, the intelligent cabin host can obtain multimedia data from the terminal after receiving the screen projection request, and then the obtained multimedia data is played through an output device in front of a seat where the passenger is located.
In addition, the intelligent cabin host can be connected to the server through 4G, 5G, Wi-Fi and the like, network car booking travel state events are obtained from the server, and multimedia contents output by the corresponding output devices are controlled by means of the network car booking travel state events, wherein the network car booking travel state events comprise passenger getting-on and getting-off.
For example, when it is determined that any passenger gets on the bus, the output device corresponding to the seat on which the passenger takes is controlled to play a welcome note welcoming the bus, a smiling face is displayed on a display screen of the output device, and when it is determined that the passenger gets off the bus, the output device corresponding to the seat on which the passenger takes is controlled to play a goodbye.
By means of the scheme provided by the embodiment of the application, in a car appointment scene, when a passenger is in a certain seat, the output device corresponding to the seat is automatically controlled to light the display screen, and then the output device can be controlled to play multimedia content. Moreover, if a plurality of passengers are in the vehicle, the passengers can independently control the multimedia content played by the output device in front of the seat, or passively enable the output device to intelligently play the travel related information and the video content, and the passengers can also close and/or open the corresponding display screens through the keys of the output device.
Therefore, when a passenger takes the network car reservation, the passenger can experience personalized intelligent cabin service no matter which seat the passenger sits on, the user experience is better, the electric quantity is saved, and the service life of the output equipment is prolonged.
Based on the same technical concept, the embodiment of the application further provides a screen control device, and the principle of the screen control device for solving the problem is similar to the control of the vehicle-mounted device, so that the implementation of the screen control device can refer to the implementation of the control of the vehicle-mounted device, and repeated details are omitted.
Fig. 4 is a schematic structural diagram of a control apparatus of an in-vehicle device according to an embodiment of the present application, and includes an obtaining module 401, an analyzing module 402, a determining module 403, and a control module 404.
An obtaining module 401, configured to obtain passenger state representation information;
an analysis module 402, configured to analyze the passenger state representation information to obtain a riding state of each passenger in the vehicle;
a determining module 403, configured to determine an output control strategy corresponding to a riding state of the passenger based on a correspondence relationship between a riding state and the output control strategy that is established in advance;
and the control module 404 is configured to control an output state of an output device corresponding to a seat in which the passenger sits according to the determined output control strategy.
In some embodiments, the passenger state characterization information includes an in-vehicle image sequence, and the analysis module 402 is specifically configured to:
performing human body recognition on each in-vehicle image in the in-vehicle image sequence to determine whether a passenger is on each seat; if any seat has a passenger, determining that the corresponding passenger is in a seating state, and if no passenger is on the seat, determining that the corresponding passenger is in an out-of-seating state; and/or
And performing face analysis on each in-vehicle image in the in-vehicle image sequence to determine whether each passenger watches the output device, and if the same passenger does not watch the output device for N times, determining that the passenger is in a non-watching state, wherein N is an integer greater than 1.
In some embodiments, the passenger status characterizing information includes sensing information of each seat sensor, and the analyzing module 402 is specifically configured to:
if the passenger is determined to be in the seat based on the sensing information of any seat sensor, determining that the corresponding passenger is in a seating state;
and if it is determined that no passenger is on the seat based on the sensing information of the seat sensor, determining that the corresponding passenger is in the out-of-seat state.
In some embodiments, an adjustment module 405 is further included for:
determining head pitch information of any passenger after determining that the riding state of the passenger is a sitting state;
determining angle adjustment information for an output device corresponding to a seat in which the passenger is seated based on the head pitch information;
and adjusting the angle of the corresponding output equipment based on the angle adjustment information.
In some embodiments, the adjustment module 405 is further configured to:
determining that the sitting posture of the passenger is in accordance with a stable sitting posture condition based on the sensing information of the seat sensor corresponding to the passenger before determining the head pitch information of the passenger.
In some embodiments, an output module 406 is further included for:
after determining that the riding state of any passenger is the seating state, acquiring multimedia data from a local or server;
and playing the multimedia data through an output device corresponding to the seat where the passenger takes.
In some embodiments, the output module 406 is specifically configured to:
sending user profile data for the passenger to the server, the server selecting multimedia data matching the passenger based on the user profile data;
and receiving the multimedia data sent by the server.
In some embodiments, a screen projection module 407 is further included for:
after determining that the riding state of any passenger is a sitting state, receiving a screen projection request sent by the passenger through a terminal;
and playing the multimedia data acquired from the terminal through an output device in front of the seat where the passenger is.
In the embodiments of the present application, the division of the modules is schematic, and is only a logic function division, and in actual implementation, there may be another division manner, and in addition, each function module in the embodiments of the present application may be integrated in one processor, may also exist alone physically, and may also be integrated in one module by two or more modules. The coupling of the various modules to each other may be through interfaces that are typically electrical communication interfaces, but mechanical or other forms of interfaces are not excluded. Accordingly, modules illustrated as separate components may or may not be physically separate, may be located in one place, or may be distributed in different locations on the same or different devices. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
Having described the control and apparatus of the in-vehicle device according to the exemplary embodiment of the present application, next, an electronic device according to another exemplary embodiment of the present application is described.
An electronic device 130 implemented according to this embodiment of the present application is described below with reference to fig. 5. The electronic device 130 shown in fig. 5 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 5, the electronic device 130 is represented in the form of a general electronic device. The components of the electronic device 130 may include, but are not limited to: the at least one processor 131, the at least one memory 132, and a bus 133 that connects the various system components (including the memory 132 and the processor 131).
Bus 133 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, a processor, or a local bus using any of a variety of bus architectures.
The memory 132 may include readable media in the form of volatile memory, such as Random Access Memory (RAM)1321 and/or cache memory 1322, and may further include Read Only Memory (ROM) 1323.
Memory 132 may also include a program/utility 1325 having a set (at least one) of program modules 1324, such program modules 1324 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
The electronic device 130 may also communicate with one or more external devices 134 (e.g., keyboard, pointing device, etc.), with one or more devices that enable a user to interact with the electronic device 130, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 130 to communicate with one or more other electronic devices. Such communication may occur via input/output (I/O) interfaces 135. Also, the electronic device 130 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) via the network adapter 136. As shown, network adapter 136 communicates with other modules for electronic device 130 over bus 133. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with electronic device 130, including but not limited to: microcode, device drivers, redundant processors, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
In an exemplary embodiment, there is also provided a storage medium in which, when a computer program is executed by a processor of an electronic apparatus, the electronic apparatus is capable of executing the control of the above-described in-vehicle apparatus. Alternatively, the storage medium may be a non-transitory computer readable storage medium, which may be, for example, a ROM, a Random Access Memory (RAM), a CD-ROM, or the like.
In an exemplary embodiment, the electronic device of the present application may include at least one processor, and a memory communicatively connected to the at least one processor, wherein the memory stores a computer program executable by the at least one processor, and the computer program, when executed by the at least one processor, may cause the at least one processor to perform the steps of controlling any of the in-vehicle devices provided by the embodiments of the present application.
In an exemplary embodiment, a computer program product is also provided, which, when executed by an electronic device, enables the electronic device to implement any of the exemplary methods provided herein.
Also, a computer program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing.
The program product for implementing the control method of the vehicle-mounted device in the embodiment of the application may adopt a memory card and include program codes, and may be run on a computing device. However, the program product of the present application is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, Radio Frequency (RF), etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In situations involving remote computing devices, the remote computing devices may be connected to the user computing device over any kind of Network, such as a Local Area Network (LAN) or Wide Area Network (WAN), or may be connected to external computing devices (e.g., over the internet using an internet service provider).
It should be noted that although several units or sub-units of the apparatus are mentioned in the above detailed description, such division is merely exemplary and not mandatory. Indeed, the features and functions of two or more units described above may be embodied in one unit, according to embodiments of the application. Conversely, the features and functions of one unit described above may be further divided into embodiments by a plurality of units.
Further, while the operations of the methods of the present application are depicted in the drawings in a particular order, this does not require or imply that these operations must be performed in this particular order, or that all of the illustrated operations must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While the preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including the preferred embodiment and all changes and modifications that fall within the scope of the present application.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (11)

1. A control method of vehicle-mounted equipment is characterized in that a vehicle is provided with an intelligent cabin host machine, the intelligent cabin host machine is connected with at least one output equipment, and each output equipment corresponds to a seat, and the method comprises the following steps:
the intelligent cabin host acquires passenger state representation information;
analyzing the passenger state representation information to obtain the riding state of each passenger in the vehicle;
determining an output control strategy corresponding to the riding state of the passenger based on a corresponding relation between the riding state and the output control strategy which is established in advance;
and controlling the output state of the output equipment corresponding to the seat in which the passenger takes according to the determined output control strategy.
2. The method of claim 1, wherein the passenger state characterization information comprises a sequence of in-vehicle images, and analyzing the passenger state characterization information to obtain a riding state of each passenger in the vehicle comprises:
performing human body recognition on each in-vehicle image in the in-vehicle image sequence to determine whether a passenger is on each seat; if any seat has a passenger, determining that the corresponding passenger is in a seating state, and if no passenger is on the seat, determining that the corresponding passenger is in an out-of-seating state; and/or
And performing face analysis on each in-vehicle image in the in-vehicle image sequence to determine whether each passenger watches the output device, and if the same passenger does not watch the output device for N times, determining that the passenger is in a non-watching state, wherein N is an integer greater than 1.
3. The method of claim 1, wherein the passenger status characterizing information includes sensing information of seat sensors, and analyzing the passenger status characterizing information to obtain a riding status of each passenger in the vehicle comprises:
if the passenger is determined to be in the seat based on the sensing information of any seat sensor, determining that the corresponding passenger is in a seating state;
and if it is determined that no passenger is on the seat based on the sensing information of the seat sensor, determining that the corresponding passenger is in the out-of-seat state.
4. The method of claim 2 or 3, wherein after determining that the riding status of any passenger is a seating status, further comprising:
determining head pitch information for the passenger;
determining angle adjustment information for an output device corresponding to a seat in which the passenger is seated based on the head pitch information;
and adjusting the angle of the corresponding output equipment based on the angle adjustment information.
5. The method of claim 4, further comprising, prior to determining the head pitch information of the passenger:
and determining that the sitting posture of the passenger accords with a stable sitting posture condition based on the sensing information of the seat sensor corresponding to the passenger.
6. The method of claim 2 or 3, wherein after determining that the riding status of any passenger is a seating status, further comprising:
acquiring multimedia data from a local or server;
and playing the multimedia data through an output device corresponding to the seat taken by the passenger.
7. The method of claim 6, wherein obtaining multimedia data from a server comprises:
sending user representation data of the passenger to the server, selecting, by the server, multimedia data matching the passenger based on the user representation data;
and receiving the multimedia data sent by the server.
8. The method of claim 2 or 3, wherein after determining that the riding status of any passenger is a seating status, further comprising:
receiving a screen projection request sent by a passenger through a terminal;
and playing the multimedia data acquired from the terminal through an output device in front of the seat where the passenger is.
9. The utility model provides a controlling means of mobile unit, its characterized in that installs intelligent passenger cabin host computer on the vehicle, at least one output device is connected to intelligent passenger cabin host computer, and every output device corresponds a seat, its characterized in that, the device sets up in intelligent passenger cabin host computer, the device includes:
the acquisition module is used for acquiring passenger state representation information;
the analysis module is used for analyzing the passenger state representation information to obtain the riding state of each passenger in the vehicle;
the determining module is used for determining an output control strategy corresponding to the riding state of the passenger based on the corresponding relation between the riding state and the output control strategy which is established in advance;
and the control module is used for controlling the output state of the output equipment corresponding to the seat in which the passenger takes according to the determined output control strategy.
10. An electronic device, comprising: at least one processor, and a memory communicatively coupled to the at least one processor, wherein:
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-8.
11. A storage medium, characterized in that the electronic device is capable of performing the method according to any of claims 1-8, when the computer program in the storage medium is executed by a processor of the electronic device.
CN202210527606.5A 2022-05-16 2022-05-16 Control method and device of vehicle-mounted equipment, electronic equipment and storage medium Pending CN114872650A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210527606.5A CN114872650A (en) 2022-05-16 2022-05-16 Control method and device of vehicle-mounted equipment, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210527606.5A CN114872650A (en) 2022-05-16 2022-05-16 Control method and device of vehicle-mounted equipment, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114872650A true CN114872650A (en) 2022-08-09

Family

ID=82674796

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210527606.5A Pending CN114872650A (en) 2022-05-16 2022-05-16 Control method and device of vehicle-mounted equipment, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114872650A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115637901A (en) * 2022-10-09 2023-01-24 东风汽车集团股份有限公司 Child lock control system, method and equipment based on OMS

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115637901A (en) * 2022-10-09 2023-01-24 东风汽车集团股份有限公司 Child lock control system, method and equipment based on OMS

Similar Documents

Publication Publication Date Title
US11498573B2 (en) Pacification method, apparatus, and system based on emotion recognition, computer device and computer readable storage medium
CN108725357B (en) Parameter control method and system based on face recognition and cloud server
CN107367841B (en) Moving object, system, and storage medium
CN108873887A (en) System and method for selecting the driving mode in autonomous vehicle
JP2020525884A (en) Vehicle control method and system, in-vehicle intelligent system, electronic device and medium
CN106467062A (en) Camera chain in vehicle
CN104883588A (en) Vehicle multi-screen interconnection system and realization method thereof
CN109960407A (en) A kind of method, computer installation and the computer readable storage medium of on-vehicle machines people active interaction
CN207443136U (en) A kind of system for user portable apparatus to be connected with occupant's display
DE102018126721A1 (en) SYSTEMS AND METHODS OF DELIVERING DISCREET AUTONOMOUS VEHICLE OWN NOTIFICATIONS
US11580938B2 (en) Methods and systems for energy or resource management of a human-machine interface
TWI738132B (en) Human-computer interaction method based on motion analysis, in-vehicle device
CN114872650A (en) Control method and device of vehicle-mounted equipment, electronic equipment and storage medium
CN113459975B (en) Intelligent cabin system
CN204681525U (en) A kind of vehicle-mounted multi-screen interconnected systems
CN112440900A (en) Vehicle control method and device, control equipment and automobile
CN111717083A (en) Vehicle interaction method and vehicle
CN115214434A (en) Seat posture memory and automatic regulating system
CN116424173A (en) Special vehicle
CN114760434A (en) Automobile intelligent cabin capable of realizing multi-person online video conference and method
US20210179139A1 (en) Vehicle and control method thereof
CN112172712A (en) Cabin service method and cabin service system
US20230241974A1 (en) Head-mounted display system and manned device
CN112918381A (en) Method, device and system for welcoming and delivering guests by vehicle-mounted robot
CN111703385B (en) Content interaction method and vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination