WO2024021720A1 - 控制显示的方法、装置和移动载体 - Google Patents

控制显示的方法、装置和移动载体 Download PDF

Info

Publication number
WO2024021720A1
WO2024021720A1 PCT/CN2023/091297 CN2023091297W WO2024021720A1 WO 2024021720 A1 WO2024021720 A1 WO 2024021720A1 CN 2023091297 W CN2023091297 W CN 2023091297W WO 2024021720 A1 WO2024021720 A1 WO 2024021720A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
area
user
screen
display area
Prior art date
Application number
PCT/CN2023/091297
Other languages
English (en)
French (fr)
Inventor
李平
黄颖华
李庆雷
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2024021720A1 publication Critical patent/WO2024021720A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0414Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/178Human faces, e.g. facial parts, sketches or expressions estimating age from face image; using age information for improving recognition

Definitions

  • Embodiments of the present application relate to the field of smart cockpits, and more specifically, to a method, device and mobile carrier for controlling display.
  • Embodiments of the present application provide a method, device, and mobile carrier for controlling display, which help improve the user's experience when using the cockpit.
  • the mobile carrier in this application may include road vehicles, water vehicles, air vehicles, industrial equipment, agricultural equipment, or entertainment equipment, etc.
  • the mobile carrier can be a vehicle, which is a vehicle in a broad sense, and can be a means of transportation (such as commercial vehicles, passenger cars, motorcycles, flying cars, trains, etc.), industrial vehicles (such as forklifts, trailers, tractors, etc.) etc.), engineering vehicles (such as excavators, bulldozers, cranes, etc.), agricultural equipment (such as lawn mowers, harvesters, etc.), amusement equipment, toy vehicles, etc.
  • the embodiments of this application do not specifically limit the types of vehicles.
  • the mobile carrier can be a vehicle such as an airplane or a ship.
  • a method of controlling display is provided.
  • the device is arranged in the cabin of a mobile carrier.
  • the method can be executed by the mobile carrier; or it can also be executed by a vehicle-mounted terminal of the mobile carrier such as a car machine; or it can also be executed It is executed by a chip or circuit used in a vehicle-mounted terminal, which is not limited in this application.
  • the following description takes mobile carrier execution as an example.
  • the method may include: detecting whether there is a user in a first area of the cockpit, where the cockpit is in a first state; and when a user is detected in the first area, controlling the cockpit to be in a second state.
  • the cockpit status change is controlled, which helps to improve the user's welcome experience when using the cockpit and meet the needs of regional users.
  • the cockpit may include one or more vehicle display screens; or the cockpit may also include vehicle fragrance, vehicle air conditioning, vehicle audio and video systems, etc.; or the cockpit may also include other equipment.
  • vehicle display screens may include one or more vehicle display screens; or the cockpit may also include vehicle fragrance, vehicle air conditioning, vehicle audio and video systems, etc.; or the cockpit may also include other equipment.
  • the first state of the cabin may include at least one of the following: the vehicle display screen is in a screen-off state; the vehicle fragrance is in a closed state, that is, the vehicle fragrance does not release fragrance; the vehicle air conditioner is in a shutdown state; the vehicle audio and video system is in a shutdown state. in shutdown state.
  • controlling the cockpit to be in the second state may include: controlling the display screen corresponding to the first area.
  • Interface elements may include a display screen disposed in the first area.
  • the display screen corresponding to the first area may be the main driving area.
  • Driving screens such as instrument panels and central control screens.
  • the cabin being in the first state includes the vehicle air conditioner being turned off, and controlling the cabin to be in the second state may include: controlling the vehicle air conditioner to be turned on.
  • the vehicle air conditioner corresponding to the first area can be controlled to be turned on.
  • the vehicle-mounted air conditioner corresponding to the first area may include a vehicle-mounted air conditioner arranged in the first area.
  • the vehicle-mounted air conditioner corresponding to the first area may be the main driving area. air conditioner.
  • the cabin being in the first state includes the vehicle audio and video system being turned off
  • controlling the cabin to be in the second state may include: controlling the vehicle audio and video system to be turned on.
  • audio such as music or radio channels can be played.
  • controlling the cockpit to switch from the first state to the second state may include one of the above examples, or it may be a combination of two or more of the above examples.
  • the first state of the cockpit includes the cockpit being in the main driver's seat with the display screen turned off and the vehicle fragrance being turned off.
  • the second state of the cockpit may include controlling the vehicle fragrance to be turned on and controlling the main driver's display screen to display interface elements.
  • the first state of the cockpit includes the cockpit being in the main driver's position with the display turned off and the vehicle fragrance and vehicle air conditioner being turned off.
  • the second state of the cockpit may include controlling the vehicle fragrance and vehicle air conditioner to be turned on and controlling the main driver.
  • the display screen displays interface elements.
  • the first display area of the cockpit is in a screen-off state, and the first display area and the first area
  • the first display area is controlled to switch from the screen-off state to displaying interface elements.
  • the first display area can be a display screen in a certain area in the cockpit, such as a central control screen at the main driver's seat, a passenger entertainment screen, or a screen behind the headrest of the front seat; or , the first display area can also be a certain area of a certain display screen.
  • the first display area can be an area on the long screen.
  • the screen-off state may include a black screen state, such as when the display is powered off; or it may also include a screen-off state when the display is in a low-power consumption state such as hibernation, standby, or shutdown.
  • the screen-off state may include displaying a dormant display interface.
  • the self-illuminating characteristics of the display screen can be used to light up some areas on the display screen to display information such as clock, date, notifications, animations, etc., so as to Users can view relevant information when the screen is turned off.
  • the user controls the cockpit to power on before entering the cockpit, but the vehicle screen may still be powered off at this time. Further, after the user enters the cockpit, the vehicle screen or an area on the vehicle screen corresponding to the user's position in the cockpit is powered on and interface elements are displayed.
  • the interface elements are displayed on the vehicle screen or the area of the vehicle screen corresponding to the user's area, which helps to save energy consumption and improve the user's interactive experience when using the vehicle screen.
  • different interface elements can be displayed on different display areas to meet the needs of different users.
  • the method further includes: controlling the first display area to display the interface element according to the user's human body characteristics.
  • the human body characteristics include but are not limited to gender, age, emotion, etc.
  • the type of interface element displayed in the first display area is determined according to the user's gender. For example, when the user is a female, the first display area can be controlled to display the one or more interface elements preferred by the female user; when the user is a male, the first display area can be controlled to display the one or more male users' preferences. interface elements.
  • the type of interface element displayed in the first display area is determined according to the user's age. For example, when the user is a teenage user, the first display area can be controlled to display the interface elements preferred by the one or more teenage users; when the user is an elderly user, the first display area can be controlled to display the one or more elderly users. User-preferred interface elements.
  • the type of interface element displayed in the first display area is determined according to the user's emotion. For example, when the user is depressed, the first display area can be controlled to display the one or more entertainment-related interface elements.
  • the above-mentioned “interface elements preferred by men”, “interface elements preferred by women”, “interface elements preferred by young users”, and “interface elements preferred by elderly users” can be set by the user. It may also be set when the mobile carrier leaves the factory, or it may be set in other ways, which is not specifically limited in the embodiments of this application.
  • personalized interface elements can be pushed to different users based on the user's human body characteristics, which helps to improve the user's entertainment experience.
  • the method further includes: controlling the first display area to display the interface element according to the user's identity information.
  • the identity information includes but is not limited to biometric information stored in the mobile carrier, account number, etc.
  • biometric information includes but is not limited to fingerprints, palmprints, facial information, iris information, gait information, etc.
  • the account number may include account information for logging into the vehicle system, etc.
  • the first display area displays that the interface element is associated with the user's identity information.
  • the cockpit displays the interface elements associated with the user through the main driving display screen; when the user appears in the co-pilot area, the cockpit displays the interface elements associated with the user through the co-pilot entertainment screen Interface elements.
  • the interface elements associated with the user can be set by the user in advance; or, the cockpit can also be arranged according to the frequency of the user using the application within a certain period of time.
  • the type of interface element to be displayed is determined according to the user's identity information.
  • the interface element associated with the user can be displayed on the display screen corresponding to the area after the user moved. , helping to improve the user’s interactive experience and driving experience.
  • the method further includes: controlling the first display area to display the interface element according to the driving status information of the mobile carrier, the mobile carrier including the cockpit.
  • the main driver's display area when the mobile carrier is in a driving state, can be controlled to display driving interface elements, such as map application icons, navigation cards, etc.; when the mobile carrier is in parking state, In this state, you can control the main driver's display area to display entertainment interface elements, such as video application icons, music cards, etc.
  • driving interface elements such as map application icons, navigation cards, etc.
  • entertainment interface elements such as video application icons, music cards, etc.
  • the driving interface elements displayed in the main driver's display area can also be determined based on the user's identity information, or can also be determined based on the user's human body characteristics.
  • the entertainment interface elements displayed in the main driver's display area may also be determined based on the user's identity information, or may also be determined based on the user's human body characteristics.
  • the main driver's display area can also display interface elements other than driving, such as icons of communication applications, etc.; when the mobile carrier is in a parking state, the main driver's display area Interface elements other than entertainment can also be displayed, such as icons for map applications, etc.
  • driving-related interface elements can be mainly displayed in the main driver's display area, which helps to improve driving safety.
  • different interface elements are displayed to help improve the user's interactive experience.
  • the interface element includes one or more tabs
  • the method further includes: upon detecting that the user has displayed the first tab for the first display area.
  • the first display area is controlled to display icons of one or more application programs; wherein the one or more tabs include the first tab, and the first tab includes icons of the one or more application programs. .
  • the user's input for the first tab displayed in the first display area can be an operation of clicking the first tab; or it can also be a voice instruction for the first tab, for example, "Open the first page.” "Tab”; or it can also be a preset gesture for the first tab; or it can also be other forms of operations for the first tab, which are not specifically limited in the embodiments of this application.
  • the icons of the one or more application programs included in the first tab are icons of application programs of the same category.
  • the icons of the applications included in the first tab can be edited by the user through the setting function.
  • the icons of financial management, sports, work, and learning applications are respectively set in "Financial Management", In the “Sports”, “Work”, and “Study” tabs.
  • the name of the page tab and the icon of the application program included in each page tab may also be set when the mobile carrier leaves the factory, which is not specifically limited in the embodiment of the present application.
  • the method further includes: when detecting the user's input to the first display area, controlling the first display area to switch from the screen-off state to the display state.
  • the interface element when detecting the user's input to the first display area, controlling the first display area to switch from the screen-off state to the display state.
  • the user may not want to use the in-vehicle screen, and may control the in-vehicle screen or the area of the in-vehicle screen corresponding to the user's location to display the sleep interface.
  • the first display area is controlled to display the interface element through input to the first display area.
  • the sleep display interface may be an interface displayed on the vehicle screen when it is sleeping, such as static or dynamic wallpaper, or it may be other forms of sleep display interfaces, which are not specifically limited in the embodiments of this application.
  • the user's input for the first display area may be an operation of clicking the first display area; or, It can also be a voice instruction directed to the first display area, for example, "open the first display area”; or it can also be a preset gesture directed to the first display area; or it can also be other forms of gestures directed to the first display area. Operation, the embodiments of this application do not specifically limit this.
  • the off-screen state of the vehicle screen is the unpowered state
  • the sleep display interface is the powered-on state.
  • the vehicle screen can be both a black screen in the off-screen state and the sleep display state, that is, no information is displayed.
  • the vehicle screen can respond to the user's click operation. or voice command, it will immediately switch to display one or more interfaces; however, when the car screen is off, if you click on the screen or enter a voice command, the car screen will not display one or more interfaces.
  • the cockpit includes a first display screen
  • the first display screen includes the first display area and a second display area
  • the second display area is connected to the first display area of the cockpit.
  • the method further includes: when no user is detected in the second area, controlling the second display area to be in a screen-off state; and when a user is detected in the second area, controlling the second display area to display Interface elements.
  • the first display screen is a long screen that can extend to the main driving area and the passenger driving area, and the first display area and the second display area can respectively be the main driving area (the first display area) on the long screen. area) and the display area of the co-pilot area (second area).
  • the first display area that is, the area display interface elements in the long-screen main driving area
  • the second display area that is, the long-screen co-pilot area. area to turn off the screen
  • the second display area can be controlled to display interface elements.
  • this interface element can be the same as or different from the interface element displayed in the first display area. This is the case in the embodiment of the present application. No specific limitation is made.
  • the zoning control of the long vehicle screen can help save energy consumption during the use of the long vehicle screen.
  • the interface element further includes at least one of a card, an application icon, a wallpaper, and an animation.
  • the cockpit includes a vehicle-mounted fragrance
  • the cockpit is in a first state, including the vehicle-mounted fragrance being in a stop-release state; and controlling the cockpit to be in a second state includes : Control the car fragrance to turn on.
  • the cabin includes a vehicle air conditioner, the cabin is in a first state, including the vehicle air conditioner being in a shutdown state; and controlling the cabin to be in a second state includes: controlling the The car air conditioner is on.
  • the cockpit includes a vehicle-mounted audio and video system, the cockpit is in a first state, including the vehicle-mounted audio and video system being in a shutdown state; and controlling the cockpit to be in a second state includes: Control the car audio and video system to turn on.
  • the vehicle fragrance, vehicle air conditioning, and vehicle audio and video systems are turned on, which helps to further improve the user's welcome experience when using the cockpit.
  • the method further includes: obtaining resource occupancy information, the resource occupancy information being used to characterize the resource capacity allocated to the first display area; and controlling the resource occupancy information according to the resource occupancy information.
  • the first display area displays the interface element.
  • the first display area when it is detected that the resource capacity allocated to the first display area is small, the first display area can be controlled to display fewer interface elements, which helps to save resource occupation when the first display area is displayed. .
  • a method for controlling display may include: when detecting that the first user is located in the first area of the cockpit, controlling the first part of the cockpit according to the first user's human body characteristics or identity information.
  • the display area displays the first interface element, and the first area corresponds to the first display area.
  • the type of interface elements can be determined based on the user's human body characteristics or identity information, that is, personalized interface element recommendations are made based on different users, which helps to improve the user's interactive experience, driving experience, and entertainment experience.
  • the method further includes: when detecting that the first user is located in the second area of the cockpit, controlling the second display area of the cockpit to display the first Interface element, the second area corresponds to the second display area.
  • the current position of the first user can be controlled to control the display area corresponding to the position to display the first interface element, without the need for manual adjustment by the user, which helps to improve the user's driving experience. experience.
  • controlling the first display area of the cockpit to display the first interface element according to the human body characteristics or identity information of the first user includes: according to the driving state of the mobile carrier The information, the human body characteristics or the identity information of the first user controls the first display area of the cockpit to display the first interface element, and the mobile carrier includes the cockpit.
  • the type of the interface element can be further determined according to the driving status information of the mobile carrier. For example, when the user is in the main driving position and the mobile carrier is in the driving state, the interface element is determined to be the interface element related to assisted driving, which is helpful to To improve driving safety, the assisted driving-related interface elements are associated with the user's human body characteristics or identity information, eliminating the need for manual selection by the user. This improves the intelligence of the mobile carrier and helps improve the user's driving experience.
  • the method further includes: when detecting that the second user is located in the second area, controlling the second display according to the human body characteristics or identity information of the second user.
  • the area displays the second interface element.
  • the method further includes: when detecting that the second user is located in the first area and the first user is located in the second area, controlling the first display The second display area displays the second interface element and controls the second display area to display the first interface element.
  • the display area where the first interface element and the second interface element are located can be adjusted accordingly, without the need for the user to make manual adjustments, which helps improve The degree of intelligence of the mobile carrier improves the user’s driving experience.
  • controlling the first display area to display the first interface element according to the human body characteristics or identity information of the first user includes: according to the human body characteristics of the first user or identity information, to control the first display area to switch from the screen-off state to displaying the first interface element.
  • the first interface element includes one or more tabs
  • the method further includes: upon detecting that the first user displays the first page in the first display area.
  • the first display area is controlled to display one or more application icons; wherein the one or more tabs include the first tab, and the first tab includes the one or more Application icon.
  • the method further includes: upon detecting the first user When input is directed to the first display area, the first display area is controlled to switch from the screen-off state to displaying the first interface element.
  • the cockpit includes a first display screen
  • the first display screen includes the first display area and a second display area
  • the second display area is connected to the first display area of the cockpit.
  • the method further includes: when no user is detected in the second area, controlling the second display area to be in a screen-off state; and when a user is detected in the second area, controlling the second display area to display Interface elements.
  • the first interface element further includes at least one of a card, an application icon, a wallpaper, and an animation.
  • a method for controlling display includes: determining a first area where the user is in the cockpit; controlling the first display area to display a first interface element according to the first area, and the first area is related to the third area. Corresponds to one display area.
  • the corresponding display area of the area is controlled to display interface elements, which helps to improve the user's welcome experience when using the cockpit and meet the needs of regional users.
  • the first display area is a central control screen, and the first area is any area in the cockpit.
  • the central control screen corresponding to the main driving area is controlled to display icons of all applications installed on the mobile carrier.
  • controlling the first display area to display the first interface element includes: controlling the first display area to display the first interface according to the user's human body characteristics or identity information. element.
  • the first display interface is controlled to display interface elements in a targeted manner according to the user's human body characteristics, which facilitates the user to see the interface elements he is interested in on the first display area, which helps to improve the user's interactive experience.
  • the first area is the co-pilot area in the cockpit or the rear area in the cockpit.
  • the rear row area may be the second row area of the vehicle; taking a 7-seat vehicle as an example, the rear row area may be the second row area and/or the third row area.
  • the rear area may also include other rear areas in addition to the main driver's area and the passenger's area.
  • the method further includes: controlling the first display area to be in a screen-off state when there is no user in the first area.
  • the vehicle screen or the area of the vehicle screen corresponding to the area is controlled to enter the screen-off state, which helps to save energy consumption.
  • controlling the first display area to display the first interface element includes: controlling the first display area to display the first interface element according to the driving status information of the mobile carrier,
  • the mobile carrier includes the cabin.
  • driving-related interface elements can be mainly displayed in the main driver's display area, which helps to improve driving safety.
  • the first interface element includes one or more tabs
  • the method also includes: when detecting the user's input for the first tab displayed in the first display area, controlling the first display area to display icons of one or more application programs; wherein, the one or more pages The tab includes the first tab, and the first tab includes icons of the one or more application programs.
  • the icons of the one or more applications included in the first tab are icons of applications of the same category.
  • the cockpit includes a first display screen
  • the first display screen includes the first display area and a second display area
  • the second display area is connected with the cockpit.
  • the method further includes: when no user is detected in the second area, controlling the second display area to be in a screen-off state; and when a user is detected in the second area, controlling the second display area to display Second interface element.
  • the cockpit includes a third display area
  • the third display area corresponds to the third area of the cockpit
  • the method further includes: detecting in the third area When the user arrives, the third display area is controlled to display a third interface element.
  • the first interface element further includes at least one of a card, an application icon, a wallpaper, and an animation.
  • a device for controlling a display includes: a detection unit for detecting whether there is a user in a first area of the cockpit, wherein the cockpit is in a first state; and a processing unit for detecting whether there is a user in the first area. When a user is detected, the cockpit is controlled to be in the second state.
  • the fourth aspect it is detected whether there is a user in the first area of the cockpit, wherein the first display area of the cockpit is in a screen-off state, and the first display area and the first area Correspondingly, when a user is detected in the first area, the first display area is controlled to switch from the screen-off state to displaying interface elements.
  • the processing unit is specifically configured to: control the first display area to display the interface element according to the user's human body characteristics.
  • the processing unit is specifically configured to: control the first display area to display the interface element according to the user's identity information.
  • the processing unit is specifically configured to: control the first display area to display the interface element according to the driving status information of the mobile carrier, and the mobile carrier includes the cockpit.
  • the interface element includes one or more tabs
  • the processing unit is further configured to: when the detection unit detects that the user displays a page for the first display area.
  • the first display area is controlled to display one or more application icons; wherein the one or more page tabs include the first page tab, and the first page tab includes the one or more application programs. application icon.
  • the icons of the one or more application programs included in the first tab are icons of application programs of the same category.
  • the processing unit is specifically configured to: when the detection unit detects the user's input to the first display area, control the first display area to turn off the first display area. Switch the screen status to display Display the interface element.
  • the cockpit includes a first display screen
  • the first display screen includes the first display area and a second display area
  • the second display area is connected to the first display area of the cockpit.
  • the processing unit is also used to: when the detection unit does not detect a user in the second area, control the second display area to be in a screen-off state; when the detection unit detects a user in the second area When the user is in control, the second display area is controlled to display interface elements.
  • the interface element further includes at least one of a card, an application icon, a wallpaper, and an animation.
  • the cabin includes a vehicle-mounted fragrance, and the cabin is in a first state, including the vehicle-mounted fragrance being in a stopped release state; the processing unit is specifically used to: control the vehicle-mounted fragrance The fragrance is on.
  • the cockpit includes a vehicle air conditioner, and the cockpit is in the first state, including the vehicle air conditioner being in a shutdown state; the processing unit is specifically used to: control the vehicle air conditioner to turn on.
  • the cockpit includes a vehicle-mounted audio and video system, and the cockpit is in a first state, including the vehicle-mounted audio and video system being in a shutdown state; the processing unit is specifically used to: control the vehicle-mounted audio and video system The system is on.
  • the device further includes: an acquisition unit, configured to acquire resource occupancy information, where the resource occupancy information is used to characterize the resource capacity allocated to the first display area;
  • the processing unit is also configured to control the first display area to display the interface element according to the resource occupation information.
  • a device for controlling display may include: a processing unit configured to, when the detection unit detects that the first user is located in the first area in the cockpit, determine the display based on the first user's human body characteristics or identity. The information controls the first display area of the cockpit to display the first interface element, and the first area corresponds to the first display area.
  • the processing unit is further configured to: when the detection unit detects that the first user is located in the second area of the cockpit, control the second display of the cockpit.
  • the first interface element is displayed in an area, and the second area corresponds to the second display area.
  • the processing unit is further configured to: control the first display area display of the cockpit according to the driving status information of the mobile carrier, the human body characteristics or identity information of the first user The first interface element, the mobile carrier includes the cockpit.
  • the processing unit is further configured to: when the detection unit detects that the second user is located in the second area, based on the human body characteristics or identity information of the second user The second display area is controlled to display a second interface element.
  • the processing unit is further configured to: when the detection unit detects that the second user is located in the first area and the first user is located in the second area, The first display area is controlled to display the second interface element and the second display area is controlled to display the first interface element.
  • the processing unit is specifically configured to: control the first display area to switch from the screen-off state to displaying the first display area according to the human body characteristics or identity information of the first user. Interface elements.
  • the first interface element includes one or more tabs
  • the processing unit is specifically configured to: upon detecting that the first user displays a display for the first display area When the first page tab is input, the first display area is controlled to display one or more application icons; wherein the one or more page tabs include the first page tab, and the first page tab includes the one or Icons for multiple applications.
  • the processing unit is further configured to: upon detecting the first When the user inputs to the first display area, the first display area is controlled to switch from the screen-off state to displaying the first interface element.
  • the cockpit includes a first display screen
  • the first display screen includes the first display area and a second display area
  • the second display area is connected with the cockpit.
  • the processing unit is also configured to: when a user is not detected in the second area, control the second display area to be in a screen-off state; when a user is detected in the second area, control the second display Area displays interface elements.
  • the first interface element further includes at least one of a card, an application icon, a wallpaper, and an animation.
  • a device for controlling display includes: a determining unit configured to determine a first area where the user is in the cockpit; and a processing unit configured to control the first display area to display a third area based on the first area.
  • An interface element, the first area corresponds to the first display area.
  • the first display area is a central control screen, and the first area is any area in the cockpit.
  • the processing unit is configured to: control the first display area to display the first interface element according to the user's human body characteristics or identity information.
  • the first area is the co-pilot area in the cockpit or the rear area in the cockpit.
  • the processing unit is further configured to: control the first display area to be in a screen-off state when there is no user in the first area.
  • the processing unit is configured to: control the first display area to display the first interface element according to the driving status information of the mobile carrier, the mobile carrier including the cockpit.
  • the device further includes a detection unit, the first interface element includes one or more tabs, and the processing unit is further configured to: when the detection unit detects the When the user inputs on the first tab displayed in the first display area, the first display area is controlled to display one or more application icons; wherein the one or more tabs include the first tab, and the The first tab includes icons of the one or more applications.
  • the icons of the one or more application programs included in the first tab are icons of application programs of the same category.
  • the device further includes a detection unit
  • the cockpit includes a first display screen
  • the first display screen includes the first display area and a second display area
  • the third The second display area corresponds to the second area of the cockpit
  • the processing unit is also used to: when the detection unit does not detect a user in the second area, control the second display area to be in a screen-off state; when the detection unit When a user is detected in the second area, the second display area is controlled to display a second interface element.
  • the device further includes a detection unit, the cockpit includes a third display area, the third display area corresponds to the third area of the cockpit, and the processing unit uses In: when the detection unit detects a user in the third area, control the third display area to display a third interface element.
  • the first interface element further includes at least one of a card, an application icon, a wallpaper, and an animation.
  • a seventh aspect provides a device for controlling display.
  • the device includes: a memory for storing a program; a processor for executing the program stored in the memory.
  • the processor is configured to execute the above-mentioned third program.
  • a mobile carrier which includes the device in any one of the possible implementations of the fourth to seventh aspects.
  • the mobile carrier is a vehicle.
  • a computer program product includes: computer program code.
  • the computer program code When the computer program code is run on a computer, it enables the computer to execute any one of the first aspect or the third aspect. method within the method.
  • the above computer program code may be stored in whole or in part on the first storage medium, where the first storage medium may be packaged together with the processor, or may be packaged separately from the processor. This is not the case in the embodiments of this application. Specific limitations.
  • a computer-readable medium stores instructions.
  • the processor implements any of the possible implementation methods of the first aspect or the third aspect. method in.
  • a chip in an eleventh aspect, includes a processor for calling a computer program or computer instructions stored in a memory, so that the processor executes any of the possible implementations of the first aspect or the third aspect. method within the method.
  • the processor is coupled to the memory through an interface.
  • the chip system further includes a memory, and a computer program or computer instructions are stored in the memory.
  • the embodiments provide a method, device and mobile carrier for controlling display, which can control changes in the status of the cockpit when a user is detected in the cockpit, helping to improve the user's welcome experience when using the cockpit. It can also save energy consumption of equipment in the cockpit.
  • each vehicle screen only supports the display of a small number of applications (Apps) and/or cards, and the displayed applications, cards, etc. are all fixed, resulting in users having to choose between The small space makes the user's driving experience poor.
  • the display screen of the cockpit area or the display screen area corresponding to the cockpit area is controlled to display interface elements, which helps to improve the user's welcome experience.
  • the type of interface elements can be determined based on the user's human body characteristics or identity information, and the type of interface elements can also be determined based on the driving status of the mobile carrier, that is, personalized interface element recommendations are made based on different users to further enhance the user's interactive experience. Driving experience and entertainment experience.
  • the user enters the cockpit he may not want to use the in-vehicle screen. In this case, he can control the in-vehicle screen or the area of the in-vehicle screen corresponding to the user's location to display the sleep interface.
  • the first display area is controlled to display the interface element through input to the first display area. That is, it can control whether the vehicle screen displays interface elements according to the user's selection and/or operation, which not only helps reduce energy consumption, but also improves the user's interactive experience when using the vehicle screen.
  • different interface elements can be displayed on the display screens corresponding to the locations of different users based on their physical characteristics or identity information and/or the driving status of the mobile carrier, which helps to provide different services for different users. offers different options to users.
  • the interface elements include one or more tabs
  • the specific type of icon of the displayed application can be determined according to the tab selected by the user, which can provide the user with more options while occupying less computing resources. Application icons help improve the user experience.
  • Figure 1 is a schematic functional block diagram of a vehicle provided by an embodiment of the present application.
  • Figure 2 is a schematic diagram of a vehicle cockpit scene provided by an embodiment of the present application.
  • FIG. 3 is a schematic block diagram of a system for controlling display provided by an embodiment of the present application.
  • FIG. 4 is another schematic block diagram of a system for controlling display provided by an embodiment of the present application.
  • FIG. 5 is a schematic diagram of an application scenario of a method for controlling display provided by an embodiment of the present application.
  • FIG. 6 is another schematic diagram of an application scenario of a method for controlling display provided by an embodiment of the present application.
  • FIG. 7 is another schematic diagram of an application scenario of a method for controlling display provided by an embodiment of the present application.
  • FIG. 8 is another schematic diagram of an application scenario of a method for controlling display provided by an embodiment of the present application.
  • FIG. 9 is another schematic diagram of an application scenario of a method for controlling display provided by an embodiment of the present application.
  • Figure 10 is another schematic diagram of an application scenario of a method for controlling display provided by an embodiment of the present application.
  • FIG. 11 is another schematic diagram of an application scenario of a method for controlling display provided by an embodiment of the present application.
  • Figure 12 is a schematic flow chart of a method for controlling display provided by an embodiment of the present application.
  • Figure 13 is a schematic flow chart of a method for controlling display provided by an embodiment of the present application.
  • Figure 14 is a schematic flow chart of a method for controlling display provided by an embodiment of the present application.
  • Figure 15 is a schematic flow chart of a method for controlling display provided by an embodiment of the present application.
  • Figure 16 is a schematic block diagram of a device for controlling display provided by an embodiment of the present application.
  • Figure 17 is a schematic block diagram of a device for controlling display provided by an embodiment of the present application.
  • Figure 18 is a schematic block diagram of a device for controlling display provided by an embodiment of the present application.
  • FIG. 1 is a functional block diagram of a vehicle 100 provided by an embodiment of the present application.
  • Vehicle 100 may include a perception system 120 , a display device 130 , and a computing platform 150 , where perception system 120 may include one or more sensors that sense information about the environment surrounding vehicle 100 .
  • the sensing system 120 may include a positioning system.
  • the positioning system may be a global positioning system (GPS), a Beidou system or other positioning systems, an inertial measurement unit (IMU), a lidar, a millimeter One or more of wave radar, ultrasonic radar, and camera devices; the sensing system 120 may also include a pressure sensor, which is disposed under the seat for detecting whether there is a user on the seat; the sensing system 120 may also include an acoustic wave sensor for detecting whether there is a user on the seat. For detecting audio information in the cockpit.
  • GPS global positioning system
  • Beidou system or other positioning systems GPS
  • IMU inertial measurement unit
  • lidar a lidar
  • the sensing system 120 may also include a pressure sensor, which is disposed under the seat for detecting whether there is a user on the seat
  • the sensing system 120 may also include an acoustic wave sensor for detecting whether there is a user on the seat. For detecting audio
  • the computing platform 150 may include one or more processors, such as processors 151 to 15n (n is a positive integer).
  • the processor is a circuit with signal processing capabilities.
  • the processor may be a circuit with instructions. Circuits with read and run capabilities, such as central processing unit (CPU), microprocessor, graphics processing unit (GPU) (can be understood as a microprocessor), or digital signal processor (digital signal processor, DSP), etc.; in another implementation, the processor can achieve certain functions through the logical relationship of the hardware circuit. The logical relationship of the hardware circuit is fixed or can be reconstructed.
  • the processor is a dedicated integrated circuit.
  • ASIC application-specific integrated circuit
  • PLD programmable logic device
  • FPGA field programmable gate array
  • the process of the processor loading the configuration file and realizing the hardware circuit configuration can be understood as the process of the processor loading instructions to realize the functions of some or all of the above units.
  • it can also be a hardware circuit designed for artificial intelligence, which can be understood as an ASIC, such as a neural network processing unit (NPU), tensor processing unit (TPU), deep learning processing unit Unit (deep learning processing unit, DPU), etc.
  • the computing platform 150 may also include a memory, which is used to store instructions. Some or all of the processors 151 to 15n may call instructions in the memory and execute the instructions to implement corresponding functions.
  • the display device 130 in the cockpit is mainly divided into two categories.
  • the first category is a vehicle-mounted display screen;
  • the second category is a projection display screen, such as a head-up display (HUD).
  • the vehicle display screen is a physical display screen and an important part of the vehicle infotainment system.
  • There can be multiple displays in the cockpit such as digital instrument display, central control screen, passenger in the co-pilot seat (also known as The display in front of the front passenger), the display in front of the left rear passenger, the display in front of the right rear passenger, and even the car window can be used as a display.
  • Head-up display also known as head-up display system. It is mainly used to display driving information such as speed and navigation on the display device in front of the driver (such as the windshield).
  • HUD includes, for example, combined head-up display (combiner-HUD, C-HUD) system, windshield-type head-up display (windshield-HUD, W-HUD) system, and augmented reality head-up display system (augmented reality HUD, AR-HUD).
  • the processor can obtain the seat pressure information, the user's face information, audio information, etc. detected by the sensing system 120, and then control the control based on at least one of the seat pressure information, the user's face information, and the audio information.
  • the display device 130 displays icons and/or cards of application programs.
  • the above-mentioned seat pressure information, user's face information, audio information and other information can also be stored in the memory in the computing platform 150 in the form of data.
  • FIG. 2 is a schematic diagram of a vehicle cockpit scene provided by an embodiment of the present application.
  • One or more vehicle-mounted display screens are provided inside the smart cockpit, including but not limited to the display screen 201 (or, it can also be called the central control screen), the display screen 202 (or, it can also be called the passenger entertainment screen) ), the display screen 203 (or, it can also be called the screen behind the main driver's headrest), the display screen 204 (or, it can also be called the screen at the back of the passenger's headrest), and the instrument screen.
  • the display screen 201 may also be a long screen extending to the passenger area.
  • one or more cameras can also be installed in the cockpit to capture images inside or outside the cabin, such as the camera of the driver monitoring system (DMS), the cabin monitoring system (cabin monitor system) , CMS) cameras, and dashcam cameras.
  • the cameras used to capture the interior and exterior of the cabin can be the same camera or different cameras.
  • one or more pressure sensors and sound wave sensors are installed in the cockpit to monitor whether there is a user inside the cockpit and the user's location.
  • the display screens 201 to 204 may display a graphical user interface (GUI), which may include one or more icons of application programs, and/or one or more cards.
  • GUI graphical user interface
  • the cockpit can include a central control screen, a passenger entertainment screen, a screen behind the main driver's headrest, a screen behind the passenger's headrest, an entertainment screen in the left area of the third row, and an entertainment screen in the right area of the third row.
  • the cockpit may include a front row entertainment screen and a rear row entertainment screen; or, the cockpit may include a display screen in the driving area and an entertainment screen in the passenger area.
  • the entertainment screen in the passenger area can also be placed on the top of the cabin.
  • FIG. 3 shows a schematic block diagram of a system for controlling display provided by an embodiment of the present application.
  • the system includes an application deployment module, a device 301, and a plurality of display devices 351 to 35n controlled by the device 301.
  • the application deployment module can deploy all applications (and/or cards).
  • the application deployment module can save the correspondence between each screen in the cockpit and the icons (and/or cards) of the applications displayed therein.
  • screen 201 can display icons (and/or cards) of all applications
  • screens 202 to 204 can display icons (and/or cards) of some applications.
  • Table 1 shows each cockpit screen and the applications it displays.
  • the application deployment module can also save the correspondence between the user's human body characteristics or identity information and the application (and/or card), such as male, female, juvenile, elderly users etc. respectively correspond to which applications (and/or cards), user A's identity information corresponds to which applications (and/or cards), etc. Tables 2 and 3 respectively show the human body characteristics, identity information and their applications.
  • the application deployment module may include one or more processors in computing platform 150 shown in FIG. 1 .
  • the device 301 includes a detection module and a processing module.
  • the detection module may include one or more camera devices in the sensing system 120 shown in Figure 1, and one or more sensors; the processing module may include one or more sensors shown in Figure 1.
  • the display devices 351 to 35n may include one or more of the display devices 130 in FIG. 1.
  • the display devices 351 to 35n may be the screens shown in FIG. 2 One or more of 201 to 204.
  • the processing module determines whether there is a user inside the cockpit and the user's location based on the seat pressure information, the user's face information, audio information, etc., and then controls the display device at the user's location to display the application's icon and /or a card, wherein the icons of which applications are specifically displayed by the display device may be determined based on the corresponding relationship stored by the application deployment module.
  • the processing module can confirm the user's age, gender, etc.
  • the system for controlling display may also include a device 302 and a device 303, as shown in (b) of Figure 3 .
  • the device 302 can control the display device 351, and the device 303 can control the display devices 352 to 35n. It should be understood that the device 302 controls the display device 351 to display which application program icons (and/or cards) can be determined by combining the information detected by the detection module and the corresponding relationship saved by the application deployment module.
  • the system for controlling display may also include a device 304, as shown in (c) in Figure 3.
  • the device 304 can also control other vehicle-mounted devices, where, Other vehicle equipment includes but is not limited to vehicle fragrance, vehicle air conditioning, and vehicle audio and video systems.
  • vehicle equipment includes but is not limited to vehicle fragrance, vehicle air conditioning, and vehicle audio and video systems.
  • the device 304 controls the display devices 351 to 35n
  • the type of icon of the application program and/or the type of card displayed by the display devices 351 to 35n can be determined according to the correspondence relationship saved by the application deployment module.
  • other vehicle-mounted devices mentioned in the above embodiments can also be controlled by devices different from the above-mentioned devices 301 to 304 that control the display device. This is not specifically limited in the embodiments of the present application. It should be understood that the above-mentioned modules and devices are only examples. In actual applications, the above-mentioned modules and devices may be added or deleted according to actual needs.
  • the "identity information” involved in the embodiments of this application includes but is not limited to biometric information stored in the vehicle, as well as account numbers, etc.
  • biometric information includes but is not limited to fingerprints, palmprints, facial information, iris information, gait information, etc.
  • the account number may include account information for logging into the vehicle system, etc.
  • FIG. 4 is a schematic diagram of a system architecture for controlling display provided by an embodiment of the present application.
  • the device 301 shown in (a) in FIG. 3 may be a system-on-a-chip (SOC).
  • SOC system-on-a-chip
  • the device 301 The functions of the hardware systems required by different types of operating systems can be simulated through multiple virtual machines, so that different types of operating systems can be run in the device 301 , and the multiple operating systems can be managed through a virtual machine manager.
  • virtual machine 1 can run a real time operation system (RTOS), and virtual machine 2 can run a guest Linux operation system (guest Linux operation system).
  • RTOS real time operation system
  • guest Linux operation system guest Linux operation system
  • the device 301 can allocate appropriate resource proportions to different operating systems through a virtual machine manager, such as CPU, memory, cache, etc.
  • the application domain 1 can be run on the virtual machine 1, and the application domain 1 can include the application of the instrument domain.
  • the instrument screen can display the application of the instrument domain.
  • application domain 2 can be run on the virtual machine 2, and the application domain 2 can include in-vehicle infotainment (IVI) applications.
  • the programs running in application domain 1 and application domain 2 may be determined by the application deployment module.
  • FIG. 5 is a schematic diagram of an application scenario of a method for controlling display provided by an embodiment of the present application.
  • the vehicle when the vehicle detects that user A is present in the main driving area, the vehicle may control the display screen 201 to display the icon of the application program. Since there are no users in the passenger area or the rear area of the vehicle, the display screens 202 to 204 can be controlled to turn off the screen; or as shown in (a) of Figure 5, only the time, date and other information can be displayed. No user is present in the main driving area When , the display screen 201 can turn off the screen or only display time, date and other information. When detecting user B in the left rear seat, the display screen 203 can be controlled to display the icon of the application, as shown in (b) in Figure 5 shown.
  • the vehicle when the vehicle detects that user A is present in the main driving area, the vehicle can control the display screen 201 to display icons of all applications.
  • all applications can refer to all applications installed on the vehicle.
  • the vehicle may control the display screen 201 to display icons for all applications when a user is detected anywhere in the vehicle.
  • the vehicle when it detects a user at a certain seat, it can control the screen at that location to display a sleep display interface, such as an always on display (AOD), or display static or dynamic wallpapers
  • a sleep display interface such as an always on display (AOD), or display static or dynamic wallpapers
  • Other hibernation display interfaces display the application icon after detecting the user's click on the screen.
  • the control display screen 203 displays the sleep display information as shown in 301 (for example, static wallpaper or dynamic wallpaper, current time, date, region and weather), etc., after user B clicks the screen, the application icon shown in 302 is displayed.
  • the content displayed on the screen can also be controlled by detecting whether the user's gaze remains on the screen. For example, when it is detected that the user's gaze is on the screen, the screen is controlled to display sleep display information; when the user's gaze leaves the screen for more than a preset period of time, the screen is controlled to turn off.
  • the above-mentioned preset time length may be 5 seconds, or 10 seconds, or other time lengths, which are not specifically limited in this embodiment of the present application.
  • the embodiment of the present application provides a method for controlling display.
  • applications are not pushed, which helps to save energy consumption. In some scenarios, it helps to allocate more computing resources to the vehicle screen corresponding to the user's seat, allowing users to obtain a better driving experience.
  • pushing the application after the user enters the vehicle cockpit can also help improve the user's human-computer interaction experience when using the vehicle.
  • the vehicle when the vehicle detects that there is user A in the main driving area, the vehicle can control the display screen 201 to display one or more cards, as shown in (a) of Figure 6; screen, you can control it to turn off the screen or display sleep display information.
  • the vehicle when the vehicle detects that there is user B in the left rear seat, the vehicle can control the display screen 203 to display one or more cards, as shown in (b) in Figure 6; for a screen without a user in the seat, the vehicle can control It turns off the screen, or displays a sleep display message.
  • the vehicle when the vehicle detects a user at a certain seat, it can control the screen at that location to display a static or dynamic wallpaper, and after detecting the user's click on the screen, display one or more cards.
  • the control display screen 203 displays the sleep display information as shown in 401, including static wallpaper or dynamic wallpaper, current time, and date. , region, weather, etc., after user B clicks the screen, the card shown in 402 is displayed.
  • the cards described above may be associated with certain applications installed on the vehicle.
  • the card can be associated with a car music app.
  • the card can display the name of the singer corresponding to a certain piece of music, lyrics information, playback progress bar, like control, switch to play the previous music control, tentative/playback control, and switch to play the next music control. wait.
  • the vehicle detects the user's click on the card, the vehicle can display the display interface of the in-car music application through the on-board screen.
  • the above card can display text information and control information, or only the application icon.
  • the card can display only the icon of the car music app.
  • Cards can also be associated with some local functions of the vehicle.
  • the card can be associated with information about the vehicle's remaining power and the cruising range corresponding to the remaining power.
  • the vehicle can The screen displays the remaining power of the vehicle and the display interface of the cruising range corresponding to the remaining power.
  • Cards can also be associated with the display interface of certain functions of the application.
  • a card can be linked to a payment function in a payment application. When the vehicle detects that the user clicks on the card, it may not display the homepage of the payment application but directly display the display interface related to the payment function in the payment application.
  • Cards can also be associated with multiple application display lists.
  • the vehicle detects the user's click on the card, the vehicle can display the display interfaces of multiple applications installed on the vehicle through the vehicle screen.
  • the above cards can be displayed on the car screen through user settings. For example, users can edit the cards they wish to display in the GUI through the settings function. Alternatively, the card may also be set when the vehicle leaves the factory, which is not specifically limited in the embodiment of the present application.
  • the above GUI only takes four cards as an example for explanation.
  • the GUI may also include more or less cards, which is not limited in the embodiment of the present application.
  • the vehicle when the vehicle detects a user at a certain seat, the vehicle can control the screen corresponding to the seat to display one or more tabs, where each tab includes one or more application programs. icon, the user can click anywhere on a tab to select the tab.
  • icons of one or more applications included in the tab are displayed on the vehicle screen. For example, as shown in (a) of Figure 7, when the vehicle detects a user in the passenger area, the vehicle control display screen 202 displays tab 1, tab 2 and tab 3 as shown in 601. The user After tab 3 is clicked, the vehicle control display screen 202 displays icons of applications included in tab 3 as shown at 602 .
  • the icons of the applications in the above tabs can be displayed on the car screen through the user's settings.
  • users can edit the specific name of the tab and the icons of the applications included in the tab through the settings function.
  • the name of the page tab and/or the icon of the included application program may also be set when the vehicle leaves the factory, which is not specifically limited in the embodiment of the present application.
  • the icons of shopping applications can be set in the "Shopping" tab, and the icons of video applications, music applications, and game applications can be set in the "Entertainment” tab.
  • set the icon of the navigation application and the icon of the map application in the "Driving Assistant" tab as shown in (b) of Figure 7.
  • multiple tabs can be displayed in a scrolling form.
  • the "Driving Assistant" tab is displayed at the top of the vehicle screen.
  • the vehicle screen display switches to as shown in 606, and the "Entertainment” tab is displayed on the vehicle screen. top of the screen.
  • the vehicle screen displays icons of applications included in the "Entertainment” tab as shown in step 607.
  • multiple tabs can also be displayed in a tiled form, as shown in (d) in Figure 7.
  • the tabs displayed in the middle of the car screen can be switched. bookmark.
  • the "Driving Assistant" tab is displayed in the middle of the vehicle screen.
  • the vehicle screen display switches to the "Entertainment” tab as shown in 609 and is displayed in the middle of the vehicle screen. .
  • the vehicle screen displays icons of applications included in the "Entertainment” tab as shown in step 610 .
  • the multiple tabs can also be arranged up and down on the car screen.
  • the user slides the finger up and down on the car screen, the user can switch to the top of the car screen or the middle of the car screen. Displayed tabs; alternatively, multiple tabs can also be displayed on the vehicle screen in other forms, and the embodiments of this application do not specifically limit this.
  • the names of the above tabs and the icons of the applications included in each tab can be displayed on the car screen through the user's settings. For example, users can edit through the settings function and set tabs such as "Financial Management", “Sports”, “Work”, and “Study”. Alternatively, the names of the tabs and the icons of the applications included in each tab may also be set when the vehicle leaves the factory, which is not specifically limited in the embodiment of the present application.
  • the display control method provided by the embodiment of the present application displays more application icons in the form of tabs. After the user selects a specific tab through a click operation, the icons of the application programs in the tab are pushed, which can be used in less calculations. When resources are occupied, providing users with more choices helps improve the user experience.
  • the type of application or card displayed by the display device can be determined according to the real-time status of the vehicle (such as driving or parking); the type of application or card displayed by the display device can also be determined according to the user's gender, age, etc. , determines the type of application or card displayed by the display device.
  • the display screen 201 can be controlled to display icons of driving-related applications (such as icons of navigation applications, icons of map applications, etc.) , as well as icons of other commonly used applications (such as icons of music applications, icons of communication applications, etc.), control the display screens 202 to 204 to display icons of entertainment-related applications, such as icons of video applications, music applications, etc. Application icons, game application icons, etc.
  • driving-related applications such as icons of navigation applications, icons of map applications, etc.
  • icons of other commonly used applications such as icons of music applications, icons of communication applications, etc.
  • control the display screens 202 to 204 to display icons of entertainment-related applications, such as icons of video applications, music applications, etc.
  • the display screen 201 can be controlled to display icons of entertainment-related applications, as shown in (b) of FIG. 8 .
  • the icons of the applications displayed on the screen can be switched.
  • the first page of the display screen 201 can be controlled to display icons of driving applications; when the vehicle is in a parking state, the first page of the display screen 201 can be controlled to display icons of entertainment applications.
  • the control display screen 201 displays the icon of the application program on the second page.
  • the first page of the control display 201 displays icons of entertainment applications.
  • the above fixed duration may be 30 seconds, or 60 seconds, or other durations, which are not specifically limited in this embodiment of the present application.
  • the display control method provided by the embodiment of the present application can determine the icon or card type of the application displayed by the display device based on the real-time status of the vehicle, the user's gender, age, etc., which helps to improve the user's human-computer interaction experience.
  • the central control screen and the passenger entertainment screen in the vehicle cockpit can be the same screen, as shown in Figure 11.
  • the screen can be divided into two display areas, namely display area 1 and display area 2.
  • Display area 1 may be a display area close to the main driver, and display area 2 may be a display area close to the passenger area.
  • the display area 1 and the display area 2 may display icons or cards of the same application program, or the display area 1 and the display area 2 may display icons or cards of different application programs.
  • the user's finger slides in the display area 1 the user's finger can control the switching of the display content in the display area 1; when the user's finger slides in the display area 2, the user's finger can control the switching of the display content in the display area 2.
  • the switching of the display content in the two areas Can not affect each other.
  • both the vehicle screen display area 1 and the display area 2 display icons of applications on the first page, as shown in 901 in (a) of Figure 11 .
  • the first page displayed in the display area 1 can display the icons commonly used during driving.
  • the icons of application programs include the icon of Navigation Assistant 1, the icon of Navigation Assistant 2, the icon of map application, the icon of music application, the icon of address book application and the icon of phone application.
  • the first page displayed in display area 2 can display entertainment. Icons of commonly used applications include Huawei Video icons, browser application icons, shopping application icons, mall application icons, game application icons, music application icons, stock application icons, and camera application icons.
  • the display area 2 displays the application icons on the second page, and the display area 1 keeps displaying the application icons on the first page unchanged, as shown in 902.
  • the display area 1 displays the application icons on the second page, and the display area 2 keeps displaying the application icons on the first page unchanged, as shown in Figure 11 ( Shown as 904 in b).
  • the co-pilot’s entertainment screen when the co-pilot has no user, can be controlled to turn off the screen, or the sleep display interface can be displayed, as shown in (c) in Figure 11; when the main driver has no user, the main driver can be controlled to The driving screen is turned off or the sleep display interface is displayed, as shown in (d) in Figure 11.
  • the type of icon or card type of the application displayed in the display area 1 and/or the display area 2 can also be determined according to the real-time state of the vehicle (such as a driving state or a parking state); and The type of icon or card type of the application program displayed in the display area 1 and/or the display area 2 may be determined according to the user's gender, age, etc.
  • icons of driving applications may be displayed in display area 1; when the vehicle is in a parking state, icons of entertainment applications may be displayed in display area 1.
  • the icon of the driving application can be displayed on the first page in the display area 1, as shown in 901; when the vehicle is in a parking state, the icon of the driving application can be displayed on the first page in the display area 1.
  • the first page displays icons of entertainment applications, as shown in 903.
  • the following example illustrates how the vehicle determines whether there is a user in the cockpit and the user's position in the cockpit.
  • the vehicle can detect whether a user is present at a certain seat through sensors, including but not limited to: detecting whether a user is present at a certain seat through an acoustic wave sensor; detecting through a camera device or a visual sensor in the cabin Whether there is a user at a certain seat; detect whether there is a user at the seat through the pressure sensor installed at the seat.
  • sensors including but not limited to: detecting whether a user is present at a certain seat through an acoustic wave sensor; detecting through a camera device or a visual sensor in the cabin Whether there is a user at a certain seat; detect whether there is a user at the seat through the pressure sensor installed at the seat.
  • an acoustic wave sensor is used to detect whether a user is present at a certain seat.
  • the vehicle can determine whether there is a user in the cockpit and the actual location of the user based on the audio information obtained by the acoustic wave sensor.
  • the audio information may be audio information obtained by excluding various invalid audio information from the collected audio information inside the vehicle, and the invalid audio information may be audio information with a volume that is too low.
  • the sound source location can be the location of the sound source corresponding to the audio information.
  • the position of the sound source may be the relative position to the vehicle screen based on sound source positioning, or it may be a specific position coordinate, which is not specifically limited in the embodiment of the present application.
  • the sound source location can be determined based on the audio information collected by multiple acoustic wave sensors based on the time difference of arrival (TDOA) principle.
  • TDOA time difference of arrival
  • sound wave sensors A and B respectively detect audio signals emitted from sound source S. The time when the sound source signal of sound source S reaches sound wave sensor A is t1, and the time when it reaches sound wave sensor B is t2.
  • the time difference dt
  • the audio information can be a voice containing a specific wake-up word, such as "turn on the car", "turn on the smart screen", etc.
  • the above-mentioned acoustic wave sensor may be an ultrasonic transceiver device integrated or installed on a vehicle screen or an ultrasonic transceiver device installed inside the vehicle cabin (including but not limited to a microphone sensor or a microphone sensor array).
  • a camera device or a cabin visual sensor is used to detect whether a user is present at a certain seat. Specifically, the user's face information is obtained through a camera device or a cabin visual sensor, and then the cabin is determined based on the user's face information. Whether there is a user inside and where the user is actually located.
  • the above-mentioned camera devices or in-cabin visual sensors include but are not limited to: camera sensors integrated or installed on the vehicle screen or camera sensors installed inside the vehicle cabin, such as red green blue (RGB) cameras, red, green, and blue (RGB) cameras. Infrared (red green blue-infrared radiation, RGB-IR) camera, time of flight (TOF) camera.
  • the user's gender, age, etc. can be determined through algorithms such as facial attribute recognition algorithms and facial gender classification algorithms. For example, after obtaining the user's face information, it can also be used to determine whether the user's gaze focus is on the vehicle screen through a gaze estimation algorithm or a gaze tracking algorithm, and then control the content displayed on the vehicle screen.
  • lidar integrated or installed on the vehicle screen or lidar installed inside the vehicle cabin
  • radio transceiver devices on the vehicle screen or at the edge of the vehicle screen including but not limited to Millimeter wave radar or centimeter wave radar
  • infrared sensing devices integrated on the vehicle screen or at the edge of the vehicle screen including but not limited to infrared rangefinders, laser rangefinders
  • eye trackers etc. to detect whether a certain seat is There is a user present.
  • the pressure sensor provided at the seat is used to detect whether a user is present at the seat. Specifically, when the pressure at a certain seat is greater than or equal to a preset threshold, it is confirmed that there is a user seat at the seat.
  • a preset threshold may be 100 Newton (N), or 200N, or may be other values, which are not specifically limited in the embodiments of the present application.
  • any one of the above methods can be used, or a combination of the above methods can be used, or other methods can also be used.
  • the embodiments of this application are suitable for This is not specifically limited.
  • Figure 12 shows a schematic flowchart of a method 1100 for controlling display provided by an embodiment of the present application.
  • the method 1100 may be applied to the vehicle 100 shown in FIG. 1 , and the method may also be performed by the system shown in FIG. 3 .
  • the steps or operations of the method of controlling display shown in FIG. 12 are only exemplary illustrations, and embodiments of the present application may also perform other operations or modifications of each operation in FIG. 12 .
  • the method 1100 includes:
  • S1110 Detect whether there is a user in the first area of the cockpit, where the cockpit is in the first state.
  • the cockpit may include at least one of the display screens 201 to 204 in the above embodiment; or the cockpit may also include a car fragrance, a car air conditioner, a car audio and video system, etc. in the above embodiment; or the The cockpit may also include other equipment, which is not specifically limited in the embodiments of this application.
  • the cabin being in the first state may include at least one of the following: the vehicle display screen (such as the displays 201 to 204) is in a screen-off state; the vehicle fragrance is in a closed state, that is, the vehicle fragrance does not release fragrance; the vehicle air conditioner Is in the off state; the car audio and video system is in the off state.
  • the vehicle display screen such as the displays 201 to 204
  • the vehicle fragrance is in a closed state, that is, the vehicle fragrance does not release fragrance
  • the car audio and video system is in the off state.
  • controlling the cockpit to be in the second state may include: controlling the display screen corresponding to the first area to display one or more An icon and/or card for an application.
  • controlling the display screen corresponding to the first area to display icons and/or cards of one or more applications please refer to the description in the above embodiments, and will not be described again here.
  • the display screen corresponding to the first area may include a display screen disposed in the first area.
  • the display screen corresponding to the first area may be the main driving area.
  • Driving screens such as instrument panels and central control screens.
  • the cabin being in the first state includes the vehicle air conditioner being turned off; controlling the cabin being in the second state may include: controlling the vehicle air conditioner to be turned on.
  • the vehicle air conditioner corresponding to the first area can be controlled to be turned on.
  • the vehicle-mounted air conditioner corresponding to the first area may include a vehicle-mounted air conditioner arranged in the first area.
  • the vehicle-mounted air conditioner corresponding to the first area may be the main driving area. air conditioner.
  • the cabin being in the first state includes the vehicle audio and video system being turned off; controlling the cabin being in the second state may include: controlling the vehicle audio and video system to be turned on. For example, when the car audio and video system is turned on, audio such as music or radio channels can be played.
  • controlling the cockpit to switch from the first state to the second state may include one of the above examples, or may be a combination of two or more of the above examples.
  • the cockpit is in the main driver's seat with the display off and the car fragrance in the off state (the first state)
  • the car fragrance is controlled to be turned on and the main driver's display is controlled to display one or Icons and/or cards for multiple applications (second state).
  • the cockpit is in the main driving area and the display screen is turned off and the car fragrance and car air conditioner are turned off (the first state), and a user is detected in the main driving area, the car fragrance and car air conditioner are controlled to be turned on and the main driving area is controlled.
  • the driver's seat display displays icons and/or cards for one or more applications (second state).
  • the display control method provided by the embodiment of the present application can control the status change of the cockpit when a user is detected in the cockpit, which helps to improve the user's welcome experience when using the cockpit.
  • Figure 13 shows a schematic flowchart of a method 1300 for controlling display provided by an embodiment of the present application.
  • the method 1300 may be applied to the vehicle 100 shown in FIG. 1 , and the method may also be performed by the system shown in FIG. 3 .
  • the steps or operations of the method for controlling display shown in FIG. 13 are only illustrative.
  • the embodiments of the present application may also perform other operations or operations. or variations of each operation in Figure 13.
  • the method 1300 includes:
  • the first area may be the main driving area in the above embodiment, and the first display area may be the main driving area screen in the above embodiment, such as screen 201 shown in Figure 2; the first area may be For the passenger area in the above embodiment, the first display area may be the passenger screen in the above embodiment, such as the screen 202 shown in Figure 2; the first area may be the second row of the vehicle in the above embodiment. If the left area is the left area, the first display area may be the screen behind the main driver's headrest in the above embodiment, as shown in screen 203 in Figure 2; the first area may be the right side of the second row of the vehicle in the above embodiment. area, the first display area may be the screen at the rear of the passenger headrest in the above embodiment, such as screen 204 shown in Figure 2 .
  • the interface element may be at least one of the cards, application icons, wallpapers, and animations in the above embodiments, or may be other content displayed on the vehicle screen.
  • the method process of controlling the first display area to switch from the screen-off state to displaying interface elements may refer to the description in the above embodiment.
  • the central control screen corresponding to the main driving area is controlled to switch from a screen-off interface or a sleep display interface to a display interface element;
  • the screen behind the main driver's headrest corresponding to the left rear seat is controlled to switch from the screen-off interface or the sleep display interface to the display interface element.
  • the central control screen corresponding to the main driving area is controlled to switch from a screen-off interface or a sleep display interface to displaying icons of all applications installed in the vehicle.
  • the type of interface element to be displayed can also be determined in combination with the user's human body characteristics or identity information.
  • the rear part of the main driver's headrest is controlled.
  • the screen changes from a screen-off interface or a sleep display interface to display icons of applications preferred by men (or young users); if the user is a female user (or an elderly user), the screen behind the main driver's headrest is controlled to change from a screen-off interface or a sleep display interface. Display interface, showing icons of applications preferred by women (or elderly users).
  • the type of icon of the displayed application is determined in combination with the correspondence between the user's identity information and the application shown in Table 3. For example, if user A is detected in the main driving area, the central control screen corresponding to the main driving area can be controlled to display the icon of Application 1, the icon of Application 2 and the icon of Application 3 from the screen-off interface or the sleep display interface. .
  • the first display area when the first display area is a main driver's screen, such as screen 201 shown in FIG. 2 , the first area may be the entire area in the cockpit of the vehicle. Further, when a user is detected in any area in the cockpit, the main driver's screen is controlled to switch from a screen-off interface or a sleep display interface to a display interface element. In some possible implementations, when a user is detected in any area in the cockpit, the main driver's screen is controlled to switch from a screen-off interface or a sleep display interface to displaying icons of all applications installed in the vehicle.
  • the first display area when the first area is one of the passenger area, the left area of the second row of the vehicle, and the right area of the second row of the vehicle, the first display area corresponds to each of the above areas respectively.
  • a screen 201 is also included. That is, when the first area is the passenger area, the first display area is the screen 201 and the screen 202; when the first area is the left area of the second row of the vehicle, the first display area is the screen 201 and the screen 203; When the area is the right side area of the second row of the vehicle, the first display area is the screen 201 and the screen 204.
  • control screen 201 and screen 202 are switched from the screen off interface or the sleep display interface to the display interface element; when a user is detected in the left area of the second row of the vehicle, the control screen 201 and screen 203 are switched from a screen-off interface or a sleep display interface to a display interface element; when a user is detected in the right area of the second row of the vehicle, the control screen 201 and screen 204 are switched from a screen-off interface or a sleep display interface to a display interface element.
  • screen 201 can display icons of all applications installed in the vehicle, and the interface elements displayed on screens 202 to 204 can be determined based on the user's human body characteristics or identity information.
  • the embodiment of the present application provides a method for controlling display. After detecting a user, the interface elements are displayed on the vehicle screen or the area of the vehicle screen corresponding to the area where the user is located, which helps to save energy consumption and improve the user's experience in using the vehicle screen. interactive experience. In addition, when users are detected in different areas of the cockpit, different interface elements can be displayed on different display areas to meet the needs of different users.
  • Figure 14 shows a schematic flowchart of a method 1400 for controlling display provided by an embodiment of the present application.
  • the method 1400 may be applied to the vehicle 100 shown in FIG. 1 , and the method may also be performed by the system shown in FIG. 3 .
  • the steps or operations of the method for controlling display shown in FIG. 14 are only illustrative examples, and embodiments of the present application may also perform other operations or modifications of each operation in FIG. 14 .
  • the method 1400 includes:
  • the first area may be the main driving area in the above embodiment, and the first display area may be the main driving area screen in the above embodiment, such as screen 201 shown in FIG. 2 .
  • the first area can also be other areas in the cockpit, and the first display area can also be other display screens corresponding to the first area or a certain area on the display screen.
  • the above embodiments Description will not be repeated here.
  • S1420 Control the first display area of the cockpit to display the first interface element according to the first user's human body characteristics or identity information, and the first area corresponds to the first display area.
  • the human body characteristics may be as described in the above embodiments, including but not limited to gender, age, emotion, etc.
  • the identity information may be as described in the above embodiment, including but not limited to biometric information stored in the vehicle, account number, etc.
  • biometric information includes but is not limited to fingerprints, palmprints, facial information, iris information, gait information, etc.
  • the account number may include account information for logging into the vehicle system, etc.
  • the specific method of controlling the first display area of the cockpit to display the first interface element according to the identity information of the first user may refer to the description in the above embodiment.
  • the type of icon of the displayed application is determined based on the correspondence between the user's identity information and the application shown in Table 3. For example, if user A is detected in the main driving area, the central control screen corresponding to the main driving area can be controlled to display the icon of Application 1, the icon of Application 2, and the icon of Application 3.
  • a method for controlling display can determine the type of interface elements based on the user's human body characteristics or identity information, that is, recommend personalized interface elements based on different users, which helps to improve the user's interactive experience and driving experience. experience and entertainment experience.
  • the first display area when the first area is one of the passenger area, the left area of the second row of the vehicle, and the right area of the second row of the vehicle, the first display area corresponds to each of the above areas respectively.
  • a screen 201 is also included. That is, when the first area is the passenger area, the first display area is the screen 201 and the screen 202; when the first area is the left area of the second row of the vehicle, the first display area is the screen 201 and the screen 203; When the area is the right side area of the second row of the vehicle, the first display area is the screen 201 and the screen 204.
  • both the control screen 201 and the screen 202 display interface elements; when a user is detected in the left area of the second row of the vehicle, the control screen 201 and the screen 203 display interface elements; for the vehicle
  • the screen 201 and the screen 204 are controlled to display interface elements.
  • screen 201 can display icons of all applications installed in the vehicle, and the interface elements displayed on screens 202 to 204 can be determined based on the user's human body characteristics or identity information.
  • the second display area of the cockpit is controlled to display the first interface element, and the second area corresponds to the second display area.
  • the first display area of the cockpit may be controlled to display the first interface element based on the driving status information of the vehicle, the first user's human body characteristics or identity information, and the vehicle includes the cockpit.
  • the screen corresponding to the main driving can be controlled to display navigation or map and other driving-related applications.
  • the icon of the above-mentioned application program may be determined based on the first user's human body characteristics or identity information.
  • the screen corresponding to the main driver can be controlled to display icons of entertainment applications.
  • the icons of the above-mentioned application programs can be determined based on the first user's human body characteristics or identity information.
  • the method further includes: when detecting that the second user is located in the second area, controlling the second display area to display the second interface element according to the human body characteristics or identity information of the second user.
  • the first interface element and the second interface element are the same.
  • the method further includes: when detecting that the second user is located in the first area and the first user is located in the second area, controlling the first display area to display the second interface element and controlling the second The display area displays the first interface element.
  • the method further includes: controlling the first display area to switch from a screen-off state to displaying the first interface element according to the first user's human body characteristics or identity information.
  • the first interface element includes one or more tabs
  • the method further includes: after detecting that the first user When input is directed to the first tab displayed in the first display area, the first display area is controlled to display icons of one or more application programs; wherein the one or more tabs include the first tab, the third One page tab includes icons of the one or more applications.
  • the passenger entertainment screen is controlled to display one or more tabs. After the user clicks one of the tabs, one or more applications included in the tab are displayed. Program icon.
  • the first display area is controlled to switch from the screen-off state to displaying the first interface element.
  • the cockpit includes a first display screen
  • the first display screen includes the first display area and a second display area
  • the second display area corresponds to the second area of the cockpit
  • the method further includes: When a user is not detected in the second area, the second display area is controlled to be in a screen-off state; when a user is detected in the second area, the second display area is controlled to display interface elements.
  • control display area 1 and display area 2 to display interface elements; the user is detected in the main driving area, but the user is not detected in the passenger driving area.
  • display area 2 is controlled to display interface elements, and display area 1 is controlled to turn off the screen.
  • Figure 15 shows a schematic flowchart of a method 1500 for controlling display provided by an embodiment of the present application.
  • the method 1500 may be applied to the vehicle 100 shown in FIG. 1 , and the method may also be performed by the system shown in FIG. 3 .
  • the steps or operations of the method for controlling display shown in FIG. 15 are only illustrative examples, and embodiments of the present application may also perform other operations or modifications of each operation in FIG. 15 .
  • the method 1500 includes:
  • S1510 Determine the user's first area in the cockpit.
  • S1520 Control the first display area to display the first interface element according to the first area, and the first area corresponds to the first display area.
  • the first area corresponds to the first display area may include: the first area may be the main driving area in the above embodiment, and the first display area may be the main driving area screen in the above embodiment (such as central control screen), as shown in Figure 2, screen 201, that is, the main driving area corresponds to the main driving screen; the first area can be the co-pilot area in the above embodiment, and the first display area can be the above
  • the passenger side screen is screen 202 as shown in Figure 2, that is, the passenger side area corresponds to the passenger side screen; the first area can be the left area of the second row of the vehicle in the above embodiment, then
  • the first display area may be the screen at the rear of the main driver's headrest in the above embodiment, as shown in screen 203 in Figure 2 , that is, the left area of the second row corresponds to the screen at the rear of the main driver's headrest; the first area may be is the right side area of the second row of the vehicle in the above embodiment, the first display area may be the screen behind the passenger headrest in the
  • the first area is any area in the cockpit. That is, all areas of the cockpit correspond to the central control screen.
  • the central control screen is controlled to display the first interface element.
  • the first display area when the first area is one of the passenger area, the left area of the second row of the vehicle, and the right area of the second row of the vehicle, the first display area corresponds to each of the above areas respectively.
  • a screen 201 is also included. That is, when the first area is the passenger area, the first display corresponding to the first area The display area includes screen 201 and screen 202; when the first area is the left area of the second row of the vehicle, the first display area corresponding to the first area is screen 201 and screen 203; the first area is the second row of the vehicle.
  • the right area is arranged, the first display area corresponding to the first area is screen 201 and screen 204.
  • the central control screen corresponding to the main driving area is controlled to display icons of all applications installed in the vehicle.
  • controlling the first display area to display the first interface element includes: controlling the third interface element according to the user's human body characteristics.
  • a display area displays the first interface element.
  • the rear row area may include the second row left area and the second row right area in the above embodiment.
  • the above-mentioned rear row area can be a second row area and/or a third row area.
  • the rear area may also include other rear areas in addition to the main driver's area and the passenger's area.
  • controlling the first display area to display the first interface element includes: controlling the third interface element according to the user's identity information.
  • a display area displays the first interface element.
  • the specific method of controlling the first display area of the cockpit to display the first interface element according to the identity information of the first user may refer to the description in the above embodiment.
  • the method further includes: controlling the first display area to be in a screen-off state when there is no user in the first area.
  • controlling the first display area to display the first interface element includes: controlling the first display area to display the first interface element according to the driving status information of the mobile carrier, and the mobile carrier includes the cockpit.
  • the first interface element includes one or more page tabs
  • the method further includes: when detecting the user's input for the first page tab displayed in the first display area, controlling the first display area to display Icons of one or more application programs; wherein the one or more tabs include the first tab, and the first tab includes icons of the one or more application programs.
  • the icons of the one or more application programs included in the first page tab are icons of application programs of the same category.
  • the cockpit includes a first display screen
  • the first display screen includes the first display area and a second display area
  • the second display area corresponds to the second area of the cockpit
  • the method further includes: When the user is not detected in the second area, the second display area is controlled to be in a screen-off state; when the user is detected in the second area, the second display area is controlled to display the second interface element.
  • the first interface element and the second interface element may be the same interface element.
  • the cockpit includes a third display area corresponding to the third area of the cockpit, and the method further includes: when a user is detected in the third area, controlling the third display area to display the third Three interface elements.
  • the central control screen (such as the display screen 201) is controlled.
  • the first interface element is displayed; when the user C is detected in the passenger area, the screen (such as the display screen 202 ) at the passenger side is controlled to display the third interface element.
  • the central control screen (such as display screen 201) is controlled to display the first interface element;
  • the screen behind the passenger headrest (such as display screen 201) is controlled.
  • Screen 204) displays the third interface element.
  • the first interface element and the third interface element may be the same interface element.
  • the first interface element also includes at least one of a card, an application icon, a wallpaper, and an animation.
  • Figure 16 shows a schematic block diagram of a display control device 2000 provided by an embodiment of the present application.
  • the device 2000 includes a detection unit 2010 and a processing unit 2020.
  • the detection unit 2010 can be used to perform detection and/or implement corresponding communication functions
  • the processing unit 2020 can be used to perform data processing.
  • the device 2000 may include one or more of the devices 301 to 304 in the system shown in FIG. 3 , or may also include control of other devices other than the devices 301 to 304 (such as vehicle incense). atmosphere, vehicle air conditioner, vehicle audio and video system), the embodiments of the present application do not specifically limit this.
  • the device 2000 may also include a storage unit, which may be used to store instructions and/or data, and the processing unit 2020 may read the instructions and/or data in the storage unit, so that the device implements the foregoing method embodiments. .
  • the apparatus 2000 may include means for performing the methods in Figures 12 to 14. Moreover, each unit in the device 2000 and the above-mentioned other operations and/or functions are respectively intended to implement the corresponding processes of the method embodiments in Figures 12 to 14.
  • the detection unit 2010 can be used to execute S1110 in the method 1100
  • the processing unit 2020 can be used to execute S1110 in the method 1100.
  • the device 2000 includes: a detection unit 2010, used to detect whether there is a user in the first area of the cockpit, where the cockpit is in the first state; and a processing unit 2020, used to control when a user is detected in the first area.
  • the cockpit is in the second state.
  • the cabin includes a vehicle-mounted fragrance, and the cabin is in the first state, including the vehicle-mounted fragrance being in a stopped release state; the processing unit 2020 is specifically used to: control the opening of the vehicle-mounted fragrance.
  • the cabin includes a vehicle air conditioner, and the cabin is in the first state, including the vehicle air conditioner being turned off; the processing unit 2020 is specifically used to: control the vehicle air conditioner to turn on.
  • the cockpit includes a vehicle-mounted audio and video system, and the cockpit is in a first state, including the vehicle-mounted audio-visual system being in a shutdown state; the processing unit 2020 is specifically used to: control the vehicle-mounted audio and video system to turn on.
  • the detection unit 2010 is also used to detect whether there is a user in the first area of the cockpit, where the first display area of the cockpit is in a screen-off state, and the first display area is similar to the first area.
  • the processing unit 2020 is configured to control the first display area to switch from the screen-off state to display interface elements when a user is detected in the first area.
  • the processing unit 2020 is specifically configured to: control the first display area to display the interface element according to the user's human body characteristics.
  • the processing unit 2020 is specifically configured to: control the first display area to display the interface element according to the user's identity information.
  • the processing unit 2020 is specifically configured to: control the first display area to display the interface element according to the driving status information of the mobile carrier, and the mobile carrier includes the cockpit.
  • the interface element includes one or more tabs
  • the processing unit 2020 is also configured to: detect the user's input for the first tab displayed in the first display area when the detection unit 2010 when the first display area is controlled to display icons of one or more application programs; wherein the one or more tabs include the first tab, and the first tab includes icons of the one or more application programs.
  • the one or more applications included in the first tab are applications of the same category.
  • the processing unit 2020 is specifically configured to: when the detection unit 2010 detects the user's input to the first display area, control the first display area to switch from the screen-off state to displaying the Interface elements.
  • the cockpit includes a first display screen
  • the first display screen includes the first display area and a second display area
  • the second display area corresponds to the second area of the cockpit
  • the processing Unit 2020 is also used to: when the detection unit 2010 does not detect a user in the second area, control the second display area to be in a screen-off state; when the detection unit 2010 detects a user in the second area, control the The second display area displays interface elements.
  • the interface element also includes at least one of a card, an application icon, a wallpaper, and an animation.
  • the device 2000 can also be used to perform the method 1300 in Figure 13.
  • the detection unit 2010 can be used to perform S1310 in the method 1300
  • the processing unit 2020 may be used to perform S1320 in the method 1300.
  • the device 2000 includes: a detection unit 2010 that detects whether there is a user in the first area of the cockpit, where the first display area of the cockpit is in a screen-off state and the first display area corresponds to the first area; processing Unit 2020 is configured to control the first display area to switch from the screen-off state to display interface elements when a user is detected in the first area.
  • the processing unit 2020 is specifically configured to: control the first display area to display the interface element according to the user's human body characteristics.
  • the processing unit 2020 is specifically configured to: control the first display area to display the interface element according to the user's identity information.
  • the processing unit 2020 is specifically configured to: control the first display area to display the interface element according to the driving status information of the mobile carrier, and the mobile carrier includes the cockpit.
  • the interface element includes one or more tabs
  • the processing unit 2020 is also configured to: detect the user's input for the first tab displayed in the first display area when the detection unit 2010 when, the first display area is controlled to display icons of one or more application programs; wherein the one or more tabs include the first tab, and the first tab includes icons of the one or more application programs.
  • the one or more applications included in the first tab are applications of the same category.
  • the processing unit 2020 is specifically configured to: when the detection unit 2010 detects the When the user inputs to the first display area, the first display area is controlled to switch from the screen-off state to displaying the interface element.
  • the cockpit includes a first display screen
  • the first display screen includes the first display area and a second display area
  • the second display area corresponds to the second area of the cockpit
  • the processing Unit 2020 is also used to: when the detection unit 2010 does not detect a user in the second area, control the second display area to be in a screen-off state; when the detection unit 2010 detects a user in the second area, control the The second display area displays interface elements.
  • the interface element also includes at least one of a card, an application icon, a wallpaper, and an animation.
  • the device further includes: an acquisition unit, configured to acquire resource occupancy information, the resource occupancy information being used to characterize the resource capacity allocated to the first display area; the processing unit 2020 is also configured to obtain the resource occupancy information according to the The resource occupation information controls the first display area to display the interface element.
  • the device 2000 can also be used to perform the method 1400 in Figure 14.
  • the detection unit 2010 can be used to perform S1410 in the method 1400
  • the processing unit 2020 may be used to perform S1420 in the method 1400.
  • the device 2000 includes: a detection unit 2010, used to detect the first area in the cockpit where the first user is located; a processing unit 2020, used to control the first display of the cockpit according to the first user's human body characteristics or identity information.
  • the first interface element is displayed in an area, and the first area corresponds to the first display area.
  • the processing unit 2020 is also configured to: when the detection unit 2010 detects that the first user is located in the second area of the cockpit, control the second display area of the cockpit to display the first interface. element, the second area corresponds to the second display area.
  • the processing unit 2020 is also configured to: control the first display area of the cockpit to display the first interface element according to the driving status information of the mobile carrier, the human body characteristics or identity information of the first user, the mobile The carrier includes the cabin.
  • the processing unit 2020 is also configured to: when the detection unit 2010 detects that the second user is located in the second area, control the second display area according to the human body characteristics or identity information of the second user. Display the second interface element.
  • the processing unit 2020 is also configured to: when the detection unit 2010 detects that the second user is located in the first area and the first user is located in the second area, control the first display area. Display the second interface element and control the second display area to display the first interface element.
  • the processing unit 2020 is specifically configured to: control the first display area to switch from a screen-off state to displaying the first interface element according to the first user's human body characteristics or identity information.
  • the first interface element includes one or more tabs
  • the processing unit 2020 is specifically configured to: upon detecting the first user's input for the first tab displayed in the first display area, when, the first display area is controlled to display icons of one or more application programs; wherein the one or more tabs include the first tab, and the first tab includes icons of the one or more application programs.
  • the processing unit 2020 is also configured to: when detecting the first user's input to the first display area, control the first display area to switch from the screen-off state to displaying the first display area. Interface elements.
  • the cockpit includes a first display screen
  • the first display screen includes the first display area and a second display area
  • the second display area corresponds to the second area of the cockpit
  • the processing Unit 2020 still available
  • the second display area is controlled to be in a screen-off state; when the user is detected in the second area, the second display area is controlled to display interface elements.
  • the first interface element also includes at least one of a card, an application icon, a wallpaper, and an animation.
  • Figure 17 shows a schematic block diagram of a device 2100 for controlling display provided by an embodiment of the present application.
  • the device 2100 includes a determination unit 2110 and a processing unit 2120.
  • the apparatus 2100 may comprise means for performing the method in Figure 15. Moreover, each unit in the device 2000 and the above-mentioned other operations and/or functions are respectively intended to implement the corresponding processes of the method embodiment in Figure 15.
  • the device 2100 includes: a determining unit 2110, used to determine the first area of the user in the cockpit; a processing unit 2120, used to control the first display area to display interface elements according to the first area, the first area and the first area. corresponding to the display area.
  • the first area is any area in the cockpit, and the first display area is a central control screen.
  • the processing unit 2120 is specifically configured to control the first display area to display the interface element according to the user's human body characteristics or identity information.
  • the first area is a co-pilot area in the cockpit or a rear area in the cockpit.
  • the processing unit 2120 is also configured to control the first display area to be in a screen-off state when there is no user in the first area.
  • the processing unit 2120 is specifically configured to: control the first display area to display the interface element according to the driving status information of the mobile carrier, which includes the cockpit.
  • the device 2100 further includes a detection unit, the interface element includes one or more tabs, and the processing unit 2120 is further configured to: when the detection unit detects the first page displayed by the user for the first display area When inputting a tab, the first display area is controlled to display icons of one or more application programs; wherein the one or more tabs include the first page tab, and the first page tab includes the one or more application programs. icon.
  • the icons of the one or more application programs included in the first page tab are icons of application programs of the same category.
  • the cockpit includes a first display screen
  • the first display screen includes the first display area and a second display area
  • the second display area corresponds to the second area of the cockpit
  • the processing unit 2120 also uses In: when the detection unit does not detect the user in the second area, control the second display area to be in a screen-off state; when the detection unit detects the user in the second area, control the second display area to display the interface element.
  • the device further includes a detection unit
  • the cockpit includes a third display area
  • the third display area corresponds to the third area of the cockpit
  • the processing unit 2120 is specifically configured to: when the detection unit is in the third When a user is detected in the area, the third display area is controlled to display interface elements.
  • the interface element also includes at least one of a card, an application icon, a wallpaper, and an animation.
  • each unit in the above device is only a division of logical functions.
  • the units may be fully or partially integrated into a physical entity, or may be physically separated.
  • the unit in the device can be implemented in the form of a processor calling software; for example, the device includes a processor, the processor is connected to a memory, instructions are stored in the memory, and the processor calls the instructions stored in the memory to implement any of the above methods.
  • the processor is, for example, a general-purpose processor, such as a CPU or a microprocessor
  • the memory is a memory within the device or a memory outside the device.
  • the units in the device can be implemented in the form of hardware circuits, and can be implemented by designing the hardware circuits.
  • the hardware circuit can be understood as one or more processors; for example, in one implementation, the hardware circuit is an ASIC, and through the design of the logical relationship of the components in the circuit, some or all of the above are realized
  • the function of the unit for another example, in another implementation, the hardware circuit can be implemented through PLD.
  • FPGA FPGA as an example, it can include a large number of logic gate circuits, and the connection relationship between the logic gate circuits is configured through the configuration file. Thereby realizing the functions of some or all of the above units. All units of the above device may be fully realized by the processor calling software, or may be fully realized by hardware circuits, or part of the units may be realized by the processor calling software, and the remaining part may be realized by hardware circuits.
  • the processor is a circuit with signal processing capabilities.
  • the processor may be a circuit with instruction reading and execution capabilities, such as a CPU, a microprocessor, a GPU, or DSP, etc.; in another implementation, the processor can realize certain functions through the logical relationship of the hardware circuit. The logical relationship of the hardware circuit is fixed or can be reconstructed.
  • the processor is a hardware circuit implemented by ASIC or PLD. For example, FPGA.
  • the process of the processor loading the configuration file and realizing the hardware circuit configuration can be understood as the process of the processor loading instructions to realize the functions of some or all of the above units.
  • it can also be a hardware circuit designed for artificial intelligence, which can be understood as an ASIC, such as NPU, TPU, DPU, etc.
  • each unit in the above device can be one or more processors (or processing circuits) configured to implement the above method, such as: CPU, GPU, NPU, TPU, DPU, microprocessor, DSP, ASIC, FPGA , or a combination of at least two of these processor forms.
  • processors or processing circuits
  • each unit in the above device may be integrated together in whole or in part, or may be implemented independently. In one implementation, these units are integrated together and implemented as a SOC.
  • the SOC may include at least one processor for implementing any of the above methods or implementing the functions of each unit of the device.
  • the at least one processor may be of different types, such as a CPU and an FPGA, or a CPU and an artificial intelligence processor. CPU and GPU etc.
  • each operation performed by the detection unit 2010 and the processing unit 2020 may be performed by the same processor, or may be performed by different processors, for example, by multiple processors.
  • one or more processors may be connected to one or more sensors in the perception system 120 in FIG. 1 to obtain the user's position information in the cockpit from the one or more sensors and process it.
  • one or more processors may also be connected to one or more display devices in the display device 130 to control the icons and/or cards of the application program displayed by the display device.
  • the one or more processors described above may be processors provided in a vehicle machine, or may also be processors provided in other vehicle-mounted terminals.
  • the above-mentioned device 2000 or device 2100 may be a chip provided in a vehicle machine or other vehicle-mounted terminal.
  • the above-mentioned device 2000 or device 2100 may be the computing platform 150 as shown in FIG. 1 provided in the vehicle.
  • the device 2000 or the device 2100 may include at least one of the devices 301 to 304 shown in FIG. 3 .
  • Embodiments of the present application also provide a device, which includes a processing unit and a storage unit, where the storage unit is used to store instructions, and the processing unit executes the instructions stored in the storage unit, so that the device performs the method performed in the above embodiments or step.
  • the above-mentioned processing unit may include at least one of the processors 151-15n shown in FIG. 1; the above-mentioned determination unit may include at least one of the processors 151-15n shown in FIG. 1.
  • the above-mentioned detection unit may be a sensor in the sensing system 120 shown in FIG. 1 , or may also be the processor 151 - 15n shown in FIG. 1 .
  • Figure 18 is a schematic block diagram of a device for controlling display according to an embodiment of the present application.
  • the control shown in Figure 18 displays the Apparatus 2200 may include a processor 2210, a transceiver 2220, and a memory 2230.
  • the processor 2210, the transceiver 2220 and the memory 2230 are connected through an internal connection path.
  • the memory 2230 is used to store instructions.
  • the processor 2210 is used to execute the instructions stored in the memory 2230, and the transceiver 2220 receives/sends some parameters.
  • the memory 2230 may be coupled with the processor 2210 through an interface, or may be integrated with the processor 2210.
  • the device 2200 may include at least one of the devices 301 to 304 shown in FIG. 3 .
  • transceiver 2220 may include but is not limited to a transceiver device such as an input/output interface to implement communication between the device 2200 and other devices or communication networks.
  • the processor 2210 can use a general-purpose CPU, microprocessor, ASIC, GPU or one or more integrated circuits to execute relevant programs to implement the display control method of the method embodiment of the present application.
  • the processor 2210 may also be an integrated circuit chip with signal processing capabilities.
  • each step of the display control method of the present application can be completed by instructions in the form of hardware integrated logic circuits or software in the processor 2210 .
  • the above-mentioned processor 2210 can also be a general-purpose processor, DSP, ASIC, FPGA or other programmable logic device, discrete gate or transistor logic device, or discrete hardware component.
  • Each method, step and logical block diagram disclosed in the embodiment of this application can be implemented or executed.
  • a general-purpose processor may be a microprocessor or the processor may be any conventional processor, etc.
  • the steps of the method disclosed in conjunction with the embodiments of the present application can be directly implemented by a hardware decoding processor, or executed by a combination of hardware and software modules in the decoding processor.
  • the software module can be located in random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers and other mature storage media in this field.
  • the storage medium is located in the memory 2230.
  • the processor 2210 reads the information in the memory 2230 and executes the display control method of the method embodiment of the present application in conjunction with its hardware.
  • the memory 2230 may be a read only memory (ROM), a static storage device, a dynamic storage device or a random access memory (RAM).
  • ROM read only memory
  • RAM random access memory
  • the transceiver 2220 uses a transceiver device such as but not limited to a transceiver to implement communication between the device 2200 and other devices or communication networks. For example, user location information can be obtained through the transceiver 2220.
  • An embodiment of the present application also provides a mobile carrier, which may include the above device 2000, or the above device 2100, or the above device 2200.
  • the mobile carrier may be the vehicle in the above embodiment.
  • Embodiments of the present application also provide a computer program product.
  • the computer program product includes: computer program code.
  • the computer program code When the computer program code is run on a computer, it causes the computer to execute the methods in FIGS. 12 to 15 .
  • Embodiments of the present application also provide a computer-readable storage medium.
  • the computer-readable medium stores program codes or instructions.
  • the processor executes the above-mentioned Figures 12 to 12.
  • An embodiment of the present application also provides a chip, including: at least one processor and a memory.
  • the at least one processor is coupled to the memory and is used to read and execute instructions in the memory to execute the above-mentioned Figures 12 to 15. Methods.
  • each step of the above method can be completed by instructions in the form of hardware integrated logic circuits or software in the processor.
  • the methods disclosed in conjunction with the embodiments of this application can be directly embodied and completed by a hardware processor. Or it can be executed using a combination of hardware and software modules in the processor.
  • the software module can be located in random access memory, flash memory, read-only memory, programmable read-only memory or power-on erasable programmable memory, registers and other mature storage media in this field.
  • the storage medium is located in the memory, and the processor reads the information in the memory and completes the steps of the above method in combination with its hardware. To avoid repetition, it will not be described in detail here.
  • At least one of the following” or similar expressions thereof refers to any combination of these items, including any combination of a single item (items) or a plurality of items (items).
  • at least one of a, b, or c can mean: a, b, c, a-b, a-c, b-c, or a-b-c, where a, b, c can be single or multiple .
  • the size of the sequence numbers of the above-mentioned processes does not mean the order of execution.
  • the execution order of each process should be determined by its functions and internal logic, and should not be used in the embodiments of the present application.
  • the implementation process constitutes any limitation.
  • the disclosed systems, devices and methods can be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be combined or can be integrated into another system, or some features can be ignored, or not implemented.
  • the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, and the indirect coupling or communication connection of the devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or they may be distributed to multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application can be integrated into one processing unit, or each functional unit can be integrated into one processing unit.
  • the functions are implemented in the form of software functional units and sold or used as independent products, they can be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application can essentially be embodied in the form of a software product.
  • the computer software product is stored in a storage medium and includes a number of instructions to enable a computer device (which can be a personal computer, a server, or network equipment, etc.) to perform all or part of the steps of the methods described in various embodiments of this application.
  • the aforementioned storage media include: U disk, mobile hard disk, ROM, RAM, magnetic disk or optical disk and other media that can store program codes.

Abstract

本申请提供了一种控制显示的方法、装置和移动载体,该方法可以包括:确定用户在座舱内的第一区域;根据该第一区域,控制第一显示区域显示第一界面元素,该第一区域与该第一显示区域相对应。本申请实施例的控制显示的方法可以应用于新能源车辆或者智能车辆中,有助于提升用户使用座舱时的迎宾体验和驾乘体验。

Description

控制显示的方法、装置和移动载体
本申请要求于2022年7月29日提交中国专利局、申请号为202210904518.2、申请名称为“控制显示的方法、装置和移动载体”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及智能座舱领域,并且更具体地,涉及一种控制显示的方法、装置和移动载体。
背景技术
随着车辆智能化、网联化水平提高,车辆座舱逐渐向以人机交互为核心、多屏联动的智能座舱发展。当前智能座舱中,车载显示屏的多媒体体验仍有待提升。
发明内容
本申请实施例提供一种控制显示的方法、装置和移动载体,有助于提升用户使用座舱时的体验。
本申请中的移动载体可以包括路上交通工具、水上交通工具、空中交通工具、工业设备、农业设备、或娱乐设备等。例如移动载体可以为车辆,该车辆为广义概念上的车辆,可以是交通工具(如商用车、乘用车、摩托车、飞行车、火车等),工业车辆(如:叉车、挂车、牵引车等),工程车辆(如挖掘机、推土车、吊车等),农用设备(如割草机、收割机等),游乐设备,玩具车辆等,本申请实施例对车辆的类型不作具体限定。再如,移动载体可以为飞机、或轮船等交通工具。
第一方面,提供了一种控制显示的方法,该设备设置于移动载体座舱内,该方法可以由移动载体执行;或者,也可以由移动载体的车载终端如车机等执行;或者,还可以由用于车载终端的芯片或电路执行,本申请对此不作限定。为了便于描述,下面以移动载体执行为例进行说明。
该方法可以包括:检测座舱的第一区域是否有用户,其中,该座舱处于第一状态;在该第一区域检测到用户时,控制该座舱处于第二状态。
在上述技术方案中,在检测到座舱中某个区域有用户时,控制座舱状态变化,有助于提升用户使用座舱时的迎宾体验,满足区域用户的需求。
在一些可能的实现方式中,该座舱可以包括一个或多个车载显示屏;或者,该座舱还可以包括车载香氛、车载空调、车载影音系统等;或者,该座舱还可以包括其他设备,本申请实施例对此不作具体限定。
示例性地,座舱处于第一状态可以包括如下至少一项:车载显示屏处于熄屏状态;车载香氛处于关闭状态,即车载香氛不释放香氛;车载空调处于关机状态;车载影音系统处 于关机状态。
示例性地,座舱处于第一状态包括车载显示屏处于熄屏状态时,在检测座舱的第一区域有用户时,控制该座舱处于第二状态可以包括:控制与第一区域对应的显示屏显示界面元素。需要说明的是,与第一区域对应的显示屏可以包括设置于第一区域的显示屏,示例性地,第一区域为座舱主驾驶区域时,则与第一区域对应的显示屏可以为主驾驶处屏幕如仪表盘、中控屏。
示例性地,座舱处于第一状态包括车载香氛处于关闭状态,即车载香氛不释放香氛;控制该座舱处于第二状态可以包括:控制车载香氛开启,即车载香氛开始释放香氛。
示例性地,座舱处于第一状态包括车载空调处于关机状态,控制该座舱处于第二状态可以包括:控制车载空调开启。在一些可能的实现方式中,可以控制与第一区域对应的车载空调开启。应理解,与第一区域对应的车载空调可以包括设置于第一区域的车载空调,示例性地,第一区域为座舱主驾驶区域时,则与第一区域对应的车载空调可以为主驾驶处空调。
示例性地,座舱处于第一状态包括车载影音系统处于关机状态,控制该座舱处于第二状态可以包括:控制车载影音系统开启。示例性地,车载影音系统开启时,可以播放音乐或者收音机频道等音频。
在一些可能的实现方式中,在检测到第一区域有用户时,控制该座舱由第一状态切换至第二状态可以包括上述示例中的一个,或者也可以为上述两个及以上示例的结合。一示例,座舱的第一状态包括座舱处于主驾驶处显示屏熄屏以及车载香氛处于关闭状态,则座舱的第二状态可以包括控制车载香氛开启且控制主驾驶处显示屏显示界面元素。再一示例,座舱的第一状态包括座舱处于主驾驶处显示屏熄屏以及车载香氛和车载空调处于关闭状态,则座舱的第二状态可以包括控制车载香氛和车载空调开启且控制主驾驶处显示屏显示界面元素。
结合第一方面,在第一方面的某些实现方式中,检测座舱的第一区域是否有用户,其中,该座舱的第一显示区域处于熄屏状态,该第一显示区域与该第一区域相对应;在该第一区域检测到用户时,控制该第一显示区域由该熄屏状态切换为显示界面元素。
在一些可能的实现方式中,该第一显示区域可以为座舱内某区域的显示屏,例如主驾驶处中控屏、副驾娱乐屏,或者还可以为前排座椅头枕后部屏幕;或者,该第一显示区域还可以为某显示屏的某个区域,例如,当中控屏和副驾娱乐屏合为同一块长屏时,该第一显示区域可以为该长屏上的一个区域。
在一些可能的实现方式中,熄屏状态可以包括黑屏状态,例如显示屏下电时的黑屏状态;或者,也可以包括显示屏休眠、待机或停机等低功耗状态下的灭屏状态,该灭屏状态可以包括显示休眠显示界面,例如,显示屏在灭屏时,利用显示屏自发光的特性,能够点亮显示屏上的部分区域用以显示时钟、日期、通知、动画等信息,以便用户在灭屏的情况下可以查看相关信息。一示例,用户进入座舱前控制座舱上电,但此时车载屏幕可以仍处于下电状态。进一步地,在用户进入座舱后,车载屏幕或车载屏幕上与用户在座舱位置所对应的区域上电并显示界面元素。
在上述技术方案中,在检测到用户后,再通过该用户所在区域对应的车载屏幕或车载屏幕的区域显示界面元素,有助于节省能耗,提高用户在使用车载屏幕时的交互体验。此 外,座舱的不同区域均检测到用户时,不同显示区域上可以分别显示不同的界面元素,可以满足不同用户的需求。
结合第一方面,在第一方面的某些实现方式中,该方法还包括:根据该用户的人体特征控制该第一显示区域显示该界面元素。
示例性地,该人体特征包括但不限于性别、年龄、情绪等。
在一些可能的实现方式中,根据用户的性别确定第一显示区域显示的界面元素的类型。例如,在用户为女性时,可以控制该第一显示区域显示该一个或者多个女性用户偏好的界面元素;在用户为男性时,可以控制该第一显示区域显示该一个或者多个男性用户偏好的界面元素。
在一些可能的实现方式中,根据用户的年龄确定第一显示区域显示的界面元素的类型。例如,在用户为少年用户时,可以控制该第一显示区域显示该一个或者多个少年用户偏好的界面元素;在用户为老年用户时,可以控制该第一显示区域显示该一个或者多个老年用户偏好的界面元素。
在一些可能的实现方式中,根据用户的情绪确定第一显示区域显示的界面元素的类型。例如,在用户情绪低落时,可以控制该第一显示区域显示该一个或者多个娱乐相关的界面元素。
在一些可能的实现方式中上述“男性偏好的界面元素”、“女性偏好的界面元素”、“少年用户偏好的界面元素”、“老年用户偏好的界面元素”可以是通过用户自行设定的,也可以是移动载体出厂时设置好的,或者也可以是通过其他方式设置的,本申请实施例对此不作具体限定。
在上述技术方案中,能够根据用户的人体特征为不同的用户推送个性化的界面元素,有助于提升用户的娱乐体验。
结合第一方面,在第一方面的某些实现方式中,该方法还包括:根据该用户的身份信息控制该第一显示区域显示该界面元素。
示例性地,该身份信息包括但不限于保存在移动载体中的生物特征信息,以及账号等。其中,生物特征信息包括但不限于指纹、掌纹、面容信息、虹膜信息、步态信息等;账号可以包括登录车机系统的账号信息等。
在一些可能的实现方式中,第一显示区域显示该界面元素与用户的身份信息相关联。示例性地,用户出现在主驾驶区域时,座舱通过主驾驶处显示屏显示该与用户相关联的界面元素;在用户出现在副驾驶区域时,座舱通过副驾娱乐屏显示该与用户相关联的界面元素。应理解,该与用户相关联的界面元素可以是用户提前设置好的;或者,也可以是座舱根据用户在一定时间段内使用应用程序的频率排列的。
在上述技术方案中,根据用户的身份信息确定显示的界面元素的类型,当用户在座舱中的位置发生移动时,可以在用户移动后的区域对应的显示屏上显示与用户相关联的界面元素,有助于提升用户的交互体验和驾乘体验。
结合第一方面,在第一方面的某些实现方式中,该方法还包括:根据移动载体的行驶状态信息控制该第一显示区域显示该界面元素,该移动载体包括该座舱。
在一些可能的实现方式中,在移动载体处于行驶状态时,可以控制主驾驶处显示区域显示驾驶类界面元素,例如地图类应用程序的图标、导航类卡片等;在移动载体处于驻车 状态时,可以控制主驾驶处显示区域显示娱乐类界面元素,例如,视频类应用程序的图标、音乐类卡片等。
在一些可能的实现方式中,在移动载体处于行驶状态时,主驾驶处显示区域显示的驾驶类界面元素也可以是根据用户的身份信息确定的,或者,还可以是根据用户的人体特征确定的;在移动载体处于驻车状态时,主驾驶处显示区域显示的娱乐类界面元素也可以是根据用户的身份信息确定的,或者,还可以是根据用户的人体特征确定的。
应理解,在移动载体处于行驶状态时,主驾驶处显示区域也可以显示除驾驶类以外的界面元素,例如通讯类应用程序的图标等;在移动载体处于驻车状态时,主驾驶处显示区域也可以显示除娱乐类以外的界面元素,例如地图类应用程序的图标等。
在上述技术方案中,在移动载体行驶过程中,可以在主驾驶处显示区域主要显示驾驶相关界面元素,有助于提升行车安全。在移动载体处于不同状态时,显示不同的界面元素,有助于提升用户的交互体验。
结合第一方面,在第一方面的某些实现方式中,该界面元素包括一个或多个页签,该方法还包括:在检测到该用户针对该第一显示区域显示的第一页签的输入时,控制该第一显示区域显示一个或多个应用程序的图标;其中,该一个或多个页签包括该第一页签,该第一页签包括该一个或多个应用程序的图标。
示例性地,用户针对该第一显示区域显示的第一页签的输入可以为点击第一页签的操作;或者,也可以为针对第一页签的语音指令,例如,“打开第一页签”;或者,还可以为针对第一页签的预设手势;或者还可以为其他形式的针对第一页签的操作,本申请实施例对此不作具体限定。
在上述技术方案中,通过页签形式显示更多的应用程序的图标,在用户选择具体页签后,推送页签中的应用程序的图标,能够在较少计算资源占用的情况下,为用户提供更多可选择的应用程序的图标,有助于提升用户的体验。
结合第一方面,在第一方面的某些实现方式中,该第一页签包括的该一个或多个应用程序的图标为同一类别的应用程序的图标。
示例性地,第一页签中包含的应用程序的图标可以是用户通过设置功能进行编辑的,例如,将理财类、运动类、工作类、学习类应用程序的图标分别设置在“理财”、“运动”、“工作”、“学习”页签中。或者,页签的名称以及每个页签中包括的应用程序的图标也可以是移动载体出厂时设置好的,本申请实施例对此不作具体限定。
在上述技术方案中,通过显示分类过的页签,方便用户快速选择所需要的应用程序,有助于节省用户选择应用程序所需时间,提升用户的驾乘体验。
结合第一方面,在第一方面的某些实现方式中,该方法还包括:在检测到该用户针对该第一显示区域的输入时,控制该第一显示区域由该熄屏状态切换至显示该界面元素。
在一些可能的实现方式中,用户进入座舱后,可能并不想使用车载屏幕,可以控制车载屏幕或车载屏幕与该用户位置对应的区域进行休眠显示界面。在用户想要使用车载屏幕时,通过针对该第一显示区域的输入,控制该第一显示区域显示该界面元素。
示例性地,休眠显示界面可以为车载屏幕休眠时所显示的界面,如静态或动态壁纸等,或者也可以为其他形式的休眠显示界面,本申请实施例对此不作具体限定。
示例性地,用户针对该第一显示区域的输入可以为点击第一显示区域的操作;或者, 也可以为针对第一显示区域的语音指令,例如,“打开第一显示区域”;或者,还可以为针对第一显示区域的预设手势;或者还可以为其他形式的针对第一显示区域的操作,本申请实施例对此不作具体限定。
在一些可能的实现方式中,车载屏幕的熄屏状态为未上电状态,休眠显示界面为上电状态。在一些可能的实现方式中,车载屏幕的熄屏状态和休眠显示状态可以均为黑屏,即不显示任何信息,但应理解的是,在休眠显示状态时,车载屏幕可以响应于用户的点击操作或语音指令,马上切换到显示一个或多个界面;但在车载屏幕处于熄屏状态时,点击屏幕或输入语音指令,车载屏幕不会显示一个或多个界面。
在上述技术方案中,可以根据用户的选择和/或操作,控制车载屏幕是否进行界面元素的显示,既有利于减少能耗,还能够提升用户使用车载屏幕时的交互体验。
结合第一方面,在第一方面的某些实现方式中,该座舱包括第一显示屏,该第一显示屏包括该第一显示区域和第二显示区域,该第二显示区域与该座舱的第二区域相对应,该方法还包括:在该第二区域未检测到用户时,控制该第二显示区域处于熄屏状态;在该第二区域检测到用户时,控制该第二显示区域显示界面元素。
在一些可能的实现方式中,第一显示屏为一块可以延伸至主驾驶区域和副驾驶区域的长屏,则第一显示区域和第二显示区域可以分别为长屏上主驾驶区域(第一区域)的显示区域和与副驾驶区域(第二区域)的显示区域。则在主驾驶区域有用户时,可以控制第一显示区域,即长屏主驾驶处的区域显示界面元素;在副驾驶区域没有用户时,可以控制第二显示区域,即长屏副驾驶处的区域熄屏;在副驾驶区域有用户时,可以控制第二显示区域显示界面元素,应理解,该界面元素与第一显示区域显示的界面元素可以相同,也可以不同,本申请实施例对此不作具体限定。
在上述技术方案中,对车载长屏进行分区控制,有助于节省在使用车载长屏过程中的能耗。
结合第一方面,在第一方面的某些实现方式中,该界面元素还包括卡片、应用程序的图标、壁纸、动画中的至少一种。
结合第一方面,在第一方面的某些实现方式中,该座舱包括车载香氛,该座舱处于第一状态,包括该车载香氛处于停止释放状态;该控制该座舱处于第二状态,包括:控制该车载香氛开启。
结合第一方面,在第一方面的某些实现方式中,该座舱包括车载空调,该座舱处于第一状态,包括该车载空调处于关机状态;该控制该座舱处于第二状态,包括:控制该车载空调开启。
结合第一方面,在第一方面的某些实现方式中,该座舱包括车载影音系统,该座舱处于第一状态,包括该车载影音系统处于关机状态;该控制该座舱处于第二状态,包括:控制该车载影音系统开启。
在上述技术方案中,在检测到用户进入座舱时,即开启车载香氛、车载空调、车载影音系统,有助于进一步提高用户使用座舱时的迎宾体验。
结合第一方面,在第一方面的某些实现方式中,该方法还包括:获取资源占用信息,该资源占用信息用于表征分配给该第一显示区域的资源容量;根据该资源占用信息控制该第一显示区域显示该界面元素。
在上述技术方案中,在检测到分配给第一显示区域的资源容量较少时,可以控制该第一显示区域显示较少的界面元素,有助于节省第一显示区域进行显示时的资源占用。
第二方面,提供了一种控制显示的方法,该方法可以包括:在检测到第一用户位于座舱中的第一区域时,根据该第一用户的人体特征或身份信息控制该座舱的第一显示区域显示第一界面元素,该第一区域与该第一显示区域相对应。
在上述技术方案中,可以根据用户的人体特征或身份信息确定界面元素的类型,即根据不同用户进行个性化界面元素推荐,有助于提升用户的交互体验、驾乘体验和娱乐体验。
结合第二方面,在第二方面的某些实现方式中,该方法还包括:在检测到该第一用户位于该座舱中的第二区域时,控制该座舱的第二显示区域显示该第一界面元素,该第二区域与该第二显示区域相对应。
在上述技术方案中,在第一用户位置发生变化时,可以控制第一用户当前所处位置控制该位置对应的显示区域显示第一界面元素,无需用户手动调节,有助于提升用户的驾乘体验。
结合第二方面,在第二方面的某些实现方式中,该根据该第一用户的人体特征或身份信息控制该座舱的第一显示区域显示第一界面元素,包括:根据移动载体的行驶状态信息、该第一用户的人体特征或身份信息控制该座舱的第一显示区域显示第一界面元素,该移动载体包括该座舱。
在上述技术方案中,可以根据移动载体的行驶状态信息进一步确定界面元素的类型,例如,用户位于主驾驶时,且移动载体处于行驶状态时,确定界面元素为辅助驾驶相关界面元素,有助于提升行车安全性,该辅助驾驶相关界面元素是与该用户的人体特征或身份信息相关联的,无需用户手动选择,提高移动载体智能化程度,有助于提升用户的驾乘体验。
结合第二方面,在第二方面的某些实现方式中,该方法还包括:在检测到第二用户位于该第二区域时,根据该第二用户的人体特征或身份信息控制该第二显示区域显示第二界面元素。
结合第二方面,在第二方面的某些实现方式中,该方法还包括:在检测到该第二用户位于该第一区域且该第一用户位于该第二区域时,控制该第一显示区域显示该第二界面元素且控制该第二显示区域显示该第一界面元素。
在上述技术方案中,在第一用户和第二用户所在的区域发生变化后,第一界面元素和第二界面元素所在的显示区域可以进行相应的调整,无需用户手动进行调整,有助于提升移动载体的智能化程度,从而提升用户的驾乘体验。
结合第二方面,在第二方面的某些实现方式中,该根据该第一用户的人体特征或身份信息控制该第一显示区域显示第一界面元素,包括:根据该第一用户的人体特征或身份信息,控制该第一显示区域由熄屏状态切换至显示第一界面元素。
结合第二方面,在第二方面的某些实现方式中,该第一界面元素包括一个或多个页签,该方法还包括:在检测到该第一用户针对该第一显示区域显示的第一页签的输入时,控制该第一显示区域显示一个或多个应用程序的图标;其中,该一个或多个页签包括该第一页签,该第一页签包括该一个或多个应用程序的图标。
结合第二方面,在第二方面的某些实现方式中,该方法还包括:在检测到该第一用户 针对该第一显示区域的输入时,控制该第一显示区域由该熄屏状态切换至显示该第一界面元素。
结合第二方面,在第二方面的某些实现方式中,该座舱包括第一显示屏,该第一显示屏包括该第一显示区域和第二显示区域,该第二显示区域与该座舱的第二区域相对应,该方法还包括:在该第二区域未检测到用户时,控制该第二显示区域处于熄屏状态;在该第二区域检测到用户时,控制该第二显示区域显示界面元素。
结合第二方面,在第二方面的某些实现方式中,该第一界面元素还包括卡片、应用程序的图标、壁纸、动画中的至少一种。
第三方面,提供一种控制显示的方法,该方法包括:确定用户在座舱内的第一区域;根据该第一区域,控制第一显示区域显示第一界面元素,该第一区域与该第一显示区域相对应。
在上述技术方案中,在检测到座舱中某个区域有用户时,控制该区域对应的显示区域显示界面元素,有助于提升用户使用座舱时的迎宾体验,满足区域用户的需求。
结合第三方面,在第三方面的某些实现方式中,该第一显示区域为中控屏,该第一区域为该座舱内的任一区域。
在一些可能的实现方式中,在座舱内的任一区域检测到用户时,控制与主驾驶区域对应的中控屏显示移动载体安装的所有应用程序的图标。
结合第三方面,在第三方面的某些实现方式中,该控制第一显示区域显示第一界面元素,包括:根据该用户的人体特征或身份信息控制该第一显示区域显示该第一界面元素。
在上述技术方案中,根据用户的人体特征控制第一显示界面针对性地显示界面元素,方便用户在第一显示区域上看到自己感兴趣的界面元素,有助于提升用户的交互体验。根据用户的身份信息控制第一显示界面针对性地显示界面元素,方便用户在第一显示区域上看到与自己相关的界面元素,从而可以方便用户在第一显示区域上对界面元素进行操作,有助于提升移动载体的智能化程度,也有助于提升用户的驾乘体验。
结合第三方面,在第三方面的某些实现方式中,该第一区域为该座舱内的副驾驶区域或该座舱内的后排区域。
示例性地,以5座车辆为例,上述后排区域可以为车辆的第二排区域;以7座车辆为例,上述后排区域可以第二排区域和/或第三排区域。对于其他多座车辆,后排区域还可以包括除主驾驶区域和副驾驶区域以外的其他后排区域。
结合第三方面,在第三方面的某些实现方式中,该方法还包括:在该第一区域没有用户时,控制该第一显示区域处于熄屏状态。
在上述技术方案中,在第一区域没有用户时,控制该区域对应的车载屏幕或车载屏幕的区域进入熄屏状态,有助于节省能耗。
结合第三方面,在第三方面的某些实现方式中,该控制第一显示区域显示第一界面元素,包括:根据移动载体的行驶状态信息控制该第一显示区域显示该第一界面元素,该移动载体包括该座舱。
在上述技术方案中,在移动载体行驶过程中,可以在主驾驶处显示区域主要显示驾驶类相关界面元素,有助于提升行车安全。
结合第三方面,在第三方面的某些实现方式中,该第一界面元素包括一个或多个页签, 该方法还包括:在检测到该用户针对该第一显示区域显示的第一页签的输入时,控制该第一显示区域显示一个或多个应用程序的图标;其中,该一个或多个页签包括该第一页签,该第一页签包括该一个或多个应用程序的图标。
在上述技术方案中,通过页签形式显示更多的应用程序的图标,在用户选择具体页签后,推送页签中的应用程序的图标,能够在较少计算资源占用的情况下,为用户提供更多可选择的应用程序的图标,有助于提升用户的体验。
结合第三方面,在第三方面的某些实现方式中,该第一页签包括的该一个或多个应用程序的图标为同一类别的应用程序的图标。
在上述技术方案中,通过显示分类过的页签,方便用户快速选择所需要的应用程序,有助于节省用户选择应用程序所需时间,提升用户的驾乘体验。
结合第三方面,在第三方面的某些实现方式中,该座舱包括第一显示屏,该第一显示屏包括该第一显示区域和第二显示区域,该第二显示区域与该座舱的第二区域相对应,该方法还包括:在该第二区域未检测到用户时,控制该第二显示区域处于熄屏状态;在该第二区域检测到用户时,控制该第二显示区域显示第二界面元素。
结合第三方面,在第三方面的某些实现方式中,该座舱包括第三显示区域,该第三显示区域与该座舱的第三区域相对应,该方法还包括:在该第三区域检测到用户时,控制该第三显示区域显示第三界面元素。
结合第三方面,在第三方面的某些实现方式中,该第一界面元素还包括卡片、应用程序的图标、壁纸、和动画中的至少一种。
第四方面,提供一种控制显示的装置,该装置包括:检测单元,用于检测座舱的第一区域是否有用户,其中,该座舱处于第一状态;处理单元,用于在该第一区域检测到用户时,控制该座舱处于第二状态。
结合第四方面,在第四方面的某些实现方式中,检测座舱的第一区域是否有用户,其中,该座舱的第一显示区域处于熄屏状态,该第一显示区域与该第一区域相对应;在该第一区域检测到用户时,控制该第一显示区域由该熄屏状态切换为显示界面元素。
结合第四方面,在第四方面的某些实现方式中,该处理单元具体用于:根据该用户的人体特征控制该第一显示区域显示该界面元素。
结合第四方面,在第四方面的某些实现方式中,该处理单元具体用于:根据该用户的身份信息控制该第一显示区域显示该界面元素。
结合第四方面,在第四方面的某些实现方式中,该处理单元具体用于:根据移动载体的行驶状态信息控制该第一显示区域显示该界面元素,该移动载体包括该座舱。
结合第四方面,在第四方面的某些实现方式中,该界面元素包括一个或多个页签,该处理单元还用于:在该检测单元检测到该用户针对该第一显示区域显示的第一页签的输入时,控制该第一显示区域显示一个或多个应用程序的图标;其中,该一个或多个页签包括该第一页签,该第一页签包括该一个或多个应用程序的图标。
结合第四方面,在第四方面的某些实现方式中,该第一页签包括的该一个或多个应用程序的图标为同一类别的应用程序的图标。
结合第四方面,在第四方面的某些实现方式中,该处理单元具体用于:在该检测单元检测到该用户针对该第一显示区域的输入时,控制该第一显示区域由该熄屏状态切换至显 示该界面元素。
结合第四方面,在第四方面的某些实现方式中,该座舱包括第一显示屏,该第一显示屏包括该第一显示区域和第二显示区域,该第二显示区域与该座舱的第二区域相对应,该处理单元还用于:在该检测单元在该第二区域未检测到用户时,控制该第二显示区域处于熄屏状态;在该检测单元在该第二区域检测到用户时,控制该第二显示区域显示界面元素。
结合第四方面,在第四方面的某些实现方式中,该界面元素还包括卡片、应用程序的图标、壁纸、动画中的至少一种。
结合第四方面,在第四方面的某些实现方式中,该座舱包括车载香氛,该座舱处于第一状态,包括该车载香氛处于停止释放状态;该处理单元具体用于:控制该车载香氛开启。
结合第四方面,在第四方面的某些实现方式中,该座舱包括车载空调,该座舱处于第一状态,包括该车载空调处于关机状态;该处理单元具体用于:控制该车载空调开启。
结合第四方面,在第四方面的某些实现方式中,该座舱包括车载影音系统,该座舱处于第一状态,包括该车载影音系统处于关机状态;该处理单元具体用于:控制该车载影音系统开启。
结合第四方面,在第四方面的某些实现方式中,该装置还包括:获取单元,用于获取资源占用信息,该资源占用信息用于表征分配给该第一显示区域的资源容量;该处理单元还用于根据该资源占用信息控制该第一显示区域显示该界面元素。
第五方面,提供了一种控制显示的装置,该装置可以包括:处理单元,用于在检测单元检测到第一用户位于座舱中的第一区域时,根据该第一用户的人体特征或身份信息控制该座舱的第一显示区域显示第一界面元素,该第一区域与该第一显示区域相对应。
结合第五方面,在第五方面的某些实现方式中,该处理单元还用于:在该检测单元检测到该第一用户位于该座舱中的第二区域时,控制该座舱的第二显示区域显示该第一界面元素,该第二区域与该第二显示区域相对应。
结合第五方面,在第五方面的某些实现方式中,该处理单元还用于:根据移动载体的行驶状态信息、该第一用户的人体特征或身份信息控制该座舱的第一显示区域显示第一界面元素,该移动载体包括该座舱。
结合第五方面,在第五方面的某些实现方式中,该处理单元还用于:在该检测单元检测到第二用户位于该第二区域时,根据该第二用户的人体特征或身份信息控制该第二显示区域显示第二界面元素。
结合第五方面,在第五方面的某些实现方式中,该处理单元还用于:在该检测单元检测到该第二用户位于该第一区域且该第一用户位于该第二区域时,控制该第一显示区域显示该第二界面元素且控制该第二显示区域显示该第一界面元素。
结合第五方面,在第五方面的某些实现方式中,该处理单元具体用于:根据该第一用户的人体特征或身份信息,控制该第一显示区域由熄屏状态切换至显示第一界面元素。
结合第五方面,在第五方面的某些实现方式中,该第一界面元素包括一个或多个页签,该处理单元具体用于:在检测到该第一用户针对该第一显示区域显示的第一页签的输入时,控制该第一显示区域显示一个或多个应用程序的图标;其中,该一个或多个页签包括该第一页签,该第一页签包括该一个或多个应用程序的图标。
结合第五方面,在第五方面的某些实现方式中,该处理单元还用于:在检测到该第一 用户针对该第一显示区域的输入时,控制该第一显示区域由该熄屏状态切换至显示该第一界面元素。
结合第五方面,在第五方面的某些实现方式中,该座舱包括第一显示屏,该第一显示屏包括该第一显示区域和第二显示区域,该第二显示区域与该座舱的第二区域相对应,该处理单元还用于:在该第二区域未检测到用户时,控制该第二显示区域处于熄屏状态;在该第二区域检测到用户时,控制该第二显示区域显示界面元素。
结合第五方面,在第五方面的某些实现方式中,该第一界面元素还包括卡片、应用程序的图标、壁纸、动画中的至少一种。
第六方面,提供了一种控制显示的装置,该装置包括:确定单元,用于确定用户在座舱内的第一区域;处理单元,用于根据该第一区域,控制第一显示区域显示第一界面元素,该第一区域与该第一显示区域相对应。
结合第六方面,在第六方面的某些实现方式中,该第一显示区域为中控屏,该第一区域为该座舱内的任一区域。
结合第六方面,在第六方面的某些实现方式中,该处理单元用于:根据该用户的人体特征或身份信息控制该第一显示区域显示该第一界面元素。
结合第六方面,在第六方面的某些实现方式中,该第一区域为该座舱内的副驾驶区域或该座舱内的后排区域。
结合第六方面,在第六方面的某些实现方式中,该处理单元还用于:在该第一区域没有用户时,控制该第一显示区域处于熄屏状态。
结合第六方面,在第六方面的某些实现方式中,该处理单元用于:根据移动载体的行驶状态信息控制该第一显示区域显示该第一界面元素,该移动载体包括该座舱。
结合第六方面,在第六方面的某些实现方式中,该装置还包括检测单元,该第一界面元素包括一个或多个页签,该处理单元还用于:在该检测单元检测到该用户针对该第一显示区域显示的第一页签的输入时,控制该第一显示区域显示一个或多个应用程序的图标;其中,该一个或多个页签包括该第一页签,该第一页签包括该一个或多个应用程序的图标。
结合第六方面,在第六方面的某些实现方式中,该第一页签包括的该一个或多个应用程序的图标为同一类别的应用程序的图标。
结合第六方面,在第六方面的某些实现方式中,该装置还包括检测单元,该座舱包括第一显示屏,该第一显示屏包括该第一显示区域和第二显示区域,该第二显示区域与该座舱的第二区域相对应,该处理单元还用于:在该检测单元在该第二区域未检测到用户时,控制该第二显示区域处于熄屏状态;在该检测单元在该第二区域检测到用户时,控制该第二显示区域显示第二界面元素。
结合第六方面,在第六方面的某些实现方式中,该装置还包括检测单元,该座舱包括第三显示区域,该第三显示区域与该座舱的第三区域相对应,该处理单元用于:在该检测单元在该第三区域检测到用户时,控制该第三显示区域显示第三界面元素。
结合第六方面,在第六方面的某些实现方式中,该第一界面元素还包括卡片、应用程序的图标、壁纸、和动画中的至少一种。
第七方面,提供了一种控制显示的装置,该装置包括:存储器,用于存储程序;处理器,用于执行存储器存储的程序,当存储器存储的程序被执行时,处理器用于执行上述第 一方面或第三方面中任一种可能实现方式中的方法。
第八方面,提供了一种移动载体,该移动载体包括上述第四方面至第七方面中任一种可能实现方式中的装置。
结合第八方面,在第八方面的某些实现方式中,该移动载体为车辆。
第九方面,提供了一种计算机程序产品,上述计算机程序产品包括:计算机程序代码,当上述计算机程序代码在计算机上运行时,使得计算机执行上述第一方面或第三方面中任一种可能实现方式中的方法。
需要说明的是,上述计算机程序代码可以全部或部分存储在第一存储介质上,其中第一存储介质可以与处理器封装在一起的,也可以与处理器单独封装,本申请实施例对此不作具体限定。
第十方面,提供了一种计算机可读介质,上述计算机可读介质存储有指令,当上述指令被处理器执行时,使得处理器实现上述第一方面或第三方面中任一种可能实现方式中的方法。
第十一方面,提供了一种芯片,该芯片包括处理器,用于调用存储器中存储的计算机程序或计算机指令,以使得该处理器执行上述第一方面或第三方面中任一种可能实现方式中的方法。
结合第十一方面,在一种可能的实现方式中,该处理器通过接口与存储器耦合。
结合第十一方面,在一种可能的实现方式中,该芯片系统还包括存储器,该存储器中存储有计算机程序或计算机指令。
本申请实施例中,实施例提供的一种控制显示的方法、装置和移动载体,能够在检测到座舱中有用户时,控制座舱状态变化,有助于提升用户使用座舱时的迎宾体验,还可以节省座舱内设备的能耗。当前智能座舱中,每个车载屏幕仅支持显示较少的应用程序(application,App)和/或卡片,并且显示的应用程序、卡片等均是固定的,导致用户在实际使用过程中,可选择空间小,使得用户驾乘体验不佳。本申请实施例中,在检测到座舱某区域有用户时,控制该座舱区域的显示屏或该座舱区域对应的显示屏区域显示界面元素,有助于提升用户的迎宾体验。进一步地,在座舱的不同区域均检测到用户时,不同显示区域上可以分别显示不同的界面元素,可以满足不同用户的需求。具体地,可以根据用户的人体特征或身份信息确定界面元素的类型,还可以结合移动载体的行驶状态确定界面元素的类型,即根据不同用户进行个性化界面元素推荐,进一步提升用户的交互体验、驾乘体验和娱乐体验。在用户进入座舱后,可能并不想使用车载屏幕,此时可以控制车载屏幕或车载屏幕与该用户位置对应的区域进行休眠显示界面。在用户想要使用车载屏幕时,通过针对该第一显示区域的输入,控制该第一显示区域显示该界面元素。即可以根据用户的选择和/或操作,控制车载屏幕是否进行界面元素的显示,既有利于减少能耗,还能够提升用户使用车载屏幕时的交互体验。在座舱中有多个用户时,还可以根据不同用户的人体特征或身份信息和/或移动载体行驶状态,在不同用户所处位置对应的显示屏上显示不同的界面元素,有助于为不同的用户提供不同的选择。在界面元素包括一个或多个页签时,可以根据用户选择的页签,确定显示的应用程序的图标的具体类型,能够在较少计算资源占用的情况下,为用户提供更多可选择的应用程序的图标,有助于提升用户的体验。
附图说明
图1是本申请实施例提供的车辆的一个功能框图示意。
图2是本申请实施例提供的车辆座舱场景的示意图。
图3是本申请实施例提供的一种控制显示的系统的示意性框图。
图4是本申请实施例提供的一种控制显示的系统的又一示意性框图。
图5是本申请实施例提供的一种控制显示的方法的应用场景示意图。
图6是本申请实施例提供的一种控制显示的方法的应用场景又一示意图。
图7是本申请实施例提供的一种控制显示的方法的应用场景又一示意图。
图8是本申请实施例提供的一种控制显示的方法的应用场景又一示意图。
图9是本申请实施例提供的一种控制显示的方法的应用场景又一示意图。
图10是本申请实施例提供的一种控制显示的方法的应用场景又一示意图。
图11是本申请实施例提供的一种控制显示的方法的应用场景又一示意图。
图12是本申请实施例提供的一种控制显示的方法的示意性流程图。
图13是本申请实施例提供的一种控制显示的方法的示意性流程图。
图14是本申请实施例提供的一种控制显示的方法的示意性流程图。
图15是本申请实施例提供的一种控制显示的方法的示意性流程图。
图16是本申请实施例提供的一种控制显示的装置的示意性框图。
图17是本申请实施例提供的一种控制显示的装置的示意性框图。
图18是本申请实施例提供的一种控制显示的装置的示意性框图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述。
图1是本申请实施例提供的车辆100的一个功能框图示意。车辆100可以包括感知系统120、显示装置130和计算平台150,其中,感知系统120可以包括感测关于车辆100周边的环境的信息的一种或多种传感器。例如,感知系统120可以包括定位系统,定位系统可以是全球定位系统(global positioning system,GPS),也可以是北斗系统或者其他定位系统、惯性测量单元(inertial measurement unit,IMU)、激光雷达、毫米波雷达、超声雷达以及摄像装置中的一种或者多种;感知系统120还可以包括压力传感器,设置于座椅下方,用于检测座位上是否有用户;感知系统120还可以包括声波传感器,用于检测座舱内的音频信息。
车辆100的部分或所有功能可以由计算平台150控制。计算平台150可包括一个或多个处理器,例如处理器151至15n(n为正整数),处理器是一种具有信号的处理能力的电路,在一种实现中,处理器可以是具有指令读取与运行能力的电路,例如中央处理单元(central processing unit,CPU)、微处理器、图形处理器(graphics processing unit,GPU)(可以理解为一种微处理器)、或数字信号处理器(digital signal processor,DSP)等;在另一种实现中,处理器可以通过硬件电路的逻辑关系实现一定功能,该硬件电路的逻辑关系是固定的或可以重构的,例如处理器为专用集成电路(application-specific integrated circuit,ASIC)或可编程逻辑器件(programmable logic device,PLD)实现的硬件电路, 例如现场可编辑逻辑门阵列(filedprogrammablegatearray,FPGA)。在可重构的硬件电路中,处理器加载配置文档,实现硬件电路配置的过程,可以理解为处理器加载指令,以实现以上部分或全部单元的功能的过程。此外,还可以是针对人工智能设计的硬件电路,其可以理解为一种ASIC,例如神经网络处理单元(neural network processing unit,NPU)、张量处理单元(tensor processing unit,TPU)、深度学习处理单元(deep learning processing unit,DPU)等。此外,计算平台150还可以包括存储器,存储器用于存储指令,处理器151至15n中的部分或全部处理器可以调用存储器中的指令,执行指令,以实现相应的功能。
座舱内的显示装置130主要分为两类,第一类是车载显示屏;第二类是投影显示屏,例如抬头显示装置(head up display,HUD)。车载显示屏是一种物理显示屏,是车载信息娱乐系统的重要组成部分,座舱内可以设置有多块显示屏,如数字仪表显示屏,中控屏,副驾驶位上的乘客(也称为前排乘客)面前的显示屏,左侧后排乘客面前的显示屏以及右侧后排乘客面前的显示屏,甚至是车窗也可以作为显示屏进行显示。抬头显示,也称平视显示系统。主要用于在驾驶员前方的显示设备(例如挡风玻璃)上显示例如时速、导航等驾驶信息。以降低驾驶员视线转移时间,避免因驾驶员视线转移而导致的瞳孔变化,提升行驶安全性和舒适性。HUD例如包括组合型抬头显示(combiner-HUD,C-HUD)系统、风挡型抬头显示(windshield-HUD,W-HUD)系统、增强现实型抬头显示系统(augmented reality HUD,AR-HUD)。
在本申请实施例中,处理器可以获取感知系统120检测的座椅压力信息、用户的面容信息、音频信息等,进而根据座椅压力信息、用户的面容信息、音频信息中的至少一个,控制显示装置130显示的应用程序的图标和/或卡片。在一些可能的实现方式中,上述座椅压力信息、用户的面容信息、音频信息等信息,还可以以数据的形式存储于计算平台150中的存储器中。
应理解,上述操作可以由同一个处理器执行,或者,也可以由一个或多个处理器执行,本申请实施例对此不作具体限定。
图2为本申请实施例提供的一种车辆座舱场景的示意图。智能座舱内部设置一个或多个车载显示屏(或称车载屏幕),包括但不限于显示屏201(或者,也可以称为中控屏)、显示屏202(或者,也可以称为副驾娱乐屏)、显示屏203(或者,也可以称为主驾驶头枕后部屏幕)、显示屏204(或者,也可以称为副驾驶头枕后部屏幕)以及仪表屏。在一些可能的实现方式中,显示屏201也可以为延伸至副驾驶区域的长屏。图2所示座舱内还可以安装有一个或多个摄像头,用于捕捉舱内或舱外的图像,例如,驾驶员监测系统(driver monitor system,DMS)的摄像头,座舱监测系统(cabin monitor system,CMS)的摄像头,以及行车记录仪(dashcam)的摄像头。其中,用于捕捉舱内和舱外的摄像头可以是同一个摄像头,也可以是不同摄像头。此外,座舱内还设置一个或多个压力传感器、声波传感器,用于监测座舱内部是否有用户以及用户所处位置。进一步地,显示屏201至204可以显示图形用户界面(graphical user interface,GUI),GUI中可以包括一个或多个应用程序的图标,和/或一个或多个卡片。
应理解,以下实施例中的控制显示的方法是以图2所示的5座车辆为例进行说明的,本申请实施例并不限于此。例如,对于7座运动型多用途车(sport/suburban utility vehicle, SUV),座舱内可以包括中控屏、副驾娱乐屏、主驾驶头枕后部屏幕、副驾驶头枕后部屏幕、三排左侧区域的娱乐屏以及三排右侧区域的娱乐屏。又例如,对于客车而言,座舱内可以包括前排娱乐屏和后排娱乐屏;或者,座舱内可以包括驾驶区域的显示屏和乘客区域的娱乐屏。在一种实现中,乘客区域的娱乐屏也可以设置在座舱顶部。
图3示出了本申请实施例提供的一种控制显示的系统的示意性框图。如图3中的(a)所示,该系统包括应用部署模块、装置301以及该装置301控制的多个显示设备351至35n。其中,应用部署模块可以对所有应用程序(和/或卡片)进行部署,一示例,该应用部署模块可以保存座舱内各屏幕与其显示的应用程序的图标(和/或卡片)之间的对应关系,例如,屏幕201可以显示所有应用程序的图标(和/或卡片),屏幕202至204可以显示部分应用程序的图标(和/或卡片),表1示出了座舱各屏幕与其显示的应用程序的图标之间的对应关系;又一示例,该应用部署模块还可以保存用户的人体特征或身份信息与应用程序(和/或卡片)之间的对应关系,例如男性、女性、少年、老年用户等分别与哪些应用程序(和/或卡片)相对应,用户A的身份信息与哪些应用程序(和/或卡片)相对应等,表2和表3分别示出了人体特征、身份信息与其应用程序之间的对应关系。示例性地,该应用部署模块可以包括图1中所示的计算平台150中的一个或多个处理器。装置301包括检测模块和处理模块,检测模块可以包括图1中所示的感知系统120中的一种或多种摄像装置,以及一种或多种传感器;处理模块可以包括图1中所示的计算平台150中的一个或多个处理器;显示设备351至35n可以包括图1中显示装置130中的一个或多个,示例性地,显示设备351至35n可以为图2中所示的屏幕201至204中的一个或多个。在具体实现过程中,处理模块根据座椅压力信息、用户的面容信息、音频信息等确定座舱内部是否有用户以及用户所处位置,进而控制用户所处位置处的显示设备显示应用程序的图标和/或卡片,其中,显示设备具体显示哪些应用程序的图标可以是根据应用部署模块存储的对应关系确定的。可选地,处理模块可以根据用户的面容信息和/或音频信息确认用户的年龄、性别等,进而根据应用部署模块存储的用户的年龄、性别等与应用程序的对应关系确定显示设备显示的应用程序的图标的类型(和/或卡片的类型)。在一些可能的实现方式中,控制显示的系统还可以包括装置302和装置303,如图3中的(b)所示。其中,装置302可以控制显示设备351,装置303可以控制显示设备352至35n。应理解,装置302控制显示设备351显示哪些应用程序的图标(和/或卡片),可以是结合检测模块检测到的信息以及应用部署模块保存的对应关系确定。在一些可能的实现方式中,控制显示的系统中还可以包括装置304,如图3中的(c)所示,该装置304除了控制显示设备351至35n,还可以控制其他车载设备,其中,其他车载设备包括但不限于车载香氛、车载空调、车载影音系统。示例性地。在该装置304控制显示设备351至35n时,可以根据应用部署模块保存的对应关系确定显示设备351至35n显示的应用程序的图标的类型和/或卡片的类型。在一些可能的实现方式中,上述实施例中提及的其他车载设备也可以通过与控制显示设备的上述装置301至装置304不同的装置进行控制,本申请实施例对此不作具体限定。应理解,上述模块及装置仅为一个示例,实际应用中,上述模块和装置有可能根据实际需要添加或删除。
表1

表2
表3
需要说明的是,上述“所有应用程序”可以理解为车辆安装的全部应用程序。
示例性地,本申请实施例中涉及的“身份信息”包括但不限于保存在车辆中的生物特征信息,以及账号等。其中,生物特征信息包括但不限于指纹、掌纹、面容信息、虹膜信息、步态信息等;账号可以包括登录车机系统的账号信息等。
图4是本申请实施例提供的一种控制显示的系统架构的示意图。示例性地,图3中的(a)所示的装置301(或图3中的(c)所示的装置304)可以为片上系统(system-on-a-chip,SOC),该装置301可以通过多个虚拟机模拟不同类型的操纵系统所需的硬件系统的功能,实现在该装置301中运行不同类型的操作系统,并且可以通过虚拟机管理器实现对该多个操作系统的管理。例如,虚拟机1可以运行实时操纵系统(real time operation system,RTOS),虚拟机2可以运行客户Linux操纵系统(guest Linux operation system)。该装置301可以通过虚拟机管理器对不同的操作系统分配合适的资源占比,比如,CPU、内存和缓存等。进一步地,基于轻量化框架/库,可以在虚拟机1上运行应用程序域1,该应用程序域1可以包括仪表域的应用,比如,仪表屏可以显示该仪表域的应用。基于框架/库,可以在虚拟机2上运行应用程序域2,该应用程序域2可以包括车载娱乐(in-vehicle infotainment,IVI)的应用。在一些可能的实现方式中,应用程序域1和应用程序域2中运行的程序可以为应用部署模块确定的。
图5是本申请实施例提供的一种控制显示的方法的应用场景的示意图。如图5中的(a)所示,在车辆检测到主驾驶区域有用户A时,车辆可以控制显示屏201显示应用程序的图标。由于副驾驶区域以及车辆后排区域没有用户在位,可以控制显示屏202至204屏幕熄屏;或者如图5中的(a)所示,仅显示时间、日期等信息。在主驾驶区域没有用户在位 时,显示屏201可以熄屏或者仅显示时间、日期等信息,检测到后排左侧座椅有用户B时,可以控制显示屏203显示应用程序的图标,具体如图5中的(b)所示。
在一些可能的实现方式中,在车辆检测到主驾驶区域有用户A时,车辆可以控制显示屏201显示所有应用程序的图标。其中,“所有应用程序”可以为车辆安装的全部应用程序。
在一些可能的实现方式中,在车辆中任意位置处检测到用户时,车辆可以控制显示屏201显示所有应用程序的图标。
在一些可能的实现方式中,在车辆检测到某座椅处有用户时,可以控制该位置处的屏幕显示休眠显示界面,如息屏显示(always on display,AOD),或者显示静态或动态壁纸等其他休眠显示界面,在检测到用户点击屏幕的操作后,显示应用程序的图标。示例性地,如图5中的(c)所示,检测到后排左侧座椅有用户B时,控制显示屏203显示如301所示的休眠显示信息(例如,静态壁纸或动态壁纸、当前时间、日期、地区及天气)等,在用户B点击屏幕后,显示如302所示的应用程序的图标。
在一些可能的实现方式中,还可以通过检测用户视线是否停留在屏幕上,进而控制屏幕显示的内容。示例性地,检测到用户视线处于屏幕上时,控制屏幕显示休眠显示信息;在用户视线离开屏幕超过预设时长时,控制屏幕熄屏。示例性地,上述预设时长可以为5秒,或10秒,或者为其他时长,本申请实施例对此不作具体限定。
本申请实施例提供的一种控制显示的方法,在车载屏幕对应的座位处没有用户在位时,不进行应用程序的推送,有助于节省能耗。在一些场景下,有助于将更多计算资源的分配有用户的座位对应的车载屏幕,使得用户获得更好的驾乘体验。此外,在用户进入车辆座舱后再进行应用程序的推送,还有助提高用户使用车辆过程中的人机交互体验。
在一些可能的实现方式中,在车辆检测到主驾驶区域有用户A时,车辆可以控制显示屏201显示一个或多个卡片,如图6中的(a)所示;对于座位处没有用户的屏幕,可以控制其熄屏,或显示休眠显示信息。或者,车辆检测到后排左侧座椅有用户B时,车辆可以控制显示屏203显示一个或多个卡片,如图6中的(b)所示;对于座位处没有用户的屏幕,可以控制其熄屏,或显示休眠显示信息。或者,在车辆检测到某座椅处有用户时,可以控制该位置处的屏幕显示静态或动态壁纸,在检测到用户点击屏幕的操作后,显示一个或多个卡片。如图6中的(c)所示,车辆检测到后排左侧座椅有用户B时,控制显示屏203显示如401所示的休眠显示信息,包括静态壁纸或动态壁纸、当前时间、日期、地区及天气等,在用户B点击屏幕后,显示如402所示的卡片。
示例性地,上述卡片可以与车辆上安装的某些应用程序相关联。例如,卡片可以关联车载音乐应用。卡片上可以显示某一首音乐对应的歌手的名字、歌词信息、播放进度条,点赞控件、切换至播放上一首音乐的控件、暂定/播放控件、切换至播放下一首音乐的控件等。当车辆检测到用户点击卡片的操作时,车辆可以通过车载屏幕显示该车载音乐应用的显示界面。
以上卡片中可以显示文本信息和控件信息,也可以只显示应用程序的图标。例如,卡片中可以只显示车载音乐应用的图标。
卡片也可以关联车辆本地的一些功能。例如,卡片可以关联车辆的剩余电量以及该剩余电量对应的续航里程的信息。当车辆检测到用户点击卡片的操作时,车辆可以通过车载 屏幕显示车辆的剩余电量以及该剩余电量对应的续航里程的显示界面。
卡片还可以关联应用程序的某些功能的显示界面。例如,卡片可以关联支付应用中的支付功能。当车辆检测到用户点击该卡片时,可以不显示该支付应用的首页而是直接显示该支付应用中与支付功能相关的显示界面。
卡片还可以关联多个应用程序的显示列表。当车辆检测到用户点击卡片的操作时,车辆可以通过车载屏幕显示车辆上安装的多个应用程序的显示界面。
以上卡片可以是通过用户的设置显示在车载屏幕中的。例如,用户可以通过设置功能编辑其希望在GUI中显示的卡片。或者,卡片也可以是车辆出厂时设置好的,本申请实施例对此不作具体限定。
以上GUI中仅仅是以4个卡片为例进行说明的,GUI中还可以包括更多或者更少的卡片数量,本申请实施例对此不作限定。
在一些可能的实现方式中,在车辆检测到某座位处有用户时,车辆可以控制该座位对应的屏幕显示一个或多个页签,其中,每个页签中包括一个或多个应用程序的图标,用户可以点击某页签的任意位置选择该页签。在检测到用户的点击操作时,车载屏幕中显示该页签包括的一个或多个应用程序的图标。示例性地,如图7中的(a)所示,在车辆检测到副驾驶区域有用户时,车辆控制显示屏202显示如601所示的页签1、页签2和页签3,用户点击页签3后,车辆控制显示屏202显示如602所示的页签3包括的应用程序的图标。
以上页签中的应用程序的图标可以是通过用户的设置显示在车载屏幕中的。例如,用户可以通过设置功能编辑页签的具体名称,以及该页签包括的应用程序的图标。或者,页签的名称和/或包括的应用程序的图标也可以是车辆出厂时设置好的,本申请实施例对此不作具体限定。示例性地,可以将购物类应用程序的图标设置在“购物”页签中,将视频类应用程序的图标、音乐类应用程序的图标、游戏类应用程序的图标设置在“娱乐”页签中,将导航类应用程序的图标、地图类应用程序的图标设置在“驾驶助手”页签中,具体如图7中的(b)所示。在检测到用户点击603所示的“娱乐”页签后,车辆控制显示屏202显示如604所示的“娱乐”页签包括的应用程序的图标。
在一些可能的实现方式中,如图7中的(c)所示,多个页签可以以滚动的形式显示,用户手指在车载屏幕上左右滑动时,一个页签总是在其他页签最上方显示。如605所示,“驾驶助手”页签显示在车载屏幕最上方,在检测到用户手指在车载屏幕上向右滑动时,车载屏幕显示切换为如606所示,“娱乐”页签显示在车载屏幕最上方。在检测到用户点击“娱乐”页签的操作后,车载屏幕显示如607所示的“娱乐”页签包括的应用程序的图标。
在一些可能的实现方式中,多个页签也可以以平铺的形式显示,具体如图7中的(d)所示,用户手指在车载屏幕上左右滑动时,可是切换车载屏幕中间显示的页签。如608所示,“驾驶助手”页签显示在车载屏幕中间,在检测到用户手指在屏幕上向左滑动时,车载屏幕显示切换为如609所示的“娱乐”页签显示在车载屏幕中间。在检测到用户点击“娱乐”页签的操作后,车载屏幕显示如610所示的“娱乐”页签包括的应用程序的图标。
应理解,多个页签以平铺或滚动的形式显示时,多个页签也可以在车载屏幕中上下排布,用户手指在车载屏幕上下滑动时,可以切换车载屏幕最上方或车载屏幕中间显示的页签;或者,多个页签也可以以其他形式显示在车载屏幕上,本申请实施例对此不作具体限 定。
以上页签的名称以及每个页签中包括的应用程序的图标,可以是通过用户的设置显示在车载屏幕中的。例如,用户可以通过设置功能进行编辑,设置“理财”、“运动”、“工作”、“学习”等页签。或者,页签的名称以及每个页签中包括的应用程序的图标也可以是车辆出厂时设置好的,本申请实施例对此不作具体限定。
本申请实施例提供的控制显示的方法,通过页签形式显示更多的应用程序的图标,在用户通过点击操作选择具体页签后,推送页签中的应用程序的图标,能够在较少计算资源占用的情况下,为用户提供更多的选择,有助于提升用户的体验。
本申请实施例提供的控制显示的方法中,可以根据车辆实时状态(如处于行驶状态或驻车状态)确定显示设备显示的应用程序的类型或卡片的类型;还可以根据用户的性别、年龄等,确定显示设备显示的应用程序的类型或卡片的类型。
示例性地,如图8中的(a)所示,车辆行驶过程中,可以控制显示屏201中显示驾驶相关应用程序的图标(如导航类应用程序的图标、地图类应用程序的图标等),以及其他常用应用程序的图标(如音乐类应用程序的图标、通讯类应用程序的图标等),控制显示屏202至204中显示娱乐相关应用程序的图标,如视频类应用程序的图标、音乐类应用程序的图标、游戏类应用程序的图标等。在车辆处于驻车状态时,可以控制显示屏201中显示娱乐相关应用程序的图标,如图8中的(b)所示。在一些可能的实现方式中,用户手指在车载屏幕上左右滑动时,可以切换屏幕显示的应用程序的图标。例如,在车辆行驶过程中,可以控制显示屏201的第一页显示驾驶类应用程序的图标;在车辆处于驻车状态时,可以控制显示屏201的第一页显示娱乐类应用程序的图标。在检测到用户手指的向左滑动操作时,控制显示屏201显示第二页的应用程序的图标。在一些可能的实现方式中,在车辆由行驶状态变为驻车状态后,处于驻车状态持续固定时长后,控制显示屏201的第一页显示娱乐类应用程序的图标。示例性地,上述固定时长可以为30秒,或60秒,或者为其他时长,本申请实施例对此不作具体限定。
示例性地,如图9中的(a)所示,在检测到某车载屏幕对应的位置处有用户在位时,若用户为男性用户E,则推送男性偏好的应用程序的图标,例如游戏类应用程序的图标(如游戏应用的图标)、理财类应用程序的图标(如股票应用的图标)等;若用户为女性用户F,则推送女性偏好的应用程序的图标,如购物类应用程序的图标(如购物应用的图标),修图类应用程序的图标(如相机应用的图标)等。在男性用户E和女性用户F互换位置后,对应位置的车载屏幕也进行应用程序的图标的切换,具体如图9中的(b)所示。
示例性地,如图10中的(a)所示,在检测到某车载屏幕对应的位置处有用户在位时,若用户为少年用户G,则推送少年用户偏好的应用程序的图标,例如学习类应用程序的图标(如趣味英语的图标、计算器的图标、在线教学的图标)、娱乐类应用程序的图标(如音乐应用的图标)等;若用户为老年用户H,则推送老年偏好的应用程序的图标,如日历的图标、天气的图标以及收音机的图标等。在少年用户G和老年用户H互换位置后,对应位置的车载屏幕也进行应用程序的图标的切换,具体如图10中的(b)所示。
应理解,上述“男性偏好的应用程序”、“女性偏好的应用程序”、“少年用户偏好的应用程序”、“老年用户偏好的应用程序”可以是通过用户自行设定的,也可以是车辆出厂时设置好的,或者也可以是通过其他方式设置的,本申请实施例对此不作具体限定。
本申请实施例提供的控制显示的方法,能够根据车辆实时状态、用户性别、年龄等确定显示设备显示的应用程序的图标或卡片类型,有助于提升用户的人机交互体验。
在一些可能的实现方式中,车辆座舱中的中控屏和副驾娱乐屏可以为同一块屏幕,如图11所示。该屏幕可以分为2块显示区域,分别为显示区域1和显示区域2。显示区域1可以为靠近主驾驶用户的显示区域,显示区域2为靠近副驾驶区域的显示区域。显示区域1和显示区域2中可以显示相同的应用程序的图标或卡片,或者,显示区域1和显示区域2中也可以显示不同的应用程序的图标或卡片。用户手指在显示区域1内滑动时,可以控制显示区域1内显示内容的切换;用户手指在显示区域2内滑动时,可以控制显示区域2内显示内容的切换,两个区域内显示内容的切换可以互不影响。
示例性地,车载屏幕显示区域1和显示区域2均显示第一页应用程序的图标,如图11中的(a)中901所示,显示区域1显示的第一页可以显示驾驶过程中常用应用程序的图标,包括导航助手1的图标、导航助手2的图标、地图应用的图标、音乐应用的图标、通讯录应用的图标以及电话应用的图标,显示区域2显示的第一页可以显示娱乐常用应用程序的图标包括华为视频的图标、浏览器应用的图标、购物应用的图标、商城应用的图标、游戏应用的图标、音乐应用的图标、股票应用的图标以及相机应用的图标。在副驾驶用户C的手指在显示区域2中向左滑动后,显示区域2显示第二页应用程序的图标,显示区域1保持显示第一页应用程序图标不变,如902所示。或者,在主驾驶用户A的手指在显示区域1中向左滑动后,显示区域1显示第二页应用程序图标,显示区域2保持显示第一页应用程序图标不变,如图11中的(b)中的904所示。在一些可能的实现方式中,当副驾驶没有用户时,可以控制副驾娱乐屏熄屏,或者进行显示休眠显示界面,如图11中的(c)所示;当主驾驶没有用户时,可以控制主驾驶屏幕熄屏,或者显示休眠显示界面,如图11中的(d)所示。应理解,在本申请实施例中,也可以根据车辆实时状态(如处于行驶状态或驻车状态)确定显示区域1和/或显示区域2显示的应用程序的图标的类型或卡片的类型;还可以根据用户的性别、年龄等,确定显示区域1和/或显示区域2显示的应用程序的图标的类型或卡片的类型。示例性地,在车辆行驶过程中,可以在显示区域1中显示驾驶类应用程序的图标;在车辆处于驻车状态时,可以在显示区域1中显示娱乐类应用程序的图标。在一些可能的实现方式中,在车辆行驶过程中,可以在显示区域1中的第一页显示驾驶类应用程序的图标,如901所示;在车辆处于驻车状态时,可以在显示区域1中第一页显示娱乐类应用程序的图标,如903所示。
需要说明的是,图5至图11所示的GUI中显示的应用程序的图标的个数,和/或卡片的个数可以根据系统的资源占用情况确定。
下面举例说明车辆判断座舱内是否有用户以及用户在座舱中的位置的方法。
在本申请实施例中,车辆可以通过传感器检测某座椅处是否有用户在位,包括但不限于:通过声波传感器检测某座椅处是否有用户在位;通过摄像装置或舱内视觉传感器检测某座椅处是否有用户在位;通过设置在座椅处的压力传感器检测座椅处是否有用户在位。
示例性地,通过声波传感器检测某座椅处是否有用户在位,具体可以为:车辆可以根据声波传感器获取的音频信息确定座舱内是否有用户,以及用户实际所在的位置。其中,音频信息可以是从采集的车辆内部的音频信息中排除各种无效音频信息后得到的音频信息,无效音频信息可以是音量过低的音频信息。声源位置可以是音频信息对应的声源所处 的位置,声源位置可以是与基于声源定位的车载屏幕的相对位置,也可以是具体的位置坐标,本申请实施例对此不作具体限制。
示例性地,可以基于到达时间差(time difference of arrival,TDOA)原理根据多个声波传感器采集的音频信息确定声源位置。例如,声波传感器A和B分别检测到音频从声源S处发出,其中声源S的声源信号到达声波传感器A的时间为t1,到达声波传感器B的时间为t2,则时间差dt=|t1-t2|,设定声源S与声波传感器A的距离为AS,声源S与声波传感器B的距离为BS,音速为c,则可以得到dt=t1-t2=AS/c-BS/c,再根据两个声波传感器之间的距离a,选择其中一个传感器为基准点,即可确定声源的位置。
在一些可能的实现方式中,音频信息可以为包含特定唤醒词的语音,例如“开启车机”、“开启智能屏幕”等。上述声波传感器可以为集成或安装于车载屏幕上的超声波收发装置或安装在车辆座舱内部的超声波收发装置(包括但不局限于麦克风传感器或者麦克风传感器阵列)。
示例性地,通过摄像装置或舱内视觉传感器检测某座椅处是否有用户在位,具体可以为:通过摄像装置或舱内视觉传感器获取的用户的面容信息,进而根据用户的面容信息确定座舱内是否有用户,以及用户实际所在的位置。上述摄像装置或舱内视觉传感器包括但不限于:集成或安装于车载屏幕上的摄像头传感器或安装在车辆座舱内部的摄像头传感器,例如,红绿蓝(red green blue,RGB)相机、红绿蓝红外线(red green blue-infrared radiation,RGB-IR)相机、飞行时间(time of flight,TOF)相机。
在一些可能的实现方式中,在获取用户面容信息后,可以通过人脸属性识别算法、人脸性别分类算法等算法确定用户的性别、年龄等。示例性地,在获取用户面容信息后,还可以通过视线估计(gaze estimation)算法或视线追踪算法等,确定用户的视线关注点是否在车载屏幕上,进而控制车载屏幕上显示的内容。
在一些可能的实现方式中,也可以通过集成或安装于车载屏幕上的激光雷达或安装在车辆座舱内部的激光雷达、在车载屏幕上或者车载屏幕边缘处的无线电收发装置(包括但不局限于毫米波雷达或者厘米波雷达)、集成在车载屏幕上或者车载屏幕边缘处的红外感知装置(包括但不局限于红外测距仪、激光测距仪)、眼动仪等检测某座椅处是否有用户在位。
示例性地,通过设置在座椅处的压力传感器检测座椅处是否有用户在位,具体可以为:在某座椅处的压力大于或等于预设阈值时,确认座椅处有用户座位。示例性地,上述预设阈值可以为100牛顿(Newton,N),或者为200N,或者也可以为其他数值,本申请实施例对此不作具体限定。
应理解,在检测座舱内是否有用户,以及用户实际所在的位置时,可以使用上述方法中的任意一种,或者也可以使用上述方法的结合,或者也可以使用其他方法,本申请实施例对此不作具体限定。
图12示出了本申请实施例提供的控制显示的方法1100的示意性流程图。该方法1100可以应用于图1所示的车辆100中,该方法也可以由图3所示的系统执行。图12示出的控制显示的方法的步骤或操作仅为示例性说明,本申请实施例还可以执行其他操作或者图12中的各个操作的变形。该方法1100包括:
S1110,检测座舱的第一区域是否有用户,其中,该座舱处于第一状态。
示例性地,检测座舱的第一区域是否有用户的具体方法可以参考上述实施例中的描述,在此不在赘述。
示例性地,该座舱可以包括上述实施例中的显示屏201至204中的至少一个;或者,该座舱还可以包括上述实施例中的车载香氛、车载空调、车载影音系统等;或者,该座舱还可以包括其他设备,本申请实施例对此不作具体限定。
示例性地,座舱处于第一状态可以包括如下至少一项:车载显示屏(如显示屏201至204)处于熄屏状态;车载香氛处于关闭状态,即车载香氛不释放香氛;车载空调处于关机状态;车载影音系统处于关机状态。
S1120,在该第一区域检测到用户时,控制该座舱处于第二状态。
示例性地,座舱处于第一状态包括车载显示屏(如显示屏201至204)处于熄屏状态时,控制该座舱处于第二状态可以包括:控制与第一区域对应的显示屏显示一个或多个应用程序的图标和/或卡片。其中,控制与第一区域对应的显示屏显示一个或多个应用程序的图标和/或卡片的具体方法可以参考上述实施例中的描述,在此不再赘述。需要说明的是,与第一区域对应的显示屏可以包括设置于第一区域的显示屏,示例性地,第一区域为座舱主驾驶区域时,则与第一区域对应的显示屏可以为主驾驶处屏幕如仪表盘、中控屏。
示例性地,座舱处于第一状态包括车载香氛处于关闭状态,即车载香氛不释放香氛;控制该座舱处于第二状态可以包括:控制车载香氛开启。即车载香氛开始释放香氛,其中,释放的香氛类型可以为设置的默认类型,或者也可以为最近一次使用的类型,或者也可以为其他类型,本申请实施例对此不作具体限定。
示例性地,座舱处于第一状态包括车载空调处于关机状态;控制该座舱处于第二状态可以包括:控制车载空调开启。在一些可能的实现方式中,可以控制与第一区域对应的车载空调开启。应理解,与第一区域对应的车载空调可以包括设置于第一区域的车载空调,示例性地,第一区域为座舱主驾驶区域时,则与第一区域对应的车载空调可以为主驾驶处空调。
示例性地,座舱处于第一状态包括车载影音系统处于关机状态;控制该座舱处于第二状态可以包括:控制车载影音系统开启。示例性地,车载影音系统开启时,可以播放音乐或者收音机频道等音频。
在一些可能的实现方式中,在检测到第一区域有用户时,控制该座舱由第一状态切换至第二状态可以包括上述示例中的一个,或者也可以为上述两个及以上示例的结合。一示例,座舱处于主驾驶处显示屏熄屏以及车载香氛处于关闭状态(第一状态)时,检测到主驾驶区域有用户,则控制车载香氛开启且控制主驾驶处显示屏显示一个或多个应用程序的图标和/或卡片(第二状态)。再一示例,座舱处于主驾驶处显示屏熄屏以及车载香氛和车载空调处于关闭状态(第一状态)时,检测到主驾驶区域有用户,则控制车载香氛和车载空调开启且控制主驾驶处显示屏显示一个或多个应用程序的图标和/或卡片(第二状态)。
本申请实施例提供的控制显示的方法,能够在检测到座舱中有用户时,控制座舱状态变化,有助于提升用户使用座舱时的迎宾体验。
图13示出了本申请实施例提供的控制显示的方法1300的示意性流程图。该方法1300可以应用于图1所示所示的车辆100中,该方法也可以由图3所示的系统执行。图13示出的控制显示的方法的步骤或操作仅为示例性说明,本申请实施例还可以执行其他操作或 者图13中的各个操作的变形。该方法1300包括:
S1310,检测座舱的第一区域是否有用户,其中,该座舱的第一显示区域处于熄屏状态,该第一显示区域与该第一区域相对应。
示例性地,检测座舱的第一区域是否有用户的具体方法可以参考上述实施例中的描述,在此不再赘述。
示例性地,该第一区域可以为上述实施例中的主驾驶区域,则第一显示区域可以为上述实施例中主驾驶处屏幕,如图2中所示屏幕201;该第一区域可以为上述实施例中的副驾驶区域,则第一显示区域可以为上述实施例中副驾驶处屏幕,如图2中所示屏幕202;该第一区域可以为上述实施例中的车辆的第二排左侧区域,则第一显示区域可以为上述实施例中主驾驶头枕后部屏幕,如图2中所示屏幕203;该第一区域可以为上述实施例中的车辆的第二排右侧区域,则第一显示区域可以为上述实施例中副驾驶头枕后部屏幕,如图2中所示屏幕204。
S1320,在该第一区域检测到用户时,控制该第一显示区域由该熄屏状态切换为显示界面元素。
示例性地,该界面元素可以为上述实施例中卡片、应用程序的图标、壁纸、动画中的至少一种,或者也可以为其他在车载屏幕上显示的内容。
示例性地,控制该第一显示区域由该熄屏状态切换为显示界面元素的方法流程可以参考上述实施例中的描述。
例如,如图5中的(a)和(b)所示,在主驾驶区域检测到用户时,控制与主驾驶区域对应的中控屏由熄屏界面或休眠显示界面切换为显示界面元素;在检测到后排左侧座椅有用户时,控制与后排左侧座椅对应的主驾驶头枕后部屏幕由熄屏界面或休眠显示界面切换为显示界面元素。
在一些可能的实现方式中,在主驾驶区域检测到用户时,控制与主驾驶区域对应的中控屏由熄屏界面或休眠显示界面切换为显示车辆安装的所有应用程序的图标。
可选地,还可以结合用户的人体特征或身份信息确定显示的界面元素的类型。
例如,如图9或图10所示,在检测到主驾驶头枕后部屏幕对应的位置处有用户在位时,若用户为男性用户(或少年用户),则控制主驾驶头枕后部屏幕由熄屏界面或休眠显示界面,显示男性(或少年用户)偏好的应用程序的图标;若用户为女性用户(或老年用户),则控制主驾驶头枕后部屏幕由熄屏界面或休眠显示界面,显示女性(或老年用户)偏好的应用程序的图标。
示例性地,结合表3所示的用户的身份信息和应用程序之间的对应关系,确定显示的应用程序的图标的类型。例如,在主驾驶区域检测到用户A,则可以控制与主驾驶区域对应的中控屏由熄屏界面或休眠显示界面,显示应用程序1的图标、应用程序2的图标和应用程序3的图标。
在一些可能的实现方式中,第一显示区域为主驾驶处屏幕,如图2中所示屏幕201时,第一区域可以为车辆的座舱内的全部区域。进一步地,在座舱内的任一区域检测到用户时,即控制主驾驶处屏幕由熄屏界面或休眠显示界面切换为显示界面元素。在一些可能的实现方式中,在座舱内的任一区域检测到用户时,即控制主驾驶处屏幕由熄屏界面或休眠显示界面切换为显示辆安装的所有应用程序的图标。
在一些可能的实现方式中,在第一区域为副驾驶区域、车辆的第二排左侧区域、车辆的第二排右侧区域中的一个时,第一显示区域除了与上述各区域分别对应的屏幕202、203、204以外,还包括屏幕201。即第一区域为副驾驶区域时,则第一显示区域为屏幕201和屏幕202;第一区域为车辆的第二排左侧区域时,则第一显示区域为屏幕201和屏幕203;第一区域为车辆的第二排右侧区域时,则第一显示区域为屏幕201和屏幕204。
示例性地,检测到副驾驶区域有用户时,控制屏幕201和屏幕202由熄屏界面或休眠显示界面切换为显示界面元素;对于车辆的第二排左侧区域检测到用户时,控制屏幕201和屏幕203由熄屏界面或休眠显示界面切换为显示界面元素;对于车辆的第二排右侧区域检测到用户时,控制屏幕201和屏幕204由熄屏界面或休眠显示界面切换为显示界面元素。其中,屏幕201可以显示车辆安装的所有应用程序的图标,屏幕202至屏幕204显示的界面元素可以根据用户的人体特征或身份信息确定。
本申请实施例提供的一种控制显示的方法,在检测到用户后,通过该用户所在区域对应的车载屏幕或车载屏幕的区域显示界面元素,有助于节省能耗,提高用户在使用车载屏幕时的交互体验。此外,座舱的不同区域均检测到用户时,不同显示区域上可以分别显示不同的界面元素,可以满足不同用户的需求。
图14示出了本申请实施例提供的控制显示的方法1400的示意性流程图。该方法1400可以应用于图1所示所示的车辆100中,该方法也可以由图3所示的系统执行。图14示出的控制显示的方法的步骤或操作仅为示例性说明,本申请实施例还可以执行其他操作或者图14中的各个操作的变形。该方法1400包括:
S1410,检测第一用户位于座舱的第一区域。
示例性地,检测座舱的第一区域是否有用户的具体方法可以参考上述实施例中的描述,在此不再赘述。
示例性地,该第一区域可以为上述实施例中的主驾驶区域,则第一显示区域可以为上述实施例中主驾驶处屏幕,如图2中所示屏幕201。应理解,第一区域也可以为座舱内的其他区域,第一显示区域也可以为其他与第一区域对应的显示屏或显示屏上的某个区域,具体对应关系可以参考上述实施例中的描述,在此不再赘述。
S1420,根据该第一用户的人体特征或身份信息控制该座舱的第一显示区域显示第一界面元素,该第一区域与该第一显示区域相对应。
示例性地,该人体特征可以如上述实施例中的描述,包括但不限于性别、年龄、情绪等。
示例性地,该身份信息可以上述实施例中的描述,包括但不限于保存在车辆中的生物特征信息,以及账号等。其中,生物特征信息包括但不限于指纹、掌纹、面容信息、虹膜信息、步态信息等;账号可以包括登录车机系统的账号信息等。
示例性地,根据该第一用户的人体特征控制该座舱的第一显示区域显示第一界面元素的具体方法可以参考图9及图10对应的实施例。
例如,如图9或图10所示,在检测到主驾驶头枕后部屏幕对应的位置处有用户在位时,若用户为男性用户(或少年用户),则控制主驾驶头枕后部屏幕显示男性(或少年用户)偏好的应用程序的图标;若用户为女性用户(或老年用户),则控制主驾驶头枕后部屏幕显示女性(或老年用户)偏好的应用程序的图标。
示例性地,根据该第一用户的身份信息控制该座舱的第一显示区域显示第一界面元素的具体方法可以参考上述实施例中的描述。
例如,结合表3所示的用户的身份信息和应用程序之间的对应关系,确定显示的应用程序的图标的类型。例如,在主驾驶区域检测到用户A,则可以控制与主驾驶区域对应的中控屏显示应用程序1的图标、应用程序2的图标和应用程序3的图标。
本申请实施例提供的一种控制显示的方法,可以根据用户的人体特征或身份信息确定界面元素的类型,即根据不同用户进行个性化界面元素推荐,有助于提升用户的交互体验、驾乘体验和娱乐体验。
在一些可能的实现方式中,在第一区域为副驾驶区域、车辆的第二排左侧区域、车辆的第二排右侧区域中的一个时,第一显示区域除了与上述各区域分别对应的屏幕202、203、204以外,还包括屏幕201。即第一区域为副驾驶区域时,则第一显示区域为屏幕201和屏幕202;第一区域为车辆的第二排左侧区域时,则第一显示区域为屏幕201和屏幕203;第一区域为车辆的第二排右侧区域时,则第一显示区域为屏幕201和屏幕204。
示例性地,检测到副驾驶区域有用户时,控制屏幕201和屏幕202均显示界面元素;对于车辆的第二排左侧区域检测到用户时,控制屏幕201和屏幕203显示界面元素;对于车辆的第二排右侧区域检测到用户时,控制屏幕201和屏幕204显示界面元素。其中,屏幕201可以显示车辆安装的所有应用程序的图标,屏幕202至屏幕204显示的界面元素可以根据用户的人体特征或身份信息确定。
可选地,在检测到该第一用户位于该座舱中的第二区域时,控制该座舱的第二显示区域显示该第一界面元素,该第二区域与该第二显示区域相对应。
例如,如图9所示,在男性用户由车辆的第二排左侧移动到车辆的第二排右侧时,控制副驾驶头枕后部屏幕显示男性偏好的应用程序图标。
可选地,还可以根据车辆的行驶状态信息、该第一用户的人体特征或身份信息控制该座舱的第一显示区域显示第一界面元素,该车辆包括该座舱。
例如,如图8所示,在第一用户位于主驾驶区域时,在车辆行驶时,可以控制与主驾驶对应的屏幕(如中控屏)显示导航类或地图类等与行驶相关的应用程序的图标,上述应用程序的图标可以是根据第一用户的人体特征或身份信息确定的。在车辆处于驻车状态时,可以控制主驾驶对应的屏幕(如中控屏)显示娱乐类应用程序的图标,上述应用程序的图标可以是根据第一用户的人体特征或身份信息确定的。
可选地,该方法还包括:在检测到第二用户位于该第二区域时,根据该第二用户的人体特征或身份信息控制该第二显示区域显示第二界面元素。
在一些可能的实现方式中,在该第一用户和该第二用户相同时,该第一界面元素和该第二界面元素相同。
可选地,该方法还包括:在检测到该第二用户位于该第一区域且该第一用户位于该第二区域时,控制该第一显示区域显示该第二界面元素且控制该第二显示区域显示该第一界面元素。
可选地,该方法还包括:根据该第一用户的人体特征或身份信息,控制该第一显示区域由熄屏状态切换至显示第一界面元素。
可选地,该第一界面元素包括一个或多个页签,该方法还包括:在检测到该第一用户 针对该第一显示区域显示的第一页签的输入时,控制该第一显示区域显示一个或多个应用程序的图标;其中,该一个或多个页签包括该第一页签,该第一页签包括该一个或多个应用程序的图标。
例如,如图6所示,在检测到副驾驶区域有用户时,控制副驾娱乐屏显示一个或多个页签,在用户点击其中一个页签后,显示该页签包括的一个或多个应用程序的图标。
可选地,在检测到该第一用户针对该第一显示区域的输入时,控制该第一显示区域由该熄屏状态切换至显示该第一界面元素。
可选地,该座舱包括第一显示屏,该第一显示屏包括该第一显示区域和第二显示区域,该第二显示区域与该座舱的第二区域相对应,该方法还包括:在该第二区域未检测到用户时,控制该第二显示区域处于熄屏状态;在该第二区域检测到用户时,控制该第二显示区域显示界面元素。
例如,如图11所示,在主驾驶区域和副驾驶区域均检测到用户时,控制显示区域1和显示区域2显示界面元素;在主驾驶区域检测到用户,而副驾驶区域未检测到用户时,控制显示区域1显示界面元素,控制显示区域2熄屏;在副驾驶区域检测到用户,而主驾驶区域未检测到用户时,控制显示区域2显示界面元素,控制显示区域1熄屏。
图15示出了本申请实施例提供的控制显示的方法1500的示意性流程图。该方法1500可以应用于图1所示所示的车辆100中,该方法也可以由图3所示的系统执行。图15示出的控制显示的方法的步骤或操作仅为示例性说明,本申请实施例还可以执行其他操作或者图15中的各个操作的变形。该方法1500包括:
S1510,确定用户在座舱内的第一区域。
示例性地,确定用户在座舱内的第一区域的具体方法可以参考上述实施例中的描述,在此不再赘述。
S1520,根据该第一区域,控制第一显示区域显示第一界面元素,该第一区域与该第一显示区域相对应。
示例性地,“第一区域与第一显示区域相对应”可以包括:该第一区域可以为上述实施例中的主驾驶区域,则第一显示区域可以为上述实施例中主驾驶处屏幕(如中控屏),如图2中所示屏幕201,即主驾驶区域与主驾驶处屏幕相对应;该第一区域可以为上述实施例中的副驾驶区域,则第一显示区域可以为上述实施例中副驾驶处屏幕,如图2中所示屏幕202,即副驾驶区域与副驾驶处屏幕相对应;该第一区域可以为上述实施例中的车辆的第二排左侧区域,则第一显示区域可以为上述实施例中主驾驶头枕后部屏幕,如图2中所示屏幕203,即第二排左侧区域与主驾驶头枕后部屏幕相对应;该第一区域可以为上述实施例中的车辆的第二排右侧区域,则第一显示区域可以为上述实施例中副驾驶头枕后部屏幕,如图2中所示屏幕204,即第二排右侧区域与副驾驶头枕后部屏幕相对应。
可选地,在该第一显示区域为中控屏时,该第一区域为该座舱内的任一区域。即,座舱的全部区域与中控屏相对应。例如,在主驾驶区域、副驾驶区域、后排区域中的任一区域有用户时,控制中控屏显示第一界面元素。
在一些可能的实现方式中,在第一区域为副驾驶区域、车辆的第二排左侧区域、车辆的第二排右侧区域中的一个时,第一显示区域除了与上述各区域分别对应的屏幕202、203、204以外,还包括屏幕201。即第一区域为副驾驶区域时,则与第一区域相对应的第一显 示区域包括屏幕201和屏幕202;第一区域为车辆的第二排左侧区域时,则与第一区域相对应的第一显示区域为屏幕201和屏幕203;第一区域为车辆的第二排右侧区域时,则与第一区域相对应的第一显示区域为屏幕201和屏幕204。
在一些可能的实现方式中,在座舱内的任一区域检测到用户时,控制与主驾驶区域对应的中控屏显示车辆安装的所有应用程序的图标。
可选地,在该第一区域为该座舱内的副驾驶区域或该座舱内的后排区域时,该控制第一显示区域显示第一界面元素,包括:根据该用户的人体特征控制该第一显示区域显示该第一界面元素。
示例性地,根据该第一用户的人体特征控制该座舱的第一显示区域显示第一界面元素的具体方法可以参考图9及图10对应的实施例。
示例性地,该后排区域可以包括上述实施例中的第二排左侧区域和第二排右侧区域。在车辆为7座SUV时,上述后排区域可以第二排区域和/或第三排区域。对于其他多座车辆,后排区域还可以包括除主驾驶区域和副驾驶区域以外的其他后排区域。
可选地,在该第一区域为该座舱内的副驾驶区域或该座舱内的后排区域时,该控制第一显示区域显示第一界面元素,包括:根据该用户的身份信息控制该第一显示区域显示该第一界面元素。
示例性地,根据该第一用户的身份信息控制该座舱的第一显示区域显示第一界面元素的具体方法可以参考上述实施例中的描述。
可选地,该方法还包括:在该第一区域没有用户时,控制该第一显示区域处于熄屏状态。
可选地,该控制第一显示区域显示第一界面元素,包括:根据移动载体的行驶状态信息控制该第一显示区域显示该第一界面元素,该移动载体包括该座舱。
可选地,该第一界面元素包括一个或多个页签,该方法还包括:在检测到该用户针对该第一显示区域显示的第一页签的输入时,控制该第一显示区域显示一个或多个应用程序的图标;其中,该一个或多个页签包括该第一页签,该第一页签包括该一个或多个应用程序的图标。
可选地,该第一页签包括的该一个或多个应用程序的图标为同一类别的应用程序的图标。
可选地,该座舱包括第一显示屏,该第一显示屏包括该第一显示区域和第二显示区域,该第二显示区域与该座舱的第二区域相对应,该方法还包括:在该第二区域未检测到用户时,控制该第二显示区域处于熄屏状态;在该第二区域检测到用户时,控制该第二显示区域显示第二界面元素。
在一些可能的实现方式中,第一界面元素与第二界面元素可以为相同界面元素。
可选地,该座舱包括第三显示区域,该第三显示区域与该座舱的第三区域相对应,该方法还包括:在该第三区域检测到用户时,控制该第三显示区域显示第三界面元素。
以第一区域为主驾驶区域、第三区域为副驾驶区域为例,如图8中的(a)所示,在主驾驶区域检测到用户A时,控制中控屏(如显示屏201)显示第一界面元素;在副驾驶区域检测到用户C时,控制副驾驶处屏幕(如显示屏202)显示第三界面元素。
以第一区域为主驾驶区域、第三区域为第二排右侧区域为例,如图8中的(b)所示, 在主驾驶区域检测到用户A时,控制中控屏(如显示屏201)显示第一界面元素;在第二排右侧区域检测到用户D时,控制副驾驶头枕后部屏幕(如显示屏204)显示第三界面元素。
在一些可能的实现方式中,第一界面元素与第三界面元素可以为相同界面元素。
可选地,该第一界面元素还包括卡片、应用程序的图标、壁纸、动画中的至少一种。
在本申请的各个实施例中,如果没有特殊说明以及逻辑冲突,各个实施例之间的术语和/或描述具有一致性、且可以相互引用,不同的实施例中的技术特征根据其内在的逻辑关系可以组合形成新的实施例。
上文中结合图5至图12详细说明了本申请实施例提供的方法。下面将结合图16和图18详细说明本申请实施例提供的装置。应理解,装置实施例的描述与方法实施例的描述相互对应,因此,未详细描述的内容可以参见上文方法实施例,为了简洁,这里不再赘述。
图16示出了本申请实施例提供的一种控制显示的装置2000的示意性框图,该装置2000包括检测单元2010和处理单元2020。检测单元2010可以用于进行检测,和/或实现相应的通信功能,处理单元2020用于进行数据处理。需要说明的是,该装置2000可以包括图3所示的系统中的装置301至装置304中的一个或多个,或者,还可以包括除装置301至装置304以外的控制其他设备(如车载香氛、车载空调、车载影音系统)的装置,本申请实施例对此不作具体限定。
可选地,该装置2000还可以包括存储单元,该存储单元可以用于存储指令和/或数据,处理单元2020可以读取存储单元中的指令和/或数据,以使得装置实现前述方法实施例。
该装置2000可以包括用于执行图12至图14中的方法的单元。并且,该装置2000中的各单元和上述其他操作和/或功能分别为了实现图12至图14中的方法实施例的相应流程。
其中,当该装置2000用于执行图12中的方法1100时,检测单元2010可用于执行方法1100中的S1110,处理单元2020可用于执行方法1100中的S1110。
具体地,该装置2000包括:检测单元2010,用于检测座舱的第一区域是否有用户,其中,该座舱处于第一状态;处理单元2020,用于在该第一区域检测到用户时,控制该座舱处于第二状态。
在一些可能的实现方式中,该座舱包括车载香氛,该座舱处于第一状态,包括该车载香氛处于停止释放状态;该处理单元2020具体用于:控制该车载香氛开启。
在一些可能的实现方式中,该座舱包括车载空调,该座舱处于第一状态,包括该车载空调处于关机状态;该处理单元2020具体用于:控制该车载空调开启。
在一些可能的实现方式中,该座舱包括车载影音系统,该座舱处于第一状态,包括该车载影音系统处于关机状态;该处理单元2020具体用于:控制该车载影音系统开启。
在一些可能的实现方式中,该检测单元2010还用于检测座舱的第一区域是否有用户,其中,该座舱的第一显示区域处于熄屏状态,该第一显示区域与该第一区域相对应;该处理单元2020,用于在该第一区域检测到用户时,控制该第一显示区域由该熄屏状态切换为显示界面元素。
在一些可能的实现方式中,该处理单元2020具体用于:根据该用户的人体特征控制该第一显示区域显示该界面元素。
在一些可能的实现方式中,该处理单元2020具体用于:根据该用户的身份信息控制该第一显示区域显示该界面元素。
在一些可能的实现方式中,该处理单元2020具体用于:根据移动载体的行驶状态信息控制该第一显示区域显示该界面元素,该移动载体包括该座舱。
在一些可能的实现方式中,该界面元素包括一个或多个页签,该处理单元2020还用于:在该检测单元2010检测到该用户针对该第一显示区域显示的第一页签的输入时,控制该第一显示区域显示一个或多个应用程序的图标;其中,该一个或多个页签包括该第一页签,该第一页签包括该一个或多个应用程序的图标。
在一些可能的实现方式中,该第一页签包括的该一个或多个应用程序为同一类别的应用程序。
在一些可能的实现方式中,该处理单元2020具体用于:在该检测单元2010检测到该用户针对该第一显示区域的输入时,控制该第一显示区域由该熄屏状态切换至显示该界面元素。
在一些可能的实现方式中,该座舱包括第一显示屏,该第一显示屏包括该第一显示区域和第二显示区域,该第二显示区域与该座舱的第二区域相对应,该处理单元2020还用于:在该检测单元2010在该第二区域未检测到用户时,控制该第二显示区域处于熄屏状态;在该检测单元2010在该第二区域检测到用户时,控制该第二显示区域显示界面元素。
在一些可能的实现方式中,该界面元素还包括卡片、应用程序的图标、壁纸、动画中的至少一种。
在一些可能的实现方式中,该装置2000还可以用于执行图13中的方法1300,当该装置2000用于执行图13中的方法1300时,检测单元2010可用于执行方法1300中的S1310,处理单元2020可用于执行方法1300中的S1320。
具体地,该装置2000包括:检测单元2010,检测座舱的第一区域是否有用户,其中,该座舱的第一显示区域处于熄屏状态,该第一显示区域与该第一区域相对应;处理单元2020,用于在该第一区域检测到用户时,控制该第一显示区域由该熄屏状态切换为显示界面元素。
在一些可能的实现方式中,该处理单元2020具体用于:根据该用户的人体特征控制该第一显示区域显示该界面元素。
在一些可能的实现方式中,该处理单元2020具体用于:根据该用户的身份信息控制该第一显示区域显示该界面元素。
在一些可能的实现方式中,该处理单元2020具体用于:根据移动载体的行驶状态信息控制该第一显示区域显示该界面元素,该移动载体包括该座舱。
在一些可能的实现方式中,该界面元素包括一个或多个页签,该处理单元2020还用于:在该检测单元2010检测到该用户针对该第一显示区域显示的第一页签的输入时,控制该第一显示区域显示一个或多个应用程序的图标;其中,该一个或多个页签包括该第一页签,该第一页签包括该一个或多个应用程序的图标。
在一些可能的实现方式中,该第一页签包括的该一个或多个应用程序为同一类别的应用程序。
在一些可能的实现方式中,该处理单元2020具体用于:在该检测单元2010检测到该 用户针对该第一显示区域的输入时,控制该第一显示区域由该熄屏状态切换至显示该界面元素。
在一些可能的实现方式中,该座舱包括第一显示屏,该第一显示屏包括该第一显示区域和第二显示区域,该第二显示区域与该座舱的第二区域相对应,该处理单元2020还用于:在该检测单元2010在该第二区域未检测到用户时,控制该第二显示区域处于熄屏状态;在该检测单元2010在该第二区域检测到用户时,控制该第二显示区域显示界面元素。
在一些可能的实现方式中,该界面元素还包括卡片、应用程序的图标、壁纸、动画中的至少一种。
在一些可能的实现方式中,该装置还包括:获取单元,用于获取资源占用信息,该资源占用信息用于表征分配给该第一显示区域的资源容量;该处理单元2020还用于根据该资源占用信息控制该第一显示区域显示该界面元素。
在一些可能的实现方式中,该装置2000还可以用于执行图14中的方法1400,当该装置2000用于执行图14中的方法1400时,检测单元2010可用于执行方法1400中的S1410,处理单元2020可用于执行方法1400中的S1420。
具体地,该装置2000包括:检测单元2010,用于检测第一用户位于座舱中的第一区域;处理单元2020,用于根据该第一用户的人体特征或身份信息控制该座舱的第一显示区域显示第一界面元素,该第一区域与该第一显示区域相对应。
在一些可能的实现方式中,该处理单元2020还用于:在该检测单元2010检测到该第一用户位于该座舱中的第二区域时,控制该座舱的第二显示区域显示该第一界面元素,该第二区域与该第二显示区域相对应。
在一些可能的实现方式中,该处理单元2020还用于:根据移动载体的行驶状态信息、该第一用户的人体特征或身份信息控制该座舱的第一显示区域显示第一界面元素,该移动载体包括该座舱。
在一些可能的实现方式中,该处理单元2020还用于:在该检测单元2010检测到第二用户位于该第二区域时,根据该第二用户的人体特征或身份信息控制该第二显示区域显示第二界面元素。
在一些可能的实现方式中,该处理单元2020还用于:在该检测单元2010检测到该第二用户位于该第一区域且该第一用户位于该第二区域时,控制该第一显示区域显示该第二界面元素且控制该第二显示区域显示该第一界面元素。
在一些可能的实现方式中,该处理单元2020具体用于:根据该第一用户的人体特征或身份信息,控制该第一显示区域由熄屏状态切换至显示第一界面元素。
在一些可能的实现方式中,该第一界面元素包括一个或多个页签,该处理单元2020具体用于:在检测到该第一用户针对该第一显示区域显示的第一页签的输入时,控制该第一显示区域显示一个或多个应用程序的图标;其中,该一个或多个页签包括该第一页签,该第一页签包括该一个或多个应用程序的图标。
在一些可能的实现方式中,该处理单元2020还用于:在检测到该第一用户针对该第一显示区域的输入时,控制该第一显示区域由该熄屏状态切换至显示该第一界面元素。
在一些可能的实现方式中,该座舱包括第一显示屏,该第一显示屏包括该第一显示区域和第二显示区域,该第二显示区域与该座舱的第二区域相对应,该处理单元2020还用 于:在该第二区域未检测到用户时,控制该第二显示区域处于熄屏状态;在该第二区域检测到用户时,控制该第二显示区域显示界面元素。
在一些可能的实现方式中,该第一界面元素还包括卡片、应用程序的图标、壁纸、动画中的至少一种。
图17示出了本申请实施例提供的一种控制显示的装置2100的示意性框图,该装置2100包括确定单元2110和处理单元2120。
该装置2100可以包括用于执行图15中的方法的单元。并且,该装置2000中的各单元和上述其他操作和/或功能分别为了实现图15中的方法实施例的相应流程。
该装置2100包括:确定单元2110,用于确定用户在座舱内的第一区域;处理单元2120,用于根据该第一区域,控制第一显示区域显示界面元素,该第一区域与该第一显示区域相对应。
可选地,该第一区域为该座舱内的任一区域,该第一显示区域为中控屏。
可选地,该处理单元2120具体用于:根据该用户的人体特征或身份信息控制该第一显示区域显示该界面元素。
可选地,该第一区域为该座舱内的副驾驶区域或该座舱内的后排区域。
可选地,该处理单元2120还用于:在该第一区域没有用户时,控制该第一显示区域处于熄屏状态。
可选地,该处理单元2120具体用于:根据移动载体的行驶状态信息控制该第一显示区域显示该界面元素,该移动载体包括该座舱。
可选地,该装置2100还包括检测单元,该界面元素包括一个或多个页签,该处理单元2120还用于:在该检测单元检测到该用户针对该第一显示区域显示的第一页签的输入时,控制该第一显示区域显示一个或多个应用程序的图标;其中,该一个或多个页签包括该第一页签,该第一页签包括该一个或多个应用程序的图标。
可选地,该第一页签包括的该一个或多个应用程序的图标为同一类别的应用程序的图标。
可选地,该座舱包括第一显示屏,该第一显示屏包括该第一显示区域和第二显示区域,该第二显示区域与该座舱的第二区域相对应,该处理单元2120还用于:在该检测单元在该第二区域未检测到用户时,控制该第二显示区域处于熄屏状态;在该检测单元在该第二区域检测到用户时,控制该第二显示区域显示界面元素。
可选地,该装置还包括检测单元,该座舱包括第三显示区域,该第三显示区域与该座舱的第三区域相对应,该处理单元2120具体用于:在该检测单元在该第三区域检测到用户时,控制该第三显示区域显示界面元素。
可选地,该界面元素还包括卡片、应用程序的图标、壁纸、动画中的至少一种。
应理解,以上装置中各单元的划分仅是一种逻辑功能的划分,实际实现时可以全部或部分集成到一个物理实体上,也可以物理上分开。此外,装置中的单元可以以处理器调用软件的形式实现;例如装置包括处理器,处理器与存储器连接,存储器中存储有指令,处理器调用存储器中存储的指令,以实现以上任一种方法或实现该装置各单元的功能,其中处理器例如为通用处理器,例如CPU或微处理器,存储器为装置内的存储器或装置外的存储器。或者,装置中的单元可以以硬件电路的形式实现,可以通过对硬件电路的设计实 现部分或全部单元的功能,该硬件电路可以理解为一个或多个处理器;例如,在一种实现中,该硬件电路为ASIC,通过对电路内元件逻辑关系的设计,实现以上部分或全部单元的功能;再如,在另一种实现中,该硬件电路为可以通过PLD实现,以FPGA为例,其可以包括大量逻辑门电路,通过配置文件来配置逻辑门电路之间的连接关系,从而实现以上部分或全部单元的功能。以上装置的所有单元可以全部通过处理器调用软件的形式实现,或全部通过硬件电路的形式实现,或部分通过处理器调用软件的形式实现,剩余部分通过硬件电路的形式实现。
在本申请实施例中,处理器是一种具有信号的处理能力的电路,在一种实现中,处理器可以是具有指令读取与运行能力的电路,例如CPU、微处理器、GPU、或DSP等;在另一种实现中,处理器可以通过硬件电路的逻辑关系实现一定功能,该硬件电路的逻辑关系是固定的或可以重构的,例如处理器为ASIC或PLD实现的硬件电路,例如FPGA。在可重构的硬件电路中,处理器加载配置文档,实现硬件电路配置的过程,可以理解为处理器加载指令,以实现以上部分或全部单元的功能的过程。此外,还可以是针对人工智能设计的硬件电路,其可以理解为一种ASIC,例如NPU、TPU、DPU等。
可见,以上装置中的各单元可以是被配置成实施以上方法的一个或多个处理器(或处理电路),例如:CPU、GPU、NPU、TPU、DPU、微处理器、DSP、ASIC、FPGA,或这些处理器形式中至少两种的组合。
此外,以上装置中的各单元可以全部或部分可以集成在一起,或者可以独立实现。在一种实现中,这些单元集成在一起,以SOC的形式实现。该SOC中可以包括至少一个处理器,用于实现以上任一种方法或实现该装置各单元的功能,该至少一个处理器的种类可以不同,例如包括CPU和FPGA,CPU和人工智能处理器,CPU和GPU等。
在具体实现过程中,上述检测单元2010和处理单元2020所执行的各项操作可以由同一个处理器执行,或者,也可以由不同的处理器执行,例如分别由多个处理器执行。一示例,一个或多个处理器可以与图1中的感知系统120中一个或多个传感器相连接,从一个或多个传感器中获取用户在座舱内的位置信息并进行处理。或者,一个或多个处理器还可以与显示装置130中的一个或多个显示设备相连接,进而控制显示设备所显示应用程序的图标和/或卡片。示例性地,在具体实现过程中,上述一个或多个处理器可以设置在车机中的处理器,或者也可以为设置在其他车载终端中的处理器。示例性地,在具体实现过程中,上述装置2000或装置2100可以为设置在车机或者其他车载终端中的芯片。示例性地,在具体实现过程中,上述装置2000或装置2100可以为设置在车辆中的如图1所示的计算平台150。在一些可能的实现方式中,该装置2000或装置2100可以包括图3所示的装置301至装置304中的至少一个装置。
本申请实施例还提供了一种装置,该装置包括处理单元和存储单元,其中存储单元用于存储指令,处理单元执行存储单元所存储的指令,以使该装置执行上述实施例执行的方法或者步骤。
可选地,在具体实现过程中,上述处理单元可以包括图1所示的处理器151-15n中的至少一个;上述确定单元可以包括图1所示的处理器151-15n中的至少一个。上述检测单元可以为图1所示的感知系统120中某个传感器,或者也可以为图1所示的处理器151-15n。
图18是本申请实施例的一种控制显示的装置的示意性框图。图18所示的控制显示的 装置2200可以包括:处理器2210、收发器2220以及存储器2230。其中,处理器2210、收发器2220以及存储器2230通过内部连接通路相连,该存储器2230用于存储指令,该处理器2210用于执行该存储器2230存储的指令,以收发器2220接收/发送部分参数。可选地,存储器2230既可以和处理器2210通过接口耦合,也可以和处理器2210集成在一起。
在一些可能的实现方式中,该装置2200可以包括图3所示的装置301至装置304中的至少一个装置。
需要说明的是,上述收发器2220可以包括但不限于输入/输出接口(input/output interface)一类的收发装置,来实现装置2200与其他设备或通信网络之间的通信。
处理器2210可以采用通用的CPU,微处理器,ASIC,GPU或者一个或多个集成电路,用于执行相关程序,以实现本申请方法实施例的控制显示的方法。处理器2210还可以是一种集成电路芯片,具有信号的处理能力。在具体实现过程中,本申请的控制显示的方法的各个步骤可以通过处理器2210中的硬件的集成逻辑电路或者软件形式的指令完成。上述处理器2210还可以是通用处理器、DSP、ASIC、FPGA或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。可以实现或者执行本申请实施例中的公开的各方法、步骤及逻辑框图。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。结合本申请实施例所公开的方法的步骤可以直接体现为硬件译码处理器执行完成,或者用译码处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机存储器,闪存、只读存储器,可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器2230,处理器2210读取存储器2230中的信息,结合其硬件执行本申请方法实施例的控制显示的方法。
存储器2230可以是只读存储器(read only memory,ROM),静态存储设备,动态存储设备或者随机存取存储器(random access memory,RAM)。
收发器2220使用例如但不限于收发器一类的收发装置,来实现装置2200与其他设备或通信网络之间的通信。例如,可以通过收发器2220获取用户位置信息。
本申请实施例还提供一种移动载体,该移动载体可以包括上述装置2000,或者上述装置2100,或者上述装置2200。
示例性地,该移动载体可以为上述实施例中的车辆。
本申请实施例还提供了一种计算机程序产品,该计算机程序产品包括:计算机程序代码,当该计算机程序代码在计算机上运行时,使得计算机执行上述图12至图15中的方法。
本申请实施例还提供一种计算机可读存储介质,该计算机可读介质存储有程序代码或指令,当该计算机程序代码或指令被计算机的处理器执行时,使得该处理器实现上述图12至图15中的方法。
本申请实施例还提供一种芯片,包括:至少一个处理器和存储器,该至少一个处理器与该存储器耦合,用于读取并执行该存储器中的指令,以执行上述图12至图15中的方法。
应理解,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程以及有益效果,可以参考前述方法实施例中的对应过程,在此不再赘述。
在实现过程中,上述方法的各步骤可以通过处理器中的硬件的集成逻辑电路或者软件形式的指令完成。结合本申请实施例所公开的方法可以直接体现为硬件处理器执行完成, 或者用处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机存储器,闪存、只读存储器,可编程只读存储器或者上电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器,处理器读取存储器中的信息,结合其硬件完成上述方法的步骤。为避免重复,这里不再详细描述。
本申请实施例中所使用的术语只是为了描述特定实施例的目的,而并非旨在作为对本申请的限制。如在本申请的说明书和所附权利要求书中所使用的那样,单数表达形式“一个”、“一种”、“所述”、“上述”、“该”和“这一”旨在也包括例如“一个或多个”这种表达形式,除非其上下文中明确地有相反指示。还应当理解,在本申请以下各实施例中,“至少一个”、“一个或多个”是指一个、两个或两个以上。术语“和/或”,用于描述关联对象的关联关系,表示可以存在三种关系;例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B的情况,其中A、B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。“以下至少一项(个)”或其类似表达,是指的这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b,或c中的至少一项(个),可以表示:a,b,c,a-b,a-c,b-c,或a-b-c,其中a,b,c可以是单个,也可以是多个。
在本说明书中描述的参考“一个实施例”或“一些实施例”等意味着在本申请的一个或多个实施例中包括结合该实施例描述的特定特征、结构或特点。由此,在本说明书中的不同之处出现的语句“在一个实施例中”、“在一些实施例中”、“在其他一些实施例中”、“在另外一些实施例中”等不是必然都参考相同的实施例,而是意味着“一个或多个但不是所有的实施例”,除非是以其他方式另外特别强调。术语“包括”、“包含”、“具有”及它们的变形都意味着“包括但不限于”,除非是以其他方式另外特别强调。
应理解,在本申请的各种实施例中,上述各过程的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请实施例的实施过程构成任何限定。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各 个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、ROM、RAM、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖。在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (27)

  1. 一种控制显示的方法,其特征在于,包括:
    确定用户在座舱内的第一区域;
    根据所述第一区域,控制第一显示区域显示第一界面元素,所述第一区域与所述第一显示区域相对应。
  2. 根据权利要求1所述的方法,其特征在于,所述第一显示区域为中控屏,所述第一区域为所述座舱内的任一区域。
  3. 根据权利要求1所述的方法,其特征在于,所述控制第一显示区域显示第一界面元素,包括:
    根据所述用户的人体特征或身份信息控制所述第一显示区域显示所述第一界面元素。
  4. 根据权利要求3所述的方法,其特征在于,所述第一区域为所述座舱内的副驾驶区域或所述座舱内的后排区域。
  5. 根据权利要求1至4中任一项所述的方法,所述方法还包括:
    在所述第一区域没有用户时,控制所述第一显示区域处于熄屏状态。
  6. 根据权利要求1至5中任一项所述的方法,其特征在于,所述控制第一显示区域显示第一界面元素,包括:
    根据移动载体的行驶状态信息控制所述第一显示区域显示所述第一界面元素,所述移动载体包括所述座舱。
  7. 根据权利要求1至6中任一项所述的方法,其特征在于,所述第一界面元素包括一个或多个页签,所述方法还包括:
    在检测到所述用户针对所述第一显示区域显示的第一页签的输入时,控制所述第一显示区域显示一个或多个应用程序的图标;
    其中,所述一个或多个页签包括所述第一页签,所述第一页签包括所述一个或多个应用程序的图标。
  8. 根据权利要求7所述的方法,其特征在于,所述第一页签包括的所述一个或多个应用程序的图标为同一类别的应用程序的图标。
  9. 根据权利要求1至8中任一项所述的方法,其特征在于,所述座舱包括第一显示屏,所述第一显示屏包括所述第一显示区域和第二显示区域,所述第二显示区域与所述座舱的第二区域相对应,所述方法还包括:
    在所述第二区域未检测到用户时,控制所述第二显示区域处于熄屏状态;
    在所述第二区域检测到用户时,控制所述第二显示区域显示第二界面元素。
  10. 根据权利要求1至9中任一项所述的方法,其特征在于,所述座舱包括第三显示区域,所述第三显示区域与所述座舱的第三区域相对应,所述方法还包括:
    在所述第三区域检测到用户时,控制所述第三显示区域显示第三界面元素。
  11. 根据权利要求1至10中任一项所述的方法,其特征在于,所述第一界面元素包括卡片、应用程序的图标、壁纸、和动画中的至少一种。
  12. 一种控制显示的装置,其特征在于,包括:
    确定单元,用于确定用户在座舱内的第一区域;
    处理单元,用于根据所述第一区域,控制第一显示区域显示第一界面元素,所述第一区域与所述第一区域相对应。
  13. 根据权利要求12所述的装置,其特征在于,所述第一显示区域为中控屏,所述第一区域为所述座舱内的任一区域。
  14. 根据权利要求12所述的装置,其特征在于,所述处理单元用于:
    根据所述用户的人体特征或身份信息控制所述第一显示区域显示所述第一界面元素。
  15. 根据权利要求14所述的装置,其特征在于,所述第一区域为所述座舱内的副驾驶区域或所述座舱内的后排区域。
  16. 根据权利要求12至15中任一项所述的装置,所述处理单元还用于:
    在所述第一区域没有用户时,控制所述第一显示区域处于熄屏状态。
  17. 根据权利要求12至16中任一项所述的装置,其特征在于,所述处理单元用于:
    根据移动载体的行驶状态信息控制所述第一显示区域显示所述第一界面元素,所述移动载体包括所述座舱。
  18. 根据权利要求12至17中任一项所述的装置,其特征在于,所述装置还包括检测单元,所述第一界面元素包括一个或多个页签,所述处理单元还用于:
    在所述检测单元检测到所述用户针对所述第一显示区域显示的第一页签的输入时,控制所述第一显示区域显示一个或多个应用程序的图标;
    其中,所述一个或多个页签包括所述第一页签,所述第一页签包括所述一个或多个应用程序的图标。
  19. 根据权利要求18所述的装置,其特征在于,所述第一页签包括的所述一个或多个应用程序的图标为同一类别的应用程序的图标。
  20. 根据权利要求12至19中任一项所述的装置,其特征在于,所述装置还包括检测单元,所述座舱包括第一显示屏,所述第一显示屏包括所述第一显示区域和第二显示区域,所述第二显示区域与所述座舱的第二区域相对应,所述处理单元还用于:
    在所述检测单元在所述第二区域未检测到用户时,控制所述第二显示区域处于熄屏状态;
    在所述检测单元在所述第二区域检测到用户时,控制所述第二显示区域显示第二界面元素。
  21. 根据权利要求12至20中任一项所述的装置,其特征在于,所述装置还包括检测单元,所述座舱包括第三显示区域,所述第三显示区域与所述座舱的第三区域相对应,所述处理单元还包括:
    在所述检测单元在所述第三区域检测到用户时,控制所述第三显示区域显示第三界面元素。
  22. 根据权利要求12至21中任一项所述的装置,其特征在于,所述第一界面元素包括卡片、应用程序的图标、壁纸、和动画中的至少一种。
  23. 一种控制显示的装置,其特征在于,包括:
    存储器,用于存储计算机程序;
    处理器,用于执行所述存储器中存储的计算机程序,以使得所述装置执行如权利要求 1至11中任一项所述的方法。
  24. 一种移动载体,其特征在于,包括如权利要求12至23中任一项所述的装置。
  25. 根据权利要求24所述的移动载体,其特征在于,所述移动载体为车辆。
  26. 一种计算机可读存储介质,其特征在于,其上存储有指令,所述指令被处理器执行时,以使得处理器实现如权利要求1至11中任一项所述的方法。
  27. 一种芯片,其特征在于,所述芯片包括处理器与数据接口,所述处理器通过所述数据接口读取存储器上存储的指令,以执行如权利要求1至11中任一项所述的方法。
PCT/CN2023/091297 2022-07-29 2023-04-27 控制显示的方法、装置和移动载体 WO2024021720A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210904518.2A CN115309285A (zh) 2022-07-29 2022-07-29 控制显示的方法、装置和移动载体
CN202210904518.2 2022-07-29

Publications (1)

Publication Number Publication Date
WO2024021720A1 true WO2024021720A1 (zh) 2024-02-01

Family

ID=83859009

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/091297 WO2024021720A1 (zh) 2022-07-29 2023-04-27 控制显示的方法、装置和移动载体

Country Status (2)

Country Link
CN (1) CN115309285A (zh)
WO (1) WO2024021720A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115309285A (zh) * 2022-07-29 2022-11-08 华为技术有限公司 控制显示的方法、装置和移动载体

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105966334A (zh) * 2015-10-12 2016-09-28 乐卡汽车智能科技(北京)有限公司 一种控制车载信息娱乐系统运行的方法和装置
CN215284684U (zh) * 2021-04-12 2021-12-24 广州汽车集团股份有限公司 座舱娱乐系统及车辆
CN114415926A (zh) * 2021-12-31 2022-04-29 上海洛轲智能科技有限公司 一种智能座舱显示触摸屏的应用布局显示方法及装置
CN114407735A (zh) * 2022-02-17 2022-04-29 芜湖雄狮汽车科技有限公司 汽车座舱的控制方法、装置、车辆及存储介质
CN115309285A (zh) * 2022-07-29 2022-11-08 华为技术有限公司 控制显示的方法、装置和移动载体

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105966334A (zh) * 2015-10-12 2016-09-28 乐卡汽车智能科技(北京)有限公司 一种控制车载信息娱乐系统运行的方法和装置
CN215284684U (zh) * 2021-04-12 2021-12-24 广州汽车集团股份有限公司 座舱娱乐系统及车辆
CN114415926A (zh) * 2021-12-31 2022-04-29 上海洛轲智能科技有限公司 一种智能座舱显示触摸屏的应用布局显示方法及装置
CN114407735A (zh) * 2022-02-17 2022-04-29 芜湖雄狮汽车科技有限公司 汽车座舱的控制方法、装置、车辆及存储介质
CN115309285A (zh) * 2022-07-29 2022-11-08 华为技术有限公司 控制显示的方法、装置和移动载体

Also Published As

Publication number Publication date
CN115309285A (zh) 2022-11-08

Similar Documents

Publication Publication Date Title
US20230396704A1 (en) Systems, devices, and methods for generating messages
US20170212633A1 (en) Automotive control system and method for operating the same
WO2019095392A1 (zh) 根据背景图像动态显示图标的方法及装置
CN107563267B (zh) 在无人驾驶车辆中提供内容的系统和方法
US20230004267A1 (en) Control Method and Apparatus
US20200139812A1 (en) Display system in a vehicle and a method for control thereof
US20170357521A1 (en) Virtual keyboard with intent-based, dynamically generated task icons
WO2024021720A1 (zh) 控制显示的方法、装置和移动载体
US9794134B2 (en) System and method for negotiating control of a shared audio or visual resource
JP2022551779A (ja) 車両のインタラクション方法及び装置、電子機器、記憶媒体並びに車両
CN111464430B (zh) 一种动态表情展示方法、动态表情创建方法及装置
WO2022062491A1 (zh) 一种基于智能座舱的车载智能硬件管控方法和智能座舱
WO2024022437A1 (zh) 一种显示方法、装置和移动载体
EP4081976A1 (en) Expressive user icons in a map-based messaging system interface
CN113782020A (zh) 车内语音交互方法和系统
US10365816B2 (en) Media content including a perceptual property and/or a contextual property
US20210334069A1 (en) System and method for managing multiple applications in a display-limited environment
KR20180111242A (ko) 채색 가능한 콘텐트를 제공하는 전자 장치 및 그 방법
WO2022222688A1 (zh) 一种窗口控制方法及其设备
CN116204253A (zh) 一种语音助手显示方法及相关装置
EP4328765A1 (en) Method and apparatus for recommending vehicle driving strategy
Vasantharaj State of the art technologies in automotive HMI
Yuan Human machine interface design for next generation of vehicle
JP7465302B2 (ja) 情報提供システム、情報提供システムの制御方法、及び情報提供システムの制御プログラム
WO2023216579A1 (zh) 控制灯光显示设备的方法和装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23844935

Country of ref document: EP

Kind code of ref document: A1