WO2024051569A1 - 一种情景展示方法及电子设备 - Google Patents

一种情景展示方法及电子设备 Download PDF

Info

Publication number
WO2024051569A1
WO2024051569A1 PCT/CN2023/116118 CN2023116118W WO2024051569A1 WO 2024051569 A1 WO2024051569 A1 WO 2024051569A1 CN 2023116118 W CN2023116118 W CN 2023116118W WO 2024051569 A1 WO2024051569 A1 WO 2024051569A1
Authority
WO
WIPO (PCT)
Prior art keywords
scenario
target
user
electronic device
vehicle
Prior art date
Application number
PCT/CN2023/116118
Other languages
English (en)
French (fr)
Inventor
王晋
郭浩
陈雨
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2024051569A1 publication Critical patent/WO2024051569A1/zh

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/22Display screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/28Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/037Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for occupant comfort, e.g. for automatic adjustment of appliances according to personal settings, e.g. seats, mirrors, steering wheel
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Definitions

  • the present application relates to the field of terminal technology, and in particular, to a scene display method and electronic equipment.
  • the cockpit scene is a human-computer interaction scene in the vehicle cockpit.
  • the main purpose of the smart cockpit is to integrate driving information and entertainment information, and use the data processing capabilities of on-board equipment to provide users with an efficient and intuitive driving experience.
  • Smart cockpit technology is mainly used to analyze users' driving needs and entertainment needs to improve users' driving experience. While meeting users' basic driving needs, how to provide users with a better interactive experience has also become an important research and development direction for smart cockpit technology.
  • This application provides a scene display method and electronic device to improve the interactivity between the electronic device and the user and improve the user experience.
  • this application provides a scenario presentation method, which can be applied to electronic devices.
  • the method includes: the electronic device obtains the current time; the electronic device determines that the current time satisfies the triggering condition of the target scenario, and executes the response action corresponding to the target scenario according to the scenario content of the target scenario; wherein the target scenario is the electronic device Any scenario from at least one scenario to be displayed stored in the device.
  • the electronic device when it determines that the current time meets the triggering conditions of the target scenario, it can execute the response action corresponding to the target scenario to display the target scenario, improve the interactivity between the electronic device and the user, and thereby enhance the user experience.
  • the electronic device can be a vehicle-mounted device, and the vehicle-mounted device can display the target scenario when it is determined that the current time meets the trigger conditions corresponding to the target scenario, providing users with a sense of surprise and improving the user's driving experience.
  • the triggering conditions of the target scenario include effective time and effective frequency; determining that the current time meets the triggering conditions of the target scenario includes: determining that the current time is within the range of the effective time, and the The device status of the electronic device is consistent with the device status corresponding to the effective frequency.
  • the electronic device can set the effective time and effective frequency of the target scenario.
  • the electronic device can display the target scenario according to the effective frequency within the effective time.
  • the effective time can be the time associated with the target scenario. For example, when the target scenario is a holiday egg, The effective time of the target scenario can be the day of the holiday, thereby providing users with a surprising experience.
  • the method further includes: in response to the user's first operation or determining that the display duration of the target scenario is longer than the set duration, ending the display.
  • the target scenario after executing the response action corresponding to the target scenario, the method further includes: in response to the user's first operation or determining that the display duration of the target scenario is longer than the set duration, ending the display.
  • the target scenario after executing the response action corresponding to the target scenario, the method further includes: in response to the user's first operation or determining that the display duration of the target scenario is longer than the set duration, ending the display. The target scenario.
  • the first operation may be a click operation, a double-click operation, a sliding operation, a long press operation, etc. that are applied to the display screen.
  • the first operation may also be an operation of switching forward gear.
  • the user when the electronic device displays the target scenario, the user can trigger the first operation to end the display of the target scenario by the electronic device; or the electronic device can preset a set duration for the target scenario, and the electronic device determines that the display duration of the target scenario is longer than the set duration.
  • the duration is longer, you can end the display of the target scenario, thereby flexibly realizing the end of the display target scenario.
  • the method further includes: in response to the user's second operation, displaying a scenario interface, the scenario interface including a plurality of candidate scenarios; in response to the user's third operation, displaying the third operation The corresponding at least one candidate scenario is used as the at least one scenario to be displayed.
  • the second operation is used to trigger the electronic device to display the context interface.
  • the second operation may be a click operation acting on a control corresponding to the context interface.
  • the third operation is used to select at least one candidate scenario among multiple candidate scenarios.
  • the third operation may be a click operation on an icon corresponding to at least one candidate scenario.
  • the electronic device is a vehicle-mounted device; the situational content of the target scenario includes multimedia content and/or Setting parameters corresponding to in-car facilities.
  • the electronic device can be a vehicle-mounted device, and the scenario display method provided in this application can be applied to smart cockpit scenarios.
  • the scenario content of the target scenario may include multimedia content and/or setting parameters corresponding to the in-vehicle facilities, so that the vehicle-mounted device can provide the user with an immersive scenario experience when displaying the target scenario.
  • the type of the at least one scenario to be displayed includes a holiday egg type, a user information customization type, a vehicle information customization type, and a user-defined type.
  • electronic devices can store a variety of scenarios to meet the needs of displaying target scenarios in a variety of scenarios.
  • electronic devices can display holiday eggs in festival scenarios, user information customization types can be displayed in user anniversary scenarios, etc., to enhance user experience. experience.
  • the method when the type of the target scenario is a user-defined type, before determining that the current time meets the triggering conditions of the target scenario and executing the response action corresponding to the target scenario according to the scenario content of the target scenario, The method also includes: in response to the user's fourth operation, displaying a new interface for a user-defined type scenario, the new interface being used to create the target scenario; determining and storing the scenario content and content of the target scenario according to the user's operation. Trigger conditions for the target scenario.
  • the fourth operation is used to trigger the electronic device to create a new user-defined type scenario.
  • the fourth operation may be a click operation acting on the newly created scenario control.
  • electronic devices can create user-defined types of scenarios based on user operations, so that users can create new scenarios according to their own needs to meet their personalized needs and improve user experience.
  • the method further includes: The scenario types of multiple first scenarios and the preset priority order of scenario types are described, and the first scenario with the highest priority of scenario types is used as the target scenario; wherein the multiple first scenarios belong to the multiple to-be-listed scenarios. Show scenarios.
  • the electronic device can select the first scenario with the highest priority among the multiple first scenarios as the target scenario according to the priority order of the scenario types. , to prevent electronic devices from continuously displaying multiple scenarios from affecting the user experience.
  • the response action corresponding to the target scenario includes at least one of the following: displaying multimedia content on the display screen of the electronic device; playing music; adjusting the ambient lighting in the car; adjusting the seat; turning on/off Fragrance in the car; adjust the air conditioning in the car; open/close the sunroof; open/close the trunk; turn on/off the lights.
  • the in-car facilities can be adjusted to provide users with an immersive scenario experience and improve the user experience.
  • the multimedia content includes at least one of the following: multimedia content preset by the operating system of the electronic device; multimedia content obtained by the electronic device from a server; multimedia content photographed by the electronic device ; The electronic device receives multimedia content transmitted by other electronic devices.
  • multimedia content can come from multiple sources, thereby providing rich multimedia content for electronic devices.
  • executing the response action corresponding to the target situation according to the situational content of the target situation includes: on the display screen of the electronic device
  • the multimedia content is displayed, and the multimedia content is located on a system layer, and the system layer is located on the background layer and the application layer.
  • electronic devices can provide scenario display services at the system level without affecting the background display or application display of the electronic device.
  • the present application provides an electronic device, which includes a plurality of functional modules; the plurality of functional modules interact to implement the method shown in the first aspect and its respective embodiments.
  • the multiple functional modules can be implemented based on software, hardware, or a combination of software and hardware, and the multiple functional modules can be arbitrarily combined or divided based on specific implementation.
  • the present application provides an electronic device, including at least one processor and at least one memory.
  • Computer program instructions are stored in the at least one memory.
  • the at least one processor executes the above-mentioned first step. Aspects and methods shown in various embodiments thereof.
  • the present application also provides a computer program product.
  • the computer program product When the computer program product is run on a computer, it causes the computer to execute the method shown in any of the above aspects and its implementation modes.
  • the present application also provides a computer-readable storage medium.
  • a computer program is stored in the computer-readable storage medium.
  • the computer program When executed by a computer, it causes the computer to execute any of the above aspects and the above. Methods shown in each embodiment.
  • this application also provides a chip, which is used to read the computer program stored in the memory and execute any of the above. Aspects and methods shown in various embodiments thereof.
  • the present application also provides a chip system.
  • the chip system includes a processor and is used to support a computer device to implement any of the above aspects and the methods shown in each embodiment.
  • the chip system further includes a memory, and the memory is used to store necessary programs and data of the computer device.
  • the chip system can be composed of chips or include chips and other discrete devices.
  • Figure 1 is a schematic diagram of a vehicle-mounted device displaying a homepage on a display screen according to an embodiment of the present application
  • Figure 2 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • Figure 3 is a software structure block diagram of an electronic device provided by an embodiment of the present application.
  • Figure 4 is a schematic diagram of a holiday egg scene provided by an embodiment of the present application.
  • Figure 5 is a schematic diagram of a user information customization scenario provided by an embodiment of the present application.
  • Figure 6 is a schematic diagram of a vehicle information type scenario provided by an embodiment of the present application.
  • Figure 7 is a schematic diagram of a video generated based on pictures provided by an embodiment of the present application.
  • Figure 8 is a schematic diagram of a scenario interface provided by an embodiment of the present application.
  • Figure 9 is a schematic diagram of a background selection interface provided by an embodiment of the present application.
  • Figure 10 is a schematic diagram of a text editing interface provided by an embodiment of the present application.
  • Figure 11 is a schematic diagram of a music selection interface provided by an embodiment of the present application.
  • Figure 12 is a schematic diagram of an ambient light setting interface provided by an embodiment of the present application.
  • Figure 13 is an effective time setting interface provided by an embodiment of the present application.
  • Figure 14 is a schematic diagram of a preview interface of a user-defined scene provided by an embodiment of the present application.
  • Figure 15 is a flow chart of a scenario presentation method provided by an embodiment of the present application.
  • Figure 16 is a schematic diagram of ending the display target scenario provided by the embodiment of the present application.
  • Figure 17 is a flow chart of a scenario presentation method provided by an embodiment of the present application.
  • At least one refers to one or more, and “multiple” refers to two or more.
  • “And/or” describes the relationship between associated objects, indicating that there can be three relationships. For example, A and/or B can mean: A exists alone, A and B exist simultaneously, and B exists alone, where A, B can be singular or plural. The character “/” generally indicates that the related objects are in an “or” relationship. "At least one (item) of the following” or similar expressions thereof refers to any combination of these items, including any combination of single item (items) or plural items (items).
  • At least one of a, b or c can mean: a, b, c, a and b, a and c, b and c, or a, b and c, where a, b, c Can be single or multiple.
  • the cockpit scene is a human-computer interaction scene in the vehicle cockpit.
  • the main purpose of the smart cockpit is to integrate driving information and entertainment information, and use the data processing capabilities of on-board equipment to provide users with an efficient and intuitive driving experience.
  • the vehicle-mounted device can control the air conditioner, seats, etc. on the vehicle.
  • the vehicle-mounted device can also include a display screen.
  • the vehicle-mounted device can display content on the display screen to assist the user in driving the vehicle or provide entertainment content for the user.
  • Figure 1 is a schematic diagram of a vehicle-mounted device displaying a home page on a display screen according to an embodiment of the present application.
  • the vehicle-mounted device includes a display screen, and the vehicle-mounted device can display a home page on the display screen.
  • the home page can include display content such as vehicle settings, applications, navigation, current time, music, etc., and can also include user information such as flight information.
  • the bottom of the interface is the controls corresponding to commonly used in-car facilities.
  • the user can click on the controls corresponding to each in-car facility to control the on-board equipment to perform corresponding response actions. For example, the user clicks on the air-conditioning control to trigger
  • the vehicle-mounted device turns the air conditioning in the vehicle on or off.
  • Smart cockpit technology is mainly used to analyze users' driving needs and entertainment needs to improve users' driving experience. While meeting users' basic driving needs, how to provide users with a better interactive experience has also become an important research and development direction for smart cockpit technology.
  • this application provides a display method and equipment.
  • the electronic device obtains the current time, determines that the current time satisfies the trigger condition corresponding to the target scenario, and executes the response action of the target scenario according to the scenario content of the target scenario.
  • the target situation The scene is any one of at least one scene to be displayed stored in the electronic device, and the response action of the target scene may include displaying multimedia content in the scene content of the target scene on the display screen.
  • the electronic device when it determines that the current time meets the triggering conditions of the target scenario, it can execute the response action corresponding to the target scenario, so as to improve the interactivity between the electronic device and the user and improve the user experience.
  • the electronic device can be a vehicle-mounted device, and the vehicle-mounted device can display the target scenario when it is determined that the current time meets the trigger conditions corresponding to the target scenario, providing users with a sense of surprise and improving the user's driving experience.
  • the electronic device may be a vehicle-mounted device, and the electronic device may also be a tablet computer, a mobile phone, an augmented reality (AR)/virtual reality (VR) device, a notebook computer, or an ultra mobile personal computer (ultra mobile personal computer).
  • -mobile personal computer UMPC
  • netbook personal digital assistant
  • PDA wearable devices
  • FIG. 2 is a schematic structural diagram of an electronic device 100 provided by an embodiment of the present application.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, and a battery 142 , Antenna 1, Antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone interface 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193 , display screen 194, and subscriber identification module (subscriber identification module, SIM) card interface 195, etc.
  • SIM subscriber identification module
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) wait.
  • image signal processor, ISP image signal processor
  • controller may be the nerve center and command center of the electronic device 100 .
  • the controller can generate operation control signals based on the instruction operation code and timing signals to complete the control of fetching and executing instructions.
  • the processor 110 may also be provided with a memory for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have been recently used or recycled by processor 110 . If the processor 110 needs to use the instructions or data again, it can be called directly from the memory. Repeated access is avoided and the waiting time of the processor 110 is reduced, thus improving the efficiency of the system.
  • the USB interface 130 is an interface that complies with the USB standard specification, and may be a Mini USB interface, a Micro USB interface, a USB Type C interface, etc.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transmit data between the electronic device 100 and peripheral devices.
  • the charging management module 140 is used to receive charging input from the charger.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, internal memory 121, external memory, display screen 194, camera 193, wireless communication module 160, etc.
  • the wireless communication function of the electronic device 100 can be implemented through the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization. For example: Antenna 1 can be reused as a diversity antenna for a wireless LAN. In other embodiments, antennas may be used in conjunction with tuning switches.
  • the mobile communication module 150 can provide solutions for wireless communication including 2G/3G/4G/5G applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), etc.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, perform filtering, amplification and other processing on the received electromagnetic waves, and transmit them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modem processor and convert it into electromagnetic waves through the antenna 1 for radiation.
  • at least part of the functional modules of the mobile communication module 150 may be disposed in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), Bluetooth (bluetooth, BT), and global navigation satellites. Wireless communication solutions such as global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), infrared technology (infrared, IR), etc.
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 and modulates the electromagnetic wave signals. frequency and filtering, and sends the processed signal to the processor 110.
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110, frequency modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi) -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the display screen 194 is used to display a display interface of an application, such as displaying a display page of an application installed on the electronic device 100 .
  • Display 194 includes a display panel.
  • the display panel can use a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • AMOLED organic light-emitting diode
  • FLED flexible light-emitting diode
  • Miniled MicroLed, Micro-oLed, quantum dot light emitting diode (QLED), etc.
  • the electronic device 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • Camera 193 is used to capture still images or video.
  • the object passes through the lens to produce an optical image that is projected onto the photosensitive element.
  • the photosensitive element can be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then passes the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other format image signals.
  • the electronic device 100 may include 1 or N cameras 193, where N is a positive integer greater than 1.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the processor 110 executes instructions stored in the internal memory 121 to execute various functional applications and data processing of the electronic device 100 .
  • the internal memory 121 may include a program storage area and a data storage area.
  • the stored program area can store an operating system, software code of at least one application program, etc.
  • the storage data area may store data generated during use of the electronic device 100 (such as captured images, recorded videos, etc.).
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one disk storage device, flash memory device, universal flash storage (UFS), etc.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement the data storage function. For example, save pictures, videos, etc. files on an external memory card.
  • the electronic device 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the sensor module 180 may include a pressure sensor 180A, an acceleration sensor 180B, a touch sensor 180C, etc.
  • the pressure sensor 180A is used to sense pressure signals and can convert the pressure signals into electrical signals.
  • pressure sensor 180A may be disposed on display screen 194 .
  • Touch sensor 180C also known as "touch panel”.
  • the touch sensor 180C can be disposed on the display screen 194.
  • the touch sensor 180C and the display screen 194 form a touch screen, which is also called a "touch screen”.
  • the touch sensor 180C is used to detect a touch operation on or near the touch sensor 180C.
  • the touch sensor can pass the detected touch operation to the application processor to determine the touch event type.
  • Visual output related to the touch operation may be provided through display screen 194 .
  • the touch sensor 180C may also be disposed on the surface of the electronic device 100 at a location different from that of the display screen 194 .
  • the buttons 190 include a power button, a volume button, etc.
  • Key 190 may be a mechanical key. It can also be a touch button.
  • the electronic device 100 may receive key inputs and generate key signal inputs related to user settings and function control of the electronic device 100 .
  • the motor 191 can generate vibration prompts.
  • the motor 191 can be used for vibration prompts for incoming calls and can also be used for touch vibration feedback.
  • touch operations for different applications can correspond to different vibration feedback effects.
  • the touch vibration feedback effect can also be customized.
  • the indicator 192 can be an indicator light, which can be used to indicate charging status, power changes, or can be used to indicate messages, missed calls, Notifications etc.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be connected to and separated from the electronic device 100 by inserting it into the SIM card interface 195 or pulling it out from the SIM card interface 195 .
  • FIG. 2 do not constitute a specific limitation on the electronic device 100.
  • the electronic device may also include more or less components than shown in the figure, or some components may be combined or separated. , or a different component arrangement.
  • the combination/connection relationship between the components in Figure 2 can also be adjusted and modified.
  • Figure 3 is a software structure block diagram of an electronic device provided by an embodiment of the present application.
  • the software structure of electronic equipment can be a layered architecture.
  • the software can be divided into several layers, and each layer has clear roles and division of labor.
  • the layers communicate through software interfaces.
  • the operating system is divided into four layers, from top to bottom: application layer, application framework layer (framework, FWK), runtime (runtime) and system library, and kernel layer.
  • the application layer can include a series of application packages. As shown in Figure 3, the application layer can include cameras, settings, skin modules, user interface (UI), third-party applications, etc. Among them, third-party applications can include gallery, calendar, calls, maps, navigation, WLAN, Bluetooth, music, video, short messages, etc.
  • the application layer may include a target installation package of a target application that the electronic device requests to download from the server, and the function files and layout files in the target installation package are adapted to the electronic device.
  • the application framework layer provides an application programming interface (API) and programming framework for applications in the application layer.
  • the application framework layer can include some predefined functions. As shown in Figure 3, the application framework layer can include window manager, content provider, view system, phone manager, resource manager, and notification manager.
  • a window manager is used to manage window programs.
  • the window manager can obtain the display size, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • Content providers are used to store and retrieve data and make this data accessible to applications. Said data can include videos, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
  • the view system includes visual controls, such as controls that display text, controls that display pictures, etc.
  • a view system can be used to build applications.
  • the display interface can be composed of one or more views.
  • a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
  • Telephone managers are used to provide communication functions of electronic devices. For example, call status management (including connected, hung up, etc.).
  • the resource manager provides various resources to applications, such as localized strings, icons, pictures, layout files, video files, etc.
  • the notification manager allows applications to display notification information in the status bar, which can be used to convey notification-type messages and can automatically disappear after a short stay without user interaction.
  • the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also be notifications that appear in the status bar at the top of the system in the form of charts or scroll bar text, such as notifications for applications running in the background, or notifications that appear on the screen in the form of conversation windows. For example, text information is prompted in the status bar, a beep sounds, the electronic device vibrates, the indicator light flashes, etc.
  • the runtime includes core libraries and virtual machines.
  • the runtime is responsible for the scheduling and management of the operating system.
  • the core library contains two parts: one part is the functional functions that need to be called by the Java language, and the other part is the core library of the operating system.
  • the application layer and application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and application framework layer into binary files.
  • the virtual machine is used to perform object life cycle management, stack management, thread management, security and exception management, and garbage collection and other functions.
  • System libraries can include multiple functional modules. For example: surface manager (surface manager), media libraries (media libraries), three-dimensional graphics processing libraries (for example: OpenGL ES), two-dimensional graphics engines (for example: SGL), image processing libraries, etc.
  • surface manager surface manager
  • media libraries media libraries
  • three-dimensional graphics processing libraries for example: OpenGL ES
  • two-dimensional graphics engines for example: SGL
  • image processing libraries etc.
  • the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as static image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, composition, and layer processing.
  • 2D Graphics Engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display driver, camera driver, audio driver, and sensor driver.
  • the hardware layer can include various types of sensors, such as acceleration sensors, gyroscope sensors, touch sensors, etc.
  • the structure shown in Figure 2 and Figure 3 is only an example of the electronic device provided by the embodiment of the present application, and cannot limit the electronic device provided by the embodiment of the present application.
  • the electronic device can There may be more or fewer devices or modules than in the structure shown in Figure 2 or Figure 3.
  • the following uses vehicle-mounted equipment as an example to introduce the scenario display method provided by the embodiment of the present application.
  • scenarios can also be called smart scenarios.
  • the vehicle-mounted device can store multiple scenarios, and each scenario can be a preset mode.
  • Each scenario can include multimedia content, setting parameters corresponding to multiple facilities, and other scenario content.
  • In-vehicle devices can display scenarios to adjust in-car facilities to provide users with a better driving experience.
  • the multimedia content that can be displayed by the vehicle-mounted device can be multimedia content preset by the operating system of the vehicle-mounted device, or it can be downloaded from the cloud server to the local multimedia content of the vehicle-mounted device, or it can also be transmitted to the user using other devices.
  • the multimedia content of the vehicle-mounted device can also be multimedia content captured by the vehicle-mounted device through the camera in the vehicle. In other words, this application does not limit the source of multimedia content that can be displayed by the vehicle-mounted device.
  • the vehicle-mounted device can also request the cloud server to update the multimedia content corresponding to the scenario based on a set period.
  • the vehicle-mounted device can also store the trigger conditions of each scenario, which can include effective time, effective frequency, etc.
  • the effective time can be the time period between the start time and the end time of the scenario.
  • the effective time can be from 0:00 on August 1 to 24:00 on August 1, which means that the effective time of the scenario is August 1 All day.
  • the effective frequency can correspond to the device status.
  • the effective frequency can be every time it is powered on, every time it is powered off, only once, etc.
  • the effective frequency is every time the power is turned on, and the corresponding device status is the power-on state; the effective frequency is every time the power is turned off, and the corresponding device status is the power-off state; when the effective frequency is only once, there is no requirement for the device status. .
  • the time trigger condition of scenario A is to display once on August 1st
  • the vehicle-mounted device will display scenario A once when it is determined that the current time is within the range of August 1st and the device status of the vehicle-mounted device is online.
  • the vehicle-mounted device can perform response actions corresponding to the target scenario according to the scenario content of the target scenario to adjust the in-vehicle facilities.
  • the target scenario can be a holiday egg.
  • the multimedia content corresponding to the holiday egg can be displayed on the display screen, and the color temperature, brightness and other parameters of the ambient light in the car can also be adjusted according to the setting parameters in the scenario content.
  • the response actions corresponding to the scenario may include: displaying multimedia content on the display, playing music, adjusting the ambient lighting in the car, adjusting the seats, turning on/off the in-car fragrance, adjusting the in-car air conditioner, and turning on/off Sunroof, open/close trunk, turn on/off lights, etc.
  • the multimedia content that the vehicle-mounted device can display on the display screen can include text, pictures, animations, videos, etc.
  • the vehicle-mounted device can store at least one scenario to be displayed, and the scenario to be displayed can be multiple types of scenarios.
  • scenario types may include: holiday eggs, user information customization, car information customization, user customization, etc.
  • the vehicle-mounted device can provide the user with multiple candidate scenarios, and the user can select at least one candidate scenario that needs to be displayed among the multiple candidate scenarios as at least one scenario to be displayed.
  • FIG. 4 is a schematic diagram of a scenario interface provided by an embodiment of the present application. Referring to Figure 4, the scenario interface of the vehicle-mounted device may include multiple candidate scenarios that the vehicle-mounted device can display.
  • the user can select at least one candidate scenario that the user wants to display in the interface, and the vehicle-mounted device can retain at least one candidate scenario selected by the user.
  • the vehicle-mounted device may delete candidate scenarios that are not selected by the user.
  • the vehicle-mounted device displays holiday eggs for all holidays by default.
  • the user can select the holiday eggs that the user wants to view in the interface shown in Figure 8.
  • the vehicle-mounted device retains the holiday eggs selected by the user and triggers them when the trigger conditions for the holiday eggs are met at the current time. , displaying holiday eggs.
  • holiday egg-type scenarios may be generated by the vehicle-mounted device according to festivals or solar terms in the system calendar of the vehicle-mounted device.
  • Figure 5 is a schematic diagram of a holiday egg scene provided by an embodiment of the present application.
  • holiday egg-type scenarios may include scenarios corresponding to dates such as traditional festivals, solar terms, and international festivals.
  • the vehicle-mounted device can implement holiday egg-type scenarios during various festivals or solar terms.
  • the vehicle-mounted device can display the multimedia content corresponding to the festival on the display screen during each festival, providing users with a surprising interactive experience.
  • the user information customized type of scenario may be a scenario generated by the vehicle-mounted device based on user information.
  • the vehicle-mounted device can obtain the user's birthday, anniversary and other information based on the user's information, and generate corresponding scenarios based on the obtained information.
  • the vehicle-mounted device can prompt the user to fill in user information, such as the user's birthday, wedding anniversary, etc., when the user logs in to the user account.
  • Figure 6 is a schematic diagram of a user information customization scenario provided by an embodiment of the present application.
  • the scene generated by the vehicle-mounted device based on the user's birthday information can include displaying blessing messages such as "Happy Birthday” on the display screen, and can also display dynamic multimedia content such as birthday cakes on the display screen.
  • the scene generated by the vehicle-mounted device according to the user's wedding anniversary can include displaying the user's title and anniversary information on the display screen, and can also display dynamic multimedia content such as roses on the display screen.
  • the car information customization type scenario may be a scenario generated by the vehicle-mounted device based on the user's car information.
  • the triggering condition of the car information customization type scenario can be a time trigger condition.
  • the vehicle-mounted device can record the user's car usage time and display the car usage information customization type scenario when the car usage time reaches a preset value.
  • the preset values can be 7 days, 30 days, 99 days, 100 days, etc. Special numbers.
  • the triggering conditions for the vehicle information customization type scenario may also include the driving mileage triggering condition.
  • the vehicle-mounted device can record the vehicle's driving mileage and display the vehicle information customization type scenario when the driving mileage reaches a preset value. For example, FIG.
  • the vehicle-mounted device can display the user's car information on the display screen when the user's car usage time is 100.
  • the vehicle-mounted device can display the companion time, companion mileage, poetry and other contents.
  • the interface shown in Figure 7 can also include a "view wonderful moments" control. After the user clicks this control, the vehicle-mounted device can also generate a video based on the pictures in the gallery application of the vehicle-mounted device, vehicle driving points and other information. and show the video on the display.
  • Figure 8 is a schematic diagram of a video generated based on pictures provided by an embodiment of the present application. After the user clicks the "View Highlights" control shown in Figure 7, the vehicle-mounted device can display the video shown in Figure 8 on the display screen. content.
  • a user-defined type of scenario can be a scenario preset by the user in the vehicle-mounted device, and the user can customize the triggering conditions and scenario content of the scenario.
  • a variety of system-preset scenario templates can be stored in the vehicle-mounted device.
  • the user can edit the system-preset scenario template and select the scenario content to generate a user-defined scenario;
  • the user can set the scenario content of the user-defined scenario by himself, and the electronic device can also use the scenario content of the user-defined scenario as a template, and upload the template to the Internet to share the template with other users.
  • the user can click the "New Scenario” control to trigger the vehicle-mounted device to create a user-defined type of scenario according to the user settings.
  • the vehicle-mounted device creates a new "scenario" A" as an example to illustrate.
  • the vehicle-mounted device can display the scenario content setting interface and the effective time setting interface of the user-defined scenario on the display screen, so that the user can set the scenario content and trigger conditions of the user-defined scenario.
  • FIG 9 is a schematic diagram of a background selection interface provided by an embodiment of the present application.
  • the vehicle-mounted device can display a background selection interface, and the user can select a background in this interface.
  • the user can select a picture preset by the system in the vehicle-mounted device as the background, or the user can also select a picture in the gallery application of the vehicle-mounted device as the background.
  • Background users can also select videos stored in on-board devices as multimedia content displayed on the display screen in user-defined scenarios.
  • Figure 10 is a schematic diagram of a text editing interface provided by an embodiment of the present application.
  • the vehicle-mounted device can display a text editing interface, and the user can enter text in the interface, and the text can be displayed on the display screen when the user-defined scenario is implemented.
  • the vehicle-mounted device can also display the music selection interface and the ambient light setting interface.
  • FIG. 11 is a schematic diagram of a music selection interface provided by an embodiment of the present application. Referring to Figure 11, the user can select music on the music selection interface, and the vehicle-mounted device can play the music selected by the user when displaying the user-defined scenario.
  • FIG. 12 is a schematic diagram of an ambient light setting interface provided by an embodiment of the present application. Referring to Figure 12, the vehicle-mounted device can also display an ambient light setting interface, in which the user can set the brightness, color and other parameters of the ambient light in a user-defined scenario.
  • the user can also set parameters for facilities such as fragrance and seats in user-defined scenarios, as well as set parameters for facilities such as car lights, sunroofs, and trunks.
  • parameters, specific display interface and user adjustment methods can be seen in Figure 11 or Figure 12, and will not be repeated here.
  • the vehicle-mounted device can display the effective time setting interface.
  • Figure 13 shows an effective time setting interface provided by an embodiment of the present application.
  • the user can select the effective time of the user-defined scenario in this interface.
  • the effective time is the time period between the start time and the end time shown in Figure 13.
  • the user can also select the effective frequency of the user-defined scenario.
  • the effective frequency is the user-selected period as shown in Figure 13. For example, if the user sets the effective time of the user-defined scenario from 8:00 on August 1 to 10:00 on August 1, and the effective frequency is only once, then the vehicle-mounted device determines that the current time is within the effective time period of the user-defined scenario.
  • a user-defined scenario When inside, a user-defined scenario is displayed. For another example, if the user sets the effective time of the user-defined scenario from 8:00 on August 1 to 10:00 on August 1, and the effective frequency is every time it is powered on, the vehicle-mounted device will determine that the current time is in the user-defined scenario. Within the effective time period, the user-defined scenario will be displayed every time the on-board device is powered on.
  • FIG 14 is a schematic diagram of a preview interface of a user-defined scene provided by an embodiment of the present application.
  • the vehicle-mounted device can display a preview of a user-defined scene.
  • the user can click the "Preview” control in the interface shown in Figure 14 to trigger the vehicle-mounted device to display a user-defined scene.
  • the vehicle-mounted equipment shown in Figures 11 to 13 first displays the setting interface of the in-vehicle facilities, and then displays the setting interface of the effective time. This is only an example and not a limitation. In the embodiment of the present application, the vehicle-mounted equipment displays the setting interface of the vehicle facilities. There is no restriction on the order of the internal facility setting interface and the effective time setting interface.
  • Figure 15 is a flow chart of the scenario display method provided by the embodiment of the present application.
  • an on-vehicle device is used as an example to execute this method. The method includes the following steps:
  • S1501 The vehicle-mounted device determines that the current time meets the trigger conditions corresponding to the target scenario.
  • the target scenario is any scenario among at least one scenario stored in the vehicle-mounted device.
  • the vehicle-mounted equipment includes three states: powered on, online, and powered off.
  • powering on can be the process of turning on the vehicle-mounted equipment when the vehicle is started; online is the process of running the vehicle-mounted equipment.
  • the vehicle-mounted equipment is online, the vehicle can be in a driving state or a stopped state; powering off is the process of shutting down the vehicle-mounted equipment.
  • the vehicle-mounted device When the vehicle-mounted device is in the power-on state, the vehicle-mounted device can display a startup animation on the display screen. The vehicle-mounted device can also determine whether the current time meets the trigger conditions corresponding to the target scenario. When the vehicle-mounted device determines that the current time meets the trigger conditions corresponding to the target scenario, , the target scenario can be implemented. For example, assuming that the current time is 8:00, the effective time of the target scenario is 8:00-10:00, and the effective frequency is only once, then the vehicle-mounted device determines that the current time meets the trigger conditions corresponding to the target scenario, and the vehicle-mounted device can display the target scenario .
  • the vehicle-mounted equipment can display the target scenario.
  • the vehicle-mounted device can determine whether the current time meets the trigger conditions corresponding to the target scenario.
  • the target scenario can be displayed.
  • the current time is 8:00
  • the vehicle is driving
  • the on-board equipment is online.
  • the effective time of the target scenario is 8:00-10:00, and the effective frequency is only once.
  • the on-board equipment determines that the current time reaches the effective time. start time, the vehicle-mounted device determines that the current time meets the trigger conditions of the target scenario, and the vehicle-mounted device can display the target scenario.
  • the vehicle is driving and the on-board equipment is online.
  • the effective time of the target scenario is 8:00-10:00, and the effective frequency is every time it is powered off.
  • the current time is determined to be 9:30.
  • the vehicle-mounted device determines that the current time meets the triggering conditions of the target scenario, and the vehicle-mounted device can display the target scenario.
  • the vehicle-mounted device can store multiple types of scenarios, such as holiday eggs, user information customization, vehicle information customization, and user-defined scenarios introduced in the above embodiments of this application.
  • the vehicle-mounted device determines that the current time meets the trigger conditions corresponding to multiple scenarios, it can select one of the multiple scenarios as the target scenario according to the preset priority.
  • the preset priority order can be user-defined type, user The order of information customization type, car information customization type, and holiday egg type.
  • S1502 The vehicle-mounted device executes the response action corresponding to the target scenario.
  • the vehicle-mounted device when the vehicle-mounted device determines that the current time satisfies the trigger condition corresponding to the target scenario, the vehicle-mounted device can implement the target scenario. Specifically, the vehicle-mounted device can execute the response action corresponding to the target scenario.
  • the response action corresponding to the target scenario may include at least one of the following: displaying multimedia content corresponding to the target scenario on the display screen, playing music, adjusting the ambient lighting in the car, adjusting the seat, and turning on/off the in-car fragrance. atmosphere, adjust the air conditioning in the car, open/close the sunroof, open/close the trunk, and turn on/off the lights.
  • the target scenario is festive eggs during the Mid-Autumn Festival.
  • the Mid-Autumn Festival Easter eggs include scene content: Mid-Autumn Festival blessing pictures, background music, and ambient light setting parameters.
  • the trigger conditions corresponding to the Mid-Autumn Festival Easter Eggs include: the effective time is the entire day of August 15th of the lunar calendar, and the effective frequency is only once.
  • the vehicle-mounted device displays the Mid-Autumn Festival $9 pictures on the display, plays background music, and adjusts the atmosphere light according to the setting parameters of the atmosphere light.
  • the vehicle-mounted device when the vehicle-mounted device determines that the vehicle is in a driving state, and the display screen of the vehicle-mounted device is displaying navigation routes and other content to assist the user in driving, the vehicle-mounted device can display the target scene when displaying the target scene.
  • the multimedia content contained in the target scenario is displayed in a small window on the display screen.
  • the small window can be a floating window on the display screen, thereby ensuring that displaying the target scenario will not affect the user's driving of the vehicle and improve the user experience.
  • the vehicle-mounted device displays the target scenario
  • the target scenario includes multimedia content
  • the vehicle-mounted device displays the multimedia content on the display screen
  • the multimedia content can be displayed at the system level, that is, for The layer displaying multimedia content can be located above the background layer and application layer.
  • the vehicle-mounted device responds to the user's first operation or the vehicle-mounted device determines that the display duration of the target scenario is longer than the set duration, and ends displaying the target scenario.
  • the first operation may be a single-click operation, a double-click operation, a sliding operation, a long-press operation, etc., acting on the display screen.
  • the first operation may also be an operation of switching forward gear.
  • the user when the vehicle-mounted device displays the target scenario, the user can trigger the first operation to end the display of the target scenario. For example, the user can click the display screen of the vehicle-mounted device, and the vehicle-mounted device ends displaying the target scenario.
  • the vehicle-mounted device can set the display duration of the target scenario to the set duration, and time it when it starts to display the target scenario. When the vehicle-mounted device determines that the display duration of the target scenario is longer than the set duration, it ends displaying the target. scene.
  • the vehicle-mounted device can set different ending trigger methods for the target scenario according to the vehicle status. For example, when the vehicle is driving, the vehicle-mounted device can end displaying the target scenario when it is determined that the display duration of the target scenario is longer than the set duration, and There is no need for the user to manually trigger the display of the target scenario. For another example, when the vehicle is in a stopped state, the vehicle-mounted device can end displaying the target scenario after the user triggers the first operation, thereby facilitating the user to view the scenario content displayed in the target scenario. It can be understood that in specific implementation, the end triggering method of the target scenario can be flexibly adjusted according to different application scenarios, which is not limited in the embodiments of the present application.
  • the vehicle-mounted device can restore the in-vehicle facilities to the initial state.
  • the initial state is the state of each in-vehicle facility before the vehicle-mounted device displays the target scenario.
  • Figure 16 is a schematic diagram of ending the display of the target scenario provided by the embodiment of the present application. Referring to Figure 16, when the vehicle-mounted device displays the target scenario, the vehicle-mounted device displays the picture shown in (a) in Figure 16 on the display screen. After the device finishes displaying the target scenario, the vehicle-mounted device can display the homepage shown in (b) in Figure 16 on the display screen.
  • this application also provides a scenario presentation method, which can be executed by an electronic device, and the electronic device can have the structure shown in Figure 2 and/or Figure 3 .
  • Figure 17 is a flow chart of a scenario presentation method provided by an embodiment of the present application. Referring to Figure 17, the method includes the following steps:
  • S1701 The electronic device obtains the current time.
  • S1702 The electronic device determines that the current time meets the triggering conditions of the target scenario.
  • S1703 The electronic device executes the response action corresponding to the target scenario according to the scenario content of the target scenario.
  • the target scenario is any one of at least one scenario to be displayed stored in the electronic device.
  • this application also provides an electronic device, which includes multiple functional modules; the multiple functional modules interact to implement the methods described in the embodiments of this application.
  • the multiple functional modules can be implemented based on software, hardware, or a combination of software and hardware, and the multiple functional modules can be arbitrarily combined or divided based on specific implementation.
  • this application also provides an electronic device.
  • the electronic device includes at least one processor and at least one memory. Computer program instructions are stored in the at least one memory. When the electronic device is running, the at least one processing The processor executes each method described in the embodiments of this application.
  • this application also provides a computer program, which when the computer program is run on a computer, causes the computer to execute the methods described in the embodiments of this application.
  • the present application also provides a computer-readable storage medium.
  • a computer program is stored in the computer-readable storage medium.
  • the computer program is executed by a computer, the computer is caused to execute the embodiments of the present application. Each method is described.
  • this application also provides a chip, which is used to read the computer program stored in the memory and implement the methods described in the embodiments of this application.
  • this application provides a chip system.
  • the chip system includes a processor and is used to support a computer device to implement the methods described in the embodiments of this application.
  • the chip system further includes a memory, and the memory is used to store necessary programs and data of the computer device.
  • the chip system may be composed of chips, or may include chips and other discrete devices.
  • embodiments of the present application may be provided as methods, systems, or computer program products. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment that combines software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
  • computer-usable storage media including, but not limited to, disk storage, CD-ROM, optical storage, etc.
  • These computer program instructions may also be stored in a computer-readable memory that causes a computer or other programmable data processing apparatus to operate in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including the instruction means, the instructions
  • the device implements the functions specified in a process or processes of the flowchart and/or a block or blocks of the block diagram.
  • These computer program instructions may also be loaded onto a computer or other programmable data processing device, causing a series of operating steps to be performed on the computer or other programmable device to produce computer-implemented processing, thereby executing on the computer or other programmable device.
  • Instructions provide steps for implementing the functions specified in a process or processes of a flowchart diagram and/or a block or blocks of a block diagram.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Software Systems (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种情景展示方法,在该方法中,电子设备(100)获取当前时间。电子设备确定当前时间满足目标情景的触发条件,根据目标情景的情景内容执行目标情景对应的响应动作。其中,目标情景为电子设备存储的至少一个待展示情景中的任一个情景。通过该方法,电子设备可以在确定当前时间满足目标情景的触发条件时,执行目标情景对应的响应动作以展示目标情景,提升电子设备与用户之间的交互性,进而提升用户体验。还包括一种电子设备。

Description

一种情景展示方法及电子设备
相关申请的交叉引用
本申请要求在2022年09月05日提交中华人民共和国知识产权局、申请号为202211080952.X、发明名称为“一种情景展示方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及终端技术领域,尤其涉及一种情景展示方法及电子设备。
背景技术
座舱场景为车辆驾驶舱中的人机交互场景,智能座舱的主要目的是将驾驶信息和娱乐信息集成,利用车载设备处理数据的能力,为用户提供高效的、直观的驾驶体验。
智能座舱技术主要用于分析用户的驾驶需求和娱乐需求,以提升用户的驾乘体验。而在满足用户的基本驾乘需求的同时,如何为用户提供更优的交互体验也成为智能座舱技术的一个重要研发方向。
发明内容
本申请提供一种情景展示方法及电子设备,用以提升电子设备与用户之间的交互性,提升用户体验。
第一方面,本申请提供一种情景展示方法,该方法可以应用于电子设备。该方法包括:电子设备获取当前时间;电子设备确定当前时间满足目标情景的触发条件,根据所述目标情景的情景内容执行所述目标情景对应的响应动作;其中,所述目标情景为所述电子设备存储的至少一个待展示情景中的任一个情景。
在以上方法中,电子设备可以在确定当前时间满足目标情景的触发条件时,执行目标情景对应的响应动作以展示目标情景,提升电子设备与用户之间的交互性,进而提升用户体验。将该方法应用于智能座舱场景中时,电子设备可以为车载设备,车载设备可以在确定当前时间满足目标情景对应的触发条件时展示目标情景,为用户提供惊喜感,提升用户的驾乘体验。
在一个可能的设计中,所述目标情景的触发条件包括生效时间和生效频率;所述确定当前时间满足目标情景的触发条件,包括:确定当前时间处于所述生效时间的范围中,且所述电子设备的设备状态与所述生效频率对应的设备状态一致。
通过该设计,电子设备可以设置目标情景的生效时间和生效频率,电子设备可以在生效时间内根据生效频率展示目标情景,生效时间可以为与目标情景关联的时间,如目标情景为节日彩蛋时,目标情景的生效时间可以为节日当天,从而为用户提供惊喜体验。
在一个可能的设计中,在所述执行所述目标情景对应的响应动作之后,所述方法还包括:响应于用户的第一操作或确定所述目标情景的展示时长大于设定时长,结束展示所述目标情景。
可选的,第一操作可以为作用于显示屏的单击操作、双击操作、滑动操作、长按操作等,当电子设备为车载设备时,第一操作也可以为切换前进挡的操作。
通过该设计,电子设备在展示目标情景时,用户可以触发第一操作以结束电子设备展示目标情景;或者电子设备可以为目标情景预设设定时长,电子设备确定目标情景的展示时长大于设定时长时,可以结束展示目标情景,从而灵活实现结束展示目标情景。
在一个可能的设计中,所述方法还包括:响应于用户的第二操作,显示情景界面,所述情景界面中包括多个候选情景;响应于用户的第三操作,将所述第三操作对应的至少一个候选情景作为所述至少一个待展示情景。
可选的,第二操作用于触发电子设备显示情景界面,例如第二操作可以为作用于情景界面对应的控件的点击操作。第三操作用于在多个候选情景中选择至少一个候选情景,例如第三操作可以为作用于至少一个候选情景对应的图标的点击操作。
通过该设计,用户可以自己选择待展示情景,从而电子设备保留用户希望显示的情景,贴近用户需求,进而提升用户体验。
在一个可能的设计中,所述电子设备为车载设备;所述目标情景的情景内容包括多媒体内容和/或 车内设施对应的设置参数。
通过该设计,电子设备可以为车载设备,本申请提供的情景展示方法可以应用于智能座舱场景。此时目标情景的情景内容可以包括多媒体内容和/或车内设施对应的设置参数,从而车载设备展示目标情景时可以为用户提供沉浸式的情景体验。
在一个可能的设计中,所述至少一个待展示情景的类型包括节日彩蛋类型、用户信息定制类型、用车信息定制类型以及用户自定义类型。
通过该设计,电子设备可以存储多样的情景,满足多种场景下展示目标情景的需求,如节日场景下电子设备可以展示节日彩蛋,用户纪念日场景下可以展示用户信息定制类型情景等,提升用户体验。
在一个可能的设计中,所述目标情景的类型为用户自定义类型时,在确定当前时间满足目标情景的触发条件,根据所述目标情景的情景内容执行所述目标情景对应的响应动作之前,所述方法还包括:响应于用户的第四操作,显示用户自定义类型情景的新建界面,所述新建界面用于创建所述目标情景;根据用户操作确定并存储所述目标情景的情景内容和所述目标情景的触发条件。
可选的,第四操作用于触发电子设备新建用户自定义类型情景,例如第四操作可以为作用于新建情景控件的点击操作。
通过该设计,电子设备可以根据用户操作新建用户自定义类型的情景,从而用户可以根据自己的需求新建情景,满足用户个性化需求,提升用户体验。
在一个可能的设计中,若当前时间满足多个第一情景的触发条件,在所述根据所述目标情景的情景内容执行所述目标情景对应的响应动作之前,所述方法还包括:根据所述多个第一情景的情景类型和预设的情景类型的优先级顺序,将情景类型的优先级最高的第一情景作为所述目标情景;其中,多个第一情景属于所述多个待展示情景。
通过该设计,若当前时间满足电子设备中存储的多个第一情景的触发条件时,电子设备可以根据情景类型的优先级顺序选择多个第一情景中优先级最高的第一情景作为目标情景,防止电子设备连续展示多个情景影响用户体验。
在一个可能的设计中,所述目标情景对应的响应动作包括以下至少一项:在所述电子设备的显示屏上显示多媒体内容;播放音乐;调整车内氛围灯;调整座椅;打开/关闭车内香氛;调整车内空调;打开/关闭天窗;打开/关闭后备箱;打开/关闭车灯。
通过该设计,在智能座舱场景中,电子设备执行目标情景对应的响应动作时,可以对车内设施进行调整,为用户提供沉浸式的情景体验,提升用户体验。
在一个可能的设计中,所述多媒体内容包括以下至少一项:所述电子设备的操作系统预置的多媒体内容;所述电子设备从服务器获取的多媒体内容;所述电子设备拍摄得到的多媒体内容;所述电子设备接收其它电子设备传输的多媒体内容。
通过该设计,多媒体内容可以有多种来源,进而为电子设备提供丰富的多媒体内容。
在一个可能的设计中,所述目标情景的情景内容包括多媒体内容时,所述根据所述目标情景的情景内容执行所述目标情景对应的响应动作,包括:在所述电子设备的显示屏上显示所述多媒体内容,所述多媒体内容位于系统图层,所述系统图层位于背景图层和应用图层之上。
通过该设计,电子设备可以在系统层级提供情景展示服务,并不影响电子设备的背景显示或应用显示。
第二方面,本申请提供一种电子设备,所述电子设备包括多个功能模块;所述多个功能模块相互作用,实现上述第一方面及其各实施方式所示的方法。所述多个功能模块可以基于软件、硬件或软件和硬件的结合实现,且所述多个功能模块可以基于具体实现进行任意组合或分割。
第三方面,本申请提供一种电子设备,包括至少一个处理器和至少一个存储器,所述至少一个存储器中存储计算机程序指令,所述电子设备运行时,所述至少一个处理器执行上述第一方面及其各实施方式所示的方法。
第四方面,本申请还提供一种计算机程序产品,当所述计算机程序产品在计算机上运行时,使得所述计算机执行上述任一方面及其各实施方式所示的方法。
第五方面,本申请还提供一种计算机可读存储介质,所述计算机可读存储介质中存储有计算机程序,当所述计算机程序被计算机执行时,使得所述计算机执行上述任一方面及其各实施方式所示的方法。
第六方面,本申请还提供一种芯片,所述芯片用于读取存储器中存储的计算机程序,执行上述任一 方面及其各实施方式所示的方法。
第七方面,本申请还提供一种芯片系统,该芯片系统包括处理器,用于支持计算机装置实现上述任一方面及其各实施方式所示的方法。在一种可能的设计中,所述芯片系统还包括存储器,所述存储器用于保存该计算机装置必要的程序和数据。该芯片系统可以由芯片构成,也可以包含芯片和其他分立器件。
附图说明
图1为本申请实施例提供的一种车载设备在显示屏上显示首页的示意图;
图2为本申请实施例提供的一种电子设备的结构示意图;
图3为本申请实施例提供的一种电子设备的软件结构框图;
图4为本申请实施例提供的一种节日彩蛋类的情景的示意图;
图5为本申请实施例提供的一种用户信息定制类情景的示意图;
图6为本申请实施例提供的一种用车信息类型的情景示意图;
图7为本申请实施例提供的一种根据图片生成的视频的示意图;
图8为本申请实施例提供的一种情景界面的示意图;
图9为本申请实施例提供的一种背景选择界面的示意图;
图10为本申请实施例提供的一种文本编辑界面的示意图;
图11为本申请实施例提供的一种音乐选择界面的示意图;
图12为本申请实施例提供的一种氛围灯设置界面的示意图;
图13为本申请实施例提供的一种生效时间设置界面;
图14为本申请实施例提供的一种用户自定义场景的预览界面的示意图;
图15为本申请实施例提供的一种情景展示方法的流程图;
图16为本申请实施例提供的一种结束展示目标情景的示意图;
图17为本申请实施例提供的一种情景展示方法的流程图。
具体实施方式
为了使本申请实施例的目的、技术方案和优点更加清楚,下面将结合附图对本申请实施例作进一步地详细描述。其中,在本申请实施例的描述中,以下,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。
应理解,本申请实施例中“至少一个”是指一个或者多个,“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B的情况,其中A、B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。“以下至少一(项)个”或其类似表达,是指的这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a、b或c中的至少一项(个),可以表示:a,b,c,a和b,a和c,b和c,或a、b和c,其中a、b、c可以是单个,也可以是多个。
座舱场景为车辆驾驶舱中的人机交互场景,智能座舱的主要目的是将驾驶信息和娱乐信息集成,利用车载设备处理数据的能力,为用户提供高效的、直观的驾驶体验。
智能座舱技术主要用于分析用户的驾驶需求和娱乐需求,以提升用户的驾乘体验。如车载设备可以控制车辆上的空调、座椅等,车载设备还可以包括显示屏,车载设备可以在显示屏上显示内容以辅助用户驾驶车辆或为用户提供娱乐内容。如图1为本申请实施例提供的一种车载设备在显示屏上显示首页的示意图。参考图1,车载设备包括显示屏,车载设备可以在显示屏上显示首页,如图1中首页上可以包括车辆设置、应用、导航、当前时间、音乐等显示内容,还可以包括航班信息等用户信息。参考图1所示的首页,该界面的底端为常用的车内设施对应的控件,用户可以点击各个车内设施对应的控件以控制车载设备执行对应的响应动作,如用户点击空调控件以触发车载设备打开或关闭车内空调。
智能座舱技术主要用于分析用户的驾驶需求和娱乐需求,以提升用户的驾乘体验。而在满足用户的基本驾乘需求的同时,如何为用户提供更优的交互体验也成为智能座舱技术的一个重要研发方向。
基于以上介绍,本申请提供一种展示方法及设备。在该方法中,电子设备获取当前时间,确定当前时间满足目标情景对应的触发条件,根据目标情景的情景内容执行目标情景的响应动作。其中,目标情 景为电子设备存储的至少一个待展示情景中的任一个情景,目标情景的响应动作可以包括在显示屏上显示目标情景的情景内容中的多媒体内容。
通过该方法,电子设备可以在确定当前时间满足目标情景的触发条件时,执行目标情景对应的响应动作,以提升电子设备与用户之间的交互性,提升用户体验。将该方法应用于智能座舱场景中时,电子设备可以为车载设备,车载设备可以在确定当前时间满足目标情景对应的触发条件时展示目标情景,为用户提供惊喜感,提升用户的驾乘体验。
以下介绍电子设备、和用于使用这样的电子设备的实施例。本申请实施例中电子设备可以为车载设备,电子设备还可以为平板电脑、手机、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)、可穿戴设备等,本申请实施例对电子设备的具体类型不作任何限制。
图2为本申请实施例提供的一种电子设备100的结构示意图。如图2所示,电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。其中,控制器可以是电子设备100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为电子设备100充电,也可以用于电子设备100与外围设备之间传输数据。充电管理模块140用于从充电器接收充电输入。电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,外部存储器,显示屏194,摄像头193,和无线通信模块160等供电。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调 频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
显示屏194用于显示应用的显示界面,例如显示电子设备100上安装的应用的显示页面等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行电子设备100的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,以及至少一个应用程序的软件代码等。存储数据区可存储电子设备100使用过程中所产生的数据(例如拍摄的图像、录制的视频等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将图片,视频等文件保存在外部存储卡中。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
其中,传感器模块180可以包括压力传感器180A,加速度传感器180B,触摸传感器180C等。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。
触摸传感器180C,也称“触控面板”。触摸传感器180C可以设置于显示屏194,由触摸传感器180C与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180C用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180C也可以设置于电子设备100的表面,与显示屏194所处的位置不同。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。电子设备100可以接收按键输入,产生与电子设备100的用户设置以及功能控制有关的键信号输入。马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电, 通知等。SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现与电子设备100的接触和分离。
可以理解的是,图2所示的部件并不构成对电子设备100的具体限定,电子设备还可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。此外,图2中的部件之间的组合/连接关系也是可以调整修改的。
图3为本申请实施例提供的一种电子设备的软件结构框图。如图3所示,电子设备的软件结构可以是分层架构,例如可以将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将操作系统分为四层,从上至下分别为应用程序层,应用程序框架层(framework,FWK),运行时(runtime)和系统库,以及内核层。
应用程序层可以包括一系列应用程序包(application package)。如图3所示,应用程序层可以包括相机、设置、皮肤模块、用户界面(user interface,UI)、三方应用程序等。其中,三方应用程序可以包括图库,日历,通话,地图,导航,WLAN,蓝牙,音乐,视频,短信息等。在本申请实施例中,应用程序层可以包括电子设备从服务器请求下载的目标应用的目标安装包,该目标安装包中的功能文件和布局文件适配于电子设备。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层可以包括一些预先定义的函数。如图3所示,应用程序框架层可以包括窗口管理器,内容提供器,视图系统,电话管理器,资源管理器,通知管理器。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
电话管理器用于提供电子设备的通信功能。例如通话状态的管理(包括接通,挂断等)。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。
运行时包括核心库和虚拟机。运行时负责操作系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是操作系统的核心库。应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(media libraries),三维图形处理库(例如:OpenGL ES),二维图形引擎(例如:SGL)、图像处理库等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。
2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动。
硬件层可以包括各类传感器,例如加速度传感器、陀螺仪传感器、触摸传感器等。
需要说明的是,图2和图3所示的结构仅作为本申请实施例提供的电子设备的一种示例,并不能对本申请实施例提供的电子设备进行任何限定,具体实施中,电子设备可以具有比图2或图3所示的结构中更多或更少的器件或模块。
下面以车载设备为例,对本申请实施例提供的情景展示方法进行介绍。
首先对本申请实施例中涉及的情景进行介绍:在本申请实施例中,情景又可以称为智慧场景,车载设备可以存储多个情景,每个情景可以为一种预设的模式。每个情景可以包括多媒体内容、多种设施对应的设置参数等情景内容。车载设备可以展示情景以调整车内设施,从而为用户提供更优的驾乘体验。
可选的,车载设备可显示的多媒体内容可以为车载设备的操作系统预置的多媒体内容,也可以为车载设备从云端服务器下载到车载设备本地的多媒体内容,还可以为用户使用其它设备传输到车载设备的多媒体内容,还可以为车载设备通过车内摄像头拍摄得到的多媒体内容。也就是说,本申请对车载设备可显示的多媒体内容的来源并不进行限定。可选地,当情景内容中的多媒体内容为车载设备从云端服务器下载得到的多媒体内容时,车载设备还可以基于设定周期向云端服务器请求更新该情景对应的多媒体内容。
车载设备还可以存储每个情景的触发条件,该触发条件可以包括生效时间、生效频率等。其中,生效时间可以为情景的开始时刻和结束时刻之间的时间段,如生效时间可以为8月1日0:00至8月1日24:00,表示该情景的生效时间为8月1日全天。生效频率可以对应设备状态,如生效频率可以为每次上电、每次下电、仅一次等。其中,生效频率为每次上电时,对应的设备状态为上电状态;生效频率为每次下电时,对应的设备状态为下电状态;生效频率为仅一次时,对设备状态没有要求。例如情景A的时间触发条件为8月1日当天展示1次,则车载设备在确定当前时间处于8月1日的范围内,且车载设备的设备状态为在线状态时,展示1次情景A。
当车载设备展示目标情景时,车载设备可以根据目标情景的情景内容执行目标情景对应的响应动作,以调整车内设施。例如,目标情景可以为节日彩蛋,车载设备展示目标情景时,可以在显示屏显示节日彩蛋对应的多媒体内容,还可以根据情景内容中的设置参数调整车内氛围灯的色温、亮度等参数。
在一些示例中,情景对应的响应动作可以包括:在显示屏上显示多媒体内容、播放音乐、调整车内氛围灯、调整座椅、打开/关闭车内香氛、调整车内空调、打开/关闭天窗、打开/关闭后备箱、打开/关闭车灯等。其中,车载设备可以在显示屏上显示的多媒体内容可以包括文本、图片、动图、视频等。
可选地,车载设备可以存储至少一个待展示情景,待展示情景可以为多种类型的情景。如情景类型可以包括:节日彩蛋、用户信息定制、用车信息定制以及用户自定义等。一些实施例中,车载设备可以为用户提供多个候选情景,用户可以在多个候选情景中选择需要希望展示的至少一个候选情景作为至少一个待展示情景。例如,图4为本申请实施例提供的一种情景界面的示意图。参考图4,车载设备的情景界面中可以包括车载设备可展示的多个候选情景,用户可以在该界面中选择用户想要展示的至少一个候选情景,车载设备可以保留用户选择的至少一个候选情景作为至少一个待展示情景,车载设备可以删除未被用户选择的候选情景。例如车载设备默认显示所有节日的节日彩蛋,用户可以在图8所示的界面中选择用户想要查看的节日彩蛋,车载设备保留用户选择的节日彩蛋,并在当前时间满足节日彩蛋的触发条件时,展示节日彩蛋。
下面分别对每种类型的情景进行进一步介绍:
在一些示例中,节日彩蛋类型的情景可以为车载设备根据车载设备的系统日历中的节日或节气生成的。如图5为本申请实施例提供的一种节日彩蛋类的情景的示意图。参考图5,节日彩蛋类型的情景可以包括传统节日、节气、国际节日等日期对应的情景。车载设备可以在各个节日或节气实施节日彩蛋类型的情景,如车载设备可以在每个节日在显示屏上显示该节日对应的多媒体内容,为用户提供惊喜的交互体验。
在一些示例中,用户信息定制类型的情景可以为车载设备根据用户信息生成的情景。例如,车载设备可以根据用户信息获取用户的生日、纪念日等信息,并根据获取到的信息生成对应的情景。可选地,车载设备可以在用户登录用户账户时,提示用户填写用户信息,如用户的生日、结婚纪念日等。如图6为本申请实施例提供的一种用户信息定制类情景的示意图。参考图6中的(a),车载设备根据用户生日信息生成的情景可以包括在显示屏显示“生日快乐”等祝福信息,还可以在显示屏显示生日蛋糕等动态多媒体内容。参考图6中的(b),车载设备根据用户结婚纪念日生成的情景可以包括在显示屏显示用户称呼以及周年信息,还可以在显示屏显示玫瑰花等动态多媒体内容。
在一些示例中,用车信息定制类型的情景可以为车载设备根据用户的用车信息生成的情景。用车信息定制类型的情景的触发条件可以为时间触发条件,如车载设备可以记录用户的用车时长,并在用车时长达到预设数值时展示用车信息定制类型的情景。例如预设数值可以为7天、30天、99天、100天等 特殊数字。另外,用车信息定制类型的情景的触发条件还可以包括行驶里程触发条件,如车载设备可以记录车辆的行驶里程数,并在行驶里程数达到预设数值时展示用车信息定制类型的情景。例如,图7为本申请实施例提供的一种用车信息类型的情景示意图。参考图7,车载设备可以在用户用车时长为100时,在显示屏显示用户的用车信息,如图7中车载设备可以显示陪伴时长、陪伴里程、诗词等内容。可选的,图7中所示界面还可以包括“查看精彩瞬间”的控件,用户点击该控件后,车载设备还可以根据车载设备的图库应用中的图片、车辆行驶途经点等信息生成视频,并在显示屏中显示该视频。如图8为本申请实施例提供的一种根据图片生成的视频的示意图,用户在点击图7所示的“查看精彩瞬间”的控件后,车载设备可以在显示屏上显示图8所示的内容。
在一些示例中,用户自定义类型的情景可以为用户在车载设备中预先设置的情景,用户可以自定义该情景的触发条件以及情景内容。可选地,车载设备中可以存储多种系统预置的情景模板,用户在创建用户自定义类型的情景时,可以对系统预置的情景模板进行编辑,选择情景内容以生成用户自定义情景;或者用户可以自行设定用户自定义情景的情景内容,电子设备还可以将用户自定义情景的情景内容作为模板,并将该模板上传至互联网,以将该模板分享给其它用户。
下面以一个示例对本申请实施例中生成用户自定义类型情景的方式进行进一步介绍。
参见上述实施例中的图4所示的情景界面,用户可以点击“新建情景”的控件,触发车载设备根据用户设置新建一个用户自定义类型的情景,如下述实施例中以车载设备新建“情景A”为例进行说明。车载设备可以在显示屏上显示用户自定义情景的情景内容设置界面以及生效时间设置界面,以供用户设置用户自定义情景的情景内容和触发条件。
图9为本申请实施例提供的一种背景选择界面的示意图。参考图9,车载设备可以显示背景选择界面,用户可以在该界面中选择背景,如用户可以选择车载设备中系统预置的图片作为背景,或者用户也可以选择车载设备的图库应用中的图片作为背景,用户也可以选择车载设备存储的视频作为用户自定义情景中显示屏显示的多媒体内容。
图10为本申请实施例提供的一种文本编辑界面的示意图。参考图10,在用户选择背景后,车载设备可以显示文本编辑界面,用户可以在该界面中输入文本,该文本可以在用户自定义情景实施时显示在显示屏上。
类似的,车载设备还可以显示音乐选择界面和氛围灯设置界面。例如图11为本申请实施例提供的一种音乐选择界面的示意图。参考图11,用户可以在音乐选择界面选择音乐,车载设备展示用户自定义情景时可以播放用户选择的音乐。又例如图12为本申请实施例提供的一种氛围灯设置界面的示意图。参考图12,车载设备还可以显示氛围灯设置界面,用户可以在该界面中设置用户自定义情景中氛围灯的亮度、颜色等参数。
需要说明的是,参考图10所示的文本编辑界面,可选地,用户还可以设置用户自定义情景中香氛、座椅等设施的参数,以及设置车灯、天窗、后备箱等设施的参数,具体显示界面以及用户调整方式可以参见图11或图12,此处不再一一赘述。
在一些示例中,用户完成设置用户自定义情景中各个设施的参数后,车载设备可以显示生效时间设置界面。如图13为本申请实施例提供的一种生效时间设置界面。参考图13,用户可以在该界面中选择用户自定义情景的生效时间,如生效时间为图13所示的开始时间与结束时间之间的时间段。用户还可以选择用户自定义情景的生效频率,如生效频率为图13所示的用户选择的周期。例如,用户设置用户自定义情景的生效时间为8月1日8:00至8月1日10:00,生效频率为仅一次,则车载设备在确定当前时间处于用户自定义情景的生效时间段内时,展示一次用户自定义情景。又例如,用户设置用户自定义情景的生效时间为8月1日8:00至8月1日10:00,生效频率为每次上电,则车载设备在确定当前时间处于用户自定义情景的生效时间段内时,在车载设备每次上电时展示一次用户自定义情景。
用户在完成设置用户自定义情景中各个设施的参数和生效时间后,点击图13所示的界面中的“确定”控件,完成设置该用户自定义场景。图14为本申请实施例提供的一种用户自定义场景的预览界面的示意图。参考图14,车载设备可以显示用户自定义场景的预览图,用户可以点击图14所示的界面中的“预览”控件,触发车载设备展示一次用户自定义场景。
需要说明的是,图11-图13中示出的车载设备先显示车内设施的设置界面,再显示生效时间的设置界面仅作为一种示例而非限定,本申请实施例对车载设备显示车内设施的设置界面和生效时间的设置界面的顺序并不做任何限定。
下面对本申请实施例提供的情景展示方法的流程进行进一步介绍,图15为本申请实施例提供的一种情景展示方法的流程图。在图15中以车载设备执行该方法为例,该方法包括以下步骤:
S1501:车载设备确定当前时间满足目标情景对应的触发条件。
可选的,目标情景为车载设备中存储的至少一个情景中的任一个情景。
在本申请实施例中,车载设备包括三种状态:上电、在线、下电。其中,上电可以为车辆启动时车载设备开机的过程;在线为车载设备运行的过程,车载设备处于在线状态时,车辆可以为行驶状态也可以为停止状态;下电为车载设备关机的过程。
当车载设备处于上电状态时,车载设备可以在显示屏上显示开机动画,车载设备还可以判断当前时间是否满足目标情景对应的触发条件,当车载设备确定当前时间满足目标情景对应的触发条件时,可以实施目标情景。例如,假设当前时间为8:00,目标情景的生效时间为8:00-10:00,生效频率为仅一次,则车载设备确定当前时间满足目标情景对应的触发条件,车载设备可以展示目标情景。又例如,假设当前时间为8:00,目标情景的生效时间为8:00-10:00,生效频率为每次上电,由于当前场景为车载设备上电,则车载设备可以展示目标情景。
当车载设备处于在线状态时,车载设备可以判断当前时间是否满足目标情景对应的触发条件,当车载设备确定当前时间满足目标情景对应的触发条件时,可以展示目标情景。例如,当前时间为8:00,车辆在行驶中,车载设备处于在线状态,目标情景的生效时间为8:00-10:00,生效频率为仅一次,车载设备确定当前时间到达生效时间中的开始时间,则车载设备确定当前时间满足目标情景的触发条件,车载设备可以展示目标情景。又例如,车辆在行驶中,车载设备处于在线状态,目标情景的生效时间为8:00-10:00,生效频率为每次下电,车载设备在关闭之前,确定当前时间为9:30,则车载设备确定当前时间满足目标情景的触发条件,车载设备可以展示目标情景。
在本申请实施例一种可能的场景中,车载设备可以存储多种类型的情景,如本申请上述实施例介绍的节日彩蛋、用户信息定制、用车信息定制以及用户自定义等类型的情景,车载设备在确定当前时间满足多个情景对应的触发条件时,可以根据预设的优先级选择多个情景中的一个情景作为目标情景,例如预设的优先级顺序可以为用户自定义类型、用户信息定制类型、用车信息定制类型、节日彩蛋类型的顺序。
S1502:车载设备执行目标情景对应的响应动作。
在本申请实施例中,车载设备在确定当前时间满足目标情景对应的触发条件时,车载设备可以实施目标情景,具体的,车载设备可以执行目标情景对应的响应动作。如前述实施例介绍,目标情景对应的响应动作可以包括以下至少一项:在显示屏上显示目标情景对应的多媒体内容、播放音乐、调整车内氛围灯、调整座椅、打开/关闭车内香氛、调整车内空调、打开/关闭天窗、打开/关闭后备箱、打开/关闭车灯。
举例来说,目标情景为中秋节的节日彩蛋。中秋节的节日彩蛋包含的情景内容有:中秋节祝福图片、背景音乐、氛围灯的设置参数。中秋节的节日彩蛋对应的触发条件包括:生效时间为农历八月十五全天,生效频率为仅一次。车载设备在确定当前时间满足中秋节的节日彩蛋对应的触发条件后,在显示屏上显示中秋节祝福图片,播放背景音乐,并根据氛围灯的设置参数调整氛围灯。
在本申请实施例一些可能的应用场景中,当车载设备确定车辆处于行驶状态,且车载设备的显示屏上正在显示导航路线等辅助用户驾驶的内容时,车载设备在展示目标情景时,可以将目标情景包含的多媒体内容显示在显示屏的小窗口中,该小窗口可以为显示屏上的一个悬浮窗口,从而保证展示目标情景不会影响用户驾驶车辆,提升用户体验。
可选地,本申请实施例中车载设备在展示目标情景时,若目标情景包括多媒体内容,车载设备在显示屏上显示多媒体内容时,可以在系统层级上显示多媒体内容,也就是说,用于显示多媒体内容的图层可以位于背景图层、应用图层之上。
S1503:车载设备响应用户的第一操作或车载设备确定目标情景的展示时长大于设定时长,结束展示目标情景。
可选的,第一操作可以为作用于显示屏的单击操作、双击操作、滑动操作、长按操作等,第一操作也可以为切换前进挡的操作。
一种可选的实施方式中,车载设备在展示目标情景时,用户可以触发第一操作以结束目标情景的展示,如用户可以单击车载设备的显示屏,车载设备结束展示目标情景。
另一种可选的实施方式中,车载设备可以设置目标情景的展示时长为设定时长,并在开始展示目标情景时计时,车载设备确定目标情景的展示时长大于设定时长时,结束展示目标情景。
在一些示例中,车载设备可以根据车辆状态为目标情景设置不同的结束触发方式,例如当车辆在行驶状态时,车载设备可以在确定目标情景的展示时长大于设定时长时结束展示目标情景,而无需用户手动触发结束目标情景的展示。又例如,车辆在停止状态时,车载设备可以在用户触发第一操作后,结束展示目标情景,从而便于用户查看目标情景展示的情景内容。可以理解的是,具体实施中可以根据不同应用场景灵活调整目标情景的结束触发方式,本申请实施例对此不做限定。
在本申请一种可选的实施方式中,车载设备结束展示目标情景后,车载设备可以将车内设施恢复至初始状态,初始状态为车载设备展示目标情景之前各车内设施的状态。例如,图16为本申请实施例提供的一种结束展示目标情景的示意图,参考图16,车载设备展示目标情景时,车载设备在显示屏显示图16中的(a)所示的图片,车载设备结束展示目标情景后,车载设备可以在显示屏上显示图16中的(b)所示的首页。
基于以上实施例,本申请还提供一种情景展示方法,该方法可以由电子设备执行,电子设备可以具有图2和/或图3所示的结构。图17为本申请实施例提供的一种情景展示方法的流程图。参考图17,该方法包括以下步骤:
S1701:电子设备获取当前时间。
S1702:电子设备确定当前时间满足目标情景的触发条件。
S1703:电子设备根据目标情景的情景内容执行目标情景对应的响应动作。
其中,目标情景为电子设备存储的至少一个待展示情景中的任一个情景。
需要说明的是,本申请图17所示的情景展示方法在具体实施时可以参见本申请上述各实施例,重复之处不再赘述。
基于以上实施例,本申请还提供一种电子设备,所述电子设备包括多个功能模块;所述多个功能模块相互作用,实现本申请实施例所描述的各方法。所述多个功能模块可以基于软件、硬件或软件和硬件的结合实现,且所述多个功能模块可以基于具体实现进行任意组合或分割。
基于以上实施例,本申请还提供一种电子设备,该电子设备包括至少一个处理器和至少一个存储器,所述至少一个存储器中存储计算机程序指令,所述电子设备运行时,所述至少一个处理器执行本申请实施例所描述的各方法。
基于以上实施例,本申请还提供一种计算机程序,当所述计算机程序在计算机上运行时,使得所述计算机执行本申请实施例所描述的各方法。
基于以上实施例,本申请还提供一种计算机可读存储介质,所述计算机可读存储介质中存储有计算机程序,当所述计算机程序被计算机执行时,使得所述计算机执行本申请实施例所描述的各方法。
基于以上实施例,本申请还提供了一种芯片,所述芯片用于读取存储器中存储的计算机程序,实现本申请实施例所描述的各方法。
基于以上实施例,本申请提供了一种芯片系统,该芯片系统包括处理器,用于支持计算机装置实现本申请实施例所描述的各方法。在一种可能的设计中,所述芯片系统还包括存储器,所述存储器用于保存该计算机装置必要的程序和数据。该芯片系统,可以由芯片构成,也可以包含芯片和其他分立器件。
本领域内的技术人员应明白,本申请的实施例可提供为方法、系统、或计算机程序产品。因此,本申请可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本申请可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
本申请是参照根据本申请的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
显然,本领域的技术人员可以对本申请进行各种改动和变型而不脱离本申请的保护范围。这样,倘若本申请的这些修改和变型属于本申请权利要求及其等同技术的范围之内,则本申请也意图包含这些改动和变型在内。

Claims (15)

  1. 一种情景展示方法,其特征在于,应用于电子设备,所述方法包括:
    获取当前时间;
    确定当前时间满足目标情景的触发条件,根据所述目标情景的情景内容执行所述目标情景对应的响应动作;
    其中,所述目标情景为所述电子设备存储的至少一个待展示情景中的任一个情景。
  2. 如权利要求1所述的方法,其特征在于,所述目标情景的触发条件包括生效时间和生效频率;所述确定当前时间满足目标情景的触发条件,包括:
    确定当前时间处于所述生效时间的范围中,且所述电子设备的设备状态与所述生效频率对应的设备状态一致。
  3. 如权利要求1或2所述的方法,其特征在于,在所述执行所述目标情景对应的响应动作之后,所述方法还包括:
    响应于用户的第一操作或确定所述目标情景的展示时长大于设定时长,结束展示所述目标情景。
  4. 如权利要求1-3任一项所述的方法,其特征在于,所述方法还包括:
    响应于用户的第二操作,显示情景界面,所述情景界面中包括多个候选情景;
    响应于用户的第三操作,将所述第三操作对应的至少一个候选情景作为所述至少一个待展示情景。
  5. 如权利要求1-4任一项所述的方法,其特征在于,所述电子设备为车载设备;所述目标情景的情景内容包括多媒体内容和/或车内设施对应的设置参数。
  6. 如权利要求5所述的方法,其特征在于,所述至少一个待展示情景的类型包括节日彩蛋类型、用户信息定制类型、用车信息定制类型以及用户自定义类型。
  7. 如权利要求6所述的方法,其特征在于,所述目标情景的类型为用户自定义类型时,在确定当前时间满足目标情景的触发条件,根据所述目标情景的情景内容执行所述目标情景对应的响应动作之前,所述方法还包括:
    响应于用户的第四操作,显示用户自定义类型情景的新建界面,所述新建界面用于创建所述目标情景;
    根据用户操作确定并存储所述目标情景的情景内容和所述目标情景的触发条件。
  8. 如权利要求6或7所述的方法,其特征在于,若当前时间满足多个第一情景的触发条件,在所述根据所述目标情景的情景内容执行所述目标情景对应的响应动作之前,所述方法还包括:
    根据所述多个第一情景的情景类型和预设的情景类型的优先级顺序,将情景类型的优先级最高的第一情景作为所述目标情景;其中,多个第一情景属于所述多个待展示情景。
  9. 如权利要求5-8任一项所述的方法,其特征在于,所述目标情景对应的响应动作包括以下至少一项:
    在所述电子设备的显示屏上显示多媒体内容;播放音乐;调整车内氛围灯;调整座椅;打开/关闭车内香氛;调整车内空调;打开/关闭天窗;打开/关闭后备箱;打开/关闭车灯。
  10. 如权利要求5-9任一项所述的方法,其特征在于,所述多媒体内容包括以下至少一项:
    所述电子设备的操作系统预置的多媒体内容;
    所述电子设备从服务器获取的多媒体内容;
    所述电子设备拍摄得到的多媒体内容;
    所述电子设备接收其它电子设备传输的多媒体内容。
  11. 如权利要求9或10所述的方法,其特征在于,所述目标情景的情景内容包括多媒体内容时,所述根据所述目标情景的情景内容执行所述目标情景对应的响应动作,包括:
    在所述电子设备的显示屏上显示所述多媒体内容,所述多媒体内容位于系统图层,所述系统图层位于背景图层和应用图层之上。
  12. 一种电子设备,其特征在于,包括至少一个处理器,所述至少一个处理器与至少一个存储器耦合,所述至少一个处理器用于读取所述至少一个存储器所存储的计算机程序,以执行如权利要求1-11中任一所述的方法。
  13. 一种电子设备,其特征在于,包括多个功能模块;所述多个功能模块相互作用,实现如权利要 求1-11中任一所述的方法。
  14. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质中存储有指令,当其在计算机上运行时,使得计算机执行如权利要求1-11中任一所述的方法。
  15. 一种包含指令的计算机程序产品,其特征在于,当所述计算机程序产品在计算机上运行时,使得计算机执行如权利要求1-11中任一所述的方法。
PCT/CN2023/116118 2022-09-05 2023-08-31 一种情景展示方法及电子设备 WO2024051569A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211080952.XA CN117698618A (zh) 2022-09-05 2022-09-05 一种情景展示方法及电子设备
CN202211080952.X 2022-09-05

Publications (1)

Publication Number Publication Date
WO2024051569A1 true WO2024051569A1 (zh) 2024-03-14

Family

ID=90161201

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/116118 WO2024051569A1 (zh) 2022-09-05 2023-08-31 一种情景展示方法及电子设备

Country Status (2)

Country Link
CN (1) CN117698618A (zh)
WO (1) WO2024051569A1 (zh)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110162719A (zh) * 2019-05-27 2019-08-23 广州小鹏汽车科技有限公司 内容推送方法、装置、存储介质及计算机设备、车辆
CN110855826A (zh) * 2019-09-23 2020-02-28 华为技术有限公司 一种原子服务的呈现方法及装置
CN112061049A (zh) * 2020-09-07 2020-12-11 华人运通(上海)云计算科技有限公司 场景触发的方法、装置、设备和存储介质
CN112061075A (zh) * 2020-09-07 2020-12-11 华人运通(上海)云计算科技有限公司 场景触发的方法、装置、设备和存储介质
WO2022053057A1 (zh) * 2020-09-14 2022-03-17 广州小鹏汽车科技有限公司 一种车载屏幕调节方法、装置、车辆和可读存储介质
CN114302191A (zh) * 2021-12-13 2022-04-08 亿咖通(湖北)技术有限公司 彩蛋显示方法、装置及电子设备
CN114327190A (zh) * 2020-09-24 2022-04-12 华人运通(上海)云计算科技有限公司 一种场景编辑装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110162719A (zh) * 2019-05-27 2019-08-23 广州小鹏汽车科技有限公司 内容推送方法、装置、存储介质及计算机设备、车辆
CN110855826A (zh) * 2019-09-23 2020-02-28 华为技术有限公司 一种原子服务的呈现方法及装置
CN112061049A (zh) * 2020-09-07 2020-12-11 华人运通(上海)云计算科技有限公司 场景触发的方法、装置、设备和存储介质
CN112061075A (zh) * 2020-09-07 2020-12-11 华人运通(上海)云计算科技有限公司 场景触发的方法、装置、设备和存储介质
WO2022053057A1 (zh) * 2020-09-14 2022-03-17 广州小鹏汽车科技有限公司 一种车载屏幕调节方法、装置、车辆和可读存储介质
CN114327190A (zh) * 2020-09-24 2022-04-12 华人运通(上海)云计算科技有限公司 一种场景编辑装置
CN114302191A (zh) * 2021-12-13 2022-04-08 亿咖通(湖北)技术有限公司 彩蛋显示方法、装置及电子设备

Also Published As

Publication number Publication date
CN117698618A (zh) 2024-03-15

Similar Documents

Publication Publication Date Title
JP7326476B2 (ja) スクリーンショット方法及び電子装置
WO2020156230A1 (zh) 一种电子设备在来电时呈现视频的方法和电子设备
CN110855826A (zh) 一种原子服务的呈现方法及装置
WO2023130921A1 (zh) 一种适配多设备的页面布局的方法及电子设备
EP4390690A1 (en) Notification processing method, chip, electronic device, and computer readable storage medium
WO2020155875A1 (zh) 电子设备的显示方法、图形用户界面及电子设备
US20240184749A1 (en) File management method, electronic device, and computer-readable storage medium
WO2024001940A1 (zh) 寻车的方法、装置和电子设备
WO2024051569A1 (zh) 一种情景展示方法及电子设备
EP4365722A1 (en) Method for displaying dock bar in launcher and electronic device
CN111324255B (zh) 一种基于双屏终端的应用处理方法及通信终端
CN113835802A (zh) 设备交互方法、系统、设备及计算机可读存储介质
CN115002336A (zh) 视频信息的生成方法、电子设备及介质
WO2022267786A1 (zh) 一种快捷图标展示方法与终端设备
WO2024140560A1 (zh) 一种控制方法及设备
WO2024036998A1 (zh) 显示方法、存储介质及电子设备
WO2024067169A1 (zh) 信息处理方法及电子设备
WO2023165413A1 (zh) 一种应用开发系统、方法及设备
WO2023051354A1 (zh) 一种分屏显示方法及电子设备
CN114816169B (zh) 桌面图标的显示方法、设备及存储介质
WO2024060968A1 (zh) 管理服务卡片的方法和电子设备
CN111381801B (zh) 一种基于双屏终端的音频播放方法及通信终端
CN113179362B (zh) 电子设备及其图像显示方法
US20240231560A9 (en) Shortcut icon display method and terminal device
WO2024139934A1 (zh) 应用程序多窗口展示方法和电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23862263

Country of ref document: EP

Kind code of ref document: A1