CN117698618A - Scene display method and electronic equipment - Google Patents

Scene display method and electronic equipment Download PDF

Info

Publication number
CN117698618A
CN117698618A CN202211080952.XA CN202211080952A CN117698618A CN 117698618 A CN117698618 A CN 117698618A CN 202211080952 A CN202211080952 A CN 202211080952A CN 117698618 A CN117698618 A CN 117698618A
Authority
CN
China
Prior art keywords
scene
vehicle
user
target
scenario
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211080952.XA
Other languages
Chinese (zh)
Inventor
王晋
郭浩
陈雨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202211080952.XA priority Critical patent/CN117698618A/en
Priority to PCT/CN2023/116118 priority patent/WO2024051569A1/en
Publication of CN117698618A publication Critical patent/CN117698618A/en
Pending legal-status Critical Current

Links

Classifications

    • B60K35/22
    • B60K35/28
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/037Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for occupant comfort, e.g. for automatic adjustment of appliances according to personal settings, e.g. seats, mirrors, steering wheel
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Abstract

The application provides a scene display method and electronic equipment. In the method, the electronic device obtains a current time. The electronic equipment determines that the current time meets the triggering condition of the target scene, and executes the response action corresponding to the target scene according to the scene content of the target scene. The target scene is any one of at least one scene to be displayed, which is stored by the electronic equipment. According to the scheme, when the current time is determined to meet the triggering condition of the target scene, the electronic equipment can execute the response action corresponding to the target scene to display the target scene, so that the interactivity between the electronic equipment and the user is improved, and further the user experience is improved.

Description

Scene display method and electronic equipment
Technical Field
The application relates to the technical field of terminals, in particular to a scene display method and electronic equipment.
Background
The cockpit scene is a man-machine interaction scene in the vehicle cockpit, and the main purpose of the intelligent cockpit is to integrate driving information and entertainment information, and the capability of processing data of vehicle-mounted equipment is utilized to provide efficient and visual driving experience for users.
The intelligent cabin technology is mainly used for analyzing the driving requirement and entertainment requirement of the user so as to improve the driving experience of the user. While meeting the basic riding requirements of users, how to provide better interactive experience for users is also an important research and development direction of intelligent cabin technology.
Disclosure of Invention
The application provides a scene display method and electronic equipment, which are used for improving interactivity between the electronic equipment and a user and improving user experience.
In a first aspect, the present application provides a scenario display method, which may be applied to an electronic device. The method comprises the following steps: the electronic equipment acquires the current time; the electronic equipment determines that the current time meets the triggering condition of a target scene, and executes a response action corresponding to the target scene according to the scene content of the target scene; the target scene is any one of at least one scene to be displayed, which is stored by the electronic equipment.
In the method, when the current time is determined to meet the triggering condition of the target scene, the electronic equipment can execute the response action corresponding to the target scene to display the target scene, so that the interactivity between the electronic equipment and the user is improved, and the user experience is further improved. When the method is applied to the intelligent cabin scene, the electronic equipment can be vehicle-mounted equipment, the vehicle-mounted equipment can display the target scene when determining that the current time meets the triggering condition corresponding to the target scene, so as to provide surprise for the user and improve the driving experience of the user.
In one possible design, the trigger conditions for the target scenario include an effective time and an effective frequency; the determining that the current time meets the triggering condition of the target scene comprises the following steps: and determining that the current time is in the effective time range, and enabling the equipment state of the electronic equipment to be consistent with the equipment state corresponding to the effective frequency.
Through the design, the electronic equipment can set the effective time and the effective frequency of the target scene, the electronic equipment can display the target scene according to the effective frequency in the effective time, the effective time can be the time associated with the target scene, for example, when the target scene is a festive egg, the effective time of the target scene can be the holiday day, and therefore surprise experience is provided for a user.
In one possible design, after the performing the response action corresponding to the target scenario, the method further includes: and responding to a first operation of a user or determining that the display time length of the target scene is longer than a set time length, and ending displaying the target scene.
Alternatively, the first operation may be a single click operation, a double click operation, a sliding operation, a long press operation, or the like, which acts on the display screen, and when the electronic device is an in-vehicle device, the first operation may also be an operation of switching the forward gear.
Through the design, when the electronic equipment displays the target scene, the user can trigger a first operation to finish the display of the target scene by the electronic equipment; or the electronic equipment can preset a set time length for the target scene, and when the electronic equipment determines that the display time length of the target scene is longer than the set time length, the display of the target scene can be ended, so that the display of the target scene can be ended flexibly.
In one possible design, the method further comprises: responsive to a second operation by the user, displaying a scenario interface, the scenario interface including a plurality of candidate scenarios therein; and responding to a third operation of a user, and taking at least one candidate scene corresponding to the third operation as the at least one scene to be displayed.
Optionally, the second operation is used to trigger the electronic device to display the contextual interface, for example, the second operation may be a click operation applied to a control corresponding to the contextual interface. The third operation is for selecting at least one candidate scene among the plurality of candidate scenes, for example, the third operation may be a click operation on an icon corresponding to the at least one candidate scene.
Through the design, the user can select the scene to be displayed by himself, so that the electronic equipment reserves the scene which the user wants to display, and the scene is close to the user demand, and further user experience is improved.
In one possible design, the electronic device is an in-vehicle device; the scene content of the target scene comprises multimedia content and/or setting parameters corresponding to facilities in the vehicle.
Through the design, the electronic equipment can be vehicle-mounted equipment, and the scene display method provided by the application can be applied to intelligent cabin scenes. The scene content of the target scene can comprise the multimedia content and/or the setting parameters corresponding to the facilities in the vehicle, so that immersive scene experience can be provided for the user when the vehicle-mounted equipment displays the target scene.
In one possible design, the at least one scene to be displayed includes a holiday egg type, a user information customization type, a vehicle information customization type, and a user customization type.
Through the design, the electronic equipment can store various scenes, the requirement of displaying target scenes in various scenes is met, for example, the electronic equipment can display holiday colored eggs in a holiday scene, the user information customization type scenes can be displayed in a user commemorative day scene, and the user experience is improved.
In one possible design, when the type of the target scenario is a user-defined type, before determining that the current time meets the trigger condition of the target scenario, and executing the response action corresponding to the target scenario according to the scenario content of the target scenario, the method further includes: responding to a fourth operation of a user, and displaying a new interface of a user-defined type scene, wherein the new interface is used for creating the target scene; and determining and storing the scene content of the target scene and the triggering condition of the target scene according to the user operation.
Optionally, the fourth operation is used to trigger the electronic device to create a user-defined scene, for example, the fourth operation may be a click operation on the newly created scene control.
Through the design, the electronic equipment can newly establish a user-defined type scene according to user operation, so that the user can newly establish the scene according to own requirements, the personalized requirements of the user are met, and the user experience is improved.
In one possible design, if the current time meets the triggering conditions of a plurality of first scenes, before the response action corresponding to the target scene is executed according to the scene content of the target scene, the method further includes: according to the scene types of the plurality of first scenes and the priority sequence of the preset scene types, taking the first scene with the highest priority of the scene types as the target scene; wherein the plurality of first scenes belong to the plurality of scenes to be displayed.
Through the design, if the current time meets the triggering conditions of a plurality of first scenes stored in the electronic equipment, the electronic equipment can select the first scene with the highest priority in the plurality of first scenes as a target scene according to the priority sequence of scene types, so that the electronic equipment is prevented from continuously displaying the plurality of scenes to influence the user experience.
In one possible design, the response action corresponding to the target scenario includes at least one of: displaying multimedia content on a display screen of the electronic device; playing music; adjusting an atmosphere lamp in the vehicle; adjusting the seat; opening/closing the fragrance in the vehicle; adjusting an air conditioner in the vehicle; opening/closing the sunroof; opening/closing the trunk; the lamp is turned on/off.
Through the design, when the electronic equipment executes the response action corresponding to the target scene in the intelligent cabin scene, the in-car facilities can be adjusted, immersive scene experience is provided for the user, and the user experience is improved.
In one possible design, the multimedia content includes at least one of: the operation system of the electronic equipment presets multimedia content; the electronic equipment acquires multimedia content from a server; the electronic equipment shoots the obtained multimedia content; the electronic device receives multimedia content transmitted by other electronic devices.
By the design, the multimedia content can have multiple sources, thereby providing rich multimedia content for the electronic equipment.
In one possible design, when the scenario content of the target scenario includes multimedia content, the executing, according to the scenario content of the target scenario, a response action corresponding to the target scenario includes: and displaying the multimedia content on a display screen of the electronic device, wherein the multimedia content is positioned on a system layer, and the system layer is positioned above a background layer and an application layer.
By this design, the electronic device can provide a scenario presentation service at the system level without affecting the background display or application display of the electronic device.
In a second aspect, the present application provides an electronic device comprising a plurality of functional modules; the plurality of functional modules interact to implement the method described in the first aspect and embodiments thereof. The plurality of functional modules may be implemented based on software, hardware, or a combination of software and hardware, and the plurality of functional modules may be arbitrarily combined or divided based on the specific implementation.
In a third aspect, the present application provides an electronic device comprising at least one processor and at least one memory, the at least one memory storing computer program instructions that, when the electronic device is operated, the at least one processor performs the method as set forth in the first aspect and embodiments thereof.
In a fourth aspect, the present application also provides a computer program product which, when run on a computer, causes the computer to perform the method of any of the above aspects and embodiments thereof.
In a fifth aspect, the present application further provides a computer readable storage medium having a computer program stored therein, which when executed by a computer, causes the computer to perform the method shown in any one of the above aspects and embodiments thereof.
In a sixth aspect, the present application further provides a chip, where the chip is configured to read a computer program stored in a memory, and perform the method shown in any one of the above aspects and embodiments thereof.
In a seventh aspect, the present application further provides a chip system, where the chip system includes a processor, and the processor is configured to support a computer device to implement the method shown in any one of the above aspects and embodiments thereof. In one possible design, the chip system further includes a memory for storing programs and data necessary for the computer device. The chip system may be formed of a chip or may include a chip and other discrete devices.
Drawings
Fig. 1 is a schematic diagram of displaying a home page on a display screen by a vehicle-mounted device according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 3 is a software structural block diagram of an electronic device according to an embodiment of the present application;
fig. 4 is a schematic diagram of a scenario of festive eggs according to an embodiment of the present disclosure;
fig. 5 is a schematic diagram of a user information customization scenario provided in an embodiment of the present application;
fig. 6 is a schematic diagram of a scenario of a vehicle information type according to an embodiment of the present application;
Fig. 7 is a schematic diagram of a video generated according to a picture according to an embodiment of the present application;
fig. 8 is a schematic diagram of a scenario interface provided in an embodiment of the present application;
FIG. 9 is a schematic diagram of a background selection interface according to an embodiment of the present application;
FIG. 10 is a schematic diagram of a text editing interface according to an embodiment of the present disclosure;
FIG. 11 is a schematic diagram of a music selection interface according to an embodiment of the present disclosure;
fig. 12 is a schematic diagram of an atmosphere lamp setting interface according to an embodiment of the present application;
FIG. 13 is a validation time setting interface provided by an embodiment of the present application;
fig. 14 is a schematic diagram of a preview interface of a user-defined scene according to an embodiment of the present application;
fig. 15 is a flowchart of a scenario display method provided in an embodiment of the present application;
fig. 16 is a schematic diagram of ending a display target scenario according to an embodiment of the present application;
fig. 17 is a flowchart of a scenario display method provided in an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application more apparent, the embodiments of the present application will be described in further detail with reference to the accompanying drawings. Wherein in the description of embodiments of the present application, the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature.
It should be understood that in embodiments of the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein A, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one (item) below" or the like, refers to any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b or c may represent: a, b, c, a and b, a and c, b and c, or a, b and c, wherein a, b and c can be single or multiple.
The cockpit scene is a man-machine interaction scene in the vehicle cockpit, and the main purpose of the intelligent cockpit is to integrate driving information and entertainment information, and the capability of processing data of vehicle-mounted equipment is utilized to provide efficient and visual driving experience for users.
The intelligent cabin technology is mainly used for analyzing the driving requirement and entertainment requirement of the user so as to improve the driving experience of the user. Such as the vehicle-mounted device may control an air conditioner, a seat, etc. on the vehicle, the vehicle-mounted device may further include a display screen on which the vehicle-mounted device may display content to assist the user in driving the vehicle or to provide entertainment content to the user. Fig. 1 is a schematic diagram of displaying a home page on a display screen by using a vehicle-mounted device according to an embodiment of the present application. Referring to fig. 1, the in-vehicle apparatus includes a display screen, and the in-vehicle apparatus may display a home page on the display screen, for example, in fig. 1, the home page may include display contents such as vehicle settings, applications, navigation, current time, music, and the like, and may further include user information such as flight information. Referring to the home page shown in fig. 1, the bottom end of the interface is a control corresponding to a commonly used in-vehicle facility, and a user can click on the control corresponding to each in-vehicle facility to control the vehicle-mounted device to execute a corresponding response action, for example, the user clicks on the air-conditioning control to trigger the vehicle-mounted device to turn on or off the in-vehicle air conditioner.
The intelligent cabin technology is mainly used for analyzing the driving requirement and entertainment requirement of the user so as to improve the driving experience of the user. While meeting the basic riding requirements of users, how to provide better interactive experience for users is also an important research and development direction of intelligent cabin technology.
Based on the above description, the application provides a display method and device. In the method, the electronic equipment acquires the current time, determines that the current time meets the triggering condition corresponding to the target scene, and executes the response action of the target scene according to the scene content of the target scene. Wherein the target scenario is any one of at least one scenario to be presented stored by the electronic device, and the response action of the target scenario may include displaying multimedia content in the scenario content of the target scenario on the display screen.
According to the method, when the current time is determined to meet the trigger condition of the target scene, the electronic equipment can execute the response action corresponding to the target scene, so that the interactivity between the electronic equipment and the user is improved, and the user experience is improved. When the method is applied to the intelligent cabin scene, the electronic equipment can be vehicle-mounted equipment, the vehicle-mounted equipment can display the target scene when determining that the current time meets the triggering condition corresponding to the target scene, so as to provide surprise for the user and improve the driving experience of the user.
Embodiments of an electronic device, and for using such an electronic device, are described below. In this embodiment of the present application, the electronic device may be a vehicle-mounted device, and the electronic device may also be a tablet computer, a mobile phone, an augmented reality (augmented reality, AR)/Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a personal digital assistant (personal digital assistant, PDA), a wearable device, or the like.
Fig. 2 is a schematic structural diagram of an electronic device 100 according to an embodiment of the present application. As shown in fig. 2, the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, a user identification module (subscriber identification module, SIM) card interface 195, and the like.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors. The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution. A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. The charge management module 140 is configured to receive a charge input from a charger. The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like. The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The display 194 is used to display a display interface of an application, such as a display page of an application installed on the electronic device 100. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an operating system, software code of at least one application program, and the like. The storage data area may store data (e.g., captured images, recorded video, etc.) generated during use of the electronic device 100, and so forth. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as pictures and videos are stored in an external memory card.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The sensor module 180 may include a pressure sensor 180A, an acceleration sensor 180B, a touch sensor 180C, and the like, among others.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194.
The touch sensor 180C, also referred to as a "touch panel". The touch sensor 180C may be disposed on the display 194, and the touch sensor 180C and the display 194 form a touch screen, which is also referred to as a "touch screen". The touch sensor 180C is used to detect a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180C may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100. The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization. The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc. The SIM card interface 195 is used to connect a SIM card. The SIM card may be contacted and separated from the electronic device 100 by inserting the SIM card interface 195 or extracting it from the SIM card interface 195.
It is to be understood that the components shown in fig. 2 are not to be construed as a particular limitation of the electronic device 100, and the electronic device may include more or less components than illustrated, or may combine certain components, or may split certain components, or may have a different arrangement of components. Furthermore, the combination/connection relationships between the components in fig. 2 may also be modified.
Fig. 3 is a software structural block diagram of an electronic device according to an embodiment of the present application. As shown in fig. 3, the software structure of the electronic device may be a hierarchical architecture, for example, the software may be divided into several layers, each layer having a distinct role and division of work. The layers communicate with each other through a software interface. In some embodiments, the operating system is divided into four layers, from top to bottom, an application layer, an application framework layer (FWK), a runtime (run time) and a system library, and a kernel layer, respectively.
The application layer may include a series of application packages (application package). As shown in fig. 3, the application layer may include a camera, settings, skin modules, user Interfaces (UIs), three-way applications, and the like. The three-party application program can comprise a gallery, calendar, conversation, map, navigation, WLAN, bluetooth, music, video, short message, and the like. In an embodiment of the present application, the application layer may include a target installation package of a target application that the electronic device requests to download from the server, where the function files and the layout files in the target installation package are adapted to the electronic device.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer may include some predefined functions. As shown in FIG. 3, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, and a notification manager.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is for providing communication functions of the electronic device. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
The runtime includes a core library and a virtual machine. The runtime is responsible for the scheduling and management of the operating system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of an operating system. The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media library (media library), three-dimensional graphics processing library (e.g., openGL ES), two-dimensional graphics engine (e.g., SGL), image processing library, etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The hardware layer may include various sensors such as acceleration sensors, gyroscopic sensors, touch sensors, and the like.
It should be noted that the structures shown in fig. 2 and fig. 3 are only an example of the electronic device provided in the embodiments of the present application, and the electronic device provided in the embodiments of the present application is not limited in any way, and in a specific implementation, the electronic device may have more or fewer devices or modules than those shown in fig. 2 or fig. 3.
The scenario displaying method provided in the embodiment of the present application is described below by taking an in-vehicle device as an example.
First, description is made of a scenario involved in an embodiment of the present application: in the embodiment of the present application, the scenario may also be referred to as an intelligent scenario, where the vehicle device may store a plurality of scenarios, and each scenario may be a preset mode. Each scene may include multimedia content, setting parameters corresponding to various facilities, and the like. The in-vehicle device may present a scenario to adjust in-vehicle facilities to provide a better ride experience for the user.
Optionally, the multimedia content displayable by the vehicle-mounted device may be a multimedia content preset by an operating system of the vehicle-mounted device, or may be a multimedia content downloaded to the vehicle-mounted device from a cloud server by the vehicle-mounted device, or may be a multimedia content transmitted to the vehicle-mounted device by a user by using other devices, or may be a multimedia content obtained by shooting by the vehicle-mounted device through an in-vehicle camera. That is, the present application does not limit the source of the multimedia content displayable by the in-vehicle apparatus. Optionally, when the multimedia content in the scenario content is the multimedia content obtained by the vehicle-mounted device downloading from the cloud server, the vehicle-mounted device may further request updating of the multimedia content corresponding to the scenario from the cloud server based on the set period.
The in-vehicle apparatus may also store a trigger condition for each scenario, which may include an effective time, an effective frequency, and the like. The effective time may be a time period between a start time and an end time of the scene, for example, the effective time may be 8 months, 1 day, 0:00 to 8 months 1 day 24:00, indicating that the scenario takes effect for 8 months 1 day. The effective frequency may correspond to a device state, e.g., the effective frequency may be power up each time, power down each time, only once, etc. The effective frequency is the power-on state of corresponding equipment when the power is on each time; the effective frequency is the power-down state of the corresponding equipment when the power is turned off each time; when the effective frequency is only once, no requirement is made on the state of the device. For example, if the time trigger condition of the scenario a is that the scenario a is displayed 1 time on the day of 8 months and 1 day, the vehicle-mounted device displays 1 time of scenario a when it is determined that the current time is within the range of 8 months and 1 day and the device state of the vehicle-mounted device is an on-line state.
When the vehicle-mounted device displays the target scene, the vehicle-mounted device can execute a response action corresponding to the target scene according to the scene content of the target scene so as to adjust the facilities in the vehicle. For example, the target scene may be a festival color egg, and when the vehicle-mounted device displays the target scene, the multimedia content corresponding to the festival color egg may be displayed on the display screen, and parameters such as color temperature, brightness, etc. of an atmosphere lamp in the vehicle may be adjusted according to the setting parameters in the scene content.
In some examples, the responsive action for the context may include: the multimedia content is displayed on the display screen, music is played, atmosphere lamps in the vehicle are adjusted, seats are adjusted, fragrance in the vehicle is turned on/off, air conditioners in the vehicle are adjusted, skylights are turned on/off, trunk boxes are turned on/off, and vehicle lamps are turned on/off. Wherein, the multimedia content that the vehicle-mounted device can display on the display screen can comprise text, pictures, moving pictures, videos and the like.
Alternatively, the in-vehicle apparatus may store at least one scenario to be presented, and the scenario to be presented may be a plurality of types of scenarios. Such as scene types may include: holiday eggs, user information customization, vehicle information customization, user customization, and the like. In some embodiments, the vehicle-mounted device may provide the user with a plurality of candidate scenes, and the user may select at least one candidate scene that needs to be shown from the plurality of candidate scenes as at least one scene to be shown. For example, fig. 4 is a schematic diagram of a scenario interface provided in an embodiment of the present application. Referring to fig. 4, a scenario interface of the in-vehicle device may include a plurality of candidate scenarios that the in-vehicle device may present, a user may select at least one candidate scenario that the user wants to present in the interface, the in-vehicle device may reserve the at least one candidate scenario selected by the user as at least one scenario to be presented, and the in-vehicle device may delete candidate scenarios that are not selected by the user. For example, the vehicle-mounted device defaults to display holiday eggs of all holidays, the user can select the holiday eggs which the user wants to view in the interface shown in fig. 8, the vehicle-mounted device reserves the holiday eggs selected by the user, and the holiday eggs are displayed when the current time meets the triggering conditions of the holiday eggs.
Each type of scenario is further described below:
in some examples, a holiday-egg type scenario may be generated by the in-vehicle device from a holiday or a throttle in a system calendar of the in-vehicle device. Fig. 5 is a schematic diagram of a scenario of festive eggs according to an embodiment of the present application. Referring to fig. 5, the festival color egg type scene may include a scene corresponding to a date of a conventional festival, a solar term, an international festival, or the like. The vehicle-mounted equipment can implement the festival color egg type scene in each festival or solar terms, for example, the vehicle-mounted equipment can display the multimedia content corresponding to each festival on a display screen in each festival, and surprise interaction experience is provided for the user.
In some examples, the user information customization type context may be a context generated by the in-vehicle device from the user information. For example, the in-vehicle apparatus may acquire information of a birthday, a anniversary, etc. of the user based on the user information, and generate a corresponding scenario based on the acquired information. Alternatively, the vehicle-mounted device may prompt the user to fill in user information, such as a birthday, wedding anniversary, etc., of the user when the user logs into the user account. Fig. 6 is a schematic diagram of a user information customization class scenario according to an embodiment of the present application. Referring to (a) of fig. 6, the scenario generated by the in-vehicle apparatus according to the user's birthday information may include displaying blessing information such as "birthday happiness" on the display screen, and displaying dynamic multimedia content such as birthday cake on the display screen. Referring to (b) of fig. 6, the scenario generated by the in-vehicle apparatus according to the wedding anniversary of the user may include displaying user titles and anniversary information on the display screen, and may also display dynamic multimedia content such as roses on the display screen.
In some examples, the scenario of the usage information customization type may be a scenario generated by the in-vehicle device from the usage information of the user. The triggering condition of the scenario of the vehicle information customization type can be a time triggering condition, for example, the vehicle-mounted equipment can record the vehicle duration of a user, and the scenario of the vehicle information customization type is displayed when the vehicle duration reaches a preset value. For example, the preset number may be a special number of 7 days, 30 days, 99 days, 100 days, etc. In addition, the triggering condition of the scenario of the vehicle information customization type may further include a driving range triggering condition, for example, the vehicle-mounted device may record the driving mileage of the vehicle, and display the scenario of the vehicle information customization type when the driving mileage reaches a preset value. For example, fig. 7 is a schematic diagram of a scenario of a vehicle information type according to an embodiment of the present application. Referring to fig. 7, when the vehicle duration of the user is 100, the vehicle-mounted device may display vehicle duration information of the user on the display screen, and in fig. 7, the vehicle-mounted device may display contents such as accompanying duration, accompanying mileage, poetry, and the like. Optionally, the interface shown in fig. 7 may further include a "look at wonderful moment" control, and after the user clicks the control, the vehicle-mounted device may further generate a video according to information such as a picture in a gallery application of the vehicle-mounted device, a running route point of the vehicle, and the like, and display the video in the display screen. Fig. 8 is a schematic diagram of a video generated according to a picture according to an embodiment of the present application, where after a user clicks a "look at wonderful moment" control shown in fig. 7, the in-vehicle device may display the content shown in fig. 8 on a display screen.
In some examples, the user-defined type of scenario may be a scenario preset by the user in the vehicle-mounted device, and the user may customize the trigger condition and the scenario content of the scenario. Optionally, a plurality of system preset scene templates can be stored in the vehicle-mounted device, and when a user creates a user-defined scene, the user can edit the system preset scene templates and select scene content to generate the user-defined scene; or the user can set the scene content of the user-defined scene by himself, the electronic equipment can also take the scene content of the user-defined scene as a template, and upload the template to the Internet so as to share the template with other users.
The manner in which the user-defined type scene is generated in the embodiment of the present application is further described below by way of an example.
Referring to the scenario interface shown in fig. 4 in the above embodiment, the user may click on the control of "new scenario" to trigger the vehicle-mounted device to set up a new scenario of a user-defined type according to the user, and in the following embodiment, the new "scenario a" of the vehicle-mounted device is taken as an example to describe. The vehicle-mounted device can display a scene content setting interface and an effective time setting interface of the user-defined scene on the display screen so that the user can set the scene content and the triggering condition of the user-defined scene.
Fig. 9 is a schematic diagram of a background selection interface according to an embodiment of the present application. Referring to fig. 9, the in-vehicle device may display a background selection interface, in which a user may select a background, for example, the user may select a picture preset in the system in the in-vehicle device as the background, or the user may also select a picture in a gallery application of the in-vehicle device as the background, and the user may also select a video stored in the in-vehicle device as multimedia content displayed on a display screen in a user-defined scenario.
Fig. 10 is a schematic diagram of a text editing interface according to an embodiment of the present application. Referring to fig. 10, after a user selects a background, the in-vehicle apparatus may display a text editing interface in which the user may input text, which may be displayed on a display screen when the user-defined scene is implemented.
Similarly, the in-vehicle apparatus may also display a music selection interface and an atmosphere lamp setting interface. For example, fig. 11 is a schematic diagram of a music selection interface according to an embodiment of the present application. Referring to fig. 11, a user may select music at a music selection interface, and the vehicle-mounted device may play the user-selected music when exhibiting a user-defined scenario. For example, fig. 12 is a schematic diagram of an atmosphere lamp setting interface according to an embodiment of the present application. Referring to fig. 12, the in-vehicle apparatus may further display an atmosphere lamp setting interface in which a user may set parameters such as brightness, color, etc. of the atmosphere lamp in the user-defined scene.
It should be noted that, referring to the text editing interface shown in fig. 10, optionally, the user may also set parameters of facilities such as fragrance and seats in the user-defined scenario, and set parameters of facilities such as a car light, a sunroof, a trunk, etc., and the specific display interface and the user adjustment manner may be referred to fig. 11 or fig. 12, which are not described here again.
In some examples, the in-vehicle device may display the validation time setting interface after the user has completed setting parameters for each facility in the user-defined scenario. Fig. 13 is a view of an effective time setting interface according to an embodiment of the present application. Referring to fig. 13, the user may select the effective time of the user-defined scenario in the interface, such as the time period between the start time and the end time shown in fig. 13. The user may also select the validation frequency of the user-defined scenario, such as the user-selected period shown in fig. 13. For example, the user sets the effective time of the user-defined scenario to 8 months 1 day 8:00 to 8 months 1 day 10:00, the effective frequency is only once, and when the vehicle-mounted equipment determines that the current time is within the effective time period of the user-defined scene, the user-defined scene is displayed once. For another example, the user sets the effective time of the user-defined scenario to 8 months 1 day 8:00 to 8 months 1 day 10:00, the effective frequency is that each time power is applied, and when the vehicle-mounted equipment determines that the current time is in the effective time period of the user-defined scene, the user-defined scene is displayed once when the vehicle-mounted equipment is applied each time.
After finishing setting the parameters and the effective time of each facility in the user-defined scene, the user clicks a "confirm" control in the interface shown in fig. 13 to finish setting the user-defined scene. Fig. 14 is a schematic diagram of a preview interface of a user-defined scene according to an embodiment of the present application. Referring to fig. 14, the in-vehicle device may display a preview of the user-defined scene, and the user may click on the "preview" control in the interface shown in fig. 14, triggering the in-vehicle device to exhibit the user-defined scene once.
It should be noted that, the in-vehicle apparatus shown in fig. 11 to 13 displays the setting interface of the in-vehicle facility first, and then displays the setting interface of the effective time, which is only an example and not a limitation, and the order in which the in-vehicle apparatus displays the setting interface of the in-vehicle facility and the setting interface of the effective time in the embodiment of the present application is not limited in any way.
The following describes the flow of the scenario display method provided in the embodiment of the present application, and fig. 15 is a flowchart of the scenario display method provided in the embodiment of the present application. Taking the vehicle-mounted device as an example in fig. 15, the method includes the steps of:
s1501: the vehicle-mounted equipment determines that the current time meets the triggering condition corresponding to the target scene.
Optionally, the target scenario is any one of at least one scenario stored in the in-vehicle apparatus.
In the embodiment of the present application, the in-vehicle apparatus includes three states: powering up, on-line and powering down. The power-on can be a process of starting up the vehicle-mounted equipment when the vehicle is started; the on-line is the running process of the vehicle-mounted equipment, and when the vehicle-mounted equipment is in an on-line state, the vehicle can be in a running state or a stopping state; and the power-down is the shutdown process of the vehicle-mounted equipment.
When the vehicle-mounted equipment is in a power-on state, the vehicle-mounted equipment can display a startup animation on the display screen, the vehicle-mounted equipment can also judge whether the current time meets the triggering condition corresponding to the target scene, and when the vehicle-mounted equipment determines that the current time meets the triggering condition corresponding to the target scene, the target scene can be implemented. For example, assuming that the current time is 8:00, the effective time of the target scenario is 8:00-10:00, and the effective frequency is only one time, the vehicle-mounted device determines that the current time meets the trigger condition corresponding to the target scenario, and the vehicle-mounted device can display the target scenario. For another example, assuming that the current time is 8:00, the effective time of the target scenario is 8:00-10:00, the effective frequency is each time power is turned on, and since the current scenario is that the vehicle-mounted device is powered on, the vehicle-mounted device can display the target scenario.
When the vehicle-mounted equipment is in an on-line state, the vehicle-mounted equipment can judge whether the current time meets the triggering condition corresponding to the target scene, and when the vehicle-mounted equipment determines that the current time meets the triggering condition corresponding to the target scene, the target scene can be displayed. For example, when the current time is 8:00 and the vehicle is running, the vehicle-mounted device is in an on-line state, the effective time of the target scene is 8:00-10:00, the effective frequency is only once, the vehicle-mounted device determines that the current time reaches the starting time in the effective time, then the vehicle-mounted device determines that the current time meets the triggering condition of the target scene, and the vehicle-mounted device can display the target scene. For example, when the vehicle is running, the vehicle-mounted device is in an on-line state, the effective time of the target scene is 8:00-10:00, the effective frequency is that each time the vehicle-mounted device is powered down, before the vehicle-mounted device is closed, the current time is 9:30, the vehicle-mounted device determines that the current time meets the triggering condition of the target scene, and the vehicle-mounted device can display the target scene.
In one possible scenario of the embodiment of the present application, the vehicle-mounted device may store multiple types of scenarios, such as the holiday eggs, the user information customization, the vehicle information customization, the user customization and other types of scenarios introduced in the foregoing embodiment of the present application, and when determining that the current time meets the triggering conditions corresponding to the multiple scenarios, the vehicle-mounted device may select one of the multiple scenarios as the target scenario according to a preset priority, for example, the preset priority order may be the order of the user customization type, the user information customization type, the vehicle information customization type and the holiday eggs type.
S1502: and the vehicle-mounted equipment executes the response action corresponding to the target scene.
In the embodiment of the application, when the vehicle-mounted device determines that the current time meets the trigger condition corresponding to the target scenario, the vehicle-mounted device can implement the target scenario, and specifically, the vehicle-mounted device can execute the response action corresponding to the target scenario. As introduced in the foregoing embodiment, the response action corresponding to the target scenario may include at least one of: the method comprises the steps of displaying multimedia content corresponding to a target scene on a display screen, playing music, adjusting atmosphere lamps in a vehicle, adjusting seats, opening/closing fragrance in the vehicle, adjusting air conditioners in the vehicle, opening/closing a skylight, opening/closing a trunk and opening/closing a vehicle lamp.
For example, the target scenario is a holiday color egg of mid-autumn festival. The festival color eggs of mid-autumn festival contain the scene contents: setting parameters of mid-autumn festival blessing pictures, background music and atmosphere lamps. The triggering conditions corresponding to the festival color eggs in mid-autumn festival comprise: the effective time is fifteen full days of lunar calendar, and the effective frequency is only one time. After the vehicle-mounted equipment determines that the current time meets the triggering conditions corresponding to the festival color eggs of the mid-autumn festival, displaying mid-autumn festival blessing pictures on a display screen, playing background music, and adjusting the atmosphere lamp according to the setting parameters of the atmosphere lamp.
In some possible application scenarios of the embodiment of the present application, when the vehicle-mounted device determines that the vehicle is in a driving state, and the display screen of the vehicle-mounted device is displaying the content assisting the user in driving such as the navigation route, the vehicle-mounted device can display the multimedia content included in the target scenario in a small window of the display screen when the target scenario is displayed, and the small window can be a floating window on the display screen, so that it is ensured that the display target scenario does not influence the user in driving the vehicle, and user experience is improved.
Optionally, when the vehicle-mounted device in the embodiment of the present application displays the target scenario, if the target scenario includes the multimedia content, the vehicle-mounted device may display the multimedia content on a system level when displaying the multimedia content on the display screen, that is, a layer for displaying the multimedia content may be located above a background layer and an application layer.
S1503: the vehicle-mounted equipment responds to a first operation of a user or determines that the display time length of the target scene is longer than the set time length, and the display of the target scene is ended.
Alternatively, the first operation may be a click operation, a double click operation, a slide operation, a long press operation, or the like, which acts on the display screen, and the first operation may also be an operation of switching the forward gear.
In an alternative embodiment, when the vehicle-mounted device displays the target scenario, the user may trigger the first operation to end displaying the target scenario, for example, the user may click a display screen of the vehicle-mounted device, and the vehicle-mounted device ends displaying the target scenario.
In another optional embodiment, the vehicle-mounted device may set the display duration of the target scenario as the set duration, and time when the display of the target scenario is started, and end the display of the target scenario when the vehicle-mounted device determines that the display duration of the target scenario is longer than the set duration.
In some examples, the in-vehicle device may set a different ending trigger manner for the target scenario according to the vehicle state, for example, when the vehicle is in a driving state, the in-vehicle device may end displaying the target scenario when it is determined that the displaying time of the target scenario is longer than the set time period, without requiring the user to manually trigger the displaying of the ending target scenario. For another example, when the vehicle is in a stopped state, the vehicle-mounted device may end displaying the target scenario after the user triggers the first operation, so that the user is facilitated to view the scenario content displayed by the target scenario. It can be understood that, in specific implementation, the ending triggering manner of the target scene can be flexibly adjusted according to different application scenes, which is not limited in the embodiment of the present application.
In an optional embodiment of the present application, after the vehicle-mounted device finishes displaying the target scenario, the vehicle-mounted device may restore the in-vehicle facilities to an initial state, where the initial state is a state of each in-vehicle facility before the vehicle-mounted device displays the target scenario. For example, fig. 16 is a schematic diagram of ending displaying a target scenario provided in an embodiment of the present application, and referring to fig. 16, when the vehicle-mounted device displays the target scenario, the vehicle-mounted device displays the picture shown in (a) in fig. 16 on the display screen, and after the vehicle-mounted device finishes displaying the target scenario, the vehicle-mounted device may display the home page shown in (b) in fig. 16 on the display screen.
Based on the above embodiments, the present application also provides a scenario display method, which may be performed by an electronic device, and the electronic device may have the structure shown in fig. 2 and/or fig. 3. Fig. 17 is a flowchart of a scenario display method provided in an embodiment of the present application. Referring to fig. 17, the method includes the steps of:
s1701: the electronic device obtains a current time.
S1702: the electronic device determines that the current time satisfies a trigger condition for the target scenario.
S1703: and the electronic equipment executes the response action corresponding to the target scene according to the scene content of the target scene.
The target scene is any one of at least one scene to be displayed, which is stored by the electronic equipment.
It should be noted that, when the scenario displaying method shown in fig. 17 of the present application is implemented, reference may be made to the above embodiments of the present application, and the repetition is not repeated.
Based on the above embodiments, the present application further provides an electronic device, including a plurality of functional modules; the functional modules interact to implement the methods described in the embodiments of the present application. The plurality of functional modules may be implemented based on software, hardware, or a combination of software and hardware, and the plurality of functional modules may be arbitrarily combined or divided based on the specific implementation.
Based on the above embodiments, the present application further provides an electronic device, which includes at least one processor and at least one memory, where the at least one memory stores computer program instructions, and when the electronic device is running, the at least one processor performs the methods described in the embodiments of the present application.
Based on the above embodiments, the present application also provides a computer program which, when run on a computer, causes the computer to perform the methods described in the embodiments of the present application.
Based on the above embodiments, the present application also provides a computer-readable storage medium having stored therein a computer program which, when executed by a computer, causes the computer to perform the methods described in the embodiments of the present application.
Based on the above embodiments, the present application further provides a chip, where the chip is configured to read a computer program stored in a memory, and implement the methods described in the embodiments of the present application.
Based on the above embodiments, the present application provides a chip system including a processor for supporting a computer device to implement the methods described in the embodiments of the present application. In one possible design, the chip system further includes a memory for storing programs and data necessary for the computer device. The chip system can be composed of chips, and can also comprise chips and other discrete devices.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various modifications and variations can be made in the present application without departing from the scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims and the equivalents thereof, the present application is intended to cover such modifications and variations.

Claims (15)

1. A scenario presentation method, applied to an electronic device, the method comprising:
acquiring the current time;
determining that the current time meets the triggering condition of a target scene, and executing a response action corresponding to the target scene according to the scene content of the target scene;
the target scene is any one of at least one scene to be displayed, which is stored by the electronic equipment.
2. The method of claim 1, wherein the trigger condition of the target scenario comprises an effective time and an effective frequency; the determining that the current time meets the triggering condition of the target scene comprises the following steps:
and determining that the current time is in the effective time range, and enabling the equipment state of the electronic equipment to be consistent with the equipment state corresponding to the effective frequency.
3. The method of claim 1 or 2, wherein after said performing a responsive action corresponding to the target scenario, the method further comprises:
and responding to a first operation of a user or determining that the display time length of the target scene is longer than a set time length, and ending displaying the target scene.
4. A method according to any one of claims 1-3, wherein the method further comprises:
responsive to a second operation by the user, displaying a scenario interface, the scenario interface including a plurality of candidate scenarios therein;
and responding to a third operation of a user, and taking at least one candidate scene corresponding to the third operation as the at least one scene to be displayed.
5. The method of any one of claims 1-4, wherein the electronic device is an in-vehicle device; the scene content of the target scene comprises multimedia content and/or setting parameters corresponding to facilities in the vehicle.
6. The method of claim 5, wherein the type of the at least one scene to be displayed comprises a holiday egg type, a user information customization type, a vehicle information customization type, and a user customization type.
7. The method of claim 6, wherein when the type of the target scenario is a user-defined type, before determining that the current time satisfies the trigger condition of the target scenario, executing the response action corresponding to the target scenario according to the scenario content of the target scenario, the method further comprises:
responding to a fourth operation of a user, and displaying a new interface of a user-defined type scene, wherein the new interface is used for creating the target scene;
and determining and storing the scene content of the target scene and the triggering condition of the target scene according to the user operation.
8. The method according to claim 6 or 7, wherein if the current time satisfies the trigger conditions of the plurality of first scenes, before the response action corresponding to the target scene is performed according to the scene content of the target scene, the method further comprises:
according to the scene types of the plurality of first scenes and the priority sequence of the preset scene types, taking the first scene with the highest priority of the scene types as the target scene; wherein the plurality of first scenes belong to the plurality of scenes to be displayed.
9. The method of any of claims 5-8, wherein the responsive action corresponding to the target scenario comprises at least one of:
displaying multimedia content on a display screen of the electronic device; playing music; adjusting an atmosphere lamp in the vehicle; adjusting the seat; opening/closing the fragrance in the vehicle; adjusting an air conditioner in the vehicle; opening/closing the sunroof; opening/closing the trunk; the lamp is turned on/off.
10. The method of any of claims 5-9, wherein the multimedia content comprises at least one of:
the operation system of the electronic equipment presets multimedia content;
the electronic equipment acquires multimedia content from a server;
the electronic equipment shoots the obtained multimedia content;
the electronic device receives multimedia content transmitted by other electronic devices.
11. The method according to claim 9 or 10, wherein when the scene content of the target scene includes multimedia content, the performing the response action corresponding to the target scene according to the scene content of the target scene includes:
and displaying the multimedia content on a display screen of the electronic device, wherein the multimedia content is positioned on a system layer, and the system layer is positioned above a background layer and an application layer.
12. An electronic device comprising at least one processor coupled to at least one memory, the at least one processor configured to read a computer program stored by the at least one memory to perform the method of any of claims 1-11.
13. An electronic device comprising a plurality of functional modules; the plurality of functional modules interact to implement the method of any of claims 1-11.
14. A computer readable storage medium having instructions stored therein which, when run on a computer, cause the computer to perform the method of any of claims 1-11.
15. A computer program product comprising instructions which, when run on a computer, cause the computer to perform the method of any of claims 1-11.
CN202211080952.XA 2022-09-05 2022-09-05 Scene display method and electronic equipment Pending CN117698618A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202211080952.XA CN117698618A (en) 2022-09-05 2022-09-05 Scene display method and electronic equipment
PCT/CN2023/116118 WO2024051569A1 (en) 2022-09-05 2023-08-31 Scene display method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211080952.XA CN117698618A (en) 2022-09-05 2022-09-05 Scene display method and electronic equipment

Publications (1)

Publication Number Publication Date
CN117698618A true CN117698618A (en) 2024-03-15

Family

ID=90161201

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211080952.XA Pending CN117698618A (en) 2022-09-05 2022-09-05 Scene display method and electronic equipment

Country Status (2)

Country Link
CN (1) CN117698618A (en)
WO (1) WO2024051569A1 (en)

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110162719A (en) * 2019-05-27 2019-08-23 广州小鹏汽车科技有限公司 Content delivery method, device, storage medium and computer equipment, vehicle
CN110855826A (en) * 2019-09-23 2020-02-28 华为技术有限公司 Atomic service presentation method and device
CN112061075B (en) * 2020-09-07 2021-11-05 华人运通(上海)云计算科技有限公司 Scene triggering method, device, equipment and storage medium
CN112061049B (en) * 2020-09-07 2022-05-13 华人运通(上海)云计算科技有限公司 Scene triggering method, device, equipment and storage medium
CN112078499B (en) * 2020-09-14 2022-08-16 广州小鹏汽车科技有限公司 Vehicle-mounted screen adjusting method and device, vehicle and readable storage medium
CN114327190A (en) * 2020-09-24 2022-04-12 华人运通(上海)云计算科技有限公司 Scene editing device
CN114302191A (en) * 2021-12-13 2022-04-08 亿咖通(湖北)技术有限公司 Color egg display method and device and electronic equipment

Also Published As

Publication number Publication date
WO2024051569A1 (en) 2024-03-14

Similar Documents

Publication Publication Date Title
WO2022089207A1 (en) Cross-device application interaction method, electronic device, and server
CN113556479B (en) Method for sharing camera by multiple applications and electronic equipment
CN110855826A (en) Atomic service presentation method and device
CN116009977A (en) Notification processing method, chip, electronic device and computer-readable storage medium
CN115237316A (en) Audio track marking method and electronic equipment
CN116881481A (en) Pre-caching method, user interface and electronic equipment
CN114168237A (en) Theme pack adaptation method and device
WO2023005711A1 (en) Service recommendation method and electronic device
CN117698618A (en) Scene display method and electronic equipment
CN114244955B (en) Service sharing method and system, electronic device and computer readable storage medium
CN113835802A (en) Device interaction method, system, device and computer readable storage medium
CN115002336A (en) Video information generation method, electronic device and medium
WO2023165413A1 (en) Application development system and method, and device
CN116709557B (en) Service processing method, device and storage medium
WO2023061298A1 (en) Picture backup system and method, and device
WO2024067169A1 (en) Information processing method and electronic device
WO2023030057A1 (en) Screen recording method, electronic device, and computer readable storage medium
US20240134496A1 (en) Shortcut icon display method and terminal device
CN115509407A (en) Shortcut icon display method and terminal equipment
CN116846853A (en) Information processing method and electronic equipment
CN117793522A (en) Picture shooting and sharing method and electronic equipment
CN117806492A (en) Card display method and electronic equipment
CN117762281A (en) Method for managing service card and electronic equipment
CN115840610A (en) Desktop dynamic effect display method and electronic equipment
CN117724640A (en) Split screen display method, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination