CN115494996B - Interaction method, interaction equipment and vehicle - Google Patents

Interaction method, interaction equipment and vehicle Download PDF

Info

Publication number
CN115494996B
CN115494996B CN202211412155.7A CN202211412155A CN115494996B CN 115494996 B CN115494996 B CN 115494996B CN 202211412155 A CN202211412155 A CN 202211412155A CN 115494996 B CN115494996 B CN 115494996B
Authority
CN
China
Prior art keywords
display interface
picture
mode
specified
picture under
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211412155.7A
Other languages
Chinese (zh)
Other versions
CN115494996A (en
Inventor
张青禾
曹昀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jidu Technology Co Ltd
Original Assignee
Beijing Jidu Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jidu Technology Co Ltd filed Critical Beijing Jidu Technology Co Ltd
Priority to CN202211412155.7A priority Critical patent/CN115494996B/en
Publication of CN115494996A publication Critical patent/CN115494996A/en
Application granted granted Critical
Publication of CN115494996B publication Critical patent/CN115494996B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Remote Sensing (AREA)
  • Computer Graphics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides an interaction method, interaction equipment and interaction vehicles, relates to the technical field of automobiles, and is used for solving the problem of waiting anxiety of a user when the user enters a full-screen mode and improving service experience of the user. In an embodiment of the present application, the method includes: responding to the received appointed trigger event, and controlling the display interface to switch from a picture under a first visual angle to a picture under a second visual angle; when the display interface is switched to the picture under the second visual angle, the target object on the display interface advances to a specified direction to approach a target element for representing a specified mode.

Description

Interaction method, interaction equipment and vehicle
Technical Field
The application relates to the technical field of automobiles, in particular to an interaction method, interaction equipment and an automobile.
Background
With the rapid development of intelligent vehicle-mounted systems, in order to provide users with better visual experience, the size of the display screen of the vehicle-mounted terminal is also larger and larger, and many applications in the vehicle-mounted terminal, such as video applications, singing applications, game applications and the like for entertainment of users, usually support a full-screen mode. Currently, when a user opens an application to enter full-screen mode, it is often necessary to wait for a few seconds or even minutes until the full-screen mode of the application is loaded.
Disclosure of Invention
Aspects of the application provide an interaction method, device and vehicle, which are used for relieving waiting anxiety of a user when the user enters a full screen mode and providing better service experience for the user.
In a first aspect, an embodiment of the present application provides an interaction method, where the method is applied to an interaction device, the method includes:
responding to the received appointed trigger event, and controlling the display interface to switch from a picture under a first visual angle to a picture under a second visual angle;
when the display interface is switched to the picture under the second visual angle, the target object on the display interface advances to a specified direction to approach a target element for representing a specified mode.
In a second aspect, embodiments of the present application further provide an interaction device, where the device includes a control module and a display module, where:
the control module is used for responding to the received appointed trigger event and sending a control instruction to the display module, wherein the control instruction is used for controlling the display interface of the display module to be switched from a picture under a first view angle to a picture under a second view angle;
the display module is used for responding to the control instruction and controlling a display interface of the display module to be switched from a picture under the first view angle to a picture under the second view angle;
when the display interface is switched to the picture under the second visual angle, the target object on the display interface advances to a specified direction to approach to a target element for representing a specified mode.
In a third aspect, embodiments of the present application further provide a vehicle, where the vehicle includes the interaction device according to the second aspect.
In a fourth aspect, embodiments of the present application further provide a computer storage medium including computer programs/instructions comprising: the steps in the method as described in the first aspect are implemented when said computer program/instructions are executed.
In a fifth aspect, embodiments of the present application further provide a computer device, including: a memory and a processor; wherein the memory is used for storing a computer program; the processor is coupled to the memory for executing the computer program for performing the steps in the method of the first aspect.
The technical scheme provided by the embodiment of the application can be seen that the embodiment of the application has at least one of the following technical effects:
according to the one or more embodiments, when a user enters the specified mode, the user can see the dynamic change picture of the display interface from the picture under the first view angle to the picture under the second view angle, and when the display interface is switched to the picture under the second view angle, the target object on the display interface advances towards the specified direction to approach the target element used for representing the specified mode, so that immersive picture entering experience is increased for the waiting process of the user entering the specified mode, waiting anxiety of the user when the user enters the specified mode is effectively relieved, and service experience of the user is effectively improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute an undue limitation to the application. In the drawings:
FIG. 1 is a schematic flow chart of an interaction method according to an exemplary embodiment of the present application;
fig. 2 is a schematic diagram of a screen when a display interface is switched to a second viewing angle in an interaction method according to an exemplary embodiment of the present disclosure;
FIG. 3 is a schematic diagram of an enter button of a specified mode displayed in a display interface being clicked in an interaction method according to an exemplary embodiment of the present application;
FIG. 4 is a schematic view of a rendering effect of a vehicle model in an interaction method according to an exemplary embodiment of the present disclosure;
fig. 5 is a schematic view of a lighting sequence of an atmosphere lamp of a vehicle in an interaction method according to an exemplary embodiment of the present application;
FIG. 6 is a schematic structural diagram of an interactive device according to an exemplary embodiment of the present application;
FIG. 7 is a schematic illustration of a vehicle according to an exemplary embodiment of the present application;
fig. 8 is a schematic structural diagram of an electronic device according to an exemplary embodiment of the present application.
Detailed Description
For the purposes, technical solutions and advantages of the present application, the technical solutions of the present application will be clearly and completely described below with reference to specific embodiments of the present application and corresponding drawings. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
As described in the background, when triggering the full-screen mode of a certain multimedia class application of an in-vehicle system, a user typically needs to take a blank time of several seconds or even several minutes or watch a fixed video or videos to wait for the application to complete the loading of the full-screen mode. Clearly, in this process, the user is prone to a feeling of waiting for anxiety.
In view of this, the embodiment of the application provides a solution, and the basic idea is that: when a user enters a specified mode, the user can see a dynamic change picture of the display interface from a picture under a first visual angle to a picture under a second visual angle, and when the display interface is switched to the picture under the second visual angle, a target object on the display interface advances to a specified direction to approach a target element used for representing the specified mode, so that immersive picture entering experience is increased for a waiting process of the user for entering the specified mode, waiting anxiety of the user when entering the specified mode is effectively relieved, and service experience of the user is effectively improved.
The following describes in detail the technical solutions provided by the embodiments of the present application with reference to the accompanying drawings.
Fig. 1 is a schematic flow chart of an implementation of an interaction method according to an embodiment of the present application, where the method may be applied to a vehicle (such as a vehicle, a motorcycle, an electric vehicle, etc.), an intelligent exercise device (such as a treadmill, a spinning, etc.), a wearable device, a VR/AR/MR device, etc. that supports an interaction experience. The method of fig. 1 may include:
step 110, in response to the received specified trigger event, the display interface is controlled to switch from the picture under the first viewing angle to the picture under the second viewing angle.
The display picture in the display interface can comprise a background picture rendered based on the environment where the interactive equipment is located, and can also comprise a target object rendered based on the appearance of the interactive equipment and/or the interactive object of the interactive equipment.
Step 120, when the display interface is switched to the screen under the second viewing angle, the target object on the display interface proceeds to the specified direction to approach the target element for representing the specified mode.
Taking the interactive device as an example of a vehicle, the specified mode can be a full-screen mode of a multimedia application in the system, or the specified mode can be a video watching mode of a certain online video. Typically, the system requires a loading time of a few seconds when loading a full screen mode of a multimedia class application, or when loading a certain online video, during which the user may create anxiety emotion due to waiting. Taking the interaction device as a VR/AR device as an example, the specified mode may also be an entry mode of a certain virtual scene, for example, may be an entry mode of a certain game scene. Such a scenario may require several seconds of loading time when loaded due to the large amount of memory occupied by the larger data.
In view of this, when the user selects to enter the specified mode, the display interface is controlled to switch from the picture under the first view angle to the picture under the second view angle, so that the dynamic change picture of the picture under the first view angle is displayed for the user, and when the display interface is switched to the picture under the second view angle, the target object on the display interface can also advance towards the specified direction to approach the target element for representing the specified mode, thus increasing the immersive picture entering experience for the waiting process of the user for entering the specified mode, effectively relieving the waiting anxiety of the user when entering the specified mode, and effectively improving the service experience of the user.
For example, for step 110, the picture at the first view may be a picture at a front, rear, or side view of the interactive device, and the picture at the second view may be a picture at a front, rear, or side view of the interactive device. In one embodiment, in order to provide the user with a change in the angle of the mirror, the user is more immersed in the change in the scene, and when the frame under the first viewing angle is a front or rear viewing angle frame of the interactive device, the frame under the second viewing angle may be a side viewing angle frame of the interactive device.
It should be understood that when the interactive device is a device capable of mounting a display screen, such as a vehicle, treadmill, spinning, motorcycle, electric vehicle, the display interface is a display interface in a physical display screen that is viewable. When the interactive device is VR/AR equipment, the display interface is the display interface viewed by the user after wearing the VR/AR equipment.
For example, for step 120, where the target element used to characterize the specified pattern, a building (e.g., a castle) may be virtualized as the target element. When the interactive object is a vehicle such as a vehicle, a motorcycle or an electric vehicle, the target element may be rendered by taking a certain landmark building in a destination to be reached in a preset time period in the future of the vehicle, the motorcycle or the electric vehicle as the rendering object. When the interactive object is VR/AR equipment, a running machine or a dynamic bicycle, the target element may be a gate, and after the target object is continuously close to the gate, the gate may be opened and the user enters the designated mode.
Fig. 2 is a schematic diagram of a screen when a display interface is switched to a second viewing angle in an interaction method according to an exemplary embodiment of the present application. In fig. 2, a certain logo building in the destination to be reached within a preset time period in the future of the vehicle may be further included in the picture at the second viewing angle, and the building may be rendered into the building shown in fig. 2 according to its actual appearance shape, and it should be understood that the appearance and shape of the rendered building may correspond to the actual appearance and shape of the building.
Taking the interactive device as an example of a vehicle, before the interactive device enters the specified mode, along with continuous running of the interactive device, a target object in a picture at the second view angle also changes in real time along with the change of the vehicle position, and compared with the waiting time of fixed animation or blank, the waiting mode for entering the specified mode can improve the interest of a user in watching the animation, reduce waiting anxiety of the user, and further improve driving experience of the user.
Wherein, the picture under the first visual angle also comprises a target object;
the target object is rendered based on the interactive device and/or the use object of the interactive device.
The target object may be the interaction device itself, or the target object may also be an interaction object that interacts with the interaction device. When the interactive device is a vehicle, the target object may be the interactive device itself. When the interactive object is a treadmill, the target object may be an interactive object running on the treadmill. When the interactive object is a VR/AR device, the target object may be an interactive object wearing the VR/AR device. When the interaction device is a spinning, the target object may comprise a combination of spinning and an interaction object on the spinning. When the interactive object is a motorcycle, the target object may include a combination of the motorcycle and the interactive object riding on the motorcycle.
Further, optionally, when the target object enters into the target element or the target object approaches the target element infinitely, the display interface may be controlled to display the display interface corresponding to the specified mode, that is, enter the specified mode. Specifically, the method provided by the embodiment of the application further comprises the following steps:
and when the distance between the target object and the target element is smaller than or equal to the preset distance, controlling the display interface to display the display interface corresponding to the designated mode.
Further, alternatively, the specified triggering event may be triggered based on the state of the interactive device, or the specified triggering event may be triggered by the interactive object clicking an enter button of a specified mode. Specifically, in response to a received specified trigger event, controlling the display interface to switch from a picture at a first viewing angle to a picture at a second viewing angle, including:
triggering a specified triggering event when the interaction equipment is detected to be in a specified state; or triggering a specified trigger event when detecting a specified operation of a specified mode of the enter button displayed in the display interface;
and in response to the specified trigger event, controlling the display interface to switch from the picture at the first view angle to the picture at the second view angle.
Taking the case that the designated mode is the full-screen mode, the designated state is the state of loading the full-screen mode for the interactive equipment. Then a specified trigger event may be triggered upon detecting that the interactive device is loading full screen mode. Or, when detecting an operation such as clicking of an enter button of a specified mode displayed by the interactive object on the display interface, a specified trigger event may be triggered.
Fig. 3 is a schematic diagram of entering a specified mode in an interaction method according to an exemplary embodiment of the present application. In fig. 3, when the vehicle does not enter the specified mode, a model image of the vehicle and an image around the vehicle are displayed in real time, and when the user clicks the enter button of the specified mode shown in the drawing, the user can trigger the entry into the specified mode.
Further, optionally, controlling the display interface to switch from the picture at the first viewing angle to the picture at the second viewing angle includes:
controlling the display interface to switch from a mirror operation process picture under a first view angle to a map picture with a three-dimensional rendering effect under a second view angle;
wherein the mirror process picture comprises a three-dimensional map picture of building and road compositions in a real environment.
Taking the interactive device as an example of a vehicle, the map picture with the three-dimensional rendering effect under the second view angle can be specifically a picture obtained by rendering the model data of the vehicle and the image data of the periphery of the vehicle according to the 3D game style. Fig. 4 is a schematic view of a rendering effect of a vehicle model in an interaction method according to an exemplary embodiment of the present application. Fig. 4 is a schematic diagram of a model of a vehicle in the left view, and fig. 4 is a schematic diagram of a model of the vehicle rendered based on model data of the vehicle in accordance with a 3D game style. Obviously, after rendering, the model of the vehicle is converted into a vehicle model in the game scene, so that the viewing interest of the user can be stimulated.
Further, optionally, in order to create an immersive experience for the user to enter the specified mode, the method provided by the embodiment of the application further includes:
before a display interface corresponding to the specified mode is displayed, playing the entering sound effect of the specified mode; or playing the background sound effect of the specified mode in a preset time period after the display interface corresponding to the specified mode is displayed.
It should be appreciated that to enhance the feeling of entry into the specified mode, the entry sound effect of the specified mode may begin to play before entering into the specified mode, i.e., during the process of controlling the display interface to switch from the screen at the first perspective to the screen at the second perspective, and the target object on the display interface traveling in the specified direction to approach the target element for characterizing the specified mode. Or playing the background sound effect of the specified mode in a preset time period after entering the specified mode.
The entering sound effect of the appointed mode can be 7.1.2 panoramic sound customized entering sound effect, the entering sound effect has stronger technological sense, and the animation of the 3D game effect watched by the user can be combined, so that better audiovisual feast is provided.
Further, optionally, in order to further enhance the experience of entering the specified mode, the method provided by the embodiment of the present application further includes:
and before the display interface corresponding to the specified mode is displayed by the control display interface or in a preset time period after the display interface corresponding to the specified mode is displayed by the control display interface, an atmosphere lamp of the interactive equipment is lightened.
It should be appreciated that to enhance the feeling of entry into the specified mode, the mood light may begin to illuminate before entering the specified mode, i.e., play the entry sound effect during control of the display interface to switch from the first view to the second view and the target object on the display interface traveling in the specified direction to approach the target element for characterizing the specified mode. Or, the atmosphere lamp is lighted for a preset period of time after entering the specified mode.
Further, alternatively, in order to enhance the interest and the sense of ceremony in entering the specified mode, the mood lamps arranged in the vehicle may be gradually lighted in a preset lighting sequence. Specifically, the atmosphere lamp of the interactive device has a plurality of, and the atmosphere lamp of the interactive device of illumination includes:
and according to a preset atmosphere lamp lighting sequence, a plurality of atmosphere lamps of the interaction equipment are lighted.
Fig. 5 is a schematic view of a lighting sequence of an atmosphere lamp of a vehicle in an interaction method according to an exemplary embodiment of the present application. In fig. 5, the atmosphere lamps may be disposed in the left and right and top areas surrounding the vehicle seat, and in order to enhance the interest and appearance of displaying the animation for the user, the atmosphere lamps in the middle area of the vehicle seat may be turned on preferentially, and then the atmosphere lamps in the left and right areas surrounding the middle area of the vehicle seat may be turned on gradually until all the atmosphere lamps surrounding the vehicle seat are turned on. The atmosphere lamp may be in a normally on state or an off state after the vehicle enters the specified mode.
According to the interaction method provided by one or more embodiments, when a user enters a specified mode, the user can see a dynamic change picture of a display interface from a picture under a first visual angle to a picture under a second visual angle, and when the display interface is switched to the picture under the second visual angle, a target object on the display interface advances towards the specified direction to approach a target element used for representing the specified mode, so that immersive picture entering experience is increased for a waiting process of the user for entering the specified mode, waiting anxiety of the user when entering the specified mode is effectively relieved, and service experience of the user is effectively improved.
Fig. 6 is a schematic structural diagram of an interaction device 600 according to an embodiment of the present application. The interactive apparatus of fig. 6 may include a control module 610 and a display module 620, wherein:
the control module 610 is configured to send a control instruction to the display module in response to the received specified trigger event, where the control instruction is configured to control a display interface of the display module to switch from a frame under a first viewing angle to a frame under a second viewing angle;
the display module 620 is configured to control, in response to the control instruction, a display interface of the display module to switch from a picture under the first viewing angle to a picture under the second viewing angle;
when the display interface is switched to the picture under the second visual angle, the target object on the display interface advances to a specified direction to approach to a target element for representing a specified mode.
Further optionally, the control module 610 is further configured to:
and when the distance between the target object and the target element is smaller than or equal to a preset distance, controlling the display interface to display the display interface corresponding to the specified mode.
Further optionally, the control module 610 is further configured to:
triggering the appointed triggering event when the interaction equipment is detected to be in an appointed state; or triggering the appointed triggering event when detecting the appointed operation of the appointed mode entry button displayed in the display interface;
and responding to the specified trigger event, and controlling the display interface to switch from the picture under the first view angle to the picture under the second view angle.
Further optionally, the control module 610 is further configured to:
controlling the display interface to be switched from a mirror operation process picture under the first view angle to a map picture with a three-dimensional rendering effect under the second view angle;
wherein the mirror process picture comprises a three-dimensional map picture of building and road composition in a real environment.
Further, optionally, the target object is included in the picture under the first view angle;
the target object is rendered based on the interactive equipment or the using object of the interactive equipment.
Further optionally, the apparatus further comprises an audio module for:
before a display interface corresponding to the specified mode is displayed, playing the entering sound effect of the specified mode; or playing the background sound effect of the specified mode in a preset time period after the display interface corresponding to the specified mode is displayed.
Further optionally, the apparatus further comprises a light emitting module for:
and before the display interface corresponding to the specified mode is controlled to be displayed on the display interface or in a preset time period after the display interface corresponding to the specified mode is controlled to be displayed on the display interface, an atmosphere lamp of the interactive device is lightened.
Further optionally, the apparatus further comprises a light emitting module for:
and according to a preset atmosphere lamp lighting sequence, a plurality of atmosphere lamps of the interaction equipment are lighted.
According to the interactive device provided by one or more embodiments, when a user enters a specified mode, the user can see a dynamic change picture of a display interface from a picture under a first view angle to a picture under a second view angle, and when the display interface is switched to the picture under the second view angle, a target object on the display interface advances towards the specified direction to approach a target element used for representing the specified mode, so that immersive picture entering experience is increased for a waiting process of the user for entering the specified mode, waiting anxiety of the user when entering the specified mode is effectively relieved, and service experience of the user is effectively improved.
Fig. 7 is a schematic structural view of a vehicle according to an embodiment of the present application. The vehicle of fig. 7 may include the interactive apparatus 600 shown in fig. 6.
The specific implementation of the vehicle shown in fig. 7 has been described in detail in connection with embodiments of the interactive apparatus and method, and will not be described in detail herein.
It should be noted that, in some of the above embodiments and the flows described in the drawings, a plurality of operations appearing in a specific order are included, but it should be clearly understood that the operations may be performed out of the order in which they appear herein or performed in parallel, the sequence numbers of the operations, such as 110, 120, etc., are merely used to distinguish between the various operations, and the sequence numbers themselves do not represent any order of execution. In addition, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel. It should be noted that, the descriptions of "first" and "second" herein are used to distinguish different messages, devices, modules, etc., and do not represent a sequence, and are not limited to the "first" and the "second" being different types.
Fig. 8 is a schematic structural diagram of an electronic device according to an exemplary embodiment of the present application. Referring to fig. 8, the electronic device includes: a memory 81 and a processor 82.
Memory 81 is used to store computer programs and may be configured to store various other data to support operations on the computing platform. Examples of such data include instructions for any application or method operating on a computing platform, contact data, phonebook data, messages, pictures, videos, and the like.
The memory 81 may be implemented by any type or combination of volatile or non-volatile memory devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
A processor 82 coupled to the memory 81 for executing the computer program in the memory 81 for: responding to the received appointed trigger event, and controlling the display interface to switch from a picture under a first visual angle to a picture under a second visual angle; when the display interface is switched to the picture under the second visual angle, the target object on the display interface advances to a specified direction to approach a target element for representing a specified mode.
Further optionally, the processor 82 is specifically configured to:
and when the distance between the target object and the target element is smaller than or equal to a preset distance, controlling the display interface to display the display interface corresponding to the specified mode.
Further optionally, the processor 82 is configured to:
triggering the appointed triggering event when the interaction equipment is detected to be in an appointed state; or triggering the appointed triggering event when detecting the appointed operation of the appointed mode entry button displayed in the display interface;
and responding to the specified trigger event, and controlling the display interface to switch from the picture under the first view angle to the picture under the second view angle.
Further optionally, the processor 82 is specifically configured to:
controlling the display interface to be switched from a mirror operation process picture under the first view angle to a map picture with a three-dimensional rendering effect under the second view angle;
wherein the mirror process picture comprises a three-dimensional map picture of building and road composition in a real environment.
Further optionally, the target object is rendered based on the interaction device or a usage object of the interaction device.
Further optionally, the processor 82 is specifically configured to:
and before the display interface corresponding to the specified mode is controlled to be displayed on the display interface or in a preset time period after the display interface corresponding to the specified mode is controlled to be displayed on the display interface, an atmosphere lamp of the interactive device is lightened.
Further optionally, the processor 82 is specifically configured to:
and according to a preset atmosphere lamp lighting sequence, a plurality of atmosphere lamps of the interaction equipment are lighted.
Further, as shown in fig. 8, the electronic device further includes: communication component 83, display 84, power component 85, audio component 86, and other components. Only some of the components are schematically shown in fig. 8, which does not mean that the electronic device only comprises the components shown in fig. 8. In addition, the components within the dashed box in fig. 8 are optional components, not necessarily optional components, depending on the product form of the electronic device. The display may include a screen, which may include a Liquid Crystal Display (LCD) and a Touch Panel (TP), for display. If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or sliding action, but also the duration and pressure associated with the touch or sliding operation. The display may also be an Electroluminescent (EL) element, a micro-display with a similar structure, or a laser scanning display where the retina can display directly or similar, or an augmented Reality (Augmented Reality, AR), virtual Reality (VR) or Mixed Reality (MR) display device.
The electronic device in this embodiment may be implemented as a terminal device such as a desktop computer, a notebook computer, a smart phone, or an IOT device, or may be a server device such as a conventional server, a cloud server, or a server array. If the electronic device of the embodiment is implemented as a terminal device such as a desktop computer, a notebook computer, or a smart phone, the electronic device may include components within the dashed-line frame in fig. 8; if the electronic device of the embodiment is implemented as a server device such as a conventional server, a cloud server, or a server array, the components within the dashed box in fig. 8 may not be included.
Accordingly, embodiments of the present application also provide a computer storage medium including a computer program/instruction, where the computer program/instruction when executed is capable of implementing each step in the above method embodiments that may be executed by an electronic device.
Wherein the computer program/instructions may be stored in a computer readable storage medium or in the cloud.
The communication component is configured to facilitate wired or wireless communication between the device in which the communication component is located and other devices. The device where the communication component is located can access a wireless network based on a communication standard, such as a mobile communication network of WiFi,2G, 3G, 4G/LTE, 5G, etc., or a combination thereof. In one exemplary embodiment, the communication component receives a broadcast signal or broadcast-related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
The display includes a screen, which may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or sliding action, but also the duration and pressure associated with the touch or sliding operation.
The power supply component provides power for various components of equipment where the power supply component is located. The power components may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the devices in which the power components are located.
The audio component described above may be configured to output and/or input an audio signal. For example, the audio component includes a Microphone (MIC) configured to receive external audio signals when the device in which the audio component is located is in an operational mode, such as a call mode, a recording mode, and a speech recognition mode. The received audio signal may be further stored in a memory or transmitted via a communication component. In some embodiments, the audio assembly further comprises a speaker for outputting audio signals.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
The foregoing is merely exemplary of the present application and is not intended to limit the present application. Various modifications and changes may be made to the present application by those skilled in the art. Any modifications, equivalent substitutions, improvements, etc. which are within the spirit and principles of the present application are intended to be included within the scope of the claims of the present application.

Claims (10)

1. An interaction method, wherein the method is applied to an interaction device, the method comprising:
responding to a received appointed triggering event, and controlling a display interface to switch from a picture under a first visual angle to a picture under a second visual angle, wherein the appointed triggering event is used for triggering to enter an appointed mode, and the appointed mode is a full-screen mode of a multimedia application or a video watching mode of an online video; the pictures under the first visual angle comprise background pictures rendered based on the environment where the interactive equipment is located and target objects rendered based on the interactive equipment or the using objects of the interactive equipment; the picture under the first visual angle is a picture of a front or rear visual angle of the interactive equipment, and the picture under the second visual angle is a picture of a side visual angle of the interactive equipment;
when the display interface is switched to a picture under the second visual angle, the target object on the display interface advances to a specified direction to approach a target element for representing the specified mode; the picture under the second view angle comprises the target object and the target element;
and when the distance between the target object and the target element is smaller than or equal to a preset distance, controlling the display interface to display the display interface corresponding to the specified mode, and entering the specified mode.
2. The method of claim 1, wherein controlling the display interface to switch from a view at a first perspective to a view at a second perspective in response to receiving a specified trigger event comprises:
triggering the appointed triggering event when the interaction equipment is detected to be in an appointed state; or triggering the appointed triggering event when detecting the appointed operation of the appointed mode entry button displayed in the display interface;
and responding to the specified trigger event, and controlling the display interface to switch from the picture under the first view angle to the picture under the second view angle.
3. The method of any of claims 1 to 2, wherein controlling the display interface to switch from a view at a first viewing angle to a view at a second viewing angle comprises:
controlling the display interface to be switched from a mirror operation process picture under the first view angle to a map picture with a three-dimensional rendering effect under the second view angle;
wherein the mirror process picture comprises a three-dimensional map picture of building and road composition in a real environment.
4. The method of claim 1, wherein the method further comprises:
before a display interface corresponding to the specified mode is displayed, playing the entering sound effect of the specified mode; or playing the background sound effect of the specified mode in a preset time period after the display interface corresponding to the specified mode is displayed.
5. The method of claim 1 or 4, wherein the method further comprises:
and before the display interface corresponding to the specified mode is controlled to be displayed on the display interface or in a preset time period after the display interface corresponding to the specified mode is controlled to be displayed on the display interface, an atmosphere lamp of the interactive device is lightened.
6. The method of claim 5, wherein the interactive device has a plurality of atmosphere lamps, and wherein the illuminating the interactive device atmosphere lamps comprises:
and according to a preset atmosphere lamp lighting sequence, a plurality of atmosphere lamps of the interaction equipment are lighted.
7. An interactive device, the device comprising a control module and a display module, wherein:
the control module is used for responding to the received appointed triggering event and sending a control instruction to the display module, wherein the control instruction is used for controlling a display interface of the display module to be switched from a picture under a first view angle to a picture under a second view angle, and the appointed triggering event is used for triggering to enter an appointed mode, wherein the appointed mode is a full-screen mode of a multimedia application or a video watching mode of an online video; the pictures under the first visual angle comprise background pictures rendered based on the environment where the interactive equipment is located and target objects rendered based on the interactive equipment or the using objects of the interactive equipment; the picture under the first visual angle is a picture of a front or rear visual angle of the interactive equipment, and the picture under the second visual angle is a picture of a side visual angle of the interactive equipment;
the display module is used for responding to the control instruction and controlling a display interface of the display module to be switched from a picture under the first view angle to a picture under the second view angle;
when the distance between the target object and the target element is smaller than or equal to a preset distance, controlling the display interface to display a display interface corresponding to the specified mode, and entering the specified mode;
when the display interface is switched to a picture under the second visual angle, a target object on the display interface advances to a specified direction to approach a target element for representing the specified mode; the picture at the second view angle includes the target object and the target element.
8. A vehicle comprising the interactive apparatus of claim 7.
9. A computer storage medium comprising computer program/instructions which, when executed by the computer program/instruction processor, implement the steps in the method of any of claims 1-6.
10. A computer device, comprising: a memory and a processor; wherein the memory is used for storing a computer program; the processor is coupled to the memory for executing the computer program for performing the steps in the method of any of claims 1-6.
CN202211412155.7A 2022-11-11 2022-11-11 Interaction method, interaction equipment and vehicle Active CN115494996B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211412155.7A CN115494996B (en) 2022-11-11 2022-11-11 Interaction method, interaction equipment and vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211412155.7A CN115494996B (en) 2022-11-11 2022-11-11 Interaction method, interaction equipment and vehicle

Publications (2)

Publication Number Publication Date
CN115494996A CN115494996A (en) 2022-12-20
CN115494996B true CN115494996B (en) 2023-07-18

Family

ID=85115621

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211412155.7A Active CN115494996B (en) 2022-11-11 2022-11-11 Interaction method, interaction equipment and vehicle

Country Status (1)

Country Link
CN (1) CN115494996B (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111591210B (en) * 2020-04-22 2022-09-16 深圳市点嘀互联网络有限公司 System and method for intelligent interaction between embedded equipment and vehicle atmosphere lamp
CN111589146A (en) * 2020-04-27 2020-08-28 腾讯科技(深圳)有限公司 Prop operation method, device, equipment and storage medium based on virtual environment
CN112843716A (en) * 2021-03-17 2021-05-28 网易(杭州)网络有限公司 Virtual object prompting and viewing method and device, computer equipment and storage medium
WO2022222597A1 (en) * 2021-04-19 2022-10-27 网易(杭州)网络有限公司 Game process control method and apparatus, electronic device, and storage medium
CN113946259B (en) * 2021-09-18 2023-04-07 北京城市网邻信息技术有限公司 Vehicle information processing method and device, electronic equipment and readable medium

Also Published As

Publication number Publication date
CN115494996A (en) 2022-12-20

Similar Documents

Publication Publication Date Title
CN111970533B (en) Interaction method and device for live broadcast room and electronic equipment
CN112261226B (en) Horizontal screen interaction method and device, electronic equipment and storage medium
RU2614137C2 (en) Method and apparatus for obtaining information
CN104113785A (en) Information acquisition method and device
US20230300403A1 (en) Video processing method and apparatus, device, and storage medium
CN113901239B (en) Information display method, device, equipment and storage medium
US11545188B2 (en) Video processing method, video playing method, devices and storage medium
CN111866596A (en) Bullet screen publishing and displaying method and device, electronic equipment and storage medium
CN112653920B (en) Video processing method, device, equipment and storage medium
CN109614470B (en) Method and device for processing answer information, terminal and readable storage medium
US20220264053A1 (en) Video processing method and device, terminal, and storage medium
CN110968362B (en) Application running method, device and storage medium
CN111382355A (en) Live broadcast management method, device and equipment based on geographic object and storage medium
CN110321042B (en) Interface information display method and device and electronic equipment
CN115494996B (en) Interaction method, interaction equipment and vehicle
CN114371898B (en) Information display method, equipment, device and storage medium
WO2019228969A1 (en) Displaying a virtual dynamic light effect
CN112905096B (en) Display control method, device, terminal equipment and storage medium
CN115022696A (en) Video preview method and device, readable medium and electronic equipment
CN116156077A (en) Method, device, equipment and storage medium for multimedia resource clipping scene
CN114222173A (en) Object display method and device, electronic equipment and storage medium
CN114428660A (en) Page processing method, device, equipment and storage medium
CN114840283A (en) Multimedia resource display method, device, terminal and medium
CN115499672B (en) Image display method, device, equipment and storage medium
US20240129427A1 (en) Video processing method and apparatus, and terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant