CN115494996A - Interaction method, interaction equipment and vehicle - Google Patents

Interaction method, interaction equipment and vehicle Download PDF

Info

Publication number
CN115494996A
CN115494996A CN202211412155.7A CN202211412155A CN115494996A CN 115494996 A CN115494996 A CN 115494996A CN 202211412155 A CN202211412155 A CN 202211412155A CN 115494996 A CN115494996 A CN 115494996A
Authority
CN
China
Prior art keywords
display interface
picture
specified
visual angle
mode
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211412155.7A
Other languages
Chinese (zh)
Other versions
CN115494996B (en
Inventor
张青禾
曹昀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jidu Technology Co Ltd
Original Assignee
Beijing Jidu Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jidu Technology Co Ltd filed Critical Beijing Jidu Technology Co Ltd
Priority to CN202211412155.7A priority Critical patent/CN115494996B/en
Publication of CN115494996A publication Critical patent/CN115494996A/en
Application granted granted Critical
Publication of CN115494996B publication Critical patent/CN115494996B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models

Abstract

The embodiment of the application provides an interaction method, interaction equipment and a vehicle, relates to the technical field of automobiles, and is used for solving the problem of anxiety waiting when a user enters a full-screen mode and improving the service experience of the user. In an embodiment of the present application, the method includes: responding to the received appointed trigger event, and controlling the display interface to be switched from the picture under the first visual angle to the picture under the second visual angle; when the display interface is switched to the picture under the second visual angle, the target object on the display interface advances to the appointed direction to be close to the target element for representing the appointed mode.

Description

Interaction method, interaction equipment and vehicle
Technical Field
The application relates to the technical field of automobiles, in particular to an interaction method, interaction equipment and an automobile.
Background
With the rapid development of intelligent vehicle-mounted systems, in order to provide users with better visual experience, the size of a display screen of a vehicle-mounted terminal is larger and larger, and many applications in the vehicle-mounted terminal, such as video applications, singing applications, game applications and other applications for entertainment of users, generally support a full-screen mode. Currently, when a user opens an application into full-screen mode, the user usually needs to wait for several seconds or even several minutes until the full-screen mode loading of the application is completed.
Disclosure of Invention
Aspects of the application provide an interaction method, an interaction device and a vehicle, which are used for relieving the waiting anxiety of a user when the user enters a full screen mode, and providing better service experience for the user.
In a first aspect, an embodiment of the present application provides an interaction method, where the method is applied to an interaction device, and the method includes:
responding to the received appointed trigger event, and controlling the display interface to be switched from the picture under the first visual angle to the picture under the second visual angle;
when the display interface is switched to the picture under the second visual angle, the target object on the display interface advances to the specified direction to be close to the target element for representing the specified mode.
In a second aspect, an embodiment of the present application further provides an interactive device, where the device includes a control module and a display module, where:
the control module is used for responding to a received specified trigger event and sending a control instruction to the display module, wherein the control instruction is used for controlling a display interface of the display module to be switched from a picture under a first visual angle to a picture under a second visual angle;
the display module is used for responding to the control instruction and controlling a display interface of the display module to be switched from the picture under the first visual angle to the picture under the second visual angle;
when the display interface is switched to the picture under the second visual angle, the target object on the display interface advances towards the specified direction to be close to the target element for representing the specified mode.
In a third aspect, embodiments of the present application further provide a vehicle, where the vehicle includes the interaction device described in the second aspect.
In a fourth aspect, embodiments of the present application further provide a computer program/instruction, including: the computer program/instructions when executed implement the steps in the method according to the first aspect.
In a fifth aspect, an embodiment of the present application further provides a computer device, including: a memory and a processor; wherein the memory is to store a computer program; the processor is coupled to the memory for executing the computer program for performing the steps of the method of the first aspect.
As can be seen from the technical solutions provided in the embodiments of the present application, the solutions in the embodiments of the present application at least have the following technical effects:
according to one or more embodiments provided by the application, when a user enters a designated mode, a dynamic change picture of a display interface switched from a picture under a first visual angle to a picture under a second visual angle can be seen, and when the display interface is switched to the picture under the second visual angle, a target object on the display interface moves towards a designated direction to be close to a target element for representing the designated mode, so that immersive picture entry experience is added to a waiting process of the user for entering the designated mode, waiting anxiety of the user for entering the designated mode is effectively relieved, and service experience of the user is effectively improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a schematic implementation flow diagram of an interaction method according to an exemplary embodiment of the present application;
fig. 2 is a schematic diagram of a picture when a display interface is switched to a second viewing angle in an interaction method according to an exemplary embodiment of the present application;
fig. 3 is a schematic diagram illustrating a specific mode of an enter button displayed in a display interface in an interaction method according to an exemplary embodiment of the present application being clicked;
FIG. 4 is a schematic diagram illustrating rendering effects of a vehicle model in an interaction method according to an exemplary embodiment of the present application;
FIG. 5 is a schematic diagram illustrating a lighting sequence of an ambience lamp of a vehicle in an interaction method according to an exemplary embodiment of the present application;
fig. 6 is a schematic structural diagram of an interaction device according to an exemplary embodiment of the present application;
FIG. 7 is a schematic illustration of a vehicle according to an exemplary embodiment of the present application;
fig. 8 is a schematic structural diagram of an electronic device according to an exemplary embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
As described in the background, when a user triggers a full screen mode of a certain multimedia-like application of an in-vehicle system, it usually takes a blank time of several seconds or even several minutes or watches a fixed video or videos to wait for the application to finish loading the full screen mode. Obviously, in this process, the user is apt to generate an emotion of waiting anxiety.
In view of the above, the embodiments of the present application provide a solution, and the basic idea is: when a user enters a designated mode, a dynamic change picture of the display interface switched from a picture under a first visual angle to a picture under a second visual angle can be seen, and when the display interface is switched to the picture under the second visual angle, a target object on the display interface moves towards a designated direction to be close to a target element for representing the designated mode, so that immersive picture entering experience is increased for the waiting process of the user entering the designated mode, waiting anxiety of the user entering the designated mode is effectively relieved, and the service experience of the user is effectively improved.
The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
Fig. 1 is a schematic implementation flow diagram of an interaction method provided by an embodiment of the present application, where the method is applicable to a vehicle (e.g., a vehicle, a motorcycle, an electric vehicle, etc.), an intelligent fitness device (e.g., a treadmill, a spinning bike, etc.), a wearable device, a VR/AR/MR device, etc. that supports an interaction experience. The method of fig. 1 may include:
step 110, in response to the received specified trigger event, controlling the display interface to switch from the picture at the first viewing angle to the picture at the second viewing angle.
The display screen in the display interface may include a background screen rendered based on an environment where the interactive device is located, and may also include a target object rendered based on an appearance of the interactive device and/or an interactive object of the interactive device.
And step 120, when the display interface is switched to the picture under the second view angle, the target object on the display interface advances to the appointed direction to be close to the target element for representing the appointed mode.
Taking the interactive device as a vehicle as an example, the specified mode may be a full screen mode of a multimedia application in the system, or the specified mode may also be a film watching mode of a certain online video. Typically, the system usually requires several seconds of loading time when loading a full screen mode of a multimedia-like application, or when loading some online video, during which time the user may be anxious due to waiting. Taking the interactive device as a VR/AR device as an example, the designated mode may also be an entry mode of a certain virtual scene, for example, an entry mode of a certain game scene. When such a scenario is loaded, because data is large, memory is occupied, and several seconds of loading time may be needed.
The embodiment of the application aims at the situation, when a user selects to enter a designated mode, the display interface is controlled to be switched to a picture under a second visual angle from the picture under the first visual angle, and then the dynamic change picture of the picture under the first visual angle is gradually transited to the picture under the second visual angle is displayed for the user.
For example, for step 110, the picture in the first viewing angle may be a picture in a front, rear or side viewing angle of the interactive device, and the picture in the second viewing angle may be a picture in a front, rear or side viewing angle of the interactive device. As an embodiment, in order to provide the user with the change of the mirror movement angle, so that the user can feel the change of the scene more immersive, when the picture at the first viewing angle is the front or rear viewing angle picture of the interactive device, the picture at the second viewing angle may be the side viewing angle picture of the interactive device.
It should be understood that when the interactive device is a vehicle, treadmill, spinning bike, motorcycle, electric car, etc. that can be equipped with a display screen, the display interface is a display interface in the physical display screen that can be seen. When the interactive device is a VR/AR device, the display interface is the display interface which is viewed by the user after wearing the VR/AR device.
For example, for step 120, where the target element for characterizing the specified pattern may be a virtual building (e.g., a castle) as the target element. When the interactive object is a vehicle such as a vehicle, a motorcycle, or an electric vehicle, the target element may be rendered by rendering an object of a landmark building in a destination to be reached within a preset time period in the future of the vehicle, the motorcycle, or the electric vehicle. When the interactive object is a VR/AR device, a treadmill or a spinning bike, the target element can be a gate, and after the target object is close to the gate, the gate can be opened and the interactive object enters a specified mode.
Fig. 2 is a schematic diagram of a picture when a display interface is switched to a second viewing angle in an interaction method according to an exemplary embodiment of the present application. In fig. 2, a landmark building in the destination to be reached in the future preset time period of the vehicle may be further included in the screen at the second viewing angle, and the building may be rendered into the building shown in fig. 2 according to the actual appearance shape of the building.
Taking the interactive device as an example, before entering the designated mode, as the interactive device continuously moves, the target object in the picture at the second view angle also changes in real time along with the change of the vehicle position, and compared with the waiting time of fixed animation or blank, the entering waiting mode of the designated mode can improve the interest of the user in watching the animation, reduce the waiting anxiety of the user, and further improve the driving experience of the user.
Wherein, the frame under the first visual angle also comprises a target object;
the target object is rendered based on the interactive device and/or the use object of the interactive device.
The target object may be the interaction device itself, or the target object may also be an interaction object interacting with the interaction device. When the interaction device is a vehicle, the target object may be the interaction device itself. When the interactive object is a treadmill, the target object may be an interactive object running on the treadmill. When the interactive object is the VR/AR device, the target object can be the interactive object wearing the VR/AR device. When the interaction device is a spinning, the target object may comprise a combination of the spinning and the interaction object on the spinning. When the interactive object is a motorcycle, the target object may comprise a combination of a motorcycle and an interactive object riding on the motorcycle.
Further, optionally, when the target object enters the target element or the target object approaches the target element indefinitely, the display interface may be controlled to display the display interface corresponding to the designated mode, that is, to enter the designated mode. Specifically, the method provided by the embodiment of the present application further includes:
and when the distance between the target object and the target element is smaller than or equal to the preset distance, controlling the display interface to display the display interface corresponding to the designated mode.
Further, optionally, the specified trigger event may be triggered based on the state of the interactive device, or the specified trigger event may also be triggered by the interactive object clicking an enter button of the specified mode. Specifically, in response to a received specified trigger event, controlling a display interface to switch from a picture at a first viewing angle to a picture at a second viewing angle includes:
when the interaction equipment is detected to be in the designated state, triggering a designated trigger event; or when the specified operation of an entry button of a specified mode displayed in the display interface is detected, a specified trigger event is triggered;
and responding to the appointed trigger event, and controlling the display interface to be switched from the picture under the first visual angle to the picture under the second visual angle.
Taking the designated mode as the full screen mode as an example, the designated state is a state in which the full screen mode is loaded by the interactive device. A specified trigger event may be triggered upon detection that the interactive device is loading full screen mode. Alternatively, the specified trigger event may be triggered when an operation such as clicking of an enter button of a specified mode displayed on the display interface by the interactive object is detected.
Fig. 3 is a schematic diagram illustrating an interaction method entering a specific mode according to an exemplary embodiment of the present application. In fig. 3, when the vehicle does not enter the designated mode, the model image of the vehicle and the image of the periphery of the vehicle are displayed in real time, and when the user clicks the enter button of the designated mode shown in the drawing, the entry into the designated mode is triggered.
Further, optionally, controlling the display interface to switch from the picture at the first viewing angle to the picture at the second viewing angle includes:
controlling a display interface to be switched from a picture in a mirror transportation process under a first visual angle to a map picture with a three-dimensional rendering effect under a second visual angle;
the picture of the mirror moving process comprises a three-dimensional map picture consisting of buildings and roads in the real environment.
Taking the interactive device as a vehicle as an example, the map image having the three-dimensional rendering effect at the second viewing angle may be a map image obtained by rendering the model data of the vehicle and the image data around the vehicle according to the 3D game style. Fig. 4 is a schematic diagram of rendering effects of a vehicle model in an interaction method according to an exemplary embodiment of the present application. Fig. 4 is a schematic diagram of a model of a vehicle on the left, and fig. 4 is a schematic diagram of a model rendered based on model data of the vehicle in accordance with a 3D game style on the right. Obviously, after the rendering, the model of the vehicle is converted into the vehicle model in the game scene, so that the watching interest of the user can be more aroused.
Further, optionally, in order to create an immersive experience for entering a specific mode for a user, the method provided in the embodiment of the present application further includes:
before displaying a display interface corresponding to the designated mode, playing the sound effect of the designated mode; or, within a preset time period after the display interface corresponding to the specified mode is displayed, playing the background sound effect of the specified mode.
It should be understood that, in order to improve the entering sense of the specified mode, the entering sound effect of the specified mode may start playing before entering the specified mode, that is, the entering sound effect is played in the process of controlling the display interface to switch from the picture in the first view angle to the picture in the second view angle and controlling the target object on the display interface to move towards the specified direction so as to approach the target element for representing the specified mode. Or, within a preset time period after entering the specified mode, playing the background sound effect of the specified mode.
Wherein, the incoming sound effect of appointed mode can be 7.1.2 panorama sound customization incoming sound effect, and this incoming sound effect possesses stronger science and technology sense, can combine the animation of the 3D recreation effect that the user watched, provides better seeing and hearing feast.
Further, optionally, in order to further improve the experience of entering the designated mode and improve the interest of entering the designated mode, the method provided in the embodiment of the present application further includes:
and before the control display interface displays the display interface corresponding to the designated mode or within a preset time period after the control display interface displays the display interface corresponding to the designated mode, lighting the atmosphere lamp of the interaction equipment.
It should be understood that, in order to enhance the sense of getting into the designated mode, the mood light may start to light before getting into the designated mode, i.e., the effect of getting into sound is played in the process of controlling the display interface to switch from the picture in the first viewing angle to the picture in the second viewing angle and the target object on the display interface to move towards the designated direction to approach the target element for representing the designated mode. Alternatively, the atmosphere lamp is lit for a preset period of time after entering the designated mode.
Further, alternatively, in order to promote interest and ceremonial feeling in entering the specified mode, the mood lamps arranged in the vehicle may be gradually lit in a preset lighting order. Specifically, the atmosphere lamp of mutual equipment has a plurality ofly, lights the atmosphere lamp of mutual equipment, includes:
and lighting the plurality of atmosphere lamps of the interaction equipment according to a preset atmosphere lamp lighting sequence.
Fig. 5 is a schematic diagram of an illumination sequence of an atmosphere lamp of a vehicle in an interaction method according to an exemplary embodiment of the present application. In fig. 5, the atmosphere lamps may be disposed in the left, right, and top areas surrounding the vehicle seats, so that the atmosphere lamps in the middle area of the vehicle seats may be turned on preferentially, and then the atmosphere lamps in the left and right areas of the middle area of the vehicle seats may be turned on gradually until all the atmosphere lamps surrounding the vehicle seats are turned on, in order to enhance the interest and the ceremony of the animation displayed to the user. The atmosphere lamp may be in a normally on state or an off state after the vehicle enters the specified mode.
According to the interaction method provided by one or more embodiments, when a user enters a designated mode, a dynamic change picture of a display interface switched from a picture under a first visual angle to a picture under a second visual angle can be seen, and when the display interface is switched to the picture under the second visual angle, a target object on the display interface moves towards a designated direction to be close to a target element for representing the designated mode, so that immersive picture entry experience is added to a waiting process of the user for entering the designated mode, waiting anxiety of the user for entering the designated mode is effectively relieved, and service experience of the user is effectively improved.
Fig. 6 is a schematic structural diagram of an interaction device 600 according to an embodiment of the present application. The interaction device of fig. 6 may include a control module 610 and a display module 620, wherein:
the control module 610 is configured to send a control instruction to the display module in response to a received specified trigger event, where the control instruction is used to control a display interface of the display module to switch from a picture in a first viewing angle to a picture in a second viewing angle;
the display module 620 is configured to respond to the control instruction, and control a display interface of the display module to switch from a picture at the first viewing angle to a picture at the second viewing angle;
when the display interface is switched to the picture under the second visual angle, the target object on the display interface advances towards the specified direction to be close to the target element for representing the specified mode.
Further, optionally, the control module 610 is further configured to:
and when the distance between the target object and the target element is smaller than or equal to a preset distance, controlling the display interface to display the display interface corresponding to the specified mode.
Further, optionally, the control module 610 is further configured to:
when the interaction equipment is detected to be in a specified state, triggering the specified trigger event; or when the specified operation of an entry button of a specified mode displayed in the display interface is detected, triggering the specified trigger event;
and responding to the specified trigger event, and controlling the display interface to be switched from the picture under the first visual angle to the picture under the second visual angle.
Further, optionally, the control module 610 is further configured to:
controlling the display interface to be switched from a picture of the mirror moving process under the first visual angle to a map picture with a three-dimensional rendering effect under the second visual angle;
wherein the picture of the mirror moving process comprises a three-dimensional map picture consisting of buildings and roads in a real environment.
Further, optionally, the frame at the first view angle includes the target object;
the target object is obtained by rendering based on the interactive equipment or the use object of the interactive equipment.
Further, optionally, the apparatus further comprises an audio module configured to:
before displaying a display interface corresponding to the specified mode, playing the sound effect of the specified mode; or playing the background sound effect of the specified mode within a preset time period after the display interface corresponding to the specified mode is displayed.
Further, optionally, the apparatus further comprises a light emitting module for:
and before the display interface is controlled to display the display interface corresponding to the specified mode or within a preset time period after the display interface is controlled to display the display interface corresponding to the specified mode, lighting an atmosphere lamp of the interaction equipment.
Further, optionally, the apparatus further comprises a light emitting module for:
and lighting the plurality of atmosphere lamps of the interaction equipment according to a preset atmosphere lamp lighting sequence.
The interactive device provided by one or more embodiments of the application can see a dynamic change picture of a display interface switched from a picture under a first view angle to a picture under a second view angle when a user enters a specified mode, and when the display interface is switched to the picture under the second view angle, a target object on the display interface moves towards a specified direction to be close to a target element for representing the specified mode, so that immersive picture entry experience is added to a waiting process of the user for entering the specified mode, waiting anxiety of the user for entering the specified mode is effectively relieved, and service experience of the user is effectively improved.
FIG. 7 is a schematic structural diagram of a vehicle according to an embodiment of the present application. The vehicle of fig. 7 may include the interaction device 600 shown in fig. 6.
The specific implementation of the vehicle shown in fig. 7 has been described in detail in the embodiments of the interaction device and method, and will not be elaborated upon here.
It should be noted that in some of the flows described in the above embodiments and the drawings, a plurality of operations are included in a specific order, but it should be clearly understood that the operations may be executed out of the order presented herein or in parallel, and the sequence numbers of the operations, such as 110, 120, etc., are merely used for distinguishing different operations, and the sequence numbers do not represent any execution order per se. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel. It should be noted that, the descriptions of "first", "second", etc. in this document are used for distinguishing different messages, devices, modules, etc., and do not represent a sequential order, nor limit the types of "first" and "second" to be different.
Fig. 8 is a schematic structural diagram of an electronic device according to an exemplary embodiment of the present application. Referring to fig. 8, the electronic device includes: a memory 81 and a processor 82.
Memory 81 is used to store computer programs and may be configured to store other various data to support operations on the computing platform. Examples of such data include instructions for any application or method operating on the computing platform, contact data, phonebook data, messages, pictures, videos, and so forth.
The memory 81 may be implemented by any type or combination of volatile or non-volatile memory devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
A processor 82 coupled to the memory 81 for executing the computer program in the memory 81 for: responding to the received appointed trigger event, and controlling the display interface to be switched from the picture under the first visual angle to the picture under the second visual angle; when the display interface is switched to the picture under the second visual angle, the target object on the display interface advances to the specified direction to be close to the target element for representing the specified mode.
Further optionally, the processor 82 is specifically configured to:
and when the distance between the target object and the target element is smaller than or equal to a preset distance, controlling the display interface to display the display interface corresponding to the specified mode.
Further optionally, the processor 82 is configured to:
when the interaction equipment is detected to be in a specified state, triggering the specified trigger event; or when the specified operation of an entry button of a specified mode displayed in the display interface is detected, triggering the specified trigger event;
and responding to the specified trigger event, and controlling the display interface to be switched from the picture under the first visual angle to the picture under the second visual angle.
Further optionally, the processor 82 is specifically configured to:
controlling the display interface to be switched from a picture of the mirror moving process under the first visual angle to a map picture with a three-dimensional rendering effect under the second visual angle;
wherein the picture of the mirror moving process comprises a three-dimensional map picture consisting of buildings and roads in a real environment.
Further optionally, the target object is rendered based on the interactive device or a usage object of the interactive device.
Further optionally, the processor 82 is specifically configured to:
and lighting an atmosphere lamp of the interaction equipment in a preset time period before the display interface is controlled to display the display interface corresponding to the specified mode or after the display interface is controlled to display the display interface corresponding to the specified mode.
Further optionally, the processor 82 is specifically configured to:
and lighting the plurality of atmosphere lamps of the interaction equipment according to a preset atmosphere lamp lighting sequence.
Further, as shown in fig. 8, the electronic device further includes: communications component 83, display 84, power component 85, audio component 86, and the like. Only some of the components are schematically shown in fig. 8, and the electronic device is not meant to include only the components shown in fig. 8. In addition, the components within the dashed line frame in fig. 8 are optional components, not necessary components, and may be determined according to the product form of the electronic device. The display is used for displaying and may include a screen, which may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. The display may also be an Electroluminescent (EL) element, a microdisplay with a similar structure, or a laser-scanned display where the retina can be directly displayed or similar, but also an Augmented Reality (AR), virtual Reality (VR) or Mixed Reality (MR) display device.
The electronic device of this embodiment may be implemented as a terminal device such as a desktop computer, a notebook computer, a smart phone, or an IOT device, or may be a server device such as a conventional server, a cloud server, or a server array. If the electronic device of this embodiment is implemented as a terminal device such as a desktop computer, a notebook computer, a smart phone, etc., the electronic device may include components within a dashed line frame in fig. 8; if the electronic device of this embodiment is implemented as a server device such as a conventional server, a cloud server, or a server array, the components in the dashed box in fig. 8 may not be included.
Accordingly, the present application further provides a computer program product, which includes a computer program/instruction that can implement the steps that can be performed by the electronic device in the foregoing method embodiments when executed.
Wherein the computer program/instructions may be stored in a computer readable storage medium or in the cloud.
The communication component is configured to facilitate communication between the device in which the communication component is located and other devices in a wired or wireless manner. The device where the communication component is located can access a wireless network based on a communication standard, such as a WiFi, a 2G, 3G, 4G/LTE, 5G and other mobile communication networks, or a combination thereof. In an exemplary embodiment, the communication component receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
The display includes a screen, which may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
The power supply assembly provides power for various components of the equipment where the power supply assembly is located. The power components may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device in which the power component is located.
The audio component may be configured to output and/or input an audio signal. For example, the audio component includes a Microphone (MIC) configured to receive an external audio signal when the device in which the audio component is located is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in a memory or transmitted via a communication component. In some embodiments, the audio assembly further comprises a speaker for outputting audio signals.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both permanent and non-permanent, removable and non-removable media, may implement the information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising a," "8230," "8230," or "comprising" does not exclude the presence of other like elements in a process, method, article, or apparatus comprising the element.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (12)

1. An interaction method, which is applied to an interaction device, and comprises:
responding to the received appointed trigger event, and controlling the display interface to be switched from the picture under the first visual angle to the picture under the second visual angle;
when the display interface is switched to the picture under the second visual angle, the target object on the display interface advances to the specified direction to be close to the target element for representing the specified mode.
2. The method of claim 1, wherein the method further comprises:
and when the distance between the target object and the target element is smaller than or equal to a preset distance, controlling the display interface to display the display interface corresponding to the specified mode.
3. The method of claim 1, wherein controlling the display interface to switch from a view in a first perspective to a view in a second perspective in response to the received specified trigger event comprises:
when the interaction equipment is detected to be in a specified state, triggering the specified trigger event; or when the specified operation of an entry button of a specified mode displayed in the display interface is detected, triggering the specified trigger event;
and responding to the specified trigger event, and controlling the display interface to be switched from the picture under the first visual angle to the picture under the second visual angle.
4. The method of any of claims 1 to 3, wherein controlling the display interface to switch from a view at a first perspective to a view at a second perspective comprises:
controlling the display interface to be switched from a picture in the mirror moving process under the first visual angle to a map picture with a three-dimensional rendering effect under the second visual angle;
wherein the picture of the mirror moving process comprises a three-dimensional map picture consisting of buildings and roads in a real environment.
5. The method according to any one of claims 1 to 3, wherein the target object is contained in the picture at the first view angle;
the target object is obtained by rendering based on the interactive device or the use object of the interactive device.
6. The method of claim 2, wherein the method further comprises:
before displaying a display interface corresponding to the specified mode, playing an entry sound effect of the specified mode; or playing the background sound effect of the specified mode within a preset time period after the display interface corresponding to the specified mode is displayed.
7. The method of claim 2 or 6, further comprising:
and before the display interface is controlled to display the display interface corresponding to the specified mode or within a preset time period after the display interface is controlled to display the display interface corresponding to the specified mode, lighting an atmosphere lamp of the interaction equipment.
8. The method of claim 7, wherein there are a plurality of ambiance lights of the interactive device, and wherein illuminating the ambiance lights of the interactive device comprises:
and lighting the plurality of atmosphere lamps of the interaction equipment according to a preset atmosphere lamp lighting sequence.
9. An interactive device, characterized in that the device comprises a control module and a display module, wherein:
the control module is used for responding to a received specified trigger event and sending a control instruction to the display module, wherein the control instruction is used for controlling a display interface of the display module to be switched from a picture under a first visual angle to a picture under a second visual angle;
the display module is used for responding to the control instruction and controlling a display interface of the display module to be switched from the picture under the first visual angle to the picture under the second visual angle;
when the display interface is switched to the screen under the second visual angle, the target object on the display interface advances to the appointed direction to be close to the target element for representing the appointed mode.
10. A vehicle, characterized in that the vehicle comprises an interaction device according to claim 9.
11. A computer program product comprising computer program/instructions, characterized in that when said computer program/instructions processor is executed it carries out the steps in the method according to any one of claims 1-8.
12. A computer device, comprising: a memory and a processor; wherein the memory is used for storing a computer program; the processor is coupled to the memory for executing the computer program for performing the steps in the method of any of claims 1-8.
CN202211412155.7A 2022-11-11 2022-11-11 Interaction method, interaction equipment and vehicle Active CN115494996B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211412155.7A CN115494996B (en) 2022-11-11 2022-11-11 Interaction method, interaction equipment and vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211412155.7A CN115494996B (en) 2022-11-11 2022-11-11 Interaction method, interaction equipment and vehicle

Publications (2)

Publication Number Publication Date
CN115494996A true CN115494996A (en) 2022-12-20
CN115494996B CN115494996B (en) 2023-07-18

Family

ID=85115621

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211412155.7A Active CN115494996B (en) 2022-11-11 2022-11-11 Interaction method, interaction equipment and vehicle

Country Status (1)

Country Link
CN (1) CN115494996B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111589146A (en) * 2020-04-27 2020-08-28 腾讯科技(深圳)有限公司 Prop operation method, device, equipment and storage medium based on virtual environment
CN111591210A (en) * 2020-04-22 2020-08-28 深圳市点嘀互联网络有限公司 System and method for intelligent interaction between embedded equipment and vehicle atmosphere lamp
CN112843716A (en) * 2021-03-17 2021-05-28 网易(杭州)网络有限公司 Virtual object prompting and viewing method and device, computer equipment and storage medium
CN113946259A (en) * 2021-09-18 2022-01-18 北京城市网邻信息技术有限公司 Vehicle information processing method and device, electronic equipment and readable medium
WO2022222597A1 (en) * 2021-04-19 2022-10-27 网易(杭州)网络有限公司 Game process control method and apparatus, electronic device, and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111591210A (en) * 2020-04-22 2020-08-28 深圳市点嘀互联网络有限公司 System and method for intelligent interaction between embedded equipment and vehicle atmosphere lamp
CN111589146A (en) * 2020-04-27 2020-08-28 腾讯科技(深圳)有限公司 Prop operation method, device, equipment and storage medium based on virtual environment
CN112843716A (en) * 2021-03-17 2021-05-28 网易(杭州)网络有限公司 Virtual object prompting and viewing method and device, computer equipment and storage medium
WO2022222597A1 (en) * 2021-04-19 2022-10-27 网易(杭州)网络有限公司 Game process control method and apparatus, electronic device, and storage medium
CN113946259A (en) * 2021-09-18 2022-01-18 北京城市网邻信息技术有限公司 Vehicle information processing method and device, electronic equipment and readable medium

Also Published As

Publication number Publication date
CN115494996B (en) 2023-07-18

Similar Documents

Publication Publication Date Title
CN111970533B (en) Interaction method and device for live broadcast room and electronic equipment
CN109920065B (en) Information display method, device, equipment and storage medium
TW202007142A (en) Video file generation method, device, and storage medium
CN109191549A (en) Show the method and device of animation
CN111327916B (en) Live broadcast management method, device and equipment based on geographic object and storage medium
CN113923499B (en) Display control method, device, equipment and storage medium
CN113395566B (en) Video playing method and device, electronic equipment and computer readable storage medium
CN112261481B (en) Interactive video creating method, device and equipment and readable storage medium
CN111966275A (en) Program trial method, system, device, equipment and medium
CN110377200B (en) Shared data generation method and device and storage medium
CN112527222A (en) Information processing method and electronic equipment
CN111596830A (en) Message reminding method and device
US20230306694A1 (en) Ranking list information display method and apparatus, and electronic device and storage medium
CN113407291A (en) Content item display method, device, terminal and computer readable storage medium
KR101413794B1 (en) Edutainment contents mobile terminal using augmented reality and method for controlling thereof
CN112527174A (en) Information processing method and electronic equipment
CN114116053A (en) Resource display method and device, computer equipment and medium
CN110968362B (en) Application running method, device and storage medium
CN112783316A (en) Augmented reality-based control method and apparatus, electronic device, and storage medium
CN111382355A (en) Live broadcast management method, device and equipment based on geographic object and storage medium
CN115494996B (en) Interaction method, interaction equipment and vehicle
CN114371898B (en) Information display method, equipment, device and storage medium
WO2019228969A1 (en) Displaying a virtual dynamic light effect
CN116016817A (en) Video editing method, device, electronic equipment and storage medium
US11546414B2 (en) Method and apparatus for controlling devices to present content and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant