CN117742555A - Control interaction method, device, equipment and medium - Google Patents

Control interaction method, device, equipment and medium Download PDF

Info

Publication number
CN117742555A
CN117742555A CN202211124875.3A CN202211124875A CN117742555A CN 117742555 A CN117742555 A CN 117742555A CN 202211124875 A CN202211124875 A CN 202211124875A CN 117742555 A CN117742555 A CN 117742555A
Authority
CN
China
Prior art keywords
control
virtual space
display state
hand model
interactive interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211124875.3A
Other languages
Chinese (zh)
Inventor
孟凡超
冀利悦
李笑林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zitiao Network Technology Co Ltd
Original Assignee
Beijing Zitiao Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zitiao Network Technology Co Ltd filed Critical Beijing Zitiao Network Technology Co Ltd
Priority to CN202211124875.3A priority Critical patent/CN117742555A/en
Publication of CN117742555A publication Critical patent/CN117742555A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a control interaction method, device, equipment and medium, wherein the method comprises the following steps: responsive to a call instruction of the virtual space, displaying an interactive interface in the virtual space, wherein the interactive interface includes at least one control; responding to the pressing operation of any control, and controlling the hand model displayed in the virtual space to press the control; and adjusting the display state of the control, wherein the display state at least comprises the spatial position of the control. According to the method and the device, the interaction effect of the control and the interaction interface can be enriched and utilized, and the use experience of a user is improved.

Description

Control interaction method, device, equipment and medium
Technical Field
The embodiment of the application relates to the technical field of man-machine interaction, in particular to a control interaction method, device, equipment and medium.
Background
With the continuous development of electronic devices such as Extended Reality (XR) devices, different controls (such as icons) are set on an interactive interface displayed by the electronic device to a user, so that the user can interact with the interactive interface through the controls. However, the interaction effect is single when the control is used for interacting with the interaction interface at present, so that the user experience is poor.
Disclosure of Invention
The embodiment of the application provides a control interaction method, device, equipment and medium, which can enrich and utilize the interaction effect of a control and an interaction interface and improve the use experience of a user.
In a first aspect, an embodiment of the present application provides a control interaction method, including:
in response to a call instruction of a virtual space, displaying a hand model and an interactive interface in the virtual space, wherein the interactive interface comprises at least one control;
responding to the pressing operation of any control, and controlling a hand model displayed in the virtual space to press the control;
and adjusting the display state of the control, wherein the display state at least comprises the spatial position of the control.
In a second aspect, an embodiment of the present application provides a control interaction device, including:
the display module is used for responding to the arousal instruction of the virtual space and displaying the hand model and the interactive interface in the virtual space, wherein the interactive interface comprises at least one control;
the response module is used for responding to the pressing operation of any control and controlling the hand model displayed in the virtual space to press the control;
and the adjusting module is used for adjusting the display state of the control, wherein the display state at least comprises the spatial position of the control.
In a third aspect, an embodiment of the present application provides an electronic device, including:
the control interaction device comprises a processor and a memory, wherein the memory is used for storing a computer program, and the processor is used for calling and running the computer program stored in the memory to execute the control interaction method in the embodiment of the first aspect or various implementation manners of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium storing a computer program, where the computer program causes a computer to execute a control interaction method as described in the embodiments of the first aspect or implementations thereof.
In a fifth aspect, embodiments of the present application provide a computer program product comprising program instructions which, when run on an electronic device, cause the electronic device to perform a control interaction method as described in the embodiments of the first aspect or implementations thereof.
The technical scheme disclosed by the embodiment of the application has at least the following beneficial effects:
and displaying an interactive interface comprising at least one control in the virtual space in response to a call instruction of the virtual space, controlling a hand model displayed in the virtual space to press the control in response to pressing operation of any control, and adjusting the display state of the control. Therefore, the control display state in the pressed state is adjusted, so that a user can intuitively see whether the control on the interactive interface is successfully pressed, the interactive effect of the control in interaction with the interactive interface can be enriched, and the use experience of the user is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of a control interaction method provided in an embodiment of the present application;
FIG. 2a is a schematic diagram of an interactive interface provided in an embodiment of the present application;
FIG. 2b is a schematic diagram of a left hand model provided in an embodiment of the present application;
FIG. 2c is a schematic diagram of a right hand model provided in an embodiment of the present application;
FIG. 2d is a schematic diagram of adjusting a pressed control display state provided by an embodiment of the present application;
FIG. 3 is a flowchart of another control interaction method according to an embodiment of the present disclosure;
FIG. 4 is a flowchart of another control interaction method according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of adjusting a display state of a touched control according to an embodiment of the present application;
FIG. 6 is a schematic block diagram of a control interaction device provided by an embodiment of the present application;
FIG. 7 is a schematic block diagram of an electronic device provided by an embodiment of the present application;
fig. 8 is a schematic block diagram of an electronic device provided in an embodiment of the present application as an HMD.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present application in light of the embodiments herein.
It should be noted that the terms "first," "second," and the like in the description and claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that embodiments of the present application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or server that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed or inherent to such process, method, article, or apparatus, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The method and the device are suitable for man-machine interaction scenes, and with the continuous development of electronic equipment such as Extended Reality (XR) equipment, different controls (such as icons and the like) can be arranged on an interaction interface displayed by the electronic equipment to a user, so that the user can interact with the interaction interface through the controls. However, the interaction effect is single when the control is used for interacting with the interaction interface at present, so that the user experience is poor. Therefore, the control interaction method is designed, so that the interaction effect of the control and the interaction interface can be utilized in a rich manner, and the use experience of a user is improved.
In order to facilitate understanding of embodiments of the present application, before describing various embodiments of the present application, some concepts related to all embodiments of the present application are first appropriately explained, specifically as follows:
1) Virtual Reality (VR) is a technology of creating and experiencing a Virtual world, calculating and generating a Virtual environment, which is a multi-source information (the Virtual Reality mentioned herein at least comprises visual perception, and may also comprise auditory perception, tactile perception, motion perception, and even taste perception, olfactory perception, etc.), implementing a fused, interactive three-dimensional dynamic view of the Virtual environment and simulation of entity behavior, immersing a user in the simulated Virtual Reality environment, and implementing applications in various Virtual environments such as maps, games, videos, education, medical treatment, simulation, collaborative training, sales, assistance in manufacturing, maintenance, and repair.
2) A virtual reality device (VR device) may be provided in the form of glasses, a head mounted display (Head Mount Display, abbreviated as HMD), or a contact lens for realizing visual perception and other forms of perception, but the form of the virtual reality device is not limited thereto, and may be further miniaturized or enlarged according to actual needs.
Optionally, the virtual reality device described in the embodiments of the present application may include, but is not limited to, the following types:
2.1 Computer-side virtual reality (PCVR) equipment, which utilizes the PC side to perform the related computation of the virtual reality function and data output, and external computer-side virtual reality equipment utilizes the data output by the PC side to realize the effect of virtual reality.
2.2 Mobile virtual reality device, supporting the setting of a mobile terminal (e.g., a smart phone) in various ways (e.g., a head mounted display provided with a dedicated card slot), performing related calculations of virtual reality functions by the mobile terminal through wired or wireless connection with the mobile terminal, and outputting data to the mobile virtual reality device, e.g., viewing virtual reality video through the APP of the mobile terminal.
2.3 The integrated virtual reality device has a processor for performing the related computation of the virtual function, so that the integrated virtual reality device has independent virtual reality input and output functions, does not need to be connected with a PC end or a mobile terminal, and has high use freedom.
3) Augmented reality (Augmented Reality, AR): a technique for calculating camera pose parameters of a camera in a real world (or three-dimensional world, real world) in real time during image acquisition by the camera, and adding virtual elements on the image acquired by the camera according to the camera pose parameters. Virtual elements include, but are not limited to: images, videos, and three-dimensional models. The goal of AR technology is to socket the virtual world over the real world on the screen for interaction.
4) Mixed Reality (Mixed Reality, abbreviated as: MR): a simulated scenery integrating computer-created sensory input (e.g., a virtual object) with sensory input from a physical scenery or a representation thereof, in some MR sceneries, the computer-created sensory input may be adapted to changes in sensory input from the physical scenery. In addition, some electronic systems for rendering MR scenes may monitor orientation and/or position relative to the physical scene to enable virtual objects to interact with real objects (i.e., physical elements from the physical scene or representations thereof). For example, the system may monitor movement such that the virtual plants appear to be stationary relative to the physical building.
5) Extended Reality (XR for short) refers to all real and virtual combined environments and human-machine interactions generated by computer technology and wearable devices, which include multiple forms of Virtual Reality (VR), augmented Reality (AR), and Mixed Reality (MR).
6) A virtual scene is a virtual scene that an application program displays (or provides) when running on an electronic device. The virtual scene may be a simulation environment for the real world, a semi-simulation and semi-fictional virtual scene, or a pure fictional virtual scene. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene or a three-dimensional virtual scene, and the dimension of the virtual scene is not limited in the embodiment of the present application. For example, a virtual scene may include sky, land, sea, etc., the land may include environmental elements of a desert, city, etc., and a user may control a virtual object to move in the virtual scene.
7) A virtual object is an object that interacts in a virtual scene, and is controlled by a user or a robot program (e.g., an artificial intelligence-based robot program) to be able to rest, move, and perform various actions in the virtual scene, such as various characters in a game.
Having introduced some concepts related to the embodiments of the present application, a specific description of a control interaction method provided by the embodiments of the present application is provided below with reference to the accompanying drawings.
Fig. 1 is a flow chart of a control interaction method provided in an embodiment of the present application. The control interaction method is applicable to man-machine interaction scenes, and can be executed by the control interaction device. The control interaction means may consist of hardware and/or software and may be integrated in the electronic device.
In the embodiment of the application, the electronic device may be any hardware device capable of providing a virtual space to a user. For example, the electronic device may be selected as an XR device or other device, etc. Wherein the XR device may be a VR device, an AR device, an MR device, or the like, which is not particularly limited herein. The present application mainly uses an electronic device as an XR device for illustration.
As shown in fig. 1, the method may include the steps of:
s101, responding to a calling instruction of the virtual space, and displaying an interactive interface in the virtual space, wherein the interactive interface comprises at least one control.
The virtual space refers to a virtual-real combination environment provided by the XR device to the user.
The interactive interface, in particular the user interface. The user interface is provided with controls with different functions, so that interactive operation between the user and the user interface can be realized through the controls with different functions. Such as a search control, a next page control, or an insert control, etc.
Specifically, the user may send a virtual space call instruction to the XR device by pressing, clicking, or voice. When a virtual space calling instruction sent by a user is detected, the XR equipment calls the virtual space, and an interactive interface is displayed in the virtual space. At least one interaction control is arranged on the interaction interface, so that a user can interact with the interaction interface by triggering any control.
The number of interactive interfaces displayed in the virtual space is at least one and each interactive interface comprises at least one control, as exemplarily shown in fig. 2 a. Wherein X is the effective pressing distance when the control is pressed. And, this effective pressing distance can be flexibly set according to actual application requirements, and is not particularly limited here.
The virtual space calling instruction is sent by pressing or clicking and the like, and can be realized by the following steps:
in one mode, a user controls a cursor or focus to click a preset calling area of an XR device display screen by means of an external device such as a handle or a hand controller.
The preset evoked area may be any area of the display screen, such as a center area, an upper left vertex area, and the like, which is not particularly limited herein.
In the second mode, the user presses the call key on the peripheral device through the real hand.
The call key can be any key on the peripheral device, for example, the call key is a key 1; or a key 1+ key 2, etc., which is not particularly limited herein.
In a third mode, the user presses the evoked button on the XR device via the real hand.
The call button may be any physical button on the XR device, such as a start button, and the like, which is not particularly limited herein.
The above manner of sending the virtual space call instruction is merely exemplary, and is not a specific limitation of the present application.
In some implementations, a hand model may also be displayed in the virtual space evoked in the present application, so that a user triggers any control on the interactive interface to perform interactive operation with the interactive interface by controlling the hand model.
The hand model is a virtual hand model constructed according to the real hand of the user.
The hand model is created in advance and arranged in a virtual space.
Illustratively, in an embodiment of the present application, the hand models displayed in the virtual space include a left hand model and a right hand model. Wherein the left hand model is shown in fig. 2 b; a right hand model as shown in fig. 2 c.
S102, in response to pressing operation of any control, controlling the hand model pressing control displayed in the virtual space.
After the interactive interface and the hand model are displayed in the virtual space, a user can control the hand model to move to the area where any control is located by using the peripheral device according to the interactive requirement, and trigger the confirmation button to send a pressing operation instruction of the control to the XR equipment. Further, the XR device controls the hand model to press the control according to the pressing instruction.
Or, the user may control the hand model to move to the area where any control is located by executing a preset gesture, and execute a pressing operation of the control, which is not specifically limited in this application. The preset gesture may be any gesture motion capable of controlling the hand model to make different motions, such as sliding gesture to control movement of the hand model, clicking gesture to control pressing of the hand model, and the like, which is not limited herein.
S103, adjusting the display state of the control, wherein the display state at least comprises the spatial position of the control.
The space position refers to display position information of the control on the interactive interface.
Specifically, when the control is pressed to be in a pressed state, the display state of the control can be adjusted. Therefore, the display state of the control is adjusted to display that the control is currently in the pressed state, and the user-adjusted display state is enabled to know that the control is successfully pressed.
In the embodiment of the present application, when adjusting the display state of the control, the following manner may be included:
mode one
And carrying out reduction processing on the display size of the control according to a preset reduction ratio.
The preset reduction ratio can be flexibly set according to the size of the interactive interface, and is not limited herein.
Mode two
And adjusting the spatial position of the control from the first position to the second position.
The first position is the spatial position of the control in the normal state, and the second position is the spatial position of the control in the pressing state. And, the first position and the second position differ by one pressing distance.
That is, the control is adjusted from the space position in the normal state to the space position in the pressing state, so that the effect of simulating the pressing of the control in the real space can be achieved.
It should be noted that the two adjustment modes described above may be implemented separately or in combination, and are not particularly limited herein. Of course, besides the adjustment mode, the display state of the control can be adjusted in other modes, so that the effect that the control is in the pressed state is reflected.
Illustratively, as shown in FIG. 2d, assume that the interactive interface includes a control region and a content display region, and that the control region includes 4 controls, control 1, control 2, control 3, and control 4, respectively. Then when the user controls the hand model to press the control 2, the spatial position of the control 2 is adjusted from the first position to the second position.
In some optional implementations, if the user controls the hand model to press any control in the interactive interface through the peripheral device, the application adjusts the display state of the control and simultaneously controls the peripheral device to output the first vibration feedback to the user.
The first vibration feedback is primary vibration feedback, and particularly strong vibration feedback. So that the control can be alerted to the successful depression based on the strong vibration feedback. Thus, the effect that any control on the visual and tactile double-perception interactive interface is pressed is realized.
According to the control interaction method, the interaction interface comprising at least one control is displayed in the virtual space through responding to the arousing instruction of the virtual space, the control is pressed by the hand model displayed in the virtual space is controlled through responding to the pressing operation of any control, and the display state of the control is adjusted. Therefore, the control display state in the pressed state is adjusted, so that a user can intuitively see whether the control on the interactive interface is successfully pressed, the interactive effect of the control in interaction with the interactive interface can be enriched, and the use experience of the user is improved.
As an alternative implementation, the control is pressed, and the distance to be pressed reaches the effective pressing distance to switch from the normal state to the pressing state. Therefore, after the control is pressed by the hand model displayed in the virtual space, whether the pressing distance of the control reaches the effective pressing distance or not needs to be determined. The process of determining whether the pressing distance of the control reaches the effective pressing distance provided in the present application is described below with reference to fig. 3.
As shown in fig. 3, the method may include the steps of:
s201, responding to a calling instruction of the virtual space, and displaying an interactive interface in the virtual space, wherein the interactive interface comprises at least one control.
S202, in response to pressing operation of any control, the hand model pressing control displayed in the virtual space is controlled.
S203, detecting whether the pressing distance of the control is larger than a preset distance, if so, executing S204, otherwise, executing S205.
S204, when the pressing distance of the control is detected to be larger than the preset distance, the display state of the control is adjusted, and the display state at least comprises the spatial position of the control.
S205, when the pressing distance of the control is detected to be smaller than or equal to the preset distance, the display state of the control is not adjusted.
The preset distance can be flexibly set according to practical application requirements, such as 2 centimeters (cm).
Specifically, when a user controls a hand model to press any control on an interactive interface, the application obtains the pressing distance of the control, and compares the obtained pressing distance with a preset distance. If the obtained pressing distance is smaller than or equal to the preset distance, the control is not pressed successfully, namely the control is still in a normal state at present, and the display state of the control is not adjusted. If the obtained pressing distance is larger than the preset distance, the control is successfully pressed, namely the control is in a pressing state currently, and the display state of the control can be adjusted. Therefore, the display state of the control is adjusted to display that the control is currently in the pressed state, and the user-adjusted display state is enabled to know that the control is successfully pressed.
The implementation process of adjusting the display state of the control can be specifically referred to the step S103 in the foregoing embodiment, and will not be described in detail herein.
According to the control interaction method, the interaction interface comprising at least one control is displayed in the virtual space through responding to the arousing instruction of the virtual space, the control is pressed by the hand model displayed in the virtual space is controlled through responding to the pressing operation of any control, and the display state of the control is adjusted. Therefore, the control display state in the pressed state is adjusted, so that a user can intuitively see whether the control on the interactive interface is successfully pressed, the interactive effect of the control in interaction with the interactive interface can be enriched, and the use experience of the user is improved. In addition, whether the pressing distance of the control is larger than the preset distance is determined, so that the accuracy of adjusting the display state of the control is improved, and the situation of error adjustment is avoided.
In another alternative implementation, because the controls on the interactive interface are limited in size, it is possible that the display area of some controls is relatively small. When a user interacts with a control with a smaller display area, there is a high possibility that a user presses a control next to the control by mistake, thereby causing a problem of low interaction efficiency. Therefore, before any control in the interactive interface is pressed by the hand model, the control to be pressed can be controlled to be touched by the hand model, and whether the control is continuously pressed by a user or not is reminded by adjusting the display state of the control, so that the occurrence of false pressing is reduced. The following describes, with reference to fig. 4, a specific procedure for performing a touch operation on the control before the control is pressed by the control hand model in the present application.
As shown in fig. 4, the method may include the steps of:
s301, responding to a calling instruction of the virtual space, and displaying an interactive interface in the virtual space, wherein the interactive interface comprises at least one control.
S302, in response to touch operation of any control, controlling the hand model touch control displayed in the virtual space.
Specifically, the user can control the hand model to move to any control to touch the control through the peripheral device according to the interaction requirement. Or, the user may control the hand model to move to any control to touch the control by executing a preset gesture, which is not particularly limited in the present application. The preset gesture may be any gesture capable of controlling the hand model to make different actions, such as a sliding gesture controlling the movement of the hand model, and the like, which is not limited herein.
And S303, when the contact of the hand model and the control is detected, adjusting the display state of the control.
Specifically, when the XR device detects that the hand model is located in the area where any control on the interactive interface is located, the hand model is considered to have touched the control. Further, the display state of the control is adjusted to highlight the control. Thereby reminding the user whether to continue to execute the pressing operation or not so as to reduce the occurrence of false pressing.
As an optional implementation manner, when the hand model is detected to be in contact with the control, adjusting the display state of the control includes: and carrying out special effect processing on the control.
Wherein, special effect processing includes: highlighting and/or magnifying.
Illustratively, as shown in FIG. 5, assume that the interactive interface includes a control region and a content display region, and that the control region includes 4 controls, control 1, control 2, control 3, and control 4, respectively. Then control 2 is highlighted when the user-controlled hand model touches control 2.
In some optional implementations, if the user controls the hand model to touch any control in the interactive interface through the peripheral device, when detecting that the hand model is in contact with the control, the application adjusts the display state of the control and simultaneously controls the peripheral device to output second vibration feedback to the user.
The second vibration feedback is a secondary vibration feedback, specifically a light vibration feedback. So that the user can be reminded to determine whether to continue pressing the control based on the light shock feedback. Therefore, the effect that any control on the visual and tactile double-perception interactive interface is touched is achieved.
S204, in response to the pressing operation of any control, the hand model pressing control displayed in the virtual space is controlled.
S205, adjusting the display state of the control, wherein the display state at least comprises the spatial position of the control.
According to the control interaction method, the interaction interface comprising at least one control is displayed in the virtual space through responding to the arousing instruction of the virtual space, the control is pressed by the hand model displayed in the virtual space is controlled through responding to the pressing operation of any control, and the display state of the control is adjusted. Therefore, the control display state in the pressed state is adjusted, so that a user can intuitively see whether the control on the interactive interface is successfully pressed, the interactive effect of the control in interaction with the interactive interface can be enriched, and the use experience of the user is improved. In addition, the control to be pressed is touched through the control hand model, and the display state of the control is adjusted to remind a user whether to continuously press the control, so that the occurrence of false pressing is reduced, and the interaction efficiency can be improved.
A control interaction device according to an embodiment of the present application is described below with reference to fig. 6. Fig. 6 is a schematic block diagram of a control interaction device provided in an embodiment of the present application.
As shown in fig. 6, the control interaction device 400 includes: a display module 410, a response module 420, and an adjustment module 430.
Wherein, the display module 410 is configured to respond to a call instruction of a virtual space, and display an interactive interface in the virtual space, where the interactive interface includes at least one control;
a response module 420, configured to control a hand model displayed in a virtual space to press the control in response to a pressing operation on any one of the controls;
and the adjustment module 430 is configured to adjust a display state of the control, where the display state includes at least a spatial position of the control.
An optional implementation manner of the embodiment of the present application, the adjustment module 330 is specifically configured to:
performing reduction processing on the display size of the control;
and/or
And adjusting the spatial position of the control from the first position to the second position.
An optional implementation manner of the embodiment of the present application, the apparatus 400 further includes:
and the output module is used for controlling the peripheral device to output the first vibration feedback.
An optional implementation manner of the embodiment of the present application, the apparatus 400 further includes:
and the distance detection module is used for detecting whether the pressing distance of the control is greater than a preset distance.
An alternative implementation of the embodiments of the present application,
the response module 420 is further configured to control a hand model displayed in the virtual space to touch the control in response to a touch operation on any one of the controls;
the adjustment module 430 is further configured to adjust a display state of the control when the hand model is detected to contact the control.
An optional implementation manner of the embodiment of the present application, the adjustment module 430 is further configured to:
and carrying out special effect processing on the control.
An alternative implementation of the embodiments of the present application,
the output module is specifically used for controlling the peripheral device to output the second vibration feedback.
According to the control interaction device provided by the embodiment of the application, the interaction interface comprising at least one control is displayed in the virtual space by responding to the arousal instruction of the virtual space, the control is pressed by the hand model displayed in the virtual space in response to the pressing operation of any control, and the display state of the control is adjusted. Therefore, the control display state in the pressed state is adjusted, so that a user can intuitively see whether the control on the interactive interface is successfully pressed, the interactive effect of the control in interaction with the interactive interface can be enriched, and the use experience of the user is improved.
It should be understood that apparatus embodiments and the foregoing method embodiments may correspond to each other, and similar descriptions may refer to the method embodiments. To avoid repetition, no further description is provided here. Specifically, the apparatus 400 shown in fig. 6 may perform the method embodiment corresponding to fig. 1, and the foregoing and other operations and/or functions of each module in the apparatus 400 are respectively for implementing the corresponding flow in each method in fig. 1, and are not further described herein for brevity.
The apparatus 400 of the embodiments of the present application is described above in terms of functional modules in connection with the accompanying drawings. It should be understood that the functional module may be implemented in hardware, or may be implemented by instructions in software, or may be implemented by a combination of hardware and software modules. Specifically, each step of the method embodiment of the first aspect in the embodiments of the present application may be implemented by an integrated logic circuit of hardware in a processor and/or an instruction in software, and the steps of the method of the first aspect disclosed in connection with the embodiments of the present application may be directly implemented as an execution of a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. Alternatively, the software modules may be located in a well-established storage medium in the art such as random access memory, flash memory, read-only memory, programmable read-only memory, electrically erasable programmable memory, registers, and the like. The storage medium is located in a memory, and the processor reads information in the memory, and in combination with hardware, performs the steps in the method embodiment of the first aspect.
Fig. 7 is a schematic block diagram of an electronic device provided in an embodiment of the present application. As shown in fig. 7, the electronic device 500 may include:
a memory 510 and a processor 520, the memory 510 being for storing a computer program and for transmitting the program code to the processor 520. In other words, the processor 520 may call and run a computer program from the memory 510 to implement the control interaction method in the embodiments of the present application.
For example, the processor 520 may be configured to execute the control interaction method embodiments described above in accordance with instructions in the computer program.
In some embodiments of the present application, the processor 520 may include, but is not limited to:
a general purpose processor, digital signal processor (Digital Signal Processor, DSP), application specific integrated circuit (Application Specific Integrated Circuit, ASIC), field programmable gate array (Field Programmable Gate Array, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like.
In some embodiments of the present application, the memory 510 includes, but is not limited to:
volatile memory and/or nonvolatile memory. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (Random Access Memory, RAM) which acts as an external cache. By way of example, and not limitation, many forms of RAM are available, such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (Double Data Rate SDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), and Direct memory bus RAM (DR RAM).
In some embodiments of the present application, the computer program may be partitioned into one or more modules that are stored in the memory 510 and executed by the processor 520 to perform the control interaction methods provided herein. The one or more modules may be a series of computer program instruction segments capable of performing the specified functions, which are used to describe the execution of the computer program in the electronic device.
As shown in fig. 7, the electronic device 500 may further include:
a transceiver 530, the transceiver 530 being connectable to the processor 520 or the memory 510.
The processor 520 may control the transceiver 530 to communicate with other devices, and in particular, may send information or data to other devices or receive information or data sent by other devices. The transceiver 530 may include a transmitter and a receiver. The transceiver 530 may further include antennas, the number of which may be one or more.
It will be appreciated that the various components in the electronic device are connected by a bus system that includes, in addition to a data bus, a power bus, a control bus, and a status signal bus.
In an embodiment of the present application, when the electronic device is an HMD, the embodiment of the present application provides a schematic block diagram of the HMD, as shown in fig. 8.
As shown in fig. 8, the main functional modules of the HMD600 may include, but are not limited to, the following: the detection module 610, the feedback module 620, the sensor 630, the control module 640, the modeling module 650.
The detection module 610 is configured to detect operation commands of a user by using various sensors, and act on a virtual environment, such as continuously updating images displayed on a display screen along with the line of sight of the user, so as to realize interaction between the user and the virtual scene.
The feedback module 620 is configured to receive data from the sensors and provide real-time feedback to the user. For example, the feedback module 620 may generate a feedback instruction based on the user operation data and output the feedback instruction.
The sensor 630 is configured to accept an operation command from a user and apply it to the virtual environment; and on the other hand is configured to provide the results generated after the operation to the user in the form of various feedback.
The control module 640 is configured to control sensors and various input/output devices, including obtaining user data such as motion, voice, etc., and outputting sensory data such as images, vibrations, temperature, sounds, etc., to affect the user, virtual environment, and the real world. For example, the control module 640 may obtain user gestures, voice, and the like.
The modeling module 650 is configured to construct a three-dimensional model of the virtual environment, and may also include various feedback mechanisms of sound, touch, etc. in the three-dimensional model.
It should be appreciated that the various functional modules in the HMD600 are connected by a bus system that includes, in addition to a data bus, a power bus, a control bus, a status signal bus, and the like.
The present application also provides a computer storage medium having stored thereon a computer program which, when executed by a computer, enables the computer to perform the method of the above-described method embodiments.
Embodiments of the present application also provide a computer program product comprising program instructions which, when run on an electronic device, cause the electronic device to perform the method of the method embodiments described above.
When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces, in whole or in part, a flow or function consistent with embodiments of the present application. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line (digital subscriber line, DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a digital video disc (digital video disc, DVD)), or a semiconductor medium (e.g., a Solid State Disk (SSD)), or the like.
Those of ordinary skill in the art will appreciate that the various illustrative modules and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the several embodiments provided in this application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, and for example, the division of the modules is merely a logical function division, and there may be additional divisions when actually implemented, for example, multiple modules or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or modules, which may be in electrical, mechanical, or other forms.
The modules illustrated as separate components may or may not be physically separate, and components shown as modules may or may not be physical modules, i.e., may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. For example, functional modules in the embodiments of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules may be integrated into one module.
The foregoing is merely specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes and substitutions are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (11)

1. A control interaction method, comprising:
responsive to a call instruction of a virtual space, displaying an interactive interface in the virtual space, wherein the interactive interface comprises at least one control;
responding to the pressing operation of any control, and controlling a hand model displayed in the virtual space to press the control;
and adjusting the display state of the control, wherein the display state at least comprises the spatial position of the control.
2. The method of claim 1, wherein adjusting the display state of the control comprises:
performing reduction processing on the display size of the control;
and/or the number of the groups of groups,
and adjusting the spatial position of the control from the first position to the second position.
3. The method as recited in claim 1, further comprising:
and controlling the peripheral device to output the first vibration feedback.
4. The method of claim 1, wherein prior to adjusting the display state of the control, further comprising:
and detecting whether the pressing distance of the control is larger than a preset distance.
5. The method of any of claims 1-4, further comprising, in response to an invocation instruction of the virtual space, after displaying the hand model and the interactive interface in the virtual space:
responding to touch operation of any control, and controlling a hand model displayed in a virtual space to touch the control;
and when the hand model is detected to be contacted with the control, adjusting the display state of the control.
6. The method of claim 5, wherein adjusting the display state of the control when contact of the hand model with the control is detected comprises:
and carrying out special effect processing on the control.
7. The method as recited in claim 5, further comprising:
and controlling the peripheral device to output the second vibration feedback.
8. A control interaction device, comprising:
the display module is used for responding to the arousal instruction of the virtual space and displaying an interactive interface in the virtual space, wherein the interactive interface comprises at least one control;
the response module is used for responding to the pressing operation of any control and controlling the hand model displayed in the virtual space to press the control;
and the adjusting module is used for adjusting the display state of the control, wherein the display state at least comprises the spatial position of the control.
9. An electronic device, comprising:
a processor and a memory for storing a computer program, the processor for invoking and running the computer program stored in the memory to perform the control interaction method of any of claims 1 to 7.
10. A computer-readable storage medium storing a computer program for causing a computer to execute the control interaction method according to any one of claims 1 to 7.
11. A computer program product comprising program instructions which, when run on an electronic device, cause the electronic device to perform the control interaction method of any of claims 1 to 7.
CN202211124875.3A 2022-09-15 2022-09-15 Control interaction method, device, equipment and medium Pending CN117742555A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211124875.3A CN117742555A (en) 2022-09-15 2022-09-15 Control interaction method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211124875.3A CN117742555A (en) 2022-09-15 2022-09-15 Control interaction method, device, equipment and medium

Publications (1)

Publication Number Publication Date
CN117742555A true CN117742555A (en) 2024-03-22

Family

ID=90276323

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211124875.3A Pending CN117742555A (en) 2022-09-15 2022-09-15 Control interaction method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN117742555A (en)

Similar Documents

Publication Publication Date Title
US20230405475A1 (en) Shooting method, apparatus, device and medium based on virtual reality space
CN112965773B (en) Method, apparatus, device and storage medium for information display
US20240127564A1 (en) Interaction method and apparatus of virtual space, device, and medium
US20240028130A1 (en) Object movement control method, apparatus, and device
CN112987924A (en) Method, apparatus, device and storage medium for device interaction
CN117742555A (en) Control interaction method, device, equipment and medium
CN117742554A (en) Man-machine interaction method, device, equipment and medium
CN117742478A (en) Information display method, device, equipment and medium
CN117555416A (en) Gesture control method, device, equipment and medium
CN118534998A (en) Virtual interaction method, device, equipment and medium
CN117130465A (en) Parameter setting method, device, equipment and storage medium based on XR equipment
CN117369677A (en) Cursor position determining method, device, equipment and medium
CN117785344A (en) Prompt message display method, device, equipment and medium
CN117850655A (en) Information input method, device, equipment and medium
CN118131892A (en) Virtual interaction method, device, equipment and medium
CN118012265A (en) Man-machine interaction method, device, equipment and medium
CN118349138A (en) Man-machine interaction method, device, equipment and medium
CN117369622A (en) Virtual object control method, device, equipment and medium
CN117631810A (en) Operation processing method, device, equipment and medium based on virtual reality space
CN117666852A (en) Method, device, equipment and medium for determining target object in virtual reality space
CN117930983A (en) Display control method, device, equipment and medium
CN117742479A (en) Man-machine interaction method, device, equipment and medium
CN117666769A (en) Virtual scene interaction method and device, storage medium and equipment
CN118051155A (en) Method, device, equipment and medium for adjusting text selection area in virtual space
CN117850606A (en) Information input method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination