CN115407906A - User interface interaction method, device, medium and equipment - Google Patents

User interface interaction method, device, medium and equipment Download PDF

Info

Publication number
CN115407906A
CN115407906A CN202211013833.2A CN202211013833A CN115407906A CN 115407906 A CN115407906 A CN 115407906A CN 202211013833 A CN202211013833 A CN 202211013833A CN 115407906 A CN115407906 A CN 115407906A
Authority
CN
China
Prior art keywords
user interface
dimensional
dimensional user
interaction
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211013833.2A
Other languages
Chinese (zh)
Inventor
仲余
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hubei Xingji Shidai Technology Co Ltd
Original Assignee
Hubei Xingji Shidai Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hubei Xingji Shidai Technology Co Ltd filed Critical Hubei Xingji Shidai Technology Co Ltd
Priority to CN202211013833.2A priority Critical patent/CN115407906A/en
Publication of CN115407906A publication Critical patent/CN115407906A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2215/00Indexing scheme for image rendering
    • G06T2215/06Curved planar reformation of 3D line structures

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a user interface interaction method, a user interface interaction device, a user interface interaction medium and user interface interaction equipment. The method comprises the following steps: mapping the two-dimensional user interface into a three-dimensional user interface; and when the three-dimensional user interface receives an interaction event, responding to the interaction event and calling a corresponding function module of the two-dimensional user interface. The user interface interaction method can reduce project cost of the three-dimensional user interface and improve development efficiency.

Description

User interface interaction method, device, medium and equipment
Technical Field
The present application relates to an interaction method, and in particular, to an interaction method, an interaction apparatus, a medium, and a device for a user interface.
Background
A 2D UI (User Interface) display is an existing display scheme since the appearance of a display screen. The 2D UI display is a technology for displaying various 2D image elements, such as buttons, selection boxes, input boxes, etc., on a two-dimensional plane, and is mainly used for displaying information and accepting user input. To date, 2D UI technology has been developed over the years, with mature technology and more products and resources. However, with the continuous development and popularization of 3D technology, it is difficult for a 2D UI to cope with more and more 3D applications, such as scenes of VR (Virtual Reality) and AR (Augmented Reality). For example, in a head-mounted AR application scenario, a common method is to map motion data onto a cursor in a 2D UI through an external device, such as a handle or a head-mounted sensor, and then access a conventional 2D UI to the AR scenario by using a computer desktop-like operation, which is inefficient.
In addition, for more and more 3D applications, a technology for providing a display interface using a 3D UI has also appeared on the market at present. The 3D UI can provide a display interface with a spatial concept for a user and can implement functions of information prompt and receiving input. The 3D UI can be well adapted to scenes such as VR and AR. However, the 3D UI has problems of excessively high manufacturing cost and a long manufacturing period.
Disclosure of Invention
In view of the above-mentioned shortcomings of the prior art, it is an object of the present application to provide an efficient and less costly three-dimensional user interface interaction scheme with a shorter production cycle.
To achieve the above and other related objects, a first aspect of the present application provides a method for user interface interaction, including: mapping the two-dimensional user interface into a three-dimensional user interface; and when the three-dimensional user interface receives an interactive event, responding to the interactive event and calling a corresponding function module of the two-dimensional user interface.
In an embodiment of the first aspect, mapping the two-dimensional user interface into the three-dimensional user interface comprises: acquiring a plurality of two-dimensional bounding boxes of the two-dimensional user interface, wherein the two-dimensional bounding boxes comprise pixel continuous areas corresponding to the two-dimensional user interface; and generating the three-dimensional user interface according to the pixel points of the two-dimensional bounding boxes and the pixel continuous area.
In an embodiment of the first aspect, the obtaining a plurality of two-dimensional bounding boxes of the two-dimensional user interface includes: acquiring a plurality of plane surrounding areas of the two-dimensional user interface, and converting the plurality of plane surrounding areas into a plurality of gray level images; and processing the plurality of gray level images to obtain a plurality of two-dimensional surrounding frames.
In an embodiment of the first aspect, the two-dimensional bounding box is a rectangular bounding box.
In an embodiment of the first aspect, the processing includes: carrying out binarization and median filtering on the gray level images to obtain a plurality of first images; performing edge detection on the plurality of first images to remove edge points to obtain a plurality of second images; carrying out contour gap filling on the plurality of second images by using closed operation, and eliminating scattered objects according to a gray threshold value to obtain a plurality of third images; and obtaining a plurality of two-dimensional surrounding frames based on pixel continuous areas in a plurality of third images.
In an embodiment of the first aspect, the obtaining the plurality of two-dimensional bounding boxes of the two-dimensional user interface includes: and generating a plurality of two-dimensional enclosure frames according to the received two-dimensional enclosure frame generation instruction.
In an embodiment of the first aspect, generating the three-dimensional user interface includes: mapping two-dimensional pixel points in the two-dimensional bounding boxes into voxel points in the three-dimensional bounding box, wherein the two-dimensional pixel points correspond to the voxel points mapped in the three-dimensional bounding box; and generating the three-dimensional user interface according to the voxel points of the three-dimensional bounding box.
In an embodiment of the first aspect, the method for interacting with the user interface further includes: and detecting whether an interaction event exists by utilizing the three-dimensional bounding box.
In an embodiment of the first aspect, the method for interacting with the user interface further includes: and when the two-dimensional user interface is updated, synchronously updating the three-dimensional user interface, or when the three-dimensional user interface is updated, synchronously updating the two-dimensional user interface.
A second aspect of the present application provides an interaction apparatus for a user interface, comprising: the user interface mapping module is used for mapping the two-dimensional user interface into a three-dimensional user interface; and the interactive event response module is used for responding to the interactive event and calling the corresponding function module of the two-dimensional user interface when the three-dimensional user interface receives the interactive event.
A third aspect of the present application provides a computer-readable storage medium, wherein when executed by a processor, the computer program implements the interaction method of the user interface of any one of the first aspects of the present disclosure.
A fourth aspect of the present application provides an electronic apparatus, comprising: a memory storing a computer program; and the processor is in communication connection with the memory and executes the user interface interaction method in any one of the first aspect of the application when the computer program is called.
As described above, the method, apparatus, medium, and device for user interface interaction provided in one or more embodiments of the present application have the following advantages:
the interaction method of the user interface can realize the effect of multiplexing the two-dimensional user interface functional module by binding the relationship between the two-dimensional user interface and the three-dimensional user interface. Compared with the existing scheme that the three-dimensional user interface interaction function needs to be additionally and independently realized, the code reuse rate can be improved, resources can be saved, and the two-dimensional user interface and the three-dimensional user interface can be unified.
The user interface interaction method can reduce project cost of the three-dimensional user interface, improve development efficiency, solve the problem of long complex period of manufacturing the three-dimensional user interface, and effectively utilize advantages of traditional two-dimensional application or interface to quickly convert the three-dimensional user interface into the corresponding three-dimensional user interface.
Drawings
Fig. 1A shows a schematic structural diagram of a virtual reality system.
Fig. 1B is a flowchart illustrating an interaction method of a user interface according to an embodiment of the present disclosure.
Fig. 2A is a flowchart illustrating a mapping of a two-dimensional user interface to a three-dimensional user interface according to an embodiment of the present disclosure.
Fig. 2B is a diagram illustrating an example of a rectangular enclosure in an embodiment of the present application.
Fig. 2C is a diagram showing an example of a three-dimensional curved surface and its control points in the embodiment of the present application.
Fig. 3 is a flowchart illustrating obtaining a two-dimensional bounding box according to an embodiment of the present disclosure.
Fig. 4 is a detailed flowchart of step S32 in the embodiment of the present application.
Fig. 5 is a flowchart illustrating a process of generating a three-dimensional user interface according to an embodiment of the present application.
Fig. 6 is a schematic structural diagram of an interaction device of a user interface in an embodiment of the present application.
Fig. 7 is a schematic structural diagram of an electronic device in an embodiment of the present application.
Description of the element reference numerals
110. Service terminal
120. Equipment terminal
600. Interaction device of user interface
610. User interface mapping module
620. Interactive event response module
630. User interface synchronization module
700. Electronic device
710. Memory device
720. Processor with a memory having a plurality of memory cells
730. Display device
S11 to S12 steps
S111 to S112 steps
S31 to S32
S321 to S323
S51 to S53
Detailed Description
The following description of the embodiments of the present application is provided by way of specific examples, and other advantages and effects of the present application will be readily apparent to those skilled in the art from the disclosure herein. The present application is capable of other and different embodiments and its several details are capable of modifications and/or changes in various respects, all without departing from the spirit of the present application. It is to be noted that the features in the following embodiments and examples may be combined with each other without conflict.
It should be noted that the drawings provided in the following embodiments are only for illustrating the basic idea of the present application, and the drawings only show the components related to the present application rather than the number, shape and size of the components in actual implementation, and the type, quantity and proportion of the components in actual implementation may be changed arbitrarily, and the layout of the components may be more complicated. Moreover, in this document, relational terms such as "first," "second," and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
Virtual reality technology is accepted by more and more people, and users can experience real feelings in the virtual reality world. Fig. 1A is a schematic diagram of a conventional virtual reality system. As shown in fig. 1A, the virtual reality system includes a service end 110 and a device end 120 communicatively connected to the service end 110, where the device end 120 may be a head-mounted device such as AR/VR glasses, for example. The communication connection between the device end 120 and the service end 110 can be various wired methods (for example, through a USB interface) or wireless methods (for example, a bluetooth communication link, a radio frequency identification technology, a short-range wireless communication technology, etc.). The server 110 may provide a plurality of services for the device 120 through a communication connection with the device 120, so that a user may perform a three-dimensional stereoscopic experience of viewing, shopping, games, remote teaching, and the like through the device 120.
The server 110 and the device 120 may be located in the same virtual reality or augmented reality device, such as a VR all-in-one machine. The server side 110 and the device side 120 can also be two independent electronic devices that can communicate with each other. For example, the device side 120 may be VR/AR glasses, VR/AR helmets, etc., and the server side 110 may be a server, a distributed server cluster composed of a plurality of servers, a mobile phone, a tablet computer, a laptop computer, a desktop computer, etc.
In the above embodiment, the server 110 may send a two-dimensional user interface to the device 120, the device 120 maps the received two-dimensional user interface to a three-dimensional user interface, the three-dimensional user interface is displayed on the device 120 so that a wearer of the device 120 may view and implement an interactive function through the three-dimensional user interface, when the three-dimensional user interface displayed on the device 120 receives an interactive event of a wearer or a non-wearer, the interactive event may be received through ray detection, bounding box collision, or the like, for example, and the device 120 responds to the interactive event and determines a corresponding function module of the two-dimensional user interface.
In the related art, the device side 120 may implement three-dimensional interaction of scenes such as VR or AR by using two schemes. The first scheme maps the motion data to a cursor in a two-dimensional user interface through external equipment, and then accesses the traditional two-dimensional user interface into a three-dimensional scene by adopting the operation similar to a computer desktop, and the scheme has low efficiency. The second solution is to directly develop a three-dimensional user interface suitable for a three-dimensional scene, and such a solution has the problems of too high manufacturing cost and too long manufacturing period.
At least in view of the above problems, the present application provides an interactive method for a user interface. The user interface interaction method can achieve the effect of multiplexing the two-dimensional user interface function module by binding the relationship between the two-dimensional user interface and the three-dimensional user interface. The method and the device can improve the code reuse rate, save resources and realize the unification of the interaction of the two-dimensional user interface and the three-dimensional user interface.
The interaction method of the user interface will be described by way of exemplary embodiments with reference to the accompanying drawings.
Fig. 1B is a flowchart illustrating an interaction method of a user interface according to an embodiment of the present disclosure. As shown in fig. 1B, the interaction method in this embodiment includes the following steps S11 and S12.
And S11, mapping the two-dimensional user interface into a three-dimensional user interface. The user interface is a medium for interaction and information exchange between the system and the user, and is used for realizing conversion between the internal form of the information and the acceptable form of the user.
And S12, when the three-dimensional user interface receives the interaction event, responding to the interaction event and calling a corresponding function module of the two-dimensional user interface. Specifically, when the three-dimensional user interface receives a certain interaction event, the interaction event is transmitted to the two-dimensional user interface, so that the two-dimensional user interface responds to the interaction event and calls the corresponding function module. In some embodiments, each time an interaction event is received by the three-dimensional user interface, the interaction event is passed to the two-dimensional user interface and the corresponding function module is invoked. In still other embodiments, the three-dimensional user interface may pass an interaction event to the two-dimensional user interface and invoke the corresponding function module only when a particular interaction event is received.
As can be seen from the above description, the interaction method for a user interface provided in this embodiment can achieve an effect of multiplexing the two-dimensional user interface function module by binding the relationship between the two-dimensional user interface and the three-dimensional user interface. Compared with the existing scheme that the three-dimensional user interface interaction function needs to be additionally and independently realized, the method and the device for realizing the three-dimensional user interface interaction have the advantages that the code reuse rate can be improved, resources are saved, and the two-dimensional user interface and the three-dimensional user interface can be unified.
It should be noted that, in step S12, the existing function module of the two-dimensional user interface may be called, and at this time, the interaction of the three-dimensional user interface may be realized without additionally developing a new function module, which is beneficial to reducing the development cost and shortening the development time, but the application is not limited thereto.
Referring to fig. 2A, in an embodiment of the present application, mapping the two-dimensional user interface to the three-dimensional user interface includes the following steps S111 and S112.
S111, acquiring a plurality of two-dimensional surrounding frames of the two-dimensional user interface, wherein the two-dimensional surrounding frames comprise pixel continuous areas corresponding to the two-dimensional user interface. Each two-dimensional bounding box may contain a region of interest in a two-dimensional user interface, e.g., for a video playback user interface, one two-dimensional bounding box may contain control components therein and another two-dimensional bounding box may contain video therein.
In this embodiment, the two-dimensional enclosure frame may be, for example, rectangular, but the application is not limited thereto. Fig. 2B is a diagram illustrating an example of a two-dimensional enclosure in the present embodiment, which includes 9 rectangular enclosures.
And S112, generating a three-dimensional user interface according to the pixel points of the two-dimensional bounding boxes and the pixel continuous area. Specifically, a corresponding three-dimensional curved surface may be generated according to the two-dimensional bounding box, and then the pixel points in the two-dimensional bounding box are mapped onto the three-dimensional curved surface to generate the three-dimensional user interface. For example, fig. 2C is a diagram showing an example of a three-dimensional curved surface generated according to the two-dimensional bounding box of fig. 2B in the present embodiment. As shown in fig. 2C, the vertices of the rectangular bounding box in fig. 2B may be mapped to control points of the bezier surface, so as to obtain a three-dimensional surface.
It should be noted that, in other embodiments, the number of two-dimensional bounding boxes may be one. At this time, the one two-dimensional bounding box may contain all the regions in the entire two-dimensional user interface.
Referring to fig. 3, a flowchart of acquiring a plurality of two-dimensional bounding boxes of a two-dimensional user interface according to an embodiment of the present application is shown. As shown in fig. 3, acquiring a plurality of two-dimensional bounding boxes of a two-dimensional user interface in the present embodiment includes the following steps S31 and S32.
And S31, acquiring a plurality of plane surrounding areas of the two-dimensional user interface, and converting the plurality of plane surrounding areas into a plurality of gray level images.
And S32, processing the gray images to obtain a plurality of two-dimensional surrounding frames. Alternatively, the two-dimensional bounding box may be a rectangular bounding box.
Fig. 4 is a flowchart illustrating processing a grayscale image to obtain a two-dimensional bounding box according to an embodiment of the present disclosure. As shown in fig. 4, the process of processing the grayscale image to obtain the two-dimensional bounding box in the present embodiment includes the following steps S321 to S324.
And S321, performing binarization and median filtering on the plurality of gray-scale images to obtain a plurality of first images. The binarization processing refers to setting the gray value of a pixel point on the gray image to two values, for example, 0 or 255, so that the whole image presents an obvious visual effect only including black and white. The median filtering is a nonlinear smoothing technique, and is used for setting the gray value of a pixel point as the median of the gray values of all pixel points in a certain neighborhood window of the pixel point.
S322, performing edge detection on the plurality of first images to remove edge points to obtain a plurality of second images. The edge detection is used for detecting points with obvious gray value change in the gray image as edge points, irrelevant information in the image can be removed through the edge detection, and the most important structural attributes in the image are reserved. Optionally, after the edge points of the grayscale image are obtained, in step S322, the non-horizontal and non-vertical edge points may be eliminated by using the matrix, so as to improve the accuracy of the obtained two-dimensional bounding box.
And S323, carrying out contour gap filling on the plurality of second images by using closed operation, and eliminating scattered objects according to a gray threshold value to obtain a plurality of third images. The closed operation is to perform expansion processing on the gray level image, and then perform corrosion processing on the gray level image after the expansion processing, so as to fill up the contour gap. In addition, in step S323, the residual scattered objects in the image can be eliminated according to the gray threshold. For example, pixels in the grayscale image with grayscale values below the grayscale threshold may be eliminated to eliminate residual scattered objects.
S324, a plurality of two-dimensional bounding boxes are obtained based on the pixel continuous regions in the plurality of third images. Optionally, for a certain pixel continuous region, if an abscissa of a leftmost pixel in the pixel continuous region is x1, an abscissa of a rightmost pixel in the pixel continuous region is x2, an ordinate of a topmost pixel is y1, and an ordinate of a bottommost pixel is y2, a range of a two-dimensional bounding box corresponding to the pixel continuous region is determined by four pixels (x 1, y 1), (x 2, y 1), (x 1, y 2), and (x 2, y 2).
It should be noted that the above steps S321 to S324 only exemplarily show one possible way of obtaining the two-dimensional bounding box in the present application, but the present application is not limited thereto. For example, in some embodiments, the two-dimensional bounding box may be obtained through an artificial intelligence algorithm, and in other embodiments, the two-dimensional bounding box may also be generated or modified according to a bounding box generation instruction input by a user, for example, a padding range is added (padding) or the like.
FIG. 5 illustrates a method for generating a three-dimensional user interface in an embodiment of the present application. As shown in fig. 5, the method of generating a three-dimensional user interface in the present embodiment includes the following steps S51 and S52.
And S51, mapping the two-dimensional pixel points in the two-dimensional bounding boxes into voxel points in the three-dimensional bounding box, wherein the two-dimensional pixel points correspond to the voxel points mapped in the three-dimensional bounding box. For example, the pixel points of the two-dimensional user interface may be uniformly mapped onto a sphere 3 meters away from the user or the device, but the application is not limited thereto.
Alternatively, in step S51, the two-dimensional bounding box may be converted into an m × n-order bezier surface, and the calculation formula of the bezier surface is shown in the following formula 1:
Figure BDA0003811713160000071
wherein B is a Bernstein polynomial, P is a Bessel surface control point, and u and v are control parameters on different dimensions.
And S52, generating a three-dimensional user interface according to the voxel points of the three-dimensional bounding box. Specifically, after the pixel points in the two-dimensional user interface are mapped to the voxel points in the three-dimensional bounding box, operations such as interpolation and/or fitting can be performed according to the voxel points to generate the three-dimensional user interface.
The method for interacting the user interface provided in an embodiment of the present application may further include: the presence of an interaction event is detected using a three-dimensional bounding box. Specifically, after the pixel points in the two-dimensional user interface are mapped to the voxel points of the three-dimensional bounding box, the bounding box of the three-dimensional user interface can be obtained according to the coordinate values of the voxel points, and the coordinates and the size of the center point of the bounding box can be obtained. The manner of obtaining the bounding box of the three-dimensional user interface is similar to the manner of obtaining the two-dimensional bounding box of the two-dimensional user interface, and is not described herein in detail.
Optionally, in this embodiment, the presence or absence of the interaction event may be detected by using a ray detection method or a bounding box collision method. For example, in the ray detection method, if a user emits a ray through VR glasses or a finger, the starting point and the angle of the ray are detected, and the coordinates and the size of the center point of the bounding box are combined to determine whether the ray passes through the bounding box and the position where the ray passes through the bounding box, so that whether an interaction event exists and the specific type of the interaction event can be known.
The method for interacting the user interface provided in an embodiment of the present application may further include: and synchronously updating the three-dimensional user interface when the two-dimensional user interface is updated, or synchronously updating the two-dimensional user interface when the three-dimensional user interface is updated.
The application also provides an interaction device of the user interface. Fig. 6 is a schematic structural diagram of an interaction device 600 of a user interface according to an embodiment of the present application. As shown in fig. 6, the interaction means 600 comprises a user interface mapping module 610 and an interaction event response module 620. The user interface mapping module 610 is configured to map a two-dimensional user interface into a three-dimensional user interface. The interaction event response module 620 is configured to, when the three-dimensional user interface receives an interaction event, respond to the interaction event and invoke a corresponding function module of the two-dimensional user interface.
Optionally, the user interface mapping module 610 mapping the two-dimensional user interface into the three-dimensional user interface includes: acquiring a plurality of two-dimensional bounding boxes of the two-dimensional user interface, wherein the two-dimensional bounding boxes comprise pixel continuous areas corresponding to the two-dimensional user interface; and generating the three-dimensional user interface according to the pixel points of the two-dimensional bounding boxes and the pixel continuous area.
Optionally, the obtaining, by the user interface mapping module 610, a plurality of two-dimensional bounding boxes of the two-dimensional user interface includes: acquiring a plurality of plane surrounding areas of the two-dimensional user interface, and converting the plane surrounding areas into a plurality of gray level images; and processing the plurality of gray level images to obtain a plurality of two-dimensional surrounding frames.
Optionally, the two-dimensional enclosure frame is a rectangular enclosure frame.
Optionally, the processing of the grayscale image by the user interface mapping module 610 includes: carrying out binarization and median filtering on the gray level images to obtain a plurality of first images; performing edge detection on the plurality of first images to remove edge points to obtain a plurality of second images; carrying out contour gap filling on the plurality of second images by using closed operation, and eliminating scattered objects according to a gray threshold value to obtain a plurality of third images; and obtaining a plurality of two-dimensional surrounding frames based on pixel continuous areas in a plurality of third images.
Optionally, the obtaining, by the user interface mapping module 610, a plurality of two-dimensional bounding boxes of the two-dimensional user interface includes: and generating a plurality of two-dimensional enclosure frames according to the received two-dimensional enclosure frame generation instruction.
Optionally, the user interface mapping module 610 generating the three-dimensional user interface includes: mapping two-dimensional pixel points in the two-dimensional bounding boxes into voxel points in the three-dimensional bounding box, wherein the two-dimensional pixel points correspond to the voxel points mapped in the three-dimensional bounding box; and generating the three-dimensional user interface according to the voxel points of the three-dimensional bounding box.
Optionally, the interactivity event response module 620 is further configured to detect whether an interactivity event is present using the three-dimensional bounding box.
Optionally, the user interface interaction device 600 may further include a user interface synchronization module 630. The user interface synchronization module 630 is configured to update the three-dimensional user interface synchronously when the two-dimensional user interface is updated, or update the two-dimensional user interface synchronously when the three-dimensional user interface is updated.
It should be noted that the above division of the modules of the interaction apparatus 600 is only a division of logical functions, and the actual implementation may be wholly or partially integrated into one physical entity, or may be physically separated. And these modules can be realized in the form of software called by processing element; or may be implemented entirely in hardware; and part of the modules can be realized in the form of calling software by the processing element, and part of the modules can be realized in the form of hardware. For example, the interactivity event response module may be a separately established processing element, or may be implemented by being integrated in a chip of the system, or may be stored in a memory of the system in the form of program code, and a processing element of the system calls and executes the functions of the interactivity event response module. Other modules are implemented similarly. In addition, all or part of the modules can be integrated together or can be independently realized. The processing element described herein may be an integrated circuit having signal processing capabilities. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in the form of software.
For example, the above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more Digital Signal Processors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), etc. For another example, when one of the above modules is implemented in the form of a Processing element scheduler code, the Processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling program code. For another example, these modules may be integrated together and implemented in the form of a System-On-a-Chip (SOC).
The present application also provides a computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the method for user interface interaction described in any of the embodiments of the present application.
Any combination of one or more storage media may be employed as a computer-readable storage medium in embodiments of the present application. The computer readable storage medium may be, for example, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer-readable storage medium include: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a RAM, a ROM, an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
Furthermore, in embodiments of the present application, computer program code for carrying out operations of the present application may be written in any combination of one or more programming languages. The programming languages include an object oriented programming language such as Java, smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The application also provides an electronic device. Please refer to fig. 7, which is a schematic structural diagram of an electronic device 700 according to an embodiment of the present application. As shown in fig. 7, the electronic device 700 of the present application includes a memory 710 and a processor 720. The memory 710 stores a computer program. The processor 720 is communicatively connected to the memory 710, and when invoked, performs the user interface interaction methods described in any of the above embodiments.
Optionally, the electronic device 700 may further include a display 730, the display 730 being communicatively coupled to the memory 710 and the processor 720 for displaying a GUI (Graphical User Interface) interaction Interface associated with the interaction method of the User Interface.
Illustratively, processor 720 may be a general-purpose Processor including a Central Processing Unit (CPU), a Network Processor (NP), and the like. But may also be a Digital Signal Processor (DSP), an application specific integrated circuit, a field programmable gate array or other programmable logic device, discrete gate or transistor logic, discrete hardware components.
The protection scope of the interaction method of the user interface described in this application is not limited to the execution sequence of the steps listed in this embodiment, and all the solutions implemented by adding, subtracting, and replacing steps in the prior art according to the principles of this application are included in the protection scope of this application.
The present application further provides an interaction device of a three-dimensional user interface, where the interaction device of the user interface can implement the interaction method of the user interface described in the present application, but the implementation device of the interaction method of the user interface described in the present application includes but is not limited to the structure of the interaction device of the user interface described in this embodiment, and all structural modifications and substitutions in the prior art made according to the principles of the present application are included in the protection scope of the present application.
In summary, the application provides an interaction method for a user interface, which can generate a three-dimensional curved surface by a way of converting a rectangular bounding box into a curved surface, is beneficial to improving the conversion speed from two dimensions to three dimensions, and can be suitable for different scenes. For example, on a display platform and a processing device with low configuration or insufficient computing resources, the whole two-dimensional interface can be directly converted into a spherical surface, a semi-spherical surface or other curved surfaces surrounding the two-dimensional interface by directly using a rectangular bounding box to rotate the spherical surface, so as to reduce the computing amount. However, the present application is not limited thereto, and for example, on a device with a higher configuration, a more accurate bounding box may be obtained by using techniques such as edge contour extraction or artificial intelligence target recognition, so as to generate a more accurate three-dimensional curved surface.
In addition, the user interface interaction method provided by the application not only can realize the conversion from the two-dimensional user interface to the three-dimensional user interface, but also can achieve the synchronization and the unification of the three-dimensional user interface and the two-dimensional user interface on the interaction layer. By the method, the three-dimensional user interface can be quickly generated according to the mature two-dimensional user interface, synchronous interaction of the two-dimensional interaction event and the three-dimensional interaction event is realized, and the method and the device are favorable for quickly generating three-dimensional interaction interfaces in batches. In addition, the method can use the existing functional module of the two-dimensional user interface, so that an interaction scheme does not need to be additionally manufactured aiming at the three-dimensional user interface, the development speed of application and functions is increased, and the cost of converting a two-dimensional product into an AR or VR product is reduced.
Therefore, the application effectively overcomes various defects in the prior art and has high industrial utilization value.
The above embodiments are merely illustrative of the principles and utilities of the present application and are not intended to limit the present application. Any person skilled in the art can modify or change the above-described embodiments without departing from the spirit and scope of the present application. Accordingly, it is intended that all equivalent modifications or changes which may be made by those skilled in the art without departing from the spirit and technical spirit of the present disclosure be covered by the claims of the present application.

Claims (12)

1. An interactive method of a user interface, comprising:
mapping the two-dimensional user interface into a three-dimensional user interface;
and when the three-dimensional user interface receives an interaction event, responding to the interaction event and calling a corresponding function module of the two-dimensional user interface.
2. The method of claim 1, wherein mapping a two-dimensional user interface to a three-dimensional user interface comprises:
acquiring a plurality of two-dimensional bounding boxes of the two-dimensional user interface, wherein the two-dimensional bounding boxes comprise pixel continuous areas corresponding to the two-dimensional user interface;
and generating the three-dimensional user interface according to the pixel points of the two-dimensional bounding boxes and the pixel continuous area.
3. The method of claim 2, wherein obtaining the plurality of two-dimensional bounding boxes of the two-dimensional user interface comprises:
acquiring a plurality of plane surrounding areas of the two-dimensional user interface, and converting the plurality of plane surrounding areas into a plurality of gray level images;
and processing the gray images to obtain a plurality of two-dimensional surrounding frames.
4. The method of claim 3, wherein the two-dimensional bounding box is a rectangular bounding box.
5. The method of claim 3, wherein the processing comprises:
carrying out binarization and median filtering on the gray level images to obtain a plurality of first images;
performing edge detection on the plurality of first images to remove edge points to obtain a plurality of second images;
carrying out contour gap filling on the plurality of second images by using closed operation, and eliminating scattered objects according to a gray threshold value to obtain a plurality of third images;
and obtaining a plurality of two-dimensional surrounding frames based on pixel continuous areas in a plurality of third images.
6. The method of claim 2, wherein obtaining the plurality of two-dimensional bounding boxes of the two-dimensional user interface comprises: and generating a plurality of two-dimensional enclosure frames according to the received two-dimensional enclosure frame generation instruction.
7. The method of claim 2, wherein generating the three-dimensional user interface comprises:
mapping two-dimensional pixel points in the two-dimensional bounding boxes into voxel points in the three-dimensional bounding box, wherein the two-dimensional pixel points correspond to the voxel points mapped in the three-dimensional bounding box;
and generating the three-dimensional user interface according to the voxel points of the three-dimensional bounding box.
8. The method of claim 7, further comprising: and detecting whether an interaction event exists by utilizing the three-dimensional bounding box.
9. The method of claim 1, further comprising:
and when the two-dimensional user interface is updated, synchronously updating the three-dimensional user interface, or when the three-dimensional user interface is updated, synchronously updating the two-dimensional user interface.
10. An interactive device of a user interface, comprising:
the user interface mapping module is used for mapping the two-dimensional user interface into a three-dimensional user interface;
and the interactive event response module is used for responding to the interactive event and calling the corresponding function module of the two-dimensional user interface when the three-dimensional user interface receives the interactive event.
11. A computer-readable storage medium having stored thereon a computer program, characterized in that: the computer program, when executed by a processor, implements the interaction method of the user interface of any one of claims 1 to 9.
12. An electronic device, characterized in that the electronic device comprises:
a memory storing a computer program;
a processor, communicatively coupled to the memory, that executes the interactive method of the user interface of any of claims 1 to 9 when the computer program is invoked.
CN202211013833.2A 2022-08-23 2022-08-23 User interface interaction method, device, medium and equipment Pending CN115407906A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211013833.2A CN115407906A (en) 2022-08-23 2022-08-23 User interface interaction method, device, medium and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211013833.2A CN115407906A (en) 2022-08-23 2022-08-23 User interface interaction method, device, medium and equipment

Publications (1)

Publication Number Publication Date
CN115407906A true CN115407906A (en) 2022-11-29

Family

ID=84161146

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211013833.2A Pending CN115407906A (en) 2022-08-23 2022-08-23 User interface interaction method, device, medium and equipment

Country Status (1)

Country Link
CN (1) CN115407906A (en)

Similar Documents

Publication Publication Date Title
WO2018188499A1 (en) Image processing method and device, video processing method and device, virtual reality device and storage medium
CN110196746B (en) Interactive interface rendering method and device, electronic equipment and storage medium
CN107274338B (en) Systems, methods, and apparatus for low-latency warping of depth maps
US20210027526A1 (en) Lighting estimation
CN110473293B (en) Virtual object processing method and device, storage medium and electronic equipment
CN111583381B (en) Game resource map rendering method and device and electronic equipment
CN110378947B (en) 3D model reconstruction method and device and electronic equipment
CN113313832B (en) Semantic generation method and device of three-dimensional model, storage medium and electronic equipment
CN109754464B (en) Method and apparatus for generating information
CN116246023A (en) Three-dimensional model reconstruction method, apparatus, device, storage medium, and program product
CN111275824A (en) Surface reconstruction for interactive augmented reality
CN111161398A (en) Image generation method, device, equipment and storage medium
CN108665510B (en) Rendering method and device of continuous shooting image, storage medium and terminal
WO2024016930A1 (en) Special effect processing method and apparatus, electronic device, and storage medium
CN113470112A (en) Image processing method, image processing device, storage medium and terminal
Manni et al. Snap2cad: 3D indoor environment reconstruction for AR/VR applications using a smartphone device
CN114782296B (en) Image fusion method, device and storage medium
WO2022121653A1 (en) Transparency determination method and apparatus, electronic device, and storage medium
CN113487717B (en) Picture processing method and device, computer readable storage medium and electronic equipment
WO2019042028A1 (en) All-around spherical light field rendering method
US20230260218A1 (en) Method and apparatus for presenting object annotation information, electronic device, and storage medium
US10754498B2 (en) Hybrid image rendering system
US20140003722A1 (en) Analyzing structured light patterns
CN111107264A (en) Image processing method, image processing device, storage medium and terminal
CN115407906A (en) User interface interaction method, device, medium and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: No. B1337, Chuanggu Initiation Zone, Taizihu Cultural Digital Creative Industry Park, No. 18 Shenlong Avenue, Wuhan Economic and Technological Development Zone, Wuhan, Hubei Province, 430058

Applicant after: Hubei Xingji Meizu Technology Co.,Ltd.

Address before: No. B1337, Chuanggu Start-up Zone, Taizi Lake Cultural Digital Creative Industry Park, No. 18 Shenlong Avenue, Wuhan Economic and Technological Development Zone, Wuhan City, Hubei Province 430000

Applicant before: Hubei Xingji times Technology Co.,Ltd.

CB02 Change of applicant information