CN115357112A - Vr remote education device based on 5G technology - Google Patents

Vr remote education device based on 5G technology Download PDF

Info

Publication number
CN115357112A
CN115357112A CN202110883267.XA CN202110883267A CN115357112A CN 115357112 A CN115357112 A CN 115357112A CN 202110883267 A CN202110883267 A CN 202110883267A CN 115357112 A CN115357112 A CN 115357112A
Authority
CN
China
Prior art keywords
module
touch
mobile phone
processor
memory
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202110883267.XA
Other languages
Chinese (zh)
Inventor
高洁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shaanxi Xueqian Normal University
Original Assignee
Shaanxi Xueqian Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shaanxi Xueqian Normal University filed Critical Shaanxi Xueqian Normal University
Priority to CN202110883267.XA priority Critical patent/CN115357112A/en
Publication of CN115357112A publication Critical patent/CN115357112A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06T3/08
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/06Electrically-operated educational appliances with both visual and audible presentation of the material to be studied
    • G09B5/065Combinations of audio and video presentations, e.g. videotapes, videodiscs, television systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/172Processing image signals image signals comprising non-image signal components, e.g. headers or format information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/048023D-info-object: information is displayed on the internal or external surface of a three dimensional manipulable object, e.g. on the faces of a cube that can be rotated by the user

Abstract

The invention discloses a vr remote education device based on a 5G technology, which comprises electronic equipment and virtual reality glasses. The electronic device receives a notification message in the VR mode, the notification message designated for display by the first user interface in the normal mode. The electronic device is configured to generate a second interface based on the first interface and display the second interface for displaying notification messages in the VR mode. The electronic equipment is configured to receive first user input corresponding to a first position on the second interface, the electronic equipment is configured to determine a second position on the first interface corresponding to the first position on the second interface to respond to the first user input for interacting with the notification, and the electronic equipment is provided with a teacher end and a student end, so that more vivid interaction can be realized between the student and the teacher through 5G communication between the teacher end and the student end.

Description

Vr remote education device based on 5G technology
Technical Field
The invention relates to the field of remote education and vr, in particular to a vr remote education device based on a 5G technology.
Background
Remote online education refers to a general term of all practical activities for a specific education organization, comprehensively applying technologies of a certain social period, collecting, designing, developing and utilizing various education resources, constructing an education environment, providing education services for students based on the technologies, the education resources and the education environment of the certain social period, and further organizing some collective conference communication activities (performed in a traditional face-to-face manner or in a modern electronic manner) for the students for the purposes of teaching and socialization to help and promote the remote learning of the students.
5G technology is the latest generation of cellular mobile communications technology and is an extension of 4G, 3G, and 2G systems, with 5G performance targets of high data rates, reduced latency, energy savings, reduced cost, increased system capacity, and large-scale device connectivity, the first phase of the 5G specification to accommodate early commercial deployment, the second phase to be completed in 4 months of 2020, vr distance education also began to apply to 5G technology.
The network course of present remote education device lacks the emotional exchange between student and the teacher when the teaching, only carries out the propagation of knowledge through the network video, and most remote education devices can not provide comparatively lifelike vision, sense of hearing and sense of touch etc. for the student, so and influence the quality of teaching, and the teacher is difficult to reach the interdynamic when carrying out remote education to the student, has influenced the quality of teaching.
Disclosure of Invention
In order to overcome the defects and shortcomings of the prior art, the invention provides a vr remote education device based on the 5G technology.
In an aspect of the present invention, there is provided a VR remote education apparatus based on 5G technology, wherein a mobile phone 100, VR devices include VR glasses 200, a terminal may include at least one memory, a touch screen, and a processor, the memory may be configured to store a software program, the processor performs various functions of the terminal by operating the software program stored in the memory, the touch screen may be configured to display information input by a user, information provided to the user, and various menus of the terminal, and may also accept user input, a state of the mobile phone 100 is set to a VR mode, and the mobile phone 100 is placed in the VR glasses 200.
In the present invention, an external device (e.g., a bluetooth handle) is used to perform operations on the 2D application on the mobile phone 100 and project the 2D application to the 3D interface of the VR glasses. The mobile phone 100 includes a Radio Frequency (RF) circuit 110, a memory 120, an input unit 130, a display unit 140, a sensor 150, an audio circuit 160, a wireless fidelity (wireless fidelity, 5G) module 170, a processor 180, and a power supply 190.
In the present invention, the RF circuit 110 receives and transmits information or signals during a call, receives downlink information of a base station, and transmits the downlink information of the base station to the processor 180. The server processes the downlink information. Further, RF circuitry 110 may also transmit uplink data to a base station-typically, RF circuitry includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier (low noise amplifier, LNA), a duplexer, and the like. In addition, the RF circuitry 110 may also communicate with a network and/or another device via wireless communication. Any communication standard or protocol may be used for wireless communication, including but not limited to global system for mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband Code Division Multiple Access (WCDMA), long term evolution (long term evolution, LTE), email, short message service (short message service, SMS), etc.
In the present invention, the memory 120 may be configured to store software programs and modules, the memory 120 including, but not limited to, an operating system, a communication module, a contact/motion module, a graphic module, a text output module, an application program, a content capture module, a camera module, a surface finger module (surface finger module), a buffer module (buffer module), a touch module (touch module), a bluetooth module, etc., and in addition, the memory 120 may mainly include a program storage area that may store an operating system, an application program required for at least one function (e.g., an audio playback function or an image playback function), etc., and a data storage area that may store data created based on the use of the mobile phone 100, etc., and the memory 120 may further include a high-speed random access memory and may further include a non-volatile memory such as at least one disk storage device, a flash memory device, or another volatile solid-state storage device.
In the present invention, the contact/motion module in memory 120 is configured to: detecting contact of an object or finger with the touch screen 140 or click wheel; capturing the speed (direction and magnitude) and acceleration (change in magnitude or direction) of the contact; the type of the contact event is determined. For example, the contact/motion module further includes multiple types of contact event detection modules: a calculation module (a digital descending module or a digital ascending module), a digital dragging module (a digital dragging module) and a digital clicking module (a digital clicking module). Sometimes, a gesture, such as a finger pinch/drop, is used with an element on the UI interface to effect some operation.
In the present invention, the bluetooth module is configured to be connected to an external device that performs an operation on the mobile phone by using the bluetooth module, and the external device includes a device that can remotely control the mobile phone, such as a bluetooth handle, and the graphic module is configured to present and display graphics on a touch screen or other display. The graphics include web pages, icons, digital images, videos, and animations, and the applications may include contact applications, phone applications, video conferencing applications, email clients, instant messaging applications, personal sports applications, camera applications, image management applications, video players, music players, calendar applications, plug-ins (weather plug-ins, stock plug-ins, calculator plug-ins, clock plug-ins, or dictionary plug-ins), custom plug-ins, search applications, note taking applications, map applications, online video applications, and so forth.
In the present invention, the input unit 130 may be configured to: receiving input numeric or character information and generating key signal input related to user settings and function control of the mobile phone 100, the input unit 130 may include a touch panel 131 and other input devices 132, the touch panel 131, also referred to as a touch screen, may collect touch operations performed by a user on the touch panel 131 or near the touch panel 131 (e.g., operations performed by a user on the touch panel 131 or near the touch panel 131 by using any suitable object or accessory such as a finger or a stylus), and drive the corresponding connection devices according to a preset program. In addition, the touch panel 131 may further include two parts: a touch detection device and a touch controller. The touch detection device detects a touch position of a user, detects a signal generated by a touch operation, and transmits the signal to the touch controller. The touch controller receives touch information from the touch detection device, converts the touch information into touch point coordinates, and transmits the touch point coordinates to the processor 180; and may receive a command sent by the processor 180 and execute the command. In particular, the touch panel 131 may be implemented in various types, such as a resistive type, a capacitive type, an infrared type, and a surface acoustic wave type, the input unit 130 may include other input devices 132 in addition to the touch panel 131, and the other input devices 132 may include one or more of, but are not limited to, a physical keyboard, function keys (e.g., volume control keys or on/off keys), a trackball, a mouse, a joystick, and a bluetooth handle.
In the present invention, the display unit 140 may be configured to display information input by or provided for a user and various menus of the mobile phone 100. The display unit 140 may include a display panel 141, and the display panel 141 may be configured in the form of a liquid crystal display (liquid crystal display, LCD), an organic light emitting diode (organic light emitting diode, OLED), or the like. In addition, the touch panel 131 may cover the display panel 141. After detecting a touch operation on or near the touch panel 131, the touch panel 131 sends the touch operation to the processor 180 to determine the type of touch event. The processor 180 then provides a corresponding visual output on the display panel 141 based on the type of touch event. In fig. 1, the touch panel 131 and the display panel 141 are used as two independent components to implement input and output functions of the mobile phone 100. However, in some embodiments, the touch panel 131 and the display panel 141 may be integrated to implement input and output functions of the mobile phone 100.
In the present invention, the mobile phone 100 may further include at least one sensor 150, such as a light sensor, a motion sensor, and another sensor; in particular, the light sensor may include an ambient light sensor and a proximity sensor. The ambient light sensor may adjust the brightness of the display panel 141 based on the brightness of the ambient light. The proximity sensor may turn off the display panel 141 and/or the backlight when the mobile phone 100 is moved to the ear. As a kind of motion sensor, the accelerometer sensor can detect acceleration values in all directions (typically three axes), can detect gravity values and directions when the mobile phone is stationary, and can be applied to an application of recognizing a posture of the mobile phone (for example, switching between a landscape mode and a portrait mode, related games, or magnetometer posture calibration), a vibration recognition related function (for example, a pedometer or a faucet), or the like, for another sensor that can be further configured in the mobile phone 100, for example, a gyroscope, a barometer, a hygrometer, a thermometer, or an infrared sensor.
In the present invention, the audio circuitry 160, speaker 161 and microphone 162 may provide an audio interface between a user and the mobile telephone 100. The audio circuit 160 may transmit an electric signal converted from the received audio data to the speaker 161, and the speaker 161 converts the electric signal into a sound signal to output. In addition, the microphone 162 converts the collected sound signals into electrical signals, and the audio circuit 160 receives and converts the electrical signals into audio data, and then outputs the audio data to the RF circuit 110, so that the audio data is transmitted to, for example, another mobile phone, or outputs the audio data to the memory 120 for further processing.
In the present invention, the processor 180 is a control center of the mobile phone 100, connects all parts of the entire mobile phone through various interfaces and lines, and implements various functions of the mobile phone 100, and processes data by running or executing software programs and/or modules stored in the memory 120 and calling data stored in the memory 120 so as to perform overall monitoring on the mobile phone, and alternatively, the processor 180 may include one or more processing units, and an application processor and a modem processor may be integrated into the processor 180, the application processor mainly processing an operating system, a user interface, an application program, and the like. The modem processor handles primarily wireless communications. It will be appreciated that the modem processor may not be integrated into the processor 180.
In the present invention, the mobile phone 100 also includes a power supply 200 (e.g., a battery) for supplying power to the components. The power supply may be logically connected to the processor 180 by using a power management system to implement functions such as charge management, discharge management, and power consumption management by using the power management system.
Drawings
FIG. 1 is a schematic block diagram of a terminal and a device according to an embodiment of the present invention
Fig. 2 is a schematic configuration diagram of a terminal according to an embodiment of the present invention
FIG. 3 is a schematic flow chart of a method for displaying a 2D application in a VR device in accordance with embodiments of the invention
FIG. 4 is a schematic flow chart diagram of the mapping principle according to an embodiment of the present invention
FIGS. 5 (a) and 5 (b) are schematic block diagrams of mapping principles according to embodiments of the present invention
FIG. 6 is a schematic configuration diagram of another terminal according to an embodiment of the present invention
FIG. 7 is a diagram illustrating an implementation effect of displaying a 2D application in a VR device according to an embodiment of the invention
Detailed Description
It should be noted that the embodiments and features of the embodiments can be combined with each other without conflict, and the present application will be further described in detail with reference to the drawings and specific embodiments.
Fig. 1 is a schematic configuration diagram of a terminal and a device according to an embodiment of the present invention, and fig. 1 is a vr remote education apparatus based on a 5G technology. As shown in fig. 1, an application scenario includes a terminal and a VR device. The terminal includes a mobile phone 100, and the VR device includes VR glasses 200. In particular, the terminal may include at least one memory, a touch screen, and a processor. The memory may be configured to store a software program; the processor performs various functions of the terminal by running the software programs stored in the memory; the touch screen may be configured to display information input by a user, information provided to the user, and various menus of the terminal, and may also accept user input. For convenience of description, in the present application, a detailed description is provided below by using an example in which the terminal is a mobile phone. The state of the mobile phone 100 is set to the VR mode, and the mobile phone 100 is placed in the VR glasses 200. The external device (e.g., bluetooth handle) is used to perform operations on the 2D application on the mobile phone 100 and project the 2D application to the 3D interface of the VR glasses. Specifically, the display method provided in the embodiment of the present invention is performed in the VR mode. Fig. 2 is a schematic configuration diagram of a terminal according to an embodiment of the present invention. Fig. 2 is a schematic configuration diagram of a mobile phone related to the present embodiment of the present application. As shown in the block diagram of a portion of the structure of the mobile phone 100 in fig. 2, the mobile phone 100 includes a radio frequency (radio frequency, RF) circuit 110, a memory 120, an input unit 130, a display unit 140, a sensor 150, an audio circuit 160, a wireless fidelity (wireless fidelity, 5G) module 170, a processor 180, and a power supply 190. Those skilled in the art will appreciate that the mobile phone architecture shown in fig. 1 does not constitute a limitation of the mobile phone, and that the mobile phone may include more or fewer components than those shown, or a combination of some components, or components arranged in a different manner. 10031] Each component of the mobile phone 100 is described in detail below:
the RF circuitry 110 may be configured to: receive and transmit information, or receive and transmit signals, during a call. The RF circuit 110 receives downlink information of the base station and transmits the downlink information of the base station to the processor 180. The server processes the downlink information. In addition, the RF circuitry 110 may also transmit uplink data to the base station. Typically, the RF circuitry includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier (low noise amplifier, LNA), a duplexer, and the like. In addition, the RF circuitry 110 may also communicate with a network and/or another device via wireless communication. Any communication standard or protocol may be used for wireless communication, including but not limited to global system for mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband Code Division Multiple Access (WCDMA), long term evolution (long term evolution, LTE), email, short message service (short message service, SMS), etc.
The memory 120 may be configured to store software programs and modules. In general, the memory 120 includes, but is not limited to, an operating system, a communication module, a contact/motion module, a graphics module, a text output module, an application program, a content capture module, a camera module, a surface finger module (surface finger module), a buffer module (buffer module), a touch module (touch module), a bluetooth module, and the like. In addition, the memory 120 may mainly include a program storage area and a data storage area. The program storage area may store an operating system, an application program required for at least one function (e.g., an audio playback function or an image playback function), and the like. The data storage area may store data (e.g., audio data and a phonebook) created based on the use of the mobile phone 100, and the like. The memory 120 may further include high speed random access memory and may further include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or another volatile solid state storage device. Additionally, the contact/motion module included in memory 120 is configured to: detecting contact of an object or finger with the touch screen 140 or click wheel; capturing the speed (direction and magnitude) and acceleration (change in magnitude or direction) of the contact; the type of the contact event is determined. For example, the contact/motion module further includes multiple types of contact event detection modules: a calculation module (a digital descending module or a digital ascending module), a digital dragging module (a digital dragging module) and a digital clicking module (a digital clicking module). Sometimes, a gesture, such as a finger pinch/drop, is used with an element on the UI interface to effect some operation.
The bluetooth module is configured to connect to an external device. The external device performs an operation on the mobile phone by using the bluetooth module, and the external device includes a device that can remotely control the mobile phone, such as a bluetooth handle.
The graphics module is configured to render and display graphics on a touch screen or other display. Graphics include web pages, icons, digital images, video, and animations.
The applications may include a contacts application, a phone application, a video conferencing application, an email client, an instant messaging application, a personal sports application, a camera application, an image management application, a video player, a music player, a calendar application, a plug-in (weather plug-in, stock plug-in, calculator plug-in, clock plug-in, or dictionary plug-in), a custom plug-in, a search application, a notes application, a maps application, an online video application, and so forth.
The input unit 130 may be configured to: receives input numeric or character information and generates key signal inputs related to user settings and function controls of the mobile phone 100. The input unit 130 may include a touch panel 131 and other input devices 132. The touch panel 131, also referred to as a touch screen, may collect touch operations performed by a user on the touch panel 131 or in the vicinity of the touch panel 131 (for example, operations performed by a user on the touch panel 131 or in the vicinity of the touch panel 131 by using any suitable object or accessory such as a finger or a stylus) and drive a corresponding connection device according to a preset program. In addition, the touch panel 131 may further include two parts: a touch detection device and a touch controller. The touch detection device detects a touch position of a user, detects a signal generated by a touch operation, and transmits the signal to the touch controller. The touch controller receives touch information from the touch detection device, converts the touch information into touch point coordinates, and transmits the touch point coordinates to the processor 180; and may receive a command sent by the processor 180 and execute the command. Specifically, the touch panel 131 may be implemented in various types, such as a resistive type, a capacitive type, an infrared type, and a surface acoustic wave type. The input unit 130 may include other input devices 132 in addition to the touch panel 131. Other input devices 132 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys or on/off keys), a trackball, a mouse, a joystick, and a bluetooth handle.
The display unit 140 may be configured to display information input by or provided for a user and various menus of the mobile phone 100. The display unit 140 may include a display panel 141. The display panel 141 may be configured in the form of a liquid crystal display (liquid crystal display, LCD), an organic light emitting diode (organic light emitting diode, OLED), or the like. In addition, the touch panel 131 may cover the display panel 141. After detecting a touch operation on or near the touch panel 131, the touch panel 131 sends the touch operation to the processor 180 to determine the type of touch event. The processor 180 then provides a corresponding visual output on the display panel 141 based on the type of touch event. In fig. 1, the touch panel 131 and the display panel 141 are used as two independent components to implement input and output functions of the mobile phone 100. However, in some embodiments, the touch panel 131 and the display panel 141 may be integrated to implement input and output functions of the mobile phone 100.
The mobile phone 100 may also include at least one sensor 150, such as a light sensor, a motion sensor, and another sensor. In particular, the light sensor may include an ambient light sensor and a proximity sensor. The ambient light sensor may adjust the brightness of the display panel 141 based on the brightness of the ambient light. The proximity sensor may turn off the display panel 141 and/or the backlight when the mobile phone 100 is moved to the ear. As a kind of motion sensor, the accelerometer sensor can detect acceleration values in all directions (usually three axes), can detect gravity values and directions when the mobile phone is stationary, and can be applied to applications of recognizing the posture of the mobile phone (for example, switching between a landscape mode and a portrait mode, related games, or magnetometer posture calibration), vibration recognition related functions (for example, a pedometer or a faucet), and the like. For another sensor that may be further configured in the mobile phone 100, such as a gyroscope, barometer, hygrometer, thermometer or infrared sensor, details are not described herein.
The audio circuitry 160, speaker 161, and microphone 162 may provide an audio interface between a user and the mobile telephone 100. The audio circuit 160 may transmit an electric signal converted from the received audio data to the speaker 161, and the speaker 161 converts the electric signal into a sound signal to output. In addition, the microphone 162 converts the collected sound signals into electrical signals, and the audio circuit 160 receives and converts the electrical signals into audio data, and then outputs the audio data to the RF circuit 110, so that the audio data is transmitted to, for example, another mobile phone, or outputs the audio data to the memory 120 for further processing.
5G is a short-long distance wireless transmission technology. Using the 5G module 170, the mobile phone 100 may assist the user in receiving and sending e-mail, browsing web pages, accessing streaming media, etc. The 5G module 170 provides wireless broadband access to the internet to the user. Although fig. 1 shows the 5G module 170, it is understood that the 5G module 170 is not an essential part of the mobile phone 100 and may of course be omitted as needed without changing the spirit of the present invention.
The processor 180 executes software programs and modules stored in the memory 120 to implement various functional applications of the mobile phone 100 and to process data. The processor 180 is a control center of the mobile phone 100, connects all parts of the entire mobile phone through various interfaces and lines, and implements various functions of the mobile phone 100, and processes data by running or executing software programs and/or modules stored in the memory 120 and calling data stored in the memory 120 so as to perform overall monitoring of the mobile phone, and optionally, the processor 180 may include one or more processing units. The application processor and the modem processor may be integrated into the processor 180. The application processor mainly handles operating systems, user interfaces, application programs, etc. The modem processor handles primarily wireless communications. It is understood that the modem processor may not be integrated into the processor 180.
The mobile phone 100 also includes a power supply 200 (e.g., a battery) that provides power to the components. The power supply may be logically connected to the processor 180 by using a power management system to implement functions such as charge management, discharge management, and power consumption management by using the power management system. Although not shown, a camera or the like may be further included in the mobile phone 100, and the details are not described here.
The following embodiments provide detailed descriptions with reference to fig. 3 and 7. Fig. 3 is a schematic flow diagram of a method for displaying a 2D application in a VR device in accordance with an embodiment of the invention. Fig. 7 is a diagram illustrating an implementation effect of displaying a 2D application in a VR device according to an embodiment of the present invention. The user may first set a start VR mode for a terminal (e.g., mobile phone 100 shown in fig. 1) and then place the mobile phone in VR mode in VR glasses. The step of initiating the VR mode may include: selecting "start VR mode" on the phone before VR glasses are connected to the phone; when the VR glasses are connected with the mobile phone, the mobile phone starts a VR mode by detecting a connection signal; when the VR glasses are connected with the mobile phone, a signal is input through an input unit of the VR glasses, and the 'starting VR mode' is selected. As shown in fig. 3, the method for displaying a 2D application in a VR device specifically includes the following steps.
Type S301. The terminal determines a first notification interface in a normal mode in the VR mode, wherein the first notification interface includes a notification message, and the notification message is from a 2D application.
Specifically, the terminal determines the notification message in the VR mode, and the terminal determines the first notification interface in the normal mode. It should be noted that in the present embodiment provided by the present invention, the 2D application may be understood as an application in the normal mode.
In the VR mode, a graphic system of the terminal and various devices (e.g., VR glasses) providing VR technology may be used together to provide an interactive three-dimensional environment for a user. In the normal mode, the terminal's graphics system may be used to provide image information to and interact with the user. That is, when the terminal is in the normal mode, the user may interact without using VR glasses; when the terminal is in the virtual reality mode, the user sees the virtual three-dimensional image information by using the virtual reality glasses, and can realize interaction by using the virtual three-dimensional image information. The normal mode may also be referred to as a non-VR mode.
In a possible embodiment, the surface finger module in the processor 180 shown in fig. 2 receives the notification message, and the surface finger module determines whether the notification message is from a 2D application.
If the notification message is from a 3D application, the surface finger module sends the notification message from the 3D application to a buffer module in the processor 180 shown in FIG. 2, which stores the notification message and projects the notification message directly onto the 3D interface of the VR glasses 200 shown in FIG. 1. At this time, a 3D model of any application is stored in the system of the mobile phone, and the 3D model of at least one of any application is displayed in the current VR mode.
If the notification message is from the 2D application, S302 is performed.
Type S302. The terminal determines a second notification interface based on the first notification interface in the VR mode and displays the second notification interface in the VR mode.
Specifically, the terminal maps the first notification interface to the 3D model to determine a second notification interface; and the terminal displays the second notification interface in a virtual reality mode.
In a possible embodiment, the surface finger module receives the notification message from the 2D application and determines a 2D notification interface on which the notification message for the 2D application is located, wherein the 2D notification interface includes the notification message from the 2D application.
For example, if the 2D application is a gallery and the notification message is a screen capture instruction, the following steps are performed: the surface finger module receives a screen capture instruction from the gallery application and determines a screenshot to capture based on the screen capture instruction.
In a possible embodiment, the terminal maps the 2D notification interface to a 3D model in VR mode.
In particular, the surface finger module maps the 2D notification interface to a 3D model in VR mode. In the VR mode, the terminal maps the first notification interface to some or all of the regions of the 3D model. The mapping in this step may be understood as mapping the 2D notification interface to the 3D model in VR mode.
The surface finger module obtains 3D data in a 3D model corresponding to the 2D notification interface.
The surface finger module sends the corresponding 3D data to a buffer module in the processor shown in fig. 2, and the buffer module stores the 3D data and directly projects the 3D data onto the 3D interface of the VR glasses 200 shown in fig. 1.
The face finger module maps the screenshot to a 3D model in VR mode, or in other words, establishes a correspondence between the coordinates of the 2D screenshot and the coordinates of the 3D model. In VR mode, the terminal maps the first notification interface to some or all of the regions of the 3D model.
The face finger module sends the 3D data corresponding to the screenshot to a buffer module in the processor shown in fig. 2, and the buffer module stores the 3D data and directly projects the 3D data onto a 3D interface of the VR glasses 200 shown in fig. 1. 10061] S303. The terminal determines a user operation in an air mode, wherein a touch point of the user operation is located on the second notification interface, and the position information of the touch point on the second notification interface is the first position information.
Specifically, the terminal determines an operation instruction of the terminal by the user in the VR mode, and the terminal determines whether the 2D application should respond to the operation instruction based on the position of the touch point operated by the user. Additionally, the VR glasses may also determine user operations.
If the touch operation of the user is on the interface of the 3D application, the touch module in the processor 180 shown in fig. 2 responds to the operation instruction, and the 3D application performs a corresponding operation according to the operation instruction.
If the touch point operated by the user is located on the second notification interface, S304 is performed.
Page 304. The terminal determines second location information based on the first location information in the VR mode so that the 2D application responds to the user operation based on the second location information, wherein the second location information is location information on the first notification interface.
Specifically, the first position information is a three-dimensional coordinate of the user operation point, and the second position information is a 2D coordinate on the first notification interface corresponding to the 3D coordinate. In a possible embodiment, the terminal maps the position indicated by the 3D coordinates of the touch point to the position indicated by the 2D coordinates, and the terminal obtains the position indicated by the current 2D coordinates. And the terminal simulates an operation instruction for touching the two-dimensional application program according to the position indicated by the two-dimensional coordinates. The touch module in the processor 180 shown in fig. 2 responds to the operation instruction, and the 2D application performs a corresponding operation according to the operation instruction. The mapping in this process can be understood as the reverse process of S302. That is, in the VR mode, a three-dimensional model of the touch point is mapped onto the two-dimensional interface.
The terminal projects an interface of the 2D application onto a 3D screen of VR glasses. The specific effect view may be as shown in fig. 5 (a) and 5 (b).
Fig. 4 is a schematic illustration of the mapping principle according to an embodiment of the invention. For the mapping in the above description, the mapping principle is described in detail with reference to fig. 2, 5 (a), and 5 (b). Fig. 5 (a) and 5 (b) are schematic structural diagrams of a mapping principle according to an embodiment of the present invention. Mapping a two-dimensional notification interface to a three-dimensional interface in VR mode includes the following steps.
And S401. The terminal obtains coordinates of a 2D notification interface, where the 2D notification interface includes a notification message from a 2D application.
In particular, the surface finger module in the processor 180 shown in fig. 2 obtains the coordinates of the 2D notification interface. As shown in fig. 5 (a), 501 is an interface of a 2D application, and cube 502 is a 3D model corresponding to a 2D application display interface.
And S402. The terminal obtains a three-dimensional model contained in the system.
Specifically, the terminal determines a 3D model of the 2D notification interface in the system based on the coordinates of the 2D notification interface, as shown by the cube 502 in fig. 5 (a) and 5 (b).
And S403. And the terminal maps the coordinates of the two-dimensional notification interface to the three-dimensional model.
Specifically, the coordinate values of the 2D notification interface in S401 may be mapped to the 3D model. This process can be illustrated in fig. 5 (b). In the VR mode, the terminal maps the first notification interface to some or all of the regions of the 3D model.
For example, the mapping can also be understood as follows: the three-dimensional coordinates (e.g., 3D coordinates) and the two-dimensional coordinates (e.g., coordinates of the 2D notification interface) may form a 3D-2D coordinate pair. Then, a process for calculating 2D coordinates corresponding to the three-dimensional coordinates P is shown by equations (1), (2), and (3). In equation (1), R is a 3 × 3 orthogonal rotation matrix and t is a 3 × 1 translation vector. Since the coordinates of the virtual object in the virtual reality device are known, the values of R and t in equation (1) are known.
Figure BDA0003193026400000171
The orthogonal rotation matrix R in equation (1) can be represented by three 3 × 1 column vectors, as shown in equation (2):
R[r1 r2 r3] (2)
in this case, the homogeneous coordinate of the point p in the two-dimensional image, to which the three-dimensional coordinate p (X, Y, O) on the three-dimensional plane is projected, can be represented by using equation (3):
Figure BDA0003193026400000172
at this time, the transformation H between an arbitrary point in an image outside the display screen of the smart device (e.g., mobile phone) and an arbitrary point in a two-dimensional image displayed on the head-mounted device (e.g., VR glasses) can be represented by a 3 × 3 matrix represented by equation (4):
H=[K*[r 1 r 2 ]K*t] (4)
similarly, H-' may be calculated to obtain a three-dimensional coordinate P corresponding to an arbitrary point P on the two-dimensional plane. Fig. 6 is a schematic configuration diagram of another terminal according to an embodiment of the present invention. As shown in fig. 6, the terminal includes a transceiver 601, a processor 602, and a display.
The transceiver 601 is configured to determine a notification message in the VR mode, wherein the notification message is from a 2D application.
The processor 602 is configured to determine a 2D notification interface, wherein the 2D notification interface comprises a notification message. 10083 the processor 602 is further configured to determine a second notification interface based on the first notification interface in the VR mode.
The display is configured to display a second notification interface in the VR mode.
The processor 602 is further configured to determine a user operation in the VR mode, wherein a touch point of the user operation is located on the second notification interface and the position information of the touch point on the second notification interface is the first position information. The processor is further configured to determine, in the VR mode, second location information based on the first location information such that the 2D application responds to the user operation based on the second location information, wherein the second location information is location information on the first notification interface.
The transceiver 601 is further configured to: a user operational interface is projected onto a 3D interface of the VR glasses and a notification message is determined in the VR mode, wherein the notification message is from a 2D application.
The processor 602 is further configured to: a first notification interface is determined in a normal mode, wherein the first notification interface includes a notification message and maps the first notification interface to a 3D model to determine a second notification interface.
The processor is further configured to map the first notification interface to some or all of the regions of the 3D model in the VR mode.
The processor 602 is further configured to: and mapping the 3D coordinates of the touch point operated by the user to the 2D coordinates on the first notification interface, and responding to the user operation based on the position indicated by the 2D coordinates on the first notification interface. The first position information is 3D coordinates of the touch point, and the second position information is 2D coordinates on the first notification interface.
The processor 602 maps the first position information to a 3D model in VR mode to obtain 3D data, and projects the 3D data into VR glasses. By this processing, the 2D application on the terminal supporting the VR technology can be displayed in the 2D mode.
The processor is further configured to determine 2D coordinates on the first notification interface by mapping the 3D coordinates in the VR mode.
Fig. 7 is an implementation effect diagram for displaying a 2D application in a VR device according to an embodiment of the present invention, as shown in fig. 7, a user performs a touch operation on a 2D application interface by using an external device (e.g., a bluetooth handle or a VR handle), and a terminal maps 3D coordinates of a position of a touch point operated by the user in a 3D model to coordinates on a 2D notification interface. The terminal determines 2D coordinates on the 2D notification interface, and based on the 2D coordinates on the 2D notification interface, touch operation is simulated on a 2D application program corresponding to the 2D notification interface, so that the 2D application responds to touch operation of a user according to mapping of the 2D application to 3D display, the terminal can project the whole operation process onto a screen of VR glasses, and the VR glasses can provide 3D effects for the user to watch.
All or a portion of the embodiments described above may be implemented using software, hardware, firmware, or any combination thereof. When implemented in software, all or part of the embodiments described above may be implemented in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions according to the embodiments of the present invention are generated in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer-readable storage medium or may be transferred from a computer-readable storage medium to another computer-readable storage medium. For example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, or Digital Subscriber Line (DSL)) or wirelessly (e.g., infrared, radio, or microwave). A computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device (e.g., a server or a data center) that incorporates one or more available media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, or a magnetic tape), an optical medium (e.g., a Digital Versatile Disk (DVD)), a semiconductor medium (e.g., a Solid State Disk (SSD)), or the like.
Although embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that various equivalent changes, modifications, substitutions and alterations can be made herein without departing from the principles and spirit of the invention, the scope of which is defined by the appended claims and their equivalents.

Claims (10)

1. A VR remote education apparatus based on 5G technology, comprising a mobile phone 100, VR devices including VR glasses 200, a terminal may include at least one memory, a touch screen, and a processor, the memory may be configured to store software programs, the processor performs various functions of the terminal by running the software programs stored in the memory, the touch screen may be configured to display information input by a user, information provided to the user, and various menus of the terminal, and may also accept user input, a state of the mobile phone 100 is set to a VR mode, and the mobile phone 100 is placed in the VR glasses 200.
2. A 5G technology based VR remote education apparatus as claimed in claim 1 where the external device (e.g. bluetooth handle) is used to perform operations on the 2D application on the mobile phone 100 and project the 2D application to the 3D interface of the VR glasses. The mobile phone 100 includes a radio frequency (radio frequency RF) circuit 110, a memory 120, an input unit 130, a display unit 140, a sensor 150, an audio circuit 160, a wireless fidelity (wireless fidelity, 5G) module 170, a processor 180, and a power supply 190.
3. The vr remote education apparatus based on 5G technology of claim 2, wherein the RF circuit 110 is configured to: receiving and transmitting information or receiving and transmitting signals during a call, the RF circuit 110 receives downlink information of a base station and transmits the downlink information of the base station to the processor 180. The server processes the downlink information. Further, the RF circuitry 110 may also transmit uplink data to a base station and typically includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier (low noise amplifier, LNA), a duplexer, and the like. In addition, the RF circuitry 110 may also communicate with a network and/or another device via wireless communication. Any communication standard or protocol may be used for wireless communication, including but not limited to global system for mobile communications (GSM), general Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), wideband Code Division Multiple Access (WCDMA), long term evolution (long term evolution, LTE), email, short Message Service (SMS), etc.
4. A vr remote education system based on 5G technology as claimed in claim 2 where the memory 120 can be configured to store software programs and modules, the memory 120 including but not limited to operating system, communication module, contact/motion module, graphics module, text output module, application program, content capture module, camera module, surface finger module (surface finger module), buffer module (buffer module), touch module (touch module), bluetooth module etc. in addition the memory 120 can mainly include a program storage area which can store operating system, application programs required for at least one function (e.g. audio playback function or image playback function etc. and a data storage area which can store data created based on the use of the mobile phone 100 etc., the memory 120 can further include high speed random access memory and can further include non-volatile memory such as at least one disk storage device, flash memory device or another volatile solid state storage device.
5. A 5G technology-based vr distance education apparatus as claimed in claim 2 wherein the contact/motion module in memory 120 is configured to: detecting contact of an object or finger with the touch screen 140 or click wheel; capturing the speed (direction and magnitude) and acceleration (change in magnitude or direction) of the contact; the type of the contact event is determined. For example, the contact/motion module further includes multiple types of contact event detection modules: a calculation module (a digital descending module or a digital ascending module), a digital dragging module (a digital dragging module) and a digital clicking module (a digital clicking module). Sometimes, a gesture, such as a finger pinch/drop, is used with an element on the UI interface to effect some operation.
6. A vr remote education system based on 5G technology as claimed in claim 2, characterized in that the bluetooth module is arranged to be connected to an external device, which performs operations on the mobile phone by using the bluetooth module, and the external device comprises a device that can remotely control the mobile phone, such as a bluetooth handle, and the graphics module is arranged to render and display graphics on a touch screen or other display. The graphics include web pages, icons, digital images, videos, and animations, and the applications may include contact applications, phone applications, video conferencing applications, email clients, instant messaging applications, personal sports applications, camera applications, image management applications, video players, music players, calendar applications, plug-ins (weather plug-ins, stock plug-ins, calculator plug-ins, clock plug-ins, or dictionary plug-ins), custom plug-ins, search applications, note taking applications, map applications, online video applications, and so forth.
7. A vr remote education device based on 5G technology as claimed in claim 2 characterised in that the input unit 130 is configurable to: receiving input numeric or character information and generating key signal input related to user settings and function control of the mobile phone 100, the input unit 130 may include a touch panel 131 and other input devices 132, the touch panel 131, also referred to as a touch screen, may collect touch operations performed by a user on the touch panel 131 or in the vicinity of the touch panel 131 (e.g., operations performed by a user on the touch panel 131 or in the vicinity of the touch panel 131 by using any suitable object or accessory such as a finger or a stylus), and drive the corresponding connection devices according to a preset program. In addition, the touch panel 131 may further include two parts: a touch detection device and a touch controller. The touch detection device detects a touch position of a user, detects a signal generated by a touch operation, and transmits the signal to the touch controller. The touch controller receives touch information from the touch detection device, converts the touch information into touch point coordinates, and transmits the touch point coordinates to the processor 180; and may receive a command sent by the processor 180 and execute the command. In particular, the touch panel 131 may be implemented in various types, such as a resistive type, a capacitive type, an infrared type, and a surface acoustic wave type, the input unit 130 may include other input devices 132 in addition to the touch panel 131, and the other input devices 132 may include one or more of, but are not limited to, a physical keyboard, function keys (e.g., volume control keys or on/off keys), a trackball, a mouse, a joystick, and a bluetooth handle.
8. A vr remote education device based on 5G technology as claimed in claim 2 c h a r a c t e r i z e d in that the display unit 140 can be configured to display information inputted by or provided for the user and various menus of the mobile phone 100. The display unit 140 may include a display panel 141, and the display panel 141 may be configured in the form of a liquid crystal display (liquid crystal display, LCD), an organic light emitting diode (organic light emitting diode, OLED), or the like. In addition, the touch panel 131 may cover the display panel 141. After detecting a touch operation on or near the touch panel 131, the touch panel 131 sends the touch operation to the processor 180 to determine the type of the touch event. The processor 180 then provides a corresponding visual output on the display panel 141 based on the type of touch event. In fig. 1, the touch panel 131 and the display panel 141 are used as two independent components to implement input and output functions of the mobile phone 100. However, in some embodiments, the touch panel 131 and the display panel 141 may be integrated to implement input and output functions of the mobile phone 100.
9. A vr remote education device based on 5G technology as claimed in claim 2 characterized in that the mobile phone 100 also includes at least one sensor 150 such as a light sensor, a motion sensor and another sensor; in particular, the light sensor may include an ambient light sensor and a proximity sensor. The ambient light sensor may adjust the brightness of the display panel 141 based on the brightness of the ambient light. The proximity sensor may turn off the display panel 141 and/or the backlight when the mobile phone 100 is moved to the ear. As a kind of motion sensor, the accelerometer sensor can detect acceleration values in all directions (typically three axes), can detect gravity values and directions when the mobile phone is stationary, and can be applied to an application of recognizing a posture of the mobile phone (for example, switching between a landscape mode and a portrait mode, related games, or magnetometer posture calibration), a vibration recognition related function (for example, a pedometer or a faucet), or the like, for another sensor that can be further configured in the mobile phone 100, for example, a gyroscope, a barometer, a hygrometer, a thermometer, or an infrared sensor.
10. A vr remote education device based on 5G technology as claimed in claim 2 characterised in that the audio circuit 160, speaker 161 and microphone 162 provide an audio interface between the user and the mobile phone 100. The audio circuit 160 may transmit an electric signal converted from the received audio data to the speaker 161, and the speaker 161 converts the electric signal into a sound signal to output. In addition, the microphone 162 converts the collected sound signal into an electrical signal, and the audio circuit 160 receives the electrical signal and converts the electrical signal into audio data, and then outputs the audio data to the RF circuit 110, so that the audio data is transmitted to, for example, another mobile phone, or outputs the audio data to the memory 120 for further processing;
the processor 180 is a control center of the mobile phone 100, connects all parts of the entire mobile phone through various interfaces and lines, and implements various functions of the mobile phone 100, and processes data by running or executing software programs and/or modules stored in the memory 120 and calling data stored in the memory 120 so as to perform overall monitoring of the mobile phone, and optionally, the processor 180 may include one or more processing units, and an application processor and a modem processor may be integrated into the processor 180, the application processor mainly processing an operating system, a user interface, an application program, and the like. The modem processor handles primarily wireless communications. It is understood that the modem processor may not be integrated into the processor 180.
The power supply may be logically connected to the processor 180 by using a power management system to implement functions such as charge management, discharge management, and power consumption management by using the power management system.
CN202110883267.XA 2022-03-07 2022-03-07 Vr remote education device based on 5G technology Withdrawn CN115357112A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110883267.XA CN115357112A (en) 2022-03-07 2022-03-07 Vr remote education device based on 5G technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110883267.XA CN115357112A (en) 2022-03-07 2022-03-07 Vr remote education device based on 5G technology

Publications (1)

Publication Number Publication Date
CN115357112A true CN115357112A (en) 2022-11-18

Family

ID=84030869

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110883267.XA Withdrawn CN115357112A (en) 2022-03-07 2022-03-07 Vr remote education device based on 5G technology

Country Status (1)

Country Link
CN (1) CN115357112A (en)

Similar Documents

Publication Publication Date Title
US11361492B2 (en) Sticker presentation method and apparatus computer-readable storage medium, and terminal
WO2020114271A1 (en) Image rendering method and apparatus, and storage medium
CN109905754B (en) Virtual gift receiving method and device and storage equipment
CN108513671B (en) Display method and terminal for 2D application in VR equipment
WO2019024700A1 (en) Emoji display method and device, and computer readable storage medium
US10768881B2 (en) Multi-screen interaction method and system in augmented reality scene
WO2020151525A1 (en) Message sending method, and terminal device
CN107943390B (en) Character copying method and mobile terminal
JP2023503526A (en) Application sharing method, electronic device and computer readable storage medium
CN109032486B (en) Display control method and terminal equipment
WO2020238449A1 (en) Notification message processing method and terminal
CN110673770B (en) Message display method and terminal equipment
CN109085968B (en) Screen capturing method and terminal equipment
CN106445340B (en) Method and device for displaying stereoscopic image by double-screen terminal
CN108958629B (en) Split screen quitting method and device, storage medium and electronic equipment
US20170046040A1 (en) Terminal device and screen content enlarging method
WO2020192322A1 (en) Display method and terminal device
CN110908554B (en) Long screenshot method and terminal device
WO2016188252A1 (en) Method, device for displaying reference content and storage medium thereof
WO2016119165A1 (en) Chat history display method and apparatus
CN108320148A (en) A kind of resource transfers method and relevant device
US11669237B2 (en) Operation method and terminal device
CN109117037B (en) Image processing method and terminal equipment
US10101894B2 (en) Information input user interface
CN109547696B (en) Shooting method and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20221118