WO2023059087A1 - Augmented reality interaction method and apparatus - Google Patents

Augmented reality interaction method and apparatus Download PDF

Info

Publication number
WO2023059087A1
WO2023059087A1 PCT/KR2022/015033 KR2022015033W WO2023059087A1 WO 2023059087 A1 WO2023059087 A1 WO 2023059087A1 KR 2022015033 W KR2022015033 W KR 2022015033W WO 2023059087 A1 WO2023059087 A1 WO 2023059087A1
Authority
WO
WIPO (PCT)
Prior art keywords
display area
spherical
display
interaction
coordinates
Prior art date
Application number
PCT/KR2022/015033
Other languages
French (fr)
Inventor
Xiaofu Liang
Lei Gao
Yongchao WU
Original Assignee
Samsung Electronics Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co., Ltd. filed Critical Samsung Electronics Co., Ltd.
Publication of WO2023059087A1 publication Critical patent/WO2023059087A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/0209Systems with very large relative bandwidth, i.e. larger than 10 %, e.g. baseband, pulse, carrier-free, ultrawideband
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/415Identification of targets based on measurements of movement associated with the target
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Definitions

  • the present application relates to computer application technology, and more particularly, to an augmented reality (AR) interaction method and apparatus.
  • AR augmented reality
  • An implementation scheme is currently proposed for interaction in an AR environment.
  • a camera is used to capture a user operation gesture, and a corresponding operation is performed on a target object in the AR environment according to the captured operation gesture.
  • This scheme does not require an external device such as keyboard or mouse, thus solving the problem of complicated operation when an AR headset is used because the user needs an external device such as keyboard or mouse for inputting.
  • the inventor found that the above scheme needs to rely on camera equipment to capture the user operation gesture, but the recognition accuracy of the camera equipment is not high, and the target operation object in the AR environment is easily to be occluded by other display contents, which leads to the failure in accurate recognition of the user' s operation by pictures captured by the camera equipment, so the recognition accuracy for continuous actions is poor.
  • an augmented reality (AR) interaction method includes monitoring a user body movement of a user based on a UWB radar sensor. In certain embodiments, wherein a coverage area of the UWB radar sensor overlaps with an AR display area. In certain embodiments, the AR interaction method includes identifying an interaction corresponding to the user body movement and body positioning coordinates based on the monitoring result. In certain embodiments, the AR interaction method includes identifying a target operation object of the interaction included in the AR display area based on the body positioning coordinates.
  • an augmented reality interaction apparatus which includes: a memory storing one or more instructions; and at least one processor executing the one or more instructions.
  • the at least one processor is configured to monitor a specified body movement of a user based on an UWB radar sensor.
  • a coverage area of the UWB radar sensor overlaps with an AR display area.
  • the at least one processor is configured to identify an interaction corresponding to the user body movement and body positioning coordinates based on the monitoring result.
  • the at least one processor is configured to identify a target operation object of the interaction included in the AR display area based on the body positioning coordinates.
  • Embodiments of the application further provide a computer-readable storage medium storing instructions, when executed by at least one processor, the instructions cause the at least one processor to perform the steps of the above augmented reality interaction method.
  • Fig. 1 is a flowchart of a method according to certain embodiments of the present application.
  • Fig. 2 is a diagram of a spherical coordinate system according to certain embodiments of the present application.
  • Fig. 3 is a diagram of a view frustum according to certain embodiments of the present application.
  • Fig. 4 is a cross-sectional view of a view frustum after being cut into layers according to certain embodiments of the present application;
  • Fig. 5 is a diagram of a finger dragging action according to certain embodiments of the present application.
  • Fig. 6 is a diagram of a hand gesture set according to certain embodiments of the present application.
  • Fig. 7 is a diagram of scenario 1 according to certain embodiments of the present application.
  • Fig. 8 is a diagram of the moving process of an object in scenario 1 according to certain embodiments of the present application.
  • Fig. 9 is a diagram of scenario 2 according to certain embodiments of the present application.
  • Fig. 10 is a diagram of the rotation process of a trolley in scenario 2 according to certain embodiments of the present application.
  • Fig. 11 is a diagram of a copy/paste process of an object in an AR environment according to certain embodiments of the present application.
  • Fig. 12 is a structural diagram of an apparatus according to certain embodiments of the present application.
  • 'augmented reality environment' may mean a virtual space in which virtual objects exist.
  • a specific position in space to which the augmented reality environment belongs may have coordinates.
  • the coordinates may be coordinates of a polar coordinate system, a cylindrical coordinate system, and a spherical coordinate system, but is not limited thereto.
  • Information existing in the augmented reality environment may be displayed overlapping the real environment through augmented reality.
  • 'augmented display area' may mean a specific space of an augmented reality environment displayed through augmented reality.
  • Fig. 1 is a flowchart according to certain embodiments of the present application. As shown in Fig. 1, an augmented reality interaction method realized by the embodiment mainly includes:
  • Step 101 a user body movement is monitored based on a UWB radar sensor in an AR environment; wherein a coverage area of the UWB radar sensor overlaps with an AR display area.
  • the user body movement needs to be monitored based on the UWB radar sensor in the AR environment, so that the positioning advantages of the UWB radar sensor may be fully utilized to capture the user body movement of the user quickly and accurately in real time.
  • the coverage area of the UWB radar sensor overlap with the AR display area, so that the association between body positioning coordinates obtained by monitoring the body movement with a specific object in the AR display area is facilitated, so that the user may directly indicate the specific object that needs to be operated in the AR environment through the body movement, which allows operation instructions of the user to be recognized quickly and accurately while ensuring operation convenience for the user.
  • the UWB radar may transmit an emission waveform to the coverage area.
  • the transmitted waveform may be reflected by the user body in the coverage area and be received by the UWB radar.
  • Information on the user body existing in the coverage area may be identified based on an emission waveform and a reflected waveform transmitted/received through the UWB radar.
  • the information on the user body may include information on whether the user body is present in the coverage area, the user body shape, and the like.
  • a UWB radar may periodically transmit and receive an emission waveform and a reflected waveform.
  • Information on how and where the user body moves in the coverage area may be identified based on the continuously identified information on the user body.
  • the user body movement (including single-point static movements and continuous movements) be accurately located and recognized using the UWB radar technology based on transmission and receiving waveforms.
  • the monitoring of the body movement of the user may be realized by using the existing technology, which will not be described in detail here.
  • an emission point of the UWB radar sensor may be adjusted to be a vertex of a view frustum of the AR display area, so that the UWB radar (transmitting + receiving) coverage area overlaps with the AR display area.
  • an origin of a spherical coordinate system (as shown in Fig. 2) used for UWB radar positioning is set as the vertex of the view frustum (as shown in Fig. 3) of the AR display area, so that a position of each display content in the AR display area may be identified directly by using the spherical coordinate system for UWB radar positioning.
  • a coordinate system for positioning the content in the AR display area is consistent with the coordinate system for UWB radar positioning, so that the positioning coordinates of the body movement are consistent with the coordinates of a target operation object in the AR display area, and therefore, the target operation object in the AR display area may be quickly positioned directly based on the positioning coordinates of the body movement.
  • the position of the display object in the AR display area may be identified by using the spherical coordinate system.
  • the AR display area may be cut by spherical surfaces with different spherical radii with the coordinate system origin as the spherical center in the spherical coordinate system to obtain a corresponding number of sub-display areas, so that each sub-display area will be defined by spherical surfaces with different spherical radii.
  • the position of the display object in the AR display area may be accurately identified by means of a spherical surface with the maximum spherical radius corresponding to the sub-display areas in the AR display area, that is, a position of the display object in the AR display area in the spherical coordinate system is a position of the display object on a spherical surface corresponding to a sub-display area to which the display object belongs.
  • the position of each display object in the AR environment in the spherical coordinate system may be specifically represented by a spherical radius R and coordinates (x, y, z).
  • R is the sphere radius of the spherical surface corresponding to the sub-display area to which the display object belongs
  • the coordinates (x, y, z) are coordinates of the display object on the spherical surface corresponding to the sub-display area to which the display object belongs.
  • the spherical surface corresponding to each sub-display area may be a spherical surface corresponding to the maximum sphere radius or the minimum sphere radius corresponding to the sub-display area.
  • Fig. 4 is a cross-sectional view of a view frustum from the origin of the visual field to the far plane of the visual field.
  • two points (P and Q) in the AR display area are located on the tangent plane, P is located on a spherical surface with the spherical radius Rp, and Q is located on a spherical surface corresponding to the spherical radius between Rq and Rp, so that the coordinates of P and Q on their respective spherical surfaces may be marked as P (Px, Py, Pz) and Q (Qx, Qy, Qz) respectively. Accordingly, the positions of P and Q may be marked as Fp (Rq, Px, Py, Pz) and Fq (Rp, Qx, Qy, Qz) respectively.
  • the user body movement may be a hand gesture, but is not limited thereto, for example, it may also be a foot movement, etc.
  • different body movements may be defined according to a transmission waveform and a reflection waveform.
  • a finger dragging action as shown in Fig. 5, the finger clicks P, that is, Fp (Rp, Xp, Yp, Zp), and then the finger is dragged to Q, that is, Fq (Rq, Xq, Yq, Zq), and this is defined as dragging.
  • P that is, Fp (Rp, Xp, Yp, Zp
  • Q that is, Fq (Rq, Xq, Yq, Zq)
  • this is defined as dragging.
  • the position information of the point may be accurately located based on an echo signal of a single point.
  • a user body movement set may be set according to actual needs. Taking hand gestures as an example, a hand gesture set as shown in Fig. 6 may be set, but the present application is not limited thereto.
  • the spherical radius may be obtained based on a preset spherical radius interval, that is, a group of spherical radii with the same interval may be obtained, so that the distances between adjacent sub-display areas obtained after cutting the AR display areas are the same.
  • the spherical radius interval may be set by those skilled in the art according to actual needs.
  • Step 102 identifying an interaction corresponding to the user body movement and body positioning coordinates based on the monitoring result.
  • information about the user body obtained through monitoring may be compared with preset user body information corresponding to each of a plurality of interactions.
  • an interaction corresponding to preset user body information having a similarity with the acquired user body information equal to or greater than a threshold value may be identified as an interaction corresponding to the user body movement.
  • an interaction corresponding to the preset user body information having the highest similarity may be identified.
  • body position coordinates may be identified differently depending on the identified interaction.
  • a position that can be identified by body position coordinates may be set, respectively.
  • the interaction 'pinch out' may be identified by the body position coordinates of the thumb and index finger tips of the hand.
  • Step 103 identifying a target operation object of the interaction included in the AR display area based on the body positioning coordinates.
  • the target operation object in the AR display area needs to be quickly and accurately determined based on the body positioning coordinates by means of the relevance between the body positioning coordinates and the display content in the AR environment.
  • the following method may be adopted to determine the target operation object:
  • the above technical solution realizes the interaction between users and AR environments by introducing the UWB radar technology and making the UWB radar coverage area overlap the AR display area. Further, by marking different layers of the AR display area, the accuracy of user operation recognition may be improved, and the operation cost is small.
  • the implementation of the above-mentioned method embodiment in several specific application contexts will be described in detail below.
  • the following method may further comprise rendering an AR display area on which the identified interaction is performed with respect to the identified at least one target operation object.
  • an identified interaction may be performed on an identified target object. For example, when the identified interaction is 'click' and a specific object is identified as the target operation object in the AR display area, the 'click' may be performed on the identified target operation object.
  • the AR environment changes as a result of the identified interactions. Based on the changed AR environment, an AR display area in which AR interaction is performed may be rendered.
  • Scenario 1 Move a virtual object in the AR environment
  • Fig. 7 is a diagram of scenario 1.
  • Fig. 8 is a diagram of the process of moving an object in scenario 1. As shown in Fig. 8, the finger clicks the virtual object, drags the virtual object to a selected area, and then released, so that the selected object is re-rendered at a new position. In Fig. 7, a sofa (virtual object) is clicked to be moved, to obtain a more reasonable home layout.
  • a sofa virtual object
  • Scenario 2 Rotate an virtual object in the AR environment
  • Fig. 9 is a diagram of scenario 2.
  • Fig. 10 is a diagram of the process of rotating a trolley in scenario 2. As shown in Fig. 10, the finger clicks the virtual object, rotates the virtual object by a certain angle (for example, rotates by 180° clockwise), and then released, so that the selected object is re-rendered at the current position, but the rendered angle is rotated by 180°. In Fig. 9, by selecting and rotating the trolley toy (virtual object), more information of other surfaces can be seen.
  • Scenario 3 Copy/Paste a virtual object in the AR environment
  • Fig. 11 is a diagram of a copy/paste process of an object in the AR environment. As shown in Fig. 11, both hands click the virtual object, the left hand is kept still, the right hand drags the virtual object and then released. so that the selected object repeatedly renders a virtual object (the same as the original selected virtual object) at the current position (the position where the right hand released), and the two virtual objects are exactly the same, but at different positions.
  • Fig. 12 is a block diagram illustrating a configuration of an augmented reality interaction apparatus according to an embodiment of the disclosure.
  • an augmented reality interaction apparatus 2000 may include a memory 2100 and a processor 2200.
  • the memory 2100 may store instructions, a data structure, and program code that are readable by the processor 2200.
  • operations performed by the processor 2200 may be implemented by executing instructions or code of a program stored in the memory 2100.
  • the memory 2100 may include a flash memory-type memory, a hard disk-type memory, a multimedia card micro-type memory, a card-type memory (e.g., an SD memory, an XD memory, etc.), a non-volatile memory including at least one of read-only memory (ROM), electrically erasable programmable ROM (EEPROM), programmable ROM (PROM), a magnetic memory, a magnetic disk, or an optical disk, and a volatile memory such as random-access memory (RAM) or static RAM (SRAM).
  • ROM read-only memory
  • EEPROM electrically erasable programmable ROM
  • PROM programmable ROM
  • RAM random-access memory
  • SRAM static RAM
  • the memory 2100 may store one or more instructions and/or a program that enable the electronic device 2000 to execute the augmented reality interaction method.
  • the memory 2100 may store a monitoring module 2110 and a rendering module 2120.
  • the processor 2200 may control overall operations of the electronic device 2000.
  • the processor 2200 may execute one or more instructions of the program stored in the memory 2100 to control the overall operations of the electronic device 2000 to execute the augmented reality interaction method.
  • the processor 2200 may include, but is not limited to, at least one of a central processing unit (CPU), a microprocessor, a graphics processing unit (GPU), an application-specific integrated circuits (ASIC), a digital signal processor (DSP), a digital signal processing device (DSPDs), a programmable logic device (PLD), a field-programmable gate array (FPGAs), an application processor, a neural processing unit, or a dedicated artificial intelligence processor designed in a hardware structure specialized for processing an artificial intelligence model.
  • CPU central processing unit
  • a microprocessor a graphics processing unit (GPU), an application-specific integrated circuits (ASIC), a digital signal processor (DSP), a digital signal processing device (DSPDs), a programmable logic device (PLD), a field-programmable gate array (FPGAs)
  • CPU central processing unit
  • GPU graphics processing unit
  • ASIC application-specific integrated circuits
  • DSP digital signal processor
  • DSPDs digital signal processing device
  • PLD programmable logic device
  • the processor 2200 may be implemented to comprise one or more central processing units or one or more field programmable gate arrays, wherein the field programmable gate arrays integrate one or more central processing unit cores.
  • the central processing unit or the central processing unit core may be implemented as a CPU or MCU.
  • the processor 2200 may execute the monitoring module to monitor a user body movement based on an UWB radar sensor. In certain embodiments, wherein a coverage of the UWB radar sensor overlaps with an AR display area.
  • the processor 2200 may execute the rendering module, configured to identifying an interaction corresponding to the user body movement and body positioning coordinates based on the monitoring result.
  • the rendering module identify a target operation object of the interaction included in the AR display area based on the body positioning coordinates.
  • an emission point of the UWB radar sensor is a vertex of a view frustum of the AR display area; an origin of a spherical coordinate system for UWB radar positioning is the vertex of the view frustum of the AR display area.
  • the processor 2200 may execute the rendering module to take a display object located at the body positioning coordinates in the spherical coordinate system as the target operation object.
  • the AR display area of the AR environment is segmented into a corresponding number of sub-display areas by spherical surfaces with different spherical radii in the spherical coordinate system, and a position of a display object in the AR display area in the spherical coordinate system is a position of the display object on a spherical surface corresponding to a sub-display area to which the display object belongs.
  • the user body movement comprises a hand gesture.
  • the spherical radius is obtained based on a preset spherical radius interval.
  • the position of the display object in the spherical coordinate system may be specifically represented by a spherical radius R and coordinates (x, y, z).
  • R is the sphere radius of the spherical surface corresponding to the sub-display area to which the display object belongs
  • the coordinates (x, y, z) are coordinates of the display object on the spherical surface corresponding to the sub-display area to which the display object belongs.
  • the processor 2200 may execute the rendering module to render an AR display area on which the identified interaction is performed with respect to the identified at least one target operation object.
  • an embodiment of the application realizes augmented reality interaction electronic equipment, which includes a processor 2200 and a memory 2100, wherein the memory 2100 stores an application program executable by the processor 2200 for causing the processor 2200 to execute the augmented reality interaction method as described above.
  • a system or device may be provided with a storage medium on which software program codes for realizing the functions of any one of the above embodiments are stored, and a computer (or CPU or MPU) of the system or device may read out and execute the program codes stored in the storage medium.
  • part or all of the actual operations can be completed by an operating system operated on the computer based on instructions of the program codes.
  • the program codes read from the storage medium can also be written into a memory 2100 arranged in an expansion board inserted into the computer or into a memory 2100 arranged in an expansion unit connected with the computer, and then part or all of the actual operations can be executed by a CPU installed on the expansion board or expansion unit based on the instructions of the program codes, thereby realizing the functions of any one of the above embodiments of the augmented reality interaction method.
  • an augmented reality (AR) interaction method may be provided.
  • an augmented reality interaction method includes monitoring a user body movement based on a UWB radar sensor. In certain embodiments, wherein a coverage of the UWB radar sensor overlaps with an AR display area. In certain embodiments, the AR interaction method includes identifying an interaction corresponding to the user body movement and body positioning coordinates based on the monitoring result. In certain embodiments, the AR interaction method includes identifying a target operation object of the interaction included in the AR display area based on the body positioning coordinates.
  • an emission point of the UWB radar sensor is a vertex of a view frustum of the AR display area; an origin of a spherical coordinate system for UWB radar positioning is the vertex of the view frustum of the AR display area.
  • identifying the target operation object of the interaction comprises taking a display object located at the body positioning coordinates in the spherical coordinate system as the target operation object.
  • the AR display area is segmented into a corresponding number of sub-display areas by spherical surfaces with different spherical radii in the spherical coordinate system; and a position of a display object in the AR display area in the spherical coordinate system is a position of the display object on a spherical surface corresponding to a sub-display area to which the display object belongs.
  • the position of the display object in the spherical coordinate system is jointly represented by a spherical radius R and coordinates (x, y, z); wherein R is the sphere radius of the spherical surface corresponding to the sub-display area to which the display object belongs, and the coordinates (x, y, z) are coordinates of the display object on the spherical surface corresponding to the sub-display area to which the display object belongs.
  • the spherical radius is obtained based on a preset spherical radius interval.
  • the user body movement comprises a hand gesture.
  • the AR interaction method includes rendering an AR display area on which the identified interaction is performed with respect to the identified at least one target operation object.
  • the augmented reality interaction scheme proposed by the embodiment of the present application utilizes the UWB radar sensor to monitor the user body movement of the user in the AR environment, and makes the coverage area of the UWB radar sensors overlap the AR display area, so that when the user body movement is monitored, the target operation object in the corresponding AR display area may be quickly and accurately determined according to the corresponding body positioning coordinates, to quickly and accurately recognize user operation instructions. Therefore, the embodiments of the present application may improve the accuracy of user operation recognition in the AR environment while ensuring operation convenience for users.
  • An embodiment of the application realizes a computer program product, which includes a computer program/instruction, and when the computer program/instruction is executed by a processor 2200, the steps of the augmented reality interaction method as described above are realized.
  • Hardware modules in various embodiments can be implemented by a mechanical or electronic means.
  • a hardware module may comprise specially designed permanent circuits or logic devices (such as special-purpose processors, like FPGA or ASIC) for performing specific operations.
  • the hardware module may also comprise programmable logic devices or circuits (such as general purpose processors or other programmable processors) temporarily configured by software for performing specific operations.
  • a special permanent circuit or a temporarily configured circuit can be decided by taking consideration of cost and time.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Human Computer Interaction (AREA)
  • Electromagnetism (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The present application discloses an augmented reality interaction method and apparatus. The method includes: monitoring a user body movement based on an UWB radar sensor; wherein a coverage of the UWB radar sensor overlaps with an AR display area; identifying an interaction corresponding to the user body movement and body positioning coordinates based on the monitoring result; and identifying a target operation object of the interaction included in the AR display area based on the body positioning coordinates.

Description

AUGMENTED REALITY INTERACTION METHOD AND APPARATUS
The present application relates to computer application technology, and more particularly, to an augmented reality (AR) interaction method and apparatus.
An implementation scheme is currently proposed for interaction in an AR environment. In this scheme, a camera is used to capture a user operation gesture, and a corresponding operation is performed on a target object in the AR environment according to the captured operation gesture. This scheme does not require an external device such as keyboard or mouse, thus solving the problem of complicated operation when an AR headset is used because the user needs an external device such as keyboard or mouse for inputting.
In the process of realizing the present application, the inventor found that the above scheme needs to rely on camera equipment to capture the user operation gesture, but the recognition accuracy of the camera equipment is not high, and the target operation object in the AR environment is easily to be occluded by other display contents, which leads to the failure in accurate recognition of the user' s operation by pictures captured by the camera equipment, so the recognition accuracy for continuous actions is poor.
In order to achieve the above purpose, the embodiments of the application adopt the following technical solution:
According to an aspect of the disclosure, an augmented reality (AR) interaction method includes monitoring a user body movement of a user based on a UWB radar sensor. In certain embodiments, wherein a coverage area of the UWB radar sensor overlaps with an AR display area. In certain embodiments, the AR interaction method includes identifying an interaction corresponding to the user body movement and body positioning coordinates based on the monitoring result. In certain embodiments, the AR interaction method includes identifying a target operation object of the interaction included in the AR display area based on the body positioning coordinates.
According to an aspect of the disclosure, an augmented reality interaction apparatus, which includes: a memory storing one or more instructions; and at least one processor executing the one or more instructions. In certain embodiments, wherein the at least one processor is configured to monitor a specified body movement of a user based on an UWB radar sensor. In certain embodiments, wherein a coverage area of the UWB radar sensor overlaps with an AR display area. In certain embodiments, wherein the at least one processor is configured to identify an interaction corresponding to the user body movement and body positioning coordinates based on the monitoring result. In certain embodiments, wherein the at least one processor is configured to identify a target operation object of the interaction included in the AR display area based on the body positioning coordinates.
According to an aspect of the disclosure, Embodiments of the application further provide a computer-readable storage medium storing instructions, when executed by at least one processor, the instructions cause the at least one processor to perform the steps of the above augmented reality interaction method.
Fig. 1 is a flowchart of a method according to certain embodiments of the present application;
Fig. 2 is a diagram of a spherical coordinate system according to certain embodiments of the present application;
Fig. 3 is a diagram of a view frustum according to certain embodiments of the present application;
Fig. 4 is a cross-sectional view of a view frustum after being cut into layers according to certain embodiments of the present application;
Fig. 5 is a diagram of a finger dragging action according to certain embodiments of the present application;
Fig. 6 is a diagram of a hand gesture set according to certain embodiments of the present application;
Fig. 7 is a diagram of scenario 1 according to certain embodiments of the present application;
Fig. 8 is a diagram of the moving process of an object in scenario 1 according to certain embodiments of the present application;
Fig. 9 is a diagram of scenario 2 according to certain embodiments of the present application;
Fig. 10 is a diagram of the rotation process of a trolley in scenario 2 according to certain embodiments of the present application;
Fig. 11 is a diagram of a copy/paste process of an object in an AR environment according to certain embodiments of the present application; and
Fig. 12 is a structural diagram of an apparatus according to certain embodiments of the present application.
The terms used herein will be briefly described, and then the disclosure will be described in detail. Throughout the disclosure, the expression "at least one of a, b or c" indicates only a, only b, only c, both a and b, both a and c, both b and c, all of a, b, and c, or variations thereof.
Although the terms used in the disclosure are selected from among common terms that are currently widely used in consideration of their functions in the disclosure, the terms may be different according to an intention of one of ordinary skill in the art, a precedent, or the advent of new technology. Also, in particular cases, the terms are discretionally selected by the applicant of the disclosure, in which case, the meaning of those terms will be described in detail in the corresponding part of the detailed description. Therefore, the terms used herein are not merely designations of the terms, but the terms are defined based on the meaning of the terms and content throughout the disclosure.
The singular expression may also include the plural meaning as long as it is not inconsistent with the context. All the terms used herein, including technical and scientific terms, may have the same meanings as those generally understood by those of skill in the art. In addition, although the terms such as 'first' or 'second' may be used in the present specification to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another element.
Throughout the specification, when a part "includes" a component, it means that the part may additionally include other components rather than excluding other components as long as there is no particular opposing recitation. Also, the terms described in the specification, such as "...er (or)", "... unit", "... module", etc., denote a unit that performs at least one function or operation, which may be implemented as hardware or software or a combination thereof.
Hereinafter, embodiments of the disclosure will be described in detail with reference to the accompanying drawings to allow those of skill in the art to easily carry out the embodiments of the disclosure. The disclosure may, however, be embodied in many different forms and should not be construed as being limited to the embodiments of the disclosure set forth herein. In order to clearly describe the disclosure, portions that are not relevant to the description of the disclosure are omitted, and similar reference numerals are assigned to similar elements throughout the present specification.
In the disclosure, 'augmented reality environment' may mean a virtual space in which virtual objects exist. A specific position in space to which the augmented reality environment belongs may have coordinates. The coordinates may be coordinates of a polar coordinate system, a cylindrical coordinate system, and a spherical coordinate system, but is not limited thereto. Information existing in the augmented reality environment may be displayed overlapping the real environment through augmented reality.
In the disclosure, 'augmented display area' may mean a specific space of an augmented reality environment displayed through augmented reality.
In order to make the object, technical solution and advantages of the present application clearer, the present application will be further described in detail with reference to the drawings and specific embodiments.
Fig. 1 is a flowchart according to certain embodiments of the present application. As shown in Fig. 1, an augmented reality interaction method realized by the embodiment mainly includes:
Step 101, a user body movement is monitored based on a UWB radar sensor in an AR environment; wherein a coverage area of the UWB radar sensor overlaps with an AR display area.
In this step, the user body movement needs to be monitored based on the UWB radar sensor in the AR environment, so that the positioning advantages of the UWB radar sensor may be fully utilized to capture the user body movement of the user quickly and accurately in real time. In addition, by making the coverage area of the UWB radar sensor overlap with the AR display area, the association between body positioning coordinates obtained by monitoring the body movement with a specific object in the AR display area is facilitated, so that the user may directly indicate the specific object that needs to be operated in the AR environment through the body movement, which allows operation instructions of the user to be recognized quickly and accurately while ensuring operation convenience for the user.
In certain embodiments, The UWB radar may transmit an emission waveform to the coverage area. The transmitted waveform may be reflected by the user body in the coverage area and be received by the UWB radar. Information on the user body existing in the coverage area may be identified based on an emission waveform and a reflected waveform transmitted/received through the UWB radar.
In certain embodiments, the information on the user body may include information on whether the user body is present in the coverage area, the user body shape, and the like.
In certain embodiments, a UWB radar may periodically transmit and receive an emission waveform and a reflected waveform. Information on how and where the user body moves in the coverage area may be identified based on the continuously identified information on the user body.
The user body movement (including single-point static movements and continuous movements) be accurately located and recognized using the UWB radar technology based on transmission and receiving waveforms. In this step, the monitoring of the body movement of the user may be realized by using the existing technology, which will not be described in detail here.
In certain embodiments, considering that a visual range of the user from an origin of a visual field forms a view frustum, to reduce the overhead of interactive calculation, an emission point of the UWB radar sensor may be adjusted to be a vertex of a view frustum of the AR display area, so that the UWB radar (transmitting + receiving) coverage area overlaps with the AR display area. In addition, an origin of a spherical coordinate system (as shown in Fig. 2) used for UWB radar positioning is set as the vertex of the view frustum (as shown in Fig. 3) of the AR display area, so that a position of each display content in the AR display area may be identified directly by using the spherical coordinate system for UWB radar positioning. In this way, a coordinate system for positioning the content in the AR display area is consistent with the coordinate system for UWB radar positioning, so that the positioning coordinates of the body movement are consistent with the coordinates of a target operation object in the AR display area, and therefore, the target operation object in the AR display area may be quickly positioned directly based on the positioning coordinates of the body movement.
In certain embodiments, in order to facilitate the identification of the position of the target operation object in the AR display area, the position of the display object in the AR display area may be identified by using the spherical coordinate system. Specifically, the AR display area may be cut by spherical surfaces with different spherical radii with the coordinate system origin as the spherical center in the spherical coordinate system to obtain a corresponding number of sub-display areas, so that each sub-display area will be defined by spherical surfaces with different spherical radii. The position of the display object in the AR display area may be accurately identified by means of a spherical surface with the maximum spherical radius corresponding to the sub-display areas in the AR display area, that is, a position of the display object in the AR display area in the spherical coordinate system is a position of the display object on a spherical surface corresponding to a sub-display area to which the display object belongs.
In certain embodiments, the position of each display object in the AR environment in the spherical coordinate system may be specifically represented by a spherical radius R and coordinates (x, y, z).
R is the sphere radius of the spherical surface corresponding to the sub-display area to which the display object belongs, and the coordinates (x, y, z) are coordinates of the display object on the spherical surface corresponding to the sub-display area to which the display object belongs.
Here, the spherical surface corresponding to each sub-display area may be a spherical surface corresponding to the maximum sphere radius or the minimum sphere radius corresponding to the sub-display area.
Fig. 4 is a cross-sectional view of a view frustum from the origin of the visual field to the far plane of the visual field. As shown in the figure, two points (P and Q) in the AR display area are located on the tangent plane, P is located on a spherical surface with the spherical radius Rp, and Q is located on a spherical surface corresponding to the spherical radius between Rq and Rp, so that the coordinates of P and Q on their respective spherical surfaces may be marked as P (Px, Py, Pz) and Q (Qx, Qy, Qz) respectively. Accordingly, the positions of P and Q may be marked as Fp (Rq, Px, Py, Pz) and Fq (Rp, Qx, Qy, Qz) respectively.
In certain embodiments, the user body movement may be a hand gesture, but is not limited thereto, for example, it may also be a foot movement, etc.
For the user body movement, different body movements may be defined according to a transmission waveform and a reflection waveform. For example, for a finger dragging action, as shown in Fig. 5, the finger clicks P, that is, Fp (Rp, Xp, Yp, Zp), and then the finger is dragged to Q, that is, Fq (Rq, Xq, Yq, Zq), and this is defined as dragging. It should be noted that when the finger clicks a certain position, the position information of the point may be accurately located based on an echo signal of a single point.
In practical application, a user body movement set may be set according to actual needs. Taking hand gestures as an example, a hand gesture set as shown in Fig. 6 may be set, but the present application is not limited thereto.
In certain embodiments, the spherical radius may be obtained based on a preset spherical radius interval, that is, a group of spherical radii with the same interval may be obtained, so that the distances between adjacent sub-display areas obtained after cutting the AR display areas are the same.
The smaller the spherical radius interval, the smaller the cutting granularity of the AR display area, and the more accurate the position identification of environment contents by the sub-display areas. In practical application, the spherical radius interval may be set by those skilled in the art according to actual needs.
Step 102, identifying an interaction corresponding to the user body movement and body positioning coordinates based on the monitoring result.
In this step, when the user body movement within the coverage area is monitored, it may be identified whether the monitored user body movement is a movement for interaction.
In certain embodiments, information about the user body obtained through monitoring may be compared with preset user body information corresponding to each of a plurality of interactions.
In certain embodiments, an interaction corresponding to preset user body information having a similarity with the acquired user body information equal to or greater than a threshold value may be identified as an interaction corresponding to the user body movement. When there are a plurality of preset user body information having a similarity equal to or greater than a threshold value, an interaction corresponding to the preset user body information having the highest similarity may be identified.
In certain embodiments, body position coordinates may be identified differently depending on the identified interaction. In each of the plurality of interactions, a position that can be identified by body position coordinates may be set, respectively. For example, the interaction 'pinch out' may be identified by the body position coordinates of the thumb and index finger tips of the hand.
Step 103, identifying a target operation object of the interaction included in the AR display area based on the body positioning coordinates..
In this step, if the currently monitored body movement is an operation directed at a specific object, the target operation object in the AR display area needs to be quickly and accurately determined based on the body positioning coordinates by means of the relevance between the body positioning coordinates and the display content in the AR environment.
In certain embodiments, the following method may be adopted to determine the target operation object:
taking an object located at the body positioning coordinates in the spherical coordinate system as the target operation object.
It can be seen from the above embodiment that the above technical solution realizes the interaction between users and AR environments by introducing the UWB radar technology and making the UWB radar coverage area overlap the AR display area. Further, by marking different layers of the AR display area, the accuracy of user operation recognition may be improved, and the operation cost is small. The implementation of the above-mentioned method embodiment in several specific application contexts will be described in detail below.
In certain embodiments, the following method may further comprise rendering an AR display area on which the identified interaction is performed with respect to the identified at least one target operation object.
In certain embodiments, an identified interaction may be performed on an identified target object. For example, when the identified interaction is 'click' and a specific object is identified as the target operation object in the AR display area, the 'click' may be performed on the identified target operation object.
In certain embodiments, the AR environment changes as a result of the identified interactions. Based on the changed AR environment, an AR display area in which AR interaction is performed may be rendered.
Scenario 1: Move a virtual object in the AR environment
Fig. 7 is a diagram of scenario 1. Fig. 8 is a diagram of the process of moving an object in scenario 1. As shown in Fig. 8, the finger clicks the virtual object, drags the virtual object to a selected area, and then released, so that the selected object is re-rendered at a new position. In Fig. 7, a sofa (virtual object) is clicked to be moved, to obtain a more reasonable home layout.
Scenario 2: Rotate an virtual object in the AR environment
Fig. 9 is a diagram of scenario 2. Fig. 10 is a diagram of the process of rotating a trolley in scenario 2. As shown in Fig. 10, the finger clicks the virtual object, rotates the virtual object by a certain angle (for example, rotates by 180° clockwise), and then released, so that the selected object is re-rendered at the current position, but the rendered angle is rotated by 180°. In Fig. 9, by selecting and rotating the trolley toy (virtual object), more information of other surfaces can be seen.
Scenario 3: Copy/Paste a virtual object in the AR environment
Fig. 11 is a diagram of a copy/paste process of an object in the AR environment. As shown in Fig. 11, both hands click the virtual object, the left hand is kept still, the right hand drags the virtual object and then released. so that the selected object repeatedly renders a virtual object (the same as the original selected virtual object) at the current position (the position where the right hand released), and the two virtual objects are exactly the same, but at different positions.
Fig. 12 is a block diagram illustrating a configuration of an augmented reality interaction apparatus according to an embodiment of the disclosure.
Referring to FIG. 12, an augmented reality interaction apparatus 2000 according to an embodiment of the disclosure may include a memory 2100 and a processor 2200.
The memory 2100 may store instructions, a data structure, and program code that are readable by the processor 2200. In the embodiments of the disclosure, operations performed by the processor 2200 may be implemented by executing instructions or code of a program stored in the memory 2100.
The memory 2100 may include a flash memory-type memory, a hard disk-type memory, a multimedia card micro-type memory, a card-type memory (e.g., an SD memory, an XD memory, etc.), a non-volatile memory including at least one of read-only memory (ROM), electrically erasable programmable ROM (EEPROM), programmable ROM (PROM), a magnetic memory, a magnetic disk, or an optical disk, and a volatile memory such as random-access memory (RAM) or static RAM (SRAM).
The memory 2100 according to an embodiment of the disclosure may store one or more instructions and/or a program that enable the electronic device 2000 to execute the augmented reality interaction method. For example, the memory 2100 may store a monitoring module 2110 and a rendering module 2120.
The processor 2200 may control overall operations of the electronic device 2000. For example, the processor 2200 may execute one or more instructions of the program stored in the memory 2100 to control the overall operations of the electronic device 2000 to execute the augmented reality interaction method.
For example, the processor 2200 may include, but is not limited to, at least one of a central processing unit (CPU), a microprocessor, a graphics processing unit (GPU), an application-specific integrated circuits (ASIC), a digital signal processor (DSP), a digital signal processing device (DSPDs), a programmable logic device (PLD), a field-programmable gate array (FPGAs), an application processor, a neural processing unit, or a dedicated artificial intelligence processor designed in a hardware structure specialized for processing an artificial intelligence model.
The processor 2200 may be implemented to comprise one or more central processing units or one or more field programmable gate arrays, wherein the field programmable gate arrays integrate one or more central processing unit cores. Particularly, the central processing unit or the central processing unit core may be implemented as a CPU or MCU.
The processor 2200 may execute the monitoring module to monitor a user body movement based on an UWB radar sensor. In certain embodiments, wherein a coverage of the UWB radar sensor overlaps with an AR display area.
The processor 2200 may execute the rendering module, configured to identifying an interaction corresponding to the user body movement and body positioning coordinates based on the monitoring result. In certain embodiments, the rendering module identify a target operation object of the interaction included in the AR display area based on the body positioning coordinates.
In certain embodiments, an emission point of the UWB radar sensor is a vertex of a view frustum of the AR display area; an origin of a spherical coordinate system for UWB radar positioning is the vertex of the view frustum of the AR display area.
In certain embodiments, the processor 2200 may execute the rendering module to take a display object located at the body positioning coordinates in the spherical coordinate system as the target operation object.
In certain embodiments, the AR display area of the AR environment is segmented into a corresponding number of sub-display areas by spherical surfaces with different spherical radii in the spherical coordinate system, and a position of a display object in the AR display area in the spherical coordinate system is a position of the display object on a spherical surface corresponding to a sub-display area to which the display object belongs.
In certain embodiments, the user body movement comprises a hand gesture.
In certain embodiments, the spherical radius is obtained based on a preset spherical radius interval.
In certain embodiments, the position of the display object in the spherical coordinate system may be specifically represented by a spherical radius R and coordinates (x, y, z).
R is the sphere radius of the spherical surface corresponding to the sub-display area to which the display object belongs, and the coordinates (x, y, z) are coordinates of the display object on the spherical surface corresponding to the sub-display area to which the display object belongs.
In certain embodiments, the processor 2200 may execute the rendering module to render an AR display area on which the identified interaction is performed with respect to the identified at least one target operation object.
Based on the above embodiment of the augmented reality interaction method, an embodiment of the application realizes augmented reality interaction electronic equipment, which includes a processor 2200 and a memory 2100, wherein the memory 2100 stores an application program executable by the processor 2200 for causing the processor 2200 to execute the augmented reality interaction method as described above. Specifically, a system or device may be provided with a storage medium on which software program codes for realizing the functions of any one of the above embodiments are stored, and a computer (or CPU or MPU) of the system or device may read out and execute the program codes stored in the storage medium. In addition, part or all of the actual operations can be completed by an operating system operated on the computer based on instructions of the program codes. The program codes read from the storage medium can also be written into a memory 2100 arranged in an expansion board inserted into the computer or into a memory 2100 arranged in an expansion unit connected with the computer, and then part or all of the actual operations can be executed by a CPU installed on the expansion board or expansion unit based on the instructions of the program codes, thereby realizing the functions of any one of the above embodiments of the augmented reality interaction method.
In certain embodiments, an augmented reality (AR) interaction method may be provided.
In certain embodiments, an augmented reality interaction method includes monitoring a user body movement based on a UWB radar sensor. In certain embodiments, wherein a coverage of the UWB radar sensor overlaps with an AR display area. In certain embodiments, the AR interaction method includes identifying an interaction corresponding to the user body movement and body positioning coordinates based on the monitoring result. In certain embodiments, the AR interaction method includes identifying a target operation object of the interaction included in the AR display area based on the body positioning coordinates.
In certain embodiments, an emission point of the UWB radar sensor is a vertex of a view frustum of the AR display area; an origin of a spherical coordinate system for UWB radar positioning is the vertex of the view frustum of the AR display area. In certain embodiments, identifying the target operation object of the interaction comprises taking a display object located at the body positioning coordinates in the spherical coordinate system as the target operation object.
In certain embodiments, the AR display area is segmented into a corresponding number of sub-display areas by spherical surfaces with different spherical radii in the spherical coordinate system; and a position of a display object in the AR display area in the spherical coordinate system is a position of the display object on a spherical surface corresponding to a sub-display area to which the display object belongs.
In certain embodiments, wherein the position of the display object in the spherical coordinate system is jointly represented by a spherical radius R and coordinates (x, y, z); wherein R is the sphere radius of the spherical surface corresponding to the sub-display area to which the display object belongs, and the coordinates (x, y, z) are coordinates of the display object on the spherical surface corresponding to the sub-display area to which the display object belongs.
In certain embodiments, the spherical radius is obtained based on a preset spherical radius interval.
In certain embodiments, wherein the user body movement comprises a hand gesture.
In certain embodiments, the AR interaction method includes rendering an AR display area on which the identified interaction is performed with respect to the identified at least one target operation object.
It can be seen from the above technical solution that the augmented reality interaction scheme proposed by the embodiment of the present application utilizes the UWB radar sensor to monitor the user body movement of the user in the AR environment, and makes the coverage area of the UWB radar sensors overlap the AR display area, so that when the user body movement is monitored, the target operation object in the corresponding AR display area may be quickly and accurately determined according to the corresponding body positioning coordinates, to quickly and accurately recognize user operation instructions. Therefore, the embodiments of the present application may improve the accuracy of user operation recognition in the AR environment while ensuring operation convenience for users.
An embodiment of the application realizes a computer program product, which includes a computer program/instruction, and when the computer program/instruction is executed by a processor 2200, the steps of the augmented reality interaction method as described above are realized.
It should be noted that not all steps and modules in the above-mentioned processes and structural diagrams are necessary, and some steps or modules can be omitted according to actual needs. The execution order of the steps is not fixed and can be adjusted as needed. The division of the modules is only to facilitate the description of different functions. In actual implementation, one module can be implemented as multiple modules, the functions of multiple modules can also be realized by one module, and the modules can be located in the same device or different devices.
Hardware modules in various embodiments can be implemented by a mechanical or electronic means. For example, a hardware module may comprise specially designed permanent circuits or logic devices (such as special-purpose processors, like FPGA or ASIC) for performing specific operations. The hardware module may also comprise programmable logic devices or circuits (such as general purpose processors or other programmable processors) temporarily configured by software for performing specific operations. Whether to implement the hardware modules by a mechanical means, a special permanent circuit or a temporarily configured circuit (such as configured by software) can be decided by taking consideration of cost and time.
Herein, "schematic" means "serving as an instance, example or explanation", and any diagram or embodiment described as "schematic" herein should not be interpreted as a more preferred or advantageous technical solution. For the sake of conciseness, only the parts related to the present application are schematically shown in each drawing, and they do not represent the actual structure of the product. In addition, in order to make the drawings simple and easy to understand, in some figures, only one of the components with the same structure or function is shown schematically, or only one of them is marked. Herein, "one" does not mean to limit the number of relevant parts of the present application to "only one", and "one" does not mean to exclude the situation that the number of relevant parts of the present application is "more than one". Herein, "upper", "lower", "front", "rear", "left", "right", "inne1r", "outer" and so on are only used to express the relative positional relationship among related parts, but not to limit the absolute positions of these related parts.
The above are only preferred embodiments of the present application and are not intended to limit the scope of protection of the present application. Any modification, equivalent substitution and improvement made within the spirit and principles of the present application shall be included in the scope of protection of the present application.

Claims (15)

  1. An augmented reality interaction method, comprising:
    monitoring a user body movement based on an UWB radar sensor; wherein a coverage area of the UWB radar sensor overlaps with an AR display area;
    identifying an interaction corresponding to the user body movement and body positioning coordinates based on the monitoring result; and
    identifying a target operation object of the interaction included in the AR display area based on the body positioning coordinates.
  2. The method according to claim 1, further comprising:
    an emission point of the UWB radar sensor is a vertex of a view frustum of the AR display area; an origin of a spherical coordinate system for UWB radar positioning is the vertex of the view frustum of the AR display area;
    identifying the target operation object of the interaction comprises:
    taking a display object located at the body positioning coordinates in the spherical coordinate system as the target operation object.
  3. The method according to claim 1, further comprising:
    the AR display area is segmented into a corresponding number of sub-display areas by spherical surfaces with different spherical radii in the spherical coordinate system; and a position of a display object in the AR display area in the spherical coordinate system is a position of the display object on a spherical surface corresponding to a sub-display area to which the display object belongs.
  4. The method according to claim 3, wherein the position of the display object in the spherical coordinate system is jointly represented by a spherical radius R and coordinates (x, y, z); wherein R is the sphere radius of the spherical surface corresponding to the sub-display area to which the display object belongs, and the coordinates (x, y, z) are coordinates of the display object on the spherical surface corresponding to the sub-display area to which the display object belongs.
  5. The method according to claim 3, wherein the spherical radius is obtained based on a preset spherical radius interval.
  6. The method according to claim 1, wherein the user body movement comprises a hand gesture.
  7. The method according to claim 1, further comprises rendering an AR display area on which the identified interaction is performed with respect to the identified at least one target operation object.
  8. An augmented reality interaction apparatus, comprising:
    a memory storing one or more instructions; and
    at least one processor executing the one or more instructions;
    wherein the at least one processor is configured to:
    monitor a user body movement based on an UWB radar sensor; wherein a coverage area of the UWB radar sensor overlaps with an AR display area;
    identify an interaction corresponding to the user body movement and body positioning coordinates based on the monitoring result; and
    identify a target operation object of the interaction included in the AR display area based on the body positioning coordinates.
  9. The apparatus according to claim 8, wherein
    an emission point of the UWB radar sensor is a vertex of a view frustum of the AR display area; an origin of a spherical coordinate system for UWB radar positioning is the vertex of the view frustum of the AR display area; and
    wherein the at least one processor is configured to:
    take a display object located at the body positioning coordinates in the spherical coordinate system as the target operation object.
  10. The apparatus according to claim 8, wherein
    the AR display area of the AR environment is segmented into a corresponding number of sub-display areas by spherical surfaces with different spherical radii in the spherical coordinate system; and a position of a display object in the AR display area in the spherical coordinate system is a position of the display object on a spherical surface corresponding to a sub-display area to which the display object belongs.
  11. The apparatus according to claim 8, wherein the position of the display object in the spherical coordinate system is jointly represented by a spherical radius R and coordinates (x, y, z); wherein R is the sphere radius of the spherical surface corresponding to the sub-display area to which the display object belongs, and the coordinates (x, y, z) are coordinates of the display object on the spherical surface corresponding to the sub-display area to which the display object belongs.
  12. The apparatus according to claim 8, wherein the spherical radius is obtained based on a preset spherical radius interval.
  13. The apparatus according to claim 8, wherein the user body movement comprises a hand gesture.
  14. The apparatus according to claim 8, wherein the at least one processor is configured to:
    render an AR display area on which the identified interaction is performed with respect to the identified at least one target operation object.
  15. A computer-readable storage medium, wherein the computer-readable storage medium stores instructions, when executed by at least one processor, the instructions cause the at least one processor to perform the steps of the augmented reality interaction method according to any one of claims 1-7.
PCT/KR2022/015033 2021-10-08 2022-10-06 Augmented reality interaction method and apparatus WO2023059087A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111171422.1A CN113918015B (en) 2021-10-08 2021-10-08 Interaction method and device for augmented reality
CN202111171422.1 2021-10-08

Publications (1)

Publication Number Publication Date
WO2023059087A1 true WO2023059087A1 (en) 2023-04-13

Family

ID=79238238

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2022/015033 WO2023059087A1 (en) 2021-10-08 2022-10-06 Augmented reality interaction method and apparatus

Country Status (2)

Country Link
CN (1) CN113918015B (en)
WO (1) WO2023059087A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117368869A (en) * 2023-12-06 2024-01-09 航天宏图信息技术股份有限公司 Visualization method, device, equipment and medium for radar three-dimensional power range

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20240047186A (en) * 2022-10-04 2024-04-12 삼성전자주식회사 Augmented reality apparatus and operating method thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130234933A1 (en) * 2011-08-26 2013-09-12 Reincloud Corporation Coherent presentation of multiple reality and interaction models
US20150277569A1 (en) * 2014-03-28 2015-10-01 Mark E. Sprenger Radar-based gesture recognition
US20190362557A1 (en) * 2018-05-22 2019-11-28 Magic Leap, Inc. Transmodal input fusion for a wearable system
US20200064996A1 (en) * 2018-08-24 2020-02-27 Google Llc Smartphone-Based Radar System Facilitating Ease and Accuracy of User Interactions with Displayed Objects in an Augmented-Reality Interface
US20210094180A1 (en) * 2018-03-05 2021-04-01 The Regents Of The University Of Colorado, A Body Corporate Augmented Reality Coordination Of Human-Robot Interaction

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7942744B2 (en) * 2004-08-19 2011-05-17 Igt Virtual input system
US9761049B2 (en) * 2014-03-28 2017-09-12 Intel Corporation Determination of mobile display position and orientation using micropower impulse radar
CN106055089A (en) * 2016-04-27 2016-10-26 深圳市前海万象智慧科技有限公司 Control system for gesture recognition based on man-machine interaction equipment and control method for same

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130234933A1 (en) * 2011-08-26 2013-09-12 Reincloud Corporation Coherent presentation of multiple reality and interaction models
US20150277569A1 (en) * 2014-03-28 2015-10-01 Mark E. Sprenger Radar-based gesture recognition
US20210094180A1 (en) * 2018-03-05 2021-04-01 The Regents Of The University Of Colorado, A Body Corporate Augmented Reality Coordination Of Human-Robot Interaction
US20190362557A1 (en) * 2018-05-22 2019-11-28 Magic Leap, Inc. Transmodal input fusion for a wearable system
US20200064996A1 (en) * 2018-08-24 2020-02-27 Google Llc Smartphone-Based Radar System Facilitating Ease and Accuracy of User Interactions with Displayed Objects in an Augmented-Reality Interface

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117368869A (en) * 2023-12-06 2024-01-09 航天宏图信息技术股份有限公司 Visualization method, device, equipment and medium for radar three-dimensional power range
CN117368869B (en) * 2023-12-06 2024-03-19 航天宏图信息技术股份有限公司 Visualization method, device, equipment and medium for radar three-dimensional power range

Also Published As

Publication number Publication date
CN113918015A (en) 2022-01-11
CN113918015B (en) 2024-04-19

Similar Documents

Publication Publication Date Title
WO2023059087A1 (en) Augmented reality interaction method and apparatus
WO2018221808A1 (en) Haptic interaction-based virtual reality simulator and operation method therefor
WO2020207190A1 (en) Three-dimensional information determination method, three-dimensional information determination device, and terminal apparatus
WO2011081371A1 (en) Password processing method and apparatus
WO2012141352A1 (en) Gesture recognition agnostic to device orientation
WO2018151449A1 (en) Electronic device and methods for determining orientation of the device
US11893702B2 (en) Virtual object processing method and apparatus, and storage medium and electronic device
JP2021192294A (en) Human 3d key point detection method, model training method and related device
KR20140033025A (en) Fast fingertip detection initializing a vision-based hand tracker
WO2020017890A1 (en) System and method for 3d association of detected objects
CN111652946B (en) Display calibration method and device, equipment and storage medium
US20220358662A1 (en) Image generation method and device
CN110349212A (en) Immediately optimization method and device, medium and the electronic equipment of positioning and map structuring
WO2016017931A1 (en) Method and apparatus for providing interface interacting with user by means of nui device
WO2021244650A1 (en) Control method and device, terminal and storage medium
WO2014133258A1 (en) Pen input apparatus and method for operating the same
WO2023224433A1 (en) Information generation method and device
WO2023005659A1 (en) Image processing method and apparatus, electronic device, computer-readable storage medium, computer program, and computer program product
WO2021153961A1 (en) Image input system using virtual reality and image data generation method using same
WO2015152487A1 (en) Method, device, system and non-transitory computer-readable recording medium for providing user interface
WO2022145595A1 (en) Calibration system and method
WO2017003040A1 (en) System and method for displaying graphic-based web vector map
WO2020130349A1 (en) Method and apparatus for recording treatment plan of 3d medical image
WO2020262725A1 (en) Augmented reality method for providing information on basis of three-dimensional object recognition using deep learning, and system using same
WO2020166837A1 (en) Method, system, and non-transitory computer-readable recording medium for supporting object control

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22878908

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE