CN114531582A - Augmented reality function control method and electronic equipment - Google Patents

Augmented reality function control method and electronic equipment Download PDF

Info

Publication number
CN114531582A
CN114531582A CN202011206535.6A CN202011206535A CN114531582A CN 114531582 A CN114531582 A CN 114531582A CN 202011206535 A CN202011206535 A CN 202011206535A CN 114531582 A CN114531582 A CN 114531582A
Authority
CN
China
Prior art keywords
positioning
electronic device
tag
augmented reality
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011206535.6A
Other languages
Chinese (zh)
Other versions
CN114531582B (en
Inventor
李世明
王帅
魏江波
唐建中
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202011206535.6A priority Critical patent/CN114531582B/en
Priority to PCT/CN2021/127781 priority patent/WO2022089625A1/en
Publication of CN114531582A publication Critical patent/CN114531582A/en
Application granted granted Critical
Publication of CN114531582B publication Critical patent/CN114531582B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/65Control of camera operation in relation to power supply
    • H04N23/651Control of camera operation in relation to power supply for reducing power consumption by affecting camera operations, e.g. sleep mode, hibernation mode or power off of selective parts of the camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/021Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences

Abstract

The embodiment of the application provides an augmented reality function control method and electronic equipment, wherein the method comprises the following steps: the electronic equipment detects the positioning tag in a preset detection range; the preset detection range is in front of a camera in the electronic equipment; the positioning tag is used for identifying the position of the augmented reality scene; when the electronic equipment does not detect the positioning tag in the preset detection range, the electronic equipment does not start or close the augmented reality function; the augmented reality function is used for processing data of an augmented reality scene and displaying an augmented reality picture; when the electronic equipment detects a first positioning tag in a preset detection range, the electronic equipment starts an augmented reality function, and the first positioning tag is a positioning tag.

Description

Augmented reality function control method and electronic equipment
Technical Field
The present application relates to the field of communications technologies, and in particular, to a method for controlling an augmented reality function and an electronic device.
Background
With the development of society, Augmented Reality (AR) devices are also applied to aspects of life, and the AR devices operate on the principle that virtual objects are projected or displayed on the AR devices through analysis by a processor, in such a way that virtual images can be displayed for a user while the user views a real-world scene. The user may also interact with the virtual image to achieve an augmented reality effect.
However, the AR apparatus requires a high-precision spatial sensor such as a camera to operate, and a high-performance processor performs real-time processing on a large amount of image data, which results in large power consumption of the AR apparatus. The current AR devices are powered by wires and batteries.
The wired power supply mode can lead the AR device to be used only in a very small space range, so that the use scene of the AR device is greatly limited, the battery power supply mode is short in endurance under general conditions, in order to increase endurance time as much as possible, the precision of a space sensing chip can be reduced, the sampling rate is reduced, and the like, so that the virtual image seen by a user is not good enough in fitting property with a real scene, visual field and the like, the immersion is not strong, and the user experience is not good; in order to increase the endurance of the AR device, the battery capacity needs to be increased, which may cause the equipment to become heavy and the wearing comfort to be reduced, so how to reduce the power consumption of the AR device and extend the endurance of the AR device on the premise of not reducing the user experience is a technical problem being solved by those in the art.
Disclosure of Invention
The embodiment of the application discloses an augmented reality function control method and electronic equipment, which can reduce power consumption.
The embodiment of the application discloses an augmented reality function control method in a first aspect. The electronic equipment detects the positioning tag in a preset detection range; the preset detection range is in front of a camera in the electronic equipment; the positioning tag is used for identifying the position of the augmented reality scene; when the electronic equipment does not detect the positioning tag in the preset detection range, the electronic equipment does not start or close the augmented reality function; wherein the augmented reality function is configured to process data of the augmented reality scene and display an augmented reality picture; when the electronic equipment detects a first positioning tag in the preset detection range, the electronic equipment starts the augmented reality function, and the first positioning tag is a positioning tag.
In the method, whether the positioning tag is detected in the preset detection range through the electronic equipment is determined, so that whether the augmented reality function is started or not is determined.
In one possible implementation, the electronic device does not activate or deactivate an augmented reality function, including: the electronic equipment does not start or close the camera, the image signal processor ISP and the graphic processor GPU.
In another possible implementation manner, the starting, by the electronic device, the augmented reality function includes: and the electronic equipment starts the camera, the image signal processor ISP and the graphic processor GPU.
In another possible implementation manner, the electronic device detects the location tag within a preset detection range, including: and a positioning sensor in the electronic equipment detects the positioning label in the preset detection range.
In yet another possible implementation manner, the positioning sensor is an ultra-wideband UWB sensor, and the positioning tag is a UWB tag.
In another possible implementation manner, when the electronic device detects the first positioning tag within the preset detection range, the electronic device starts the augmented reality function, which specifically includes: when a positioning sensor in the electronic equipment detects a first positioning tag in the preset detection range, the positioning sensor sends a first indicating signal to a coprocessor, and the first indicating signal is used for indicating the positioning sensor to detect the first positioning tag in the preset detection range; the coprocessor receives a first indication signal from the positioning sensor and determines the value of a flag bit in the coprocessor; and the controller in the electronic equipment starts the camera, the image signal processor ISP and the graphic processor GPU according to the value of the zone bit in the coprocessor.
In another possible implementation manner, when the electronic device does not detect the positioning tag within the preset detection range, the electronic device does not start or close the augmented reality function, specifically including: when a positioning sensor in the electronic equipment does not detect a positioning tag in the preset detection range, the positioning sensor sends a second indication signal to a coprocessor, wherein the second indication signal is used for indicating that the positioning sensor does not detect the positioning tag in the preset detection range; the coprocessor receives a second indication signal from the positioning sensor and determines the value of a flag bit in the coprocessor; and the controller in the electronic equipment closes the camera, the image signal processor ISP and the graphic processor GPU according to the value of the zone bit in the coprocessor.
In another possible implementation manner, after the electronic device starts the augmented reality function when the electronic device detects the first positioning tag within the preset detection range, the method further includes: and the electronic equipment processes and displays the augmented reality scene of the position of the first positioning label.
In another possible implementation manner, after the electronic device starts the augmented reality function when the electronic device detects the first positioning tag within the preset detection range, before the electronic device processes and displays an augmented reality scene where the first positioning tag is located, the method further includes: when a positioning sensor in the electronic device detects a second positioning tag in the preset detection range, determining the size of a first distance and a second distance, wherein the first distance is the distance between the first positioning tag and the positioning sensor, and the second distance is the distance between the second positioning tag and the positioning sensor; the electronic device determines that the second distance is greater than the first distance.
In another possible implementation manner, when the electronic device detects a first positioning tag within the preset detection range, after the electronic device starts the augmented reality function, before the electronic device processes and displays an augmented reality scene at a location of the first positioning tag, the method further includes: and when a positioning sensor in the electronic equipment detects a second positioning tag in the preset detection range, the electronic equipment processes and displays the augmented reality scene of the position of the second positioning tag.
In a second aspect of the embodiments of the present application, an electronic device is disclosed, which includes a positioning sensor and a controller.
The positioning sensor is used for detecting the positioning label in a preset detection range; the preset detection range is in front of the camera; the positioning tag is used for identifying the position of the augmented reality scene;
the controller is used for controlling not to start or close the augmented reality function under the condition that the positioning sensor does not detect the positioning label in the preset detection range; the augmented reality function is used for processing the data of the augmented reality scene and displaying an augmented reality picture;
the controller is further configured to control the augmented reality function to be started under the condition that the positioning sensor detects a first positioning tag within the preset detection range, where the first positioning tag is a positioning tag.
In a possible implementation manner, the controller is further configured to control not to activate or deactivate the camera, the image signal processor ISP, and the graphics processor GPU.
In yet another possible implementation manner, the controller is further configured to control to start the camera, the image signal processor ISP, and the graphics processor GPU.
In yet another possible implementation manner, the positioning sensor is an ultra-wideband UWB sensor, and the positioning tag is a UWB tag.
In yet another possible implementation manner, the positioning sensor is further configured to send a first indication signal to the co-processor if the positioning sensor in the electronic device detects a first positioning tag within the preset detection range, where the first indication signal is used to indicate that the positioning sensor detects the first positioning tag within the preset detection range; the coprocessor is used for receiving a first indication signal from the positioning sensor and determining the value of a flag bit in the coprocessor; and the controller is used for starting the camera, the image signal processor ISP and the graphic processor GPU according to the value of the zone bit in the coprocessor.
In yet another possible implementation manner, the positioning sensor is further configured to send a second indication signal to the co-processor if the positioning sensor in the electronic device does not detect a positioning tag within the preset detection range, where the second indication signal is used to indicate that the positioning sensor does not detect a positioning tag within the preset detection range; the coprocessor is also used for receiving a second indication signal from the positioning sensor and determining the value of a zone bit in the coprocessor; and the controller is also used for closing the camera, the image signal processor ISP and the graphic processor GPU according to the value of the zone bit in the coprocessor.
In yet another possible implementation manner, the electronic device further includes: and the processor is used for processing and displaying the augmented reality scene at the position of the first positioning label.
In yet another possible implementation manner, the electronic device further includes: the distance sensor is used for determining the size of a first distance and a second distance under the condition that the positioning sensor detects a second positioning tag in the preset detection range, wherein the first distance is the distance between the first positioning tag and the positioning sensor, and the second distance is the distance between the second positioning tag and the positioning sensor; the electronic device determines that the second distance is greater than the first distance.
In yet another possible implementation manner, the processor is further configured to, when the positioning sensor detects a second positioning tag within the preset detection range, process and display an augmented reality scene at a position where the second positioning tag is located.
With regard to the technical effect brought about by the second aspect or the possible implementation, reference may be made to the introduction to the technical effect of the first aspect or the corresponding implementation.
A third aspect of the embodiments of the present application discloses an electronic device, including: one or more processors, and a memory; the memory is coupled to the one or more processors for storing computer program code comprising computer instructions which are invoked by the one or more processors to implement the method described in the first aspect or a possible implementation manner of the first aspect.
A fourth aspect of embodiments of the present application discloses a computer storage medium, in which a computer program is stored, where the computer program includes program instructions, and the program instructions, when executed by a processor, are configured to implement the method described in the first aspect or the possible implementation manner of the first aspect.
A fifth aspect of embodiments of the present application discloses a computer program product, which, when run on an electronic device, causes the electronic device to implement the method described in the first aspect or in a possible implementation manner of the first aspect.
Drawings
The drawings used in the embodiments of the present application are described below.
FIG. 1 is a schematic diagram of a positioning sensor installation location provided by an embodiment of the present application;
FIG. 2 is a schematic view of another positioning sensor installation location provided by an embodiment of the present application;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 4 is a schematic diagram of an electronic device according to an embodiment of the present application, where an augmented reality scene is detected to a position where the augmented reality scene is not detected;
fig. 5 is a schematic diagram illustrating an electronic device detecting a location of an augmented reality scene according to an embodiment of the present application;
fig. 6 is a schematic diagram illustrating a position where an electronic device does not detect an augmented reality scene according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 8 is a schematic diagram of an electronic device according to an embodiment of the present application, from the detection of a location tag to the non-detection of the location tag;
fig. 9 is a schematic diagram illustrating an electronic device detecting a location tag according to an embodiment of the present application;
fig. 10 is a schematic diagram of an electronic device that does not detect a location tag according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 12 is a schematic view of an indoor museum provided in the embodiment of the present application;
fig. 13 is a schematic diagram of an exhibit with a UWB tag detected in a detection range according to an embodiment of the present application;
FIG. 14 is a schematic diagram of an image displayed on a display screen according to an embodiment of the present application;
fig. 15 is a schematic diagram of an exhibit in which a UWB tag is not detected in a detection range according to an embodiment of the present application;
FIG. 16 is a schematic diagram of an image displayed on a display screen according to an embodiment of the present application;
fig. 17 is a schematic diagram of an exhibit with a detected UWB tag to an exhibit without a detected UWB tag in a detection range according to an embodiment of the present application;
fig. 18 is a schematic diagram of an exhibit with a UWB tag detected to an exhibit with a UWB tag detected in a detection range provided by an embodiment of the present application;
FIG. 19 is a schematic diagram of an image displayed on a display screen according to an embodiment of the present application;
fig. 20 is a schematic diagram of an exhibit with no UWB tag detected to an exhibit with UWB tag detected in a detection range provided by an embodiment of the present application;
fig. 21 is a schematic diagram of an exhibit without a detected UWB tag in a detection range to an exhibit without a detected UWB tag according to an embodiment of the present application;
fig. 22 is a schematic diagram of 2 UWB-tagged exhibits detected within a detection range according to an embodiment of the present application;
FIG. 23 is a schematic diagram of an image displayed on a display screen according to an embodiment of the present application;
FIG. 24 is a schematic view of an image displayed on a display screen according to an embodiment of the present application;
fig. 25 is a flowchart illustrating an augmented reality function control method according to an embodiment of the present application.
Detailed Description
The embodiments of the present application will be described below with reference to the drawings. The terminology used in the description of the embodiments herein is for the purpose of describing particular embodiments herein only and is not intended to be limiting of the application.
Several concepts involved in the embodiments of the present application are presented below:
(1) starting and closing the virtual reality function:
in an embodiment of the present application, an electronic device is an AR apparatus having an AR function. The electronic device can process data in the AR scene and display an AR picture by starting the AR function.
It can be understood that the electronic device consumes more power when the AR function is activated and less power when the AR function is not activated or the AR is deactivated.
Illustratively, the co-processor, position sensor, in the electronic device operates when the AR function in the electronic device is turned off.
A coprocessor: the auxiliary processor is used for relieving the processing burden of an Application Processor (AP) and helping the AP to perform partial work, and the power consumption of the auxiliary processor is generally smaller than that of the AP. The coprocessor may be a chip-level control unit formed by appropriately reducing the frequency and specification of a Central Processing Unit (CPU) and integrating related modules such as a counter, a memory, a display driver, and the like on a single chip. For example, one of the coprocessors may be a general-purpose smart sensing hub (sensorhub), the main function of which is to connect and process data from various sensors, the power consumption of the coprocessor chip of the sensorhub being only 1% -5% of the power consumption of the application processor chip.
When the AR function in the electronic device is activated, a controller in the electronic device controls a camera, an Image Signal Processor (ISP), or a Graphics Processing Unit (GPU) to operate.
Specifically, in some embodiments of the present application, how the positioning sensor interacts with the coprocessor, the coprocessor interacts with the controller, and thus, whether the camera, the ISP, or the GPU works can be determined by the following two ways:
the first mode is as follows:
when the positioning sensor detects the positioning tag, the positioning sensor sends a first indicating signal to the coprocessor, the first indicating signal is used for indicating the positioning sensor to detect the positioning tag, and correspondingly, after the coprocessor receives the first indicating signal, the value of the AR function start-stop flag bit is set to be 1; when the positioning sensor does not detect the positioning tag, the positioning sensor sends a second indicating signal to the coprocessor, the second indicating signal is used for indicating that the positioning sensor does not detect the positioning tag, and correspondingly, after receiving the second indicating signal, the coprocessor sets the value of the AR function start-stop flag bit to be 0.
The controller controls the camera, the ISP or the GPU to work by detecting that the value of an AR function start-stop zone bit in the coprocessor is 1; the controller controls the camera, the ISP or the GPU to stop working by detecting that the value of the AR function start-stop zone bit is 0.
The second mode is as follows:
when the position sensor detects the position tag from the undetected position tag to the detected position tag, or from the detected position tag to the undetected position tag, the position sensor sends a third indication signal to the coprocessor, the third indication signal is used to indicate that the value of the AR function start/stop flag bit changes, for example, when the positioning sensor does not detect the positioning tag at the 5 th second, the value of the AR function start/stop flag bit in the coprocessor is 0, when the positioning sensor detects the positioning label in the 10 th second, the value of the AR function start-stop flag bit is determined to be changed, then the positioning sensor sends a third indication signal to the coprocessor in the 10 th second, the third indication signal is used for indicating that the value of the AR function start-stop flag bit changes, and correspondingly, after receiving the third indication signal, the coprocessor changes the value of the AR function start-stop flag bit from 0 to 1.
The controller controls the camera, the ISP or the GPU to work by detecting that the value of an AR function start-stop zone bit in the coprocessor is 1; or the controller controls the camera, the ISP or the GPU to work by detecting that the value of the AR function start-stop zone bit in the coprocessor is changed from 0 to 1.
In the above description, the value of the AR function start/stop flag is 0, 1, that is, when the positioning sensor detects the positioning tag, the value of the AR function start/stop flag is 1; when the positioning sensor does not detect the positioning label, the value of the AR function start-stop zone bit is 0; of course, the value of the start-stop flag bit of the AR function may also be other different values that respectively represent starting or closing the AR function, and the embodiment of the present application is not limited.
(2) Positioning sensor and positioning tag:
in the embodiment of the application, the positioning sensor is used for detecting the positioning tag and determining the position of the positioning tag; the positioning tags are used to identify the location of the augmented reality scene.
Detection range of the positioning sensor:
the visual range of the user is determined because the visual range of the user is calculated and determined by a human visual model. Under a certain user visual range, the position sensor is positioned differently, and the detection range is enlarged in order to cover the user visual range. In one example, as shown in fig. 1, the visual range of both eyes of the user is determined to be 188 degrees and 2 meters according to a human eye visual model, and when the positioning sensor is installed at a position on the electronic device 500 between the two display screens, the detection range is 190 degrees and 2.2 meters because the detection range is greater than or equal to the detection range for the visual range. In still another example, as shown in fig. 2, the visual range of both eyes of the user is determined to be 188 degrees and 2 meters according to the human eye visual model, and when the positioning sensor is installed at a position in the middle of the right display screen on the electronic device 500, the detection range is 200 degrees and 2.5 meters because the detection range is greater than or equal to the detection range for the visual range. In still another example, the visual range of both eyes of the user is determined to be 188 degrees and 2 meters according to the human visual model, and when the positioning sensor is installed at a position in the middle of the left display screen on the electronic device 500, the detection range is 200 degrees and 2.5 meters because the detection range is greater than or equal to the visual range.
In the embodiment of the application, the detection direction of the positioning sensor is in front of the camera in the electronic equipment.
Optionally, in some embodiments of the present application, when the positioning sensor is an Ultra Wide Band (UWB) sensor, the positioning tag is a UWB tag; when the positioning sensor is a sensor with a Bluetooth positioning function, the positioning tag is a Bluetooth tag; when the positioning sensor is a sensor with Near Field Communication (NFC) function, the positioning tag is an NFC tag; when the positioning sensor is a sensor having a Radio Frequency Identification (RFID) function, the positioning tag is an RFID tag.
At present, the structure and the working principle of the electronic device in the prior art are as follows: as shown in fig. 3, fig. 3 is a schematic structural diagram of an electronic device including a processor, a camera, a display screen, and an Image Signal Processor (ISP) or a Graphics Processing Unit (GPU), wherein the processor, the ISP, and the GPU are not shown in fig. 1. The display screen may be a lens through which the user can see the image in the real world.
Taking a museum scene as an example, the museum comprises an exhibit 1 and an exhibit 2, wherein the exhibit 1 comprises position information of an augmented reality scene, and the exhibit 2 does not comprise position information of the augmented reality scene. As shown in fig. 4, when the user moves from exhibit 1 to exhibit 2, that is, the electronic device moves from a position where the augmented reality scene is detected to a position where the augmented reality scene is not detected, the AR function in the electronic device is always in the startup working state. As shown in fig. 5, when the user always watches the exhibit 1, the AR function in the electronic device is in a start-up working state, and the augmented reality scene of the exhibit 1 can be processed; as shown in fig. 6, when the user watches exhibit 2 all the time, there is no augmented reality scene on exhibit 2, and there is no need to process the augmented reality scene, and the AR function in the electronic device is also in a start-up working state.
Through the process analysis from the start of the electronic equipment to the display of the virtual reality image in the prior art, at present, after the electronic equipment is started, the AR function is always in a start state, a processor, a camera, a display screen, an ISP (internet service provider) or a GPU (graphics processing unit) and the like need to continuously work all the time, the power consumption is high, and the power consumption is large.
In order to solve the above problem, an embodiment of the present application provides an augmented reality function control method and an electronic device. Illustratively, as shown in fig. 7, the electronic device may include a processor, a camera, a positioning sensor, a display screen, and an ISP or GPU, wherein the processor, ISP and GPU are not shown in fig. 4.
Taking a museum scene as an example, the museum comprises an exhibit 1 and an exhibit 2, wherein the exhibit 1 comprises a positioning tag, and the exhibit 2 does not comprise a positioning tag. As shown in fig. 8, when the user moves from exhibit 1 to exhibit 2, that is, the electronic device detects the positioning tag and does not detect the positioning tag, whether the AR function in the electronic device starts to operate is shown in fig. 9 and 10. As shown in fig. 9, when the user always watches exhibit 1, that is, the electronic device detects the positioning tag, the AR function in the electronic device is in a start-up working state; as shown in fig. 10, when the user is watching exhibit 2 all the time, that is, the electronic device does not detect the positioning tag, the AR function in the electronic device is in a turned-off state.
Because the high-power-consumption device does not work when the positioning tag is not detected, the power consumption can be reduced by adopting the augmented reality function control mode of the embodiment of the application.
In the embodiment of the present application, the electronic device may be a handheld electronic device, a head-mounted electronic device, or the like with AR function, and is not limited herein.
For example, a user may wear a head-mounted electronic device to achieve different effects such as Virtual Reality (VR), AR, Mixed Reality (MR), and the like. For example, the head-mounted electronic device may be glasses, goggles, or the like. When the head-mounted electronic equipment is installed on the head of a user, the eyes of the user can see the image presented by the display screen of the head-mounted electronic equipment.
Referring to fig. 11, fig. 11 is a schematic structural diagram of an electronic device 1100.
The electronic device 1100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, buttons 190, a camera 193, a display 194, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, an acceleration sensor 180E, a distance sensor 180F, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a positioning sensor 180N, and the like.
It is to be understood that the illustrated configuration of the embodiment of the invention does not constitute a specific limitation on the electronic device 1100. In other embodiments of the present application, electronic device 1100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an AP, a coprocessor, a modem processor, a GPU, an ISP, a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), among others. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can be, among other things, a neural center and a command center of the electronic device 1100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. Such as a memory for storing instructions and data, may be provided in the co-processor. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose-input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, a bus or Universal Serial Bus (USB) interface, and the like.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K via an I2C interface, such that the processor 110 and the touch sensor 180K communicate via an I2C bus interface to implement the touch functionality of the electronic device 1100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate over a CSI interface to implement the capture functionality of electronic device 1100. The processor 110 and the display screen 194 communicate via a DSI interface to implement the display functionality of the electronic device 1100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 1100, and may also be used to transmit data between the electronic device 1100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only illustrative, and is not limited to the structure of the electronic device 1100. In other embodiments of the present application, the electronic device 1100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 1100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The electronic device 1100 implements display functions via the GPU, the display screen 194, and the application processor, etc. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 1100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The number of the display screens 1100 in the electronic device 1100 may be two, respectively corresponding to two eyeballs of the user 200.
The electronic device 1100 may implement a photographing function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, electronic device 1100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The camera 193 may be mounted on the side of the electronic device 1100, and may also be mounted on the electronic device 1100 at a position between two display screens. The camera 193 is used to capture images and video in real time within the perspective of the user 200. The electronic device 1100 generates a virtual image from the captured real-time image and video and displays the virtual image through the display screen 194.
The processor 110 may determine a virtual image displayed on the display screen 194 from a still image or video image captured by the camera 193 in conjunction with data (e.g., brightness, sound, etc.) acquired by the sensor module 130 to superimpose the virtual image on the real-world object.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 1100 selects at a frequency bin, the digital signal processor is used to perform a fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 1100 may support one or more video codecs. In this way, the electronic device 1100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 1100 can be implemented by the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 1100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area may store data (e.g., audio data, a phonebook, etc.) created during use of the electronic device 1100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications of the electronic device 1100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 1100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headset interface 170D, and the application processor, etc. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic device 1100 can listen to music through the speaker 170A or listen to a hands-free call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic device 1100 answers a call or voice information, it can answer the voice by placing the receiver 170B close to the human ear.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device 1100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 1100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 1100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and the like.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
In some embodiments, the electronic device 1100 may include one or more keys 190 that may control the electronic device to provide a user with access to functions on the electronic device 1100. The keys 190 may be in the form of buttons, switches, dials, and touch or near touch sensing devices (e.g., touch sensors). Specifically, for example, the user 200 may open the display screen 194 of the electronic device 1100 by pressing a button. The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic device 1100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic device 1100.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 1100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 1100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 1100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but have different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 1100. In some embodiments, the angular velocity of the electronic device 1100 about three axes (i.e., x, y, and z axes) may be determined by the gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. Illustratively, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 1100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 1100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, the electronic device 1100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by the barometric pressure sensor 180C.
The acceleration sensor 180E can detect the magnitude of acceleration of the electronic device 1100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 1100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 1100 may measure distance by infrared or laser. In some embodiments, taking a picture of a scene, the electronic device 1100 may utilize the range sensor 180F to range for fast focus.
The ambient light sensor 180L is used to sense the ambient light level. The electronic device 1100 may adaptively adjust the brightness of the display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 1100 may utilize the collected fingerprint characteristics to implement fingerprint unlocking, access an application lock, fingerprint photographing, fingerprint incoming call answering, and the like.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 1100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 1100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 1100 heats the battery 142 when the temperature is below another threshold to avoid an abnormal shutdown of the electronic device 1100 due to low temperatures. In other embodiments, the electronic device 1100 performs boosting of the output voltage of the battery 142 when the temperature is below a further threshold value to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 1100 at a different position than the display screen 194.
The location sensor 180N is used to detect and read location tags with precise location capabilities and pointing functionality, in some embodiments.
The augmented reality function control method in the embodiment of the present application is described below with reference to the hardware structure of the electronic device 1100 and with reference to specific scenarios:
use electronic equipment 1100 to be AR glasses, positioning sensor is the UWB sensor, and the location label is the UWB label, and the position of UWB sensor in the middle of two display screens describes as the example, and the user wears AR glasses and looks over the showpiece in indoor museum. Most exhibits seen by a user while visiting a museum are static. The rich human history information contained behind the exhibits is difficult to understand by the user and can only rely on the explanation content for imagination. By means of the AR glasses, the user can interact with the exhibit to know more information, and therefore the museum browsing device is more interesting and attractive. For example, as shown in fig. 12, fig. 12 shows a schematic diagram of an indoor museum, assuming that the indoor museum includes 4 exhibits: exhibits 1, exhibits 2, exhibits 3 and exhibits 4, wherein, all have deployed the UWB label on exhibits 1 and the exhibits 3, and the UWB label can be considered to have the chip of the directional function of location, does not deploy the UWB label on exhibits 2 and the exhibits 4.
In a museum, a user 200 opens the AR glasses by pressing a power-on key or in other ways, the AR glasses are powered on, after the AR glasses are powered on, the UWB sensor and the coprocessor in the AR glasses are started to work by default, and the camera, the display screen, the ISP and the GPU are not started to work. After starting up, the value of the AR function start-stop flag bit in the default coprocessor is 0, and the controller in the AR glasses periodically checks whether the value of the AR function start-stop flag bit in the coprocessor or the value of the AR function start-stop flag bit changes.
The user 200 wears the AR glasses to move in the indoor museum, and the AR glasses also move along with the user 200.
In the first case: after the AR glasses are started, the camera, the display screen, the ISP and the GPU are not started to work, the value of the AR function start-stop flag bit in the coprocessor is 0 at the moment, the user 200 wears the AR glasses, the UWB sensor detects the UWB tag, as shown in fig. 13, fig. 13 shows a schematic view of an exhibit in which a UWB tag is detected in a detection range, for example, the user 200 watches the exhibit 1 for a short time while wearing the AR glasses, and after the UWB sensor in the AR glasses detects the UWB tag in the exhibit 1, reads the UWB tag information, then the UWB sensor reports to the coprocessor, correspondingly, after the coprocessor receives the reported information of the UWB sensor, and changing the value of the AR function start-stop zone bit from 0 to 1, detecting and determining that the value of the AR function start-stop zone bit in the coprocessor is changed from 0 to 1 by the controller, and controlling the camera, the display screen, the ISP and the GPU to start working. Then the camera scans the exhibit 1, the images are analyzed by the processor to generate virtual images, such as drums, then the virtual images, such as drums, are attached to the real images seen through the display screen, such as the exhibit 1, by the ISP or the GPU at the determined angle and the inclination angle, and then the virtual images are displayed on the display screen, as shown in figure 14, and accordingly, a user can see the dynamic head portrait of the terracotta soldiers and horses during the drum shooting through the display screen.
In the second case: after the AR glasses are turned on, the camera, the display screen, the ISP and the GPU are not started to operate, at this time, the value of the AR function start/stop flag bit in the coprocessor is 0, the user 200 wears the AR glasses, and the UWB sensor does not detect the UWB tag, as shown in fig. 15, fig. 15 shows a schematic diagram of an exhibit in which the UWB tag is not detected in a detection range, for example, when the user 200 wears the AR glasses to constantly watch the exhibit 2 for a short time, that is, the UWB sensor does not detect the UWB tag, the value of the AR function start/stop flag bit in the coprocessor is still 0, the camera, the display screen, the ISP and the GPU are in a standby or sleep state, at this time, the UWB sensor and the coprocessor in the default AR glasses are continuously operated, the AR glasses operate in a low power consumption mode, and at this time, the display screen is in a screen off state, and the user can see images of the real world through the display screen, as shown in fig. 16.
In the third case: from the detection of the UWB tag to the non-detection of the UWB tag, the camera, the display screen, the ISP and the GPU are in a standby or dormant state from a starting working state. When the UWB sensor detects the UWB tag, the value of the AR function start-stop flag bit is 1. As shown in fig. 17, during the movement of the user 200, a process from an exhibit in which a UWB tag is detected to an exhibit in which the UWB tag is not detected within a detection range may occur, in an example, the user 200 wears AR glasses to watch the exhibit 1, and then the user 200 wears AR glasses to watch the exhibit 2, wherein the exhibit 1 is deployed with a UWB tag, and the exhibit 2 is not deployed with a UWB tag, when the user 200 wears the AR glasses to watch the exhibit 1, after detecting the UWB tag in the exhibit 1, the UWB sensor in the AR glasses reads information, and then the UWB sensor reports to the coprocessor, correspondingly, the coprocessor sets the value of the AR UWB start/stop flag bit to 1, the controller determines the value of the AR start/stop flag bit in the coprocessor to 1 by detection, controls the camera, the display, the ISP, and the GPU to start operation, and then the camera scans the exhibit 1 and analyzes by the processor, and generating a virtual image, such as a drum, and then enabling the ISP or the GPU to attach the virtual image, such as the drum, to a real image, such as an exhibit 1, which is seen through a display screen at a determined angle and a determined inclination angle, and then displaying the virtual image on the display screen, as shown in figure 14, so that a user can see a dynamic head portrait of the terracotta soldiers and horses during the drum shooting through the display screen. Then, the user 200 wears the AR glasses to watch the exhibit 2, that is, the UWB sensor does not detect the UWB tag, the value of the AR function start/stop flag bit in the coprocessor is modified from 1 to 0, the controller modifies the value of the AR function start/stop flag bit from 1 to 0 by detecting, and controls the camera, the display screen, the ISP, and the GPU to be in a standby or sleep state from a start-up working state, at this time, the UWB sensor and the coprocessor in the default AR glasses continuously work, and the AR glasses operate in a low power consumption mode.
In a fourth case: from the detection of the UWB tag to the detection of the UWB tag by the UWB sensor, the camera, the display screen, the ISP and the GPU maintain the starting working state from the starting working state. When the UWB sensor detects the UWB tag, the value of the AR function start-stop flag bit is 1. As shown in fig. 18, during the moving process of the user 200, a process from an exhibit in which a UWB tag is detected to an exhibit in which a UWB tag is detected within a detection range may occur, in an example, the user 200 wears AR glasses to watch the exhibit 1, and then the user 200 wears AR glasses to watch the exhibit 2, wherein a UWB tag is disposed in the exhibit 1, a UWB tag is disposed in the exhibit 3, when the user 200 wears the AR glasses to watch the exhibit 1, after detecting the UWB tag in the exhibit 1, the UWB sensor in the AR glasses reads information of the UWB tag, and then the UWB sensor reports to the coprocessor, correspondingly, the coprocessor sets the value of the AR function start/stop flag to 1, the controller determines the value of the AR function start/stop flag in the coprocessor to be 1 by detecting, controls the camera, the display, the ISP, and the GPU to start operation, then the camera scans the exhibit 1, and generates a virtual image by analyzing the processor, for example, a drum, and then the ISP or the GPU attaches a virtual image, for example, a drum, to a real image, such as the exhibit 1, which is seen through the display screen at a determined angle and inclination angle, and then displays the virtual image on the display screen, as shown in fig. 14, and accordingly, the user can see the dynamic head portrait of the terracotta soldiers and horses during the drum shooting through the display screen. Then, when the user 200 wears the AR glasses to watch the exhibit 3, after detecting the UWB tag in the exhibit 3, the UWB sensor in the AR glasses reads the UWB tag information in the exhibit 3, then the UWB sensor reports to the coprocessor, the value of the AR function start/stop flag bit is still 1, the controller controls the camera and the display screen, the ISP and the GPU to maintain the start-up working state, then the camera scans the exhibit 3, and through the analysis of the processor, a virtual image, for example, a segment of text, is generated, and then the ISP or the GPU attaches the virtual image, for example, the segment of text, to the real image seen through the display screen, for example, the exhibit 3, and then displays the virtual image on the display screen, as shown in fig. 19.
In the fifth case: from the moment the UWB sensor detects the UWB tag to the moment the UWB tag is detected, the camera, the display screen, the ISP and the GPU are changed into a starting working state from a standby or dormant state. When the UWB sensor does not detect the UWB tag, the value of the AR function start-stop flag bit is 0. As shown in fig. 20, during the movement of user 200, a process from an exhibit in which a UWB tag is not detected to an exhibit in which a UWB tag is detected within a detection range may occur, in an example, user 200 wears AR glasses to watch exhibit 4, and then user 200 wears AR glasses to watch exhibit 3, where a UWB tag is not deployed in exhibit 4, a UWB tag is deployed in exhibit 3, when user 200 wears AR glasses to watch exhibit 4, a UWB sensor does not detect a UWB tag in exhibit 4, a value of an AR function start/stop flag bit is 0, and a camera and a display screen, an ISP, and a GPU are in a standby or sleep state. Then, when the user 200 wears the AR glasses to watch the exhibit 3, after detecting the UWB tag in the exhibit 3, the UWB sensor in the AR glasses reads the UWB tag information in the exhibit 3, and then the UWB sensor reports to the coprocessor, correspondingly, the coprocessor sets the value of the AR function start/stop flag to 1, the controller determines that the value of the AR function start/stop flag in the coprocessor is 1 by detection, controls the camera and the display screen, the ISP and the GPU to start working, then the camera scans the exhibit 3, and generates a virtual image, such as a segment of text, through analysis by the processor, then the ISP or the GPU attaches the virtual image, such as a segment of text, to the real image, such as the exhibit 3, which is seen through the display screen, and displays the virtual image on the display screen at the determined angle and the tilt angle, as shown in fig. 19.
In the sixth case: the UWB sensor never detects the UWB tag to not detect the UWB tag, and the camera, the display screen, the ISP, and the GPU are maintained in a standby or sleep state from being in a standby or sleep state. When the UWB sensor does not detect the UWB tag, the value of the AR function start-stop flag bit is 0. As shown in fig. 21, during the movement of the user 200, a process from an exhibit where a UWB tag is not detected to an exhibit where a UWB tag is not detected within a detection range may occur, in an example, the user 200 wears AR glasses to watch the exhibit 2, and then the user 200 wears AR glasses to watch the exhibit 4, wherein a UWB tag is not deployed in the exhibit 2, a UWB tag is not deployed in the exhibit 4, the user 200 wears AR glasses to watch the exhibit 2, that is, the UWB tag is not detected by the UWB sensor, the value of the AR function start/stop flag bit in the coprocessor is 0, the camera and the display screen, the ISP and the GPU are in a standby or sleep state, and at this time, the UWB sensor and the coprocessor in the AR glasses are operated continuously by default, the AR glasses operate in a low power consumption mode, at this time, the display screen is in an off state, the user may see an image of the real world through the display screen, as shown in fig. 16, and then the user 200 wears the AR glasses to watch the exhibit 4, the UWB sensor does not detect the UWB tag in the exhibit 4, the value of the AR function start-stop zone bit is still 0, and the camera, the display screen, the ISP and the GPU are still in a standby or dormant state.
In the seventh case: a UWB sensor detects 2 UWB tags, as shown in fig. 22, fig. 22 shows a schematic diagram of an exhibit in which 2 UWB tags are detected within a detection range, in an example, a user 200 stands in front of the exhibit 1 and the exhibit 3, the user 200 wears AR glasses, the UWB sensor scans 2 UWB tags, for example, the UWB tag in the exhibit 1 and the UWB tag in the exhibit 3, the UWB tag in the exhibit is considered as a first positioning tag, the UWB tag in the exhibit 3 is considered as a second positioning tag, the UWB sensor reports to the coprocessor during scanning, the coprocessor calls the distance sensor, the distance sensor measures a distance between the UWB tag in the exhibit 1 and the UWB sensor, that is, a first distance, and a distance between the UWB tag in the exhibit 3 and the UWB sensor, that is, a second distance, since the second distance is greater than the first distance, determining a exhibit, such as exhibit 1, which is relatively close to a user and informing a coprocessor, informing the coprocessor of the exhibit, such as exhibit 1, which is relatively close to the user to an UWB sensor, scanning an UWB tag in the exhibit 1 by the UWB sensor, reading UWB tag information in the exhibit 1, reporting the UWB tag to the coprocessor by the UWB sensor, correspondingly, after the coprocessor receives the reported information of the UWB sensor, setting the value of an AR function start-stop flag bit to 1, determining the value of the AR function start-stop flag bit in the coprocessor to be 1 by a controller through detection, controlling a camera, a display screen, an ISP and a GPU to start working, scanning the exhibit 1 by the camera, generating a virtual image, such as a drum, by analyzing the processor, then attaching the virtual image, such as the drum, to a real image, such as the exhibit 1, which is seen through the display screen by the ISP or the GPU at a determined angle and then displaying on the display screen, as shown in fig. 23.
In the eighth case: a UWB sensor detects 2 UWB tags, as shown in fig. 22, fig. 22 shows a schematic diagram of an exhibit detecting 2 UWB tags within a detection range, in an example, a user 200 stands in front of the exhibit 1 and the exhibit 3, the user 200 wears AR glasses, the UWB sensor reports to the coprocessor when the UWB sensor scans 2 UWB tags, for example, the UWB tag in the exhibit 1 and the UWB tag in the exhibit 3, the UWB tag in the exhibit is considered as a first positioning tag, the UWB tag in the exhibit 3 is considered as a second positioning tag, the UWB sensor simultaneously scans the UWB tags in the exhibit 1 and the exhibit 3, reads the UWB tag information in the exhibit 1 and the UWB tag information in the exhibit 3, the UWB sensor reports to the coprocessor, and accordingly, after the UWB coprocessor receives the reported information of the UWB sensor, setting the value of the AR function start/stop flag to 1, determining the value of the AR function start/stop flag in the coprocessor to 1 by the controller through detection, controlling the camera, the display screen, the ISP and the GPU to start operation, then scanning the exhibit 1 and the exhibit 3 by the camera, generating a virtual image such as a drum and a segment of characters through processor analysis, and then attaching the virtual image such as the drum to a real image such as the exhibit 1 seen through the display screen and attaching the virtual image such as a real image such as the exhibit 3 seen through the display screen by the ISP or the GPU at a determined angle and an inclination angle to display the virtual image on the display screen, as shown in fig. 24.
With reference to the hardware structure of the electronic device 1100, the following describes a method for controlling an augmented reality function in an embodiment of the present application in detail:
referring to fig. 25, fig. 25 is a diagram illustrating an augmented reality function control method according to an embodiment of the present application, where the method includes, but is not limited to, the following steps:
step S2501: the user wears the electronic equipment and turns on the electronic equipment.
Step S2502: the electronic device detects the positioning tag within a preset detection range.
Specifically, the preset detection range is greater than or equal to the user visual range, and the user visual range is determined by calculation according to a human eye visual model. The detection range of the positioning sensor may be referred to specifically, and details are not repeated here.
Specifically, the electronic device may detect the location tag within a preset detection range through the location sensor. When the positioning sensor is an ultra-wideband UWB sensor, the positioning tag is an UWB tag; of course, the positioning sensor and the positioning tag may also be other positioning sensors and positioning tags, and specific reference may be made to the description of the positioning sensor and the positioning tag, which is not described herein again.
Step S2503: and when the electronic equipment does not detect the positioning tag within the preset detection range, the augmented reality function is not started or closed.
Specifically, not activating or deactivating the augmented reality function means that the camera, the display screen, the ISP, and the GPU are in a standby or sleep state. That is, when the electronic device does not detect the positioning tag within the preset detection range, the camera, the display screen, the ISP, and the GPU are in a standby or sleep state.
Specifically, not activating or deactivating the augmented reality function may be achieved by interaction between the positioning sensor and the co-processor, the co-processor and the controller: the first mode is as follows: when the positioning sensor does not detect the positioning tag, the positioning sensor sends a second indicating signal to the coprocessor, the second indicating signal is used for indicating that the positioning sensor does not detect the positioning tag, and correspondingly, after the coprocessor receives the second indicating signal, the value of the AR function start-stop flag bit is set to be 0. The controller controls the camera, the ISP or the GPU to stop working by detecting that the value of the AR function start-stop zone bit is 0. The second mode is as follows: whether the value of the flag bit in the coprocessor is changed or not can be detected by the controller at regular time; and closing the camera, the image signal processor ISP and the graphic processor GPU according to the value of the zone bit. Reference may be made to the above description for details not described herein.
The specific electronic device does not detect the positioning tag, and the augmented reality function is not started or closed according to the second, third, fifth, and sixth conditions described in the above embodiments, which are not described herein again.
Step S2504: the electronic equipment detects a first positioning tag in a preset detection range and starts an augmented reality function.
Specifically, starting the augmented reality function means that the camera, the display screen, the ISP, and the GPU are in a starting state. When the electronic equipment detects the first positioning label in a preset detection range, the camera, the display screen, the ISP and the GPU are in a starting state, and the coprocessor and the positioning sensor are in a working state.
Specifically, the starting of the augmented reality function may be realized by interaction between the positioning sensor and the co-processor, and between the co-processor and the controller: the first mode is as follows: when the positioning sensor detects the positioning tag, the positioning sensor sends a first indicating signal to the coprocessor, the first indicating signal is used for indicating the positioning sensor to detect the positioning tag, and correspondingly, after the coprocessor receives the first indicating signal, the value of the AR function start-stop flag bit is set to be 1. And the controller controls the camera, the ISP or the GPU to work by detecting that the value of the AR function start-stop zone bit is 1. The second mode is as follows: whether the value of the flag bit in the coprocessor is changed or not can be detected by the controller at regular time; and starting the camera, the ISP and the GPU according to the value of the zone bit. Reference may be made to the above description for details not described herein.
Specifically, the first positioning tag is detected in the preset detection range, and the augmented reality starting function may refer to the first and fourth cases described in the above embodiments, which is not described herein again.
Step S2505: the electronic device detects the second positioning tag within a preset detection range.
In a possible implementation manner, when the electronic device detects the second positioning tag within the preset detection range, the sizes of the first distance and the second distance are determined, and the second distance is determined to be greater than the first distance. Specifically, the first distance is a distance between the first positioning tag and the positioning sensor, and the second distance is a distance between the second positioning tag and the positioning sensor. Specifically, reference may be made to the seventh case, which is not described herein again.
In a possible implementation manner, when the electronic device detects the second positioning tag within a preset detection range, the electronic device processes and displays an augmented reality scene at the position of the second positioning tag. Specifically, reference may be made to the above eighth case, which is not described herein again.
Step S2506: and the electronic equipment processes and displays the augmented reality scene of the position of the first positioning label.
The first, third, fourth, fifth and seventh conditions described in the above embodiments may be specifically mentioned, and are not described herein again.
In the method, whether the positioning tag is detected in the preset detection range through the electronic equipment is determined, so that whether the augmented reality function is started or not is determined.
The embodiment of the present application further provides a chip system, where the chip system includes at least one processor, a memory and an interface circuit, where the memory, the transceiver and the at least one processor are interconnected by a line, and the at least one memory stores a computer program; the method flow shown in fig. 25 is implemented when the computer program is executed by the processor.
An embodiment of the present application further provides a computer-readable storage medium, in which a computer program is stored, and when the computer program runs on an electronic device, the method flow shown in fig. 25 is implemented.
Embodiments of the present application also provide a computer program product, and when the computer program product runs on a processor, the method flow shown in fig. 25 is implemented.
One of ordinary skill in the art will appreciate that all or part of the processes in the methods of the above embodiments can be implemented by hardware associated with a computer program that can be stored in a computer-readable storage medium, and when executed, can include the processes of the above method embodiments. And the aforementioned storage medium includes: various media that can store computer program code, such as ROM or RAM, magnetic or optical disks, etc.

Claims (22)

1. An augmented reality function control method, comprising:
the electronic equipment detects the positioning tag in a preset detection range; the preset detection range is in front of a camera in the electronic equipment; the positioning tag is used for identifying the position of the augmented reality scene;
when the electronic equipment does not detect the positioning tag in the preset detection range, the electronic equipment does not start or close the augmented reality function; wherein the augmented reality function is configured to process data of the augmented reality scene and display an augmented reality picture;
when the electronic equipment detects a first positioning tag in the preset detection range, the electronic equipment starts the augmented reality function, and the first positioning tag is a positioning tag.
2. The method of claim 1, wherein the electronic device does not activate or deactivate augmented reality functionality, comprising:
the electronic equipment does not start or close the camera, the image signal processor ISP and the graphic processor GPU.
3. The method of claim 1, wherein the electronic device initiates the augmented reality function, comprising:
and the electronic equipment starts the camera, the image signal processor ISP and the graphic processor GPU.
4. The method according to any one of claims 1-3, wherein the electronic device detects the location tag within a preset detection range, comprising:
and a positioning sensor in the electronic equipment detects the positioning label in the preset detection range.
5. The method of claim 4,
the positioning sensor is an ultra wide band UWB sensor, and the positioning label is an UWB label.
6. The method according to any one of claims 1 to 5, wherein when the electronic device detects the first positioning tag within the preset detection range, the electronic device starts the augmented reality function, specifically including:
when a positioning sensor in the electronic equipment detects a first positioning tag in the preset detection range, the positioning sensor sends a first indicating signal to a coprocessor, and the first indicating signal is used for indicating the positioning sensor to detect the first positioning tag in the preset detection range;
the coprocessor receives a first indication signal from the positioning sensor and determines the value of a flag bit in the coprocessor;
and the controller in the electronic equipment starts the camera, the image signal processor ISP and the graphic processor GPU according to the value of the zone bit in the coprocessor.
7. The method according to any one of claims 1 to 5, wherein when the electronic device does not detect a positioning tag within the preset detection range, the electronic device does not start or turn off an augmented reality function, specifically including:
when a positioning sensor in the electronic equipment does not detect a positioning tag in the preset detection range, the positioning sensor sends a second indication signal to a coprocessor, wherein the second indication signal is used for indicating that the positioning sensor does not detect the positioning tag in the preset detection range;
the coprocessor receives a second indication signal from the positioning sensor and determines the value of a flag bit in the coprocessor;
and the controller in the electronic equipment closes the camera, the image signal processor ISP and the graphic processor GPU according to the value of the zone bit in the coprocessor.
8. The method according to any one of claims 1-7, wherein after the electronic device activates the augmented reality function when the electronic device detects the first positioning tag within the preset detection range, the method further comprises:
and the electronic equipment processes and displays the augmented reality scene of the position of the first positioning label.
9. The method according to claim 8, wherein after the electronic device starts the augmented reality function when the electronic device detects the first positioning tag within the preset detection range, and before the electronic device processes and displays an augmented reality scene where the first positioning tag is located, the method further comprises:
when a positioning sensor in the electronic device detects a second positioning tag in the preset detection range, determining the size of a first distance and a second distance, wherein the first distance is the distance between the first positioning tag and the positioning sensor, and the second distance is the distance between the second positioning tag and the positioning sensor;
the electronic device determines that the second distance is greater than the first distance.
10. The method according to claim 8, wherein after the electronic device starts the augmented reality function when the electronic device detects the first positioning tag within the preset detection range, and before the electronic device processes and displays an augmented reality scene where the first positioning tag is located, the method further comprises:
and when a positioning sensor in the electronic equipment detects a second positioning tag in the preset detection range, the electronic equipment processes and displays the augmented reality scene of the position of the second positioning tag.
11. An electronic device, comprising a positioning sensor and a controller,
the positioning sensor is used for detecting the positioning label in a preset detection range; the preset detection range is in front of the camera; the positioning tag is used for identifying the position of the augmented reality scene;
the controller is used for controlling not to start or close the augmented reality function under the condition that the positioning sensor does not detect the positioning label in the preset detection range; the augmented reality function is used for processing the data of the augmented reality scene and displaying an augmented reality picture;
the controller is further configured to control the augmented reality function to be started under the condition that the positioning sensor detects a first positioning tag within the preset detection range, where the first positioning tag is a positioning tag.
12. The apparatus of claim 11, wherein the controller is further configured to control the camera, the image signal processor ISP, and the graphics processor GPU not to be turned on or off.
13. The apparatus of claim 11, wherein the controller is further configured to control activation of the camera, the image signal processor ISP, and the graphics processor GPU.
14. The apparatus of any one of claims 11-13, wherein the positioning sensor is an ultra-wideband UWB sensor and the positioning tag is a UWB tag.
15. The apparatus according to any one of claims 11-14,
the positioning sensor is further configured to send a first indication signal to the coprocessor when the positioning sensor in the electronic device detects a first positioning tag within the preset detection range, where the first indication signal is used to indicate that the positioning sensor detects the first positioning tag within the preset detection range;
the coprocessor is used for receiving a first indication signal from the positioning sensor and determining the value of a flag bit in the coprocessor;
and the controller is used for starting the camera, the image signal processor ISP and the graphic processor GPU according to the value of the zone bit in the coprocessor.
16. The apparatus according to any one of claims 11-14,
the positioning sensor is further configured to send a second indication signal to the coprocessor when the positioning sensor in the electronic device does not detect a positioning tag within the preset detection range, where the second indication signal is used to indicate that the positioning sensor does not detect a positioning tag within the preset detection range;
the coprocessor is also used for receiving a second indication signal from the positioning sensor and determining the value of a flag bit in the coprocessor;
and the controller is also used for closing the camera, the image signal processor ISP and the graphic processor GPU according to the value of the zone bit in the coprocessor.
17. The apparatus according to any one of claims 11-16, further comprising:
and the processor is used for processing and displaying the augmented reality scene at the position of the first positioning label.
18. The apparatus of claim 17, further comprising:
the distance sensor is used for determining the size of a first distance and a second distance under the condition that the positioning sensor detects a second positioning tag in the preset detection range, wherein the first distance is the distance between the first positioning tag and the positioning sensor, and the second distance is the distance between the second positioning tag and the positioning sensor; the electronic device determines that the second distance is greater than the first distance.
19. The apparatus of claim 17,
the processor is further configured to process and display an augmented reality scene at a position of a second positioning tag when the positioning sensor detects the second positioning tag within the preset detection range.
20. An electronic device, comprising: one or more processors and memory; the memory coupled with the one or more processors, the memory for storing computer program code, the computer program code comprising computer instructions, the one or more processors invoking the computer instructions to cause the electronic device to perform the method of any of claims 1-10.
21. A computer storage medium, characterized in that it stores a computer program comprising program instructions for implementing the method of any one of claims 1-10 when executed by a processor.
22. A computer program product which, when run on an electronic device, causes the electronic device to perform the method of any of claims 1-10.
CN202011206535.6A 2020-11-02 2020-11-02 Augmented reality function control method and electronic equipment Active CN114531582B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011206535.6A CN114531582B (en) 2020-11-02 2020-11-02 Augmented reality function control method and electronic equipment
PCT/CN2021/127781 WO2022089625A1 (en) 2020-11-02 2021-10-30 Augmented reality function control method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011206535.6A CN114531582B (en) 2020-11-02 2020-11-02 Augmented reality function control method and electronic equipment

Publications (2)

Publication Number Publication Date
CN114531582A true CN114531582A (en) 2022-05-24
CN114531582B CN114531582B (en) 2023-06-13

Family

ID=81381893

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011206535.6A Active CN114531582B (en) 2020-11-02 2020-11-02 Augmented reality function control method and electronic equipment

Country Status (2)

Country Link
CN (1) CN114531582B (en)
WO (1) WO2022089625A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11935093B1 (en) 2023-02-19 2024-03-19 Toyota Motor Engineering & Manufacturing North America, Inc. Dynamic vehicle tags

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120212484A1 (en) * 2010-02-28 2012-08-23 Osterhout Group, Inc. System and method for display content placement using distance and location information
WO2014169692A1 (en) * 2013-04-15 2014-10-23 Tencent Technology (Shenzhen) Company Limited Method,device and storage medium for implementing augmented reality
CN107767460A (en) * 2016-08-18 2018-03-06 深圳市劲嘉数媒科技有限公司 The methods of exhibiting and device of augmented reality
CN107967054A (en) * 2017-11-16 2018-04-27 中国人民解放军陆军装甲兵学院 The immersion three-dimensional electronic sand table that a kind of virtual reality is coupled with augmented reality
US20180239144A1 (en) * 2017-02-16 2018-08-23 Magic Leap, Inc. Systems and methods for augmented reality
CN109129507A (en) * 2018-09-10 2019-01-04 北京联合大学 A kind of medium intelligent introduction robot and explanation method and system
CN109614785A (en) * 2018-11-01 2019-04-12 Oppo广东移动通信有限公司 Using the management-control method of operation, device, storage medium and electronic equipment
CN111722710A (en) * 2020-06-02 2020-09-29 广东小天才科技有限公司 Method for starting augmented reality AR interactive learning mode and electronic equipment

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10037465B2 (en) * 2016-03-18 2018-07-31 Disney Enterprises, Inc. Systems and methods for generating augmented reality environments
CN106126066A (en) * 2016-06-28 2016-11-16 广东欧珀移动通信有限公司 Control method, device and the mobile terminal of a kind of augmented reality function
CN106204743B (en) * 2016-06-28 2020-07-31 Oppo广东移动通信有限公司 Control method and device for augmented reality function and mobile terminal
CN106406520A (en) * 2016-08-30 2017-02-15 徐丽芳 Position identification method of virtual reality system
CN108962098A (en) * 2018-08-02 2018-12-07 合肥市徽马信息科技有限公司 A kind of guide system based on AR augmented reality

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120212484A1 (en) * 2010-02-28 2012-08-23 Osterhout Group, Inc. System and method for display content placement using distance and location information
WO2014169692A1 (en) * 2013-04-15 2014-10-23 Tencent Technology (Shenzhen) Company Limited Method,device and storage medium for implementing augmented reality
CN107767460A (en) * 2016-08-18 2018-03-06 深圳市劲嘉数媒科技有限公司 The methods of exhibiting and device of augmented reality
US20180239144A1 (en) * 2017-02-16 2018-08-23 Magic Leap, Inc. Systems and methods for augmented reality
CN107967054A (en) * 2017-11-16 2018-04-27 中国人民解放军陆军装甲兵学院 The immersion three-dimensional electronic sand table that a kind of virtual reality is coupled with augmented reality
CN109129507A (en) * 2018-09-10 2019-01-04 北京联合大学 A kind of medium intelligent introduction robot and explanation method and system
CN109614785A (en) * 2018-11-01 2019-04-12 Oppo广东移动通信有限公司 Using the management-control method of operation, device, storage medium and electronic equipment
CN111722710A (en) * 2020-06-02 2020-09-29 广东小天才科技有限公司 Method for starting augmented reality AR interactive learning mode and electronic equipment

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
史晓琳等: "手机增强现实室内向导的研究与实现", 《计算机应用与软件》 *
孙效华等: "基于增强现实技术的物联网数据呈现与交互", 《包装工程》 *
张?等: "用于电力资产在线感知的eRFID标签设计", 《电工技术学报》 *

Also Published As

Publication number Publication date
CN114531582B (en) 2023-06-13
WO2022089625A1 (en) 2022-05-05

Similar Documents

Publication Publication Date Title
CN110045908B (en) Control method and electronic equipment
CN109582141B (en) Method for controlling display screen according to eyeball focus and head-mounted electronic equipment
CN110798568B (en) Display control method of electronic equipment with folding screen and electronic equipment
CN116070684B (en) Integrated chip and method for processing sensor data
CN111475077A (en) Display control method and electronic equipment
WO2022179376A1 (en) Gesture control method and apparatus, and electronic device and storage medium
CN110401768B (en) Method and device for adjusting working state of electronic equipment
CN114202000A (en) Service processing method and device
CN115589051B (en) Charging method and terminal equipment
CN112860428A (en) High-energy-efficiency display processing method and equipment
CN113395382A (en) Method for data interaction between devices and related devices
CN113728295A (en) Screen control method, device, equipment and storage medium
CN113448482A (en) Sliding response control method and device of touch screen and electronic equipment
WO2022089625A1 (en) Augmented reality function control method and electronic device
CN110058729B (en) Method and electronic device for adjusting sensitivity of touch detection
CN113496477A (en) Screen detection method and electronic equipment
CN111880661A (en) Gesture recognition method and device
CN110673694A (en) Application opening method and electronic equipment
CN115665632A (en) Audio circuit, related device and control method
WO2022017270A1 (en) Appearance analysis method, and electronic device
CN113380240B (en) Voice interaction method and electronic equipment
CN113325948B (en) Air-isolated gesture adjusting method and terminal
CN114221402A (en) Charging method and device of terminal equipment and terminal equipment
CN115393676A (en) Gesture control optimization method and device, terminal and storage medium
CN114661258A (en) Adaptive display method, electronic device, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant