CN114531582B - Augmented reality function control method and electronic equipment - Google Patents

Augmented reality function control method and electronic equipment Download PDF

Info

Publication number
CN114531582B
CN114531582B CN202011206535.6A CN202011206535A CN114531582B CN 114531582 B CN114531582 B CN 114531582B CN 202011206535 A CN202011206535 A CN 202011206535A CN 114531582 B CN114531582 B CN 114531582B
Authority
CN
China
Prior art keywords
positioning
augmented reality
electronic device
sensor
detection range
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011206535.6A
Other languages
Chinese (zh)
Other versions
CN114531582A (en
Inventor
李世明
王帅
魏江波
唐建中
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202011206535.6A priority Critical patent/CN114531582B/en
Priority to PCT/CN2021/127781 priority patent/WO2022089625A1/en
Publication of CN114531582A publication Critical patent/CN114531582A/en
Application granted granted Critical
Publication of CN114531582B publication Critical patent/CN114531582B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/65Control of camera operation in relation to power supply
    • H04N23/651Control of camera operation in relation to power supply for reducing power consumption by affecting camera operations, e.g. sleep mode, hibernation mode or power off of selective parts of the camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/021Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences

Abstract

The embodiment of the application provides an augmented reality function control method and electronic equipment, wherein the method comprises the following steps: the electronic equipment detects the positioning label in a preset detection range; the method comprises the steps that a detection range is preset in front of a camera in electronic equipment; the positioning tag is used for identifying the position of the augmented reality scene; when the electronic equipment does not detect the positioning tag in the preset detection range, the electronic equipment does not start or close the augmented reality function; the augmented reality function is used for processing data of an augmented reality scene and displaying an augmented reality picture; when the electronic equipment detects the first positioning label in the preset detection range, the electronic equipment starts the augmented reality function, and the first positioning label is a positioning label.

Description

Augmented reality function control method and electronic equipment
Technical Field
The application relates to the technical field of communication, in particular to an augmented reality function control method and electronic equipment.
Background
With the continuous development of society, an augmented reality (augmented reality, AR) device is also applied to aspects of life, and the working principle of the AR device is that virtual objects are projected or displayed on the AR device through analysis of a processor, in this way, virtual images can be displayed for a user while the user views a real world scene. The user may also interact with the virtual image to achieve the effect of augmented reality.
However, the AR device is operated using a high-precision spatial sensor such as a camera, and a high-performance processor performs real-time processing on a large amount of image data, which results in high power consumption of the AR device. Whereas current AR device power supply mainly includes wired power supply and battery power supply.
The AR device can be used in a very small space range by a wired power supply mode, the use scene of the AR device is greatly limited, the battery power supply mode is generally used for quite short endurance, the accuracy of a space sensing chip can be reduced, the sampling rate is reduced and the like in order to increase the endurance time as much as possible, the virtual image seen by a user is not good enough in fit with a real scene, visual field and the like, immersion feeling is not strong, and user experience is not good; in order to increase the endurance of the AR device, the battery capacity needs to be increased, which causes the equipment to become heavy and the wearing comfort to be reduced, so how to reduce the power consumption of the AR device and prolong the endurance of the AR device without reducing the user experience is a technical problem being solved by the person skilled in the art.
Disclosure of Invention
The embodiment of the application discloses an augmented reality function control method and electronic equipment, which can reduce power consumption.
The embodiment of the application discloses an augmented reality function control method in a first aspect. The electronic equipment detects the positioning label in a preset detection range; the preset detection range is in front of a camera in the electronic equipment; the positioning tag is used for identifying the position of the augmented reality scene; when the electronic equipment does not detect the positioning tag in the preset detection range, the electronic equipment does not start or close the augmented reality function; the augmented reality function is used for processing the data of the augmented reality scene and displaying an augmented reality picture; when the electronic equipment detects a first positioning label in the preset detection range, the electronic equipment starts the augmented reality function, and the first positioning label is a positioning label.
In the method, whether the electronic equipment detects the positioning tag in the preset detection range or not is determined, so that whether the augmented reality function is started or not is determined.
In one possible implementation, the electronic device does not activate or deactivate an augmented reality function, including: the electronic device does not activate or deactivate the camera, the image signal processor ISP and the graphics processor GPU.
In yet another possible implementation, the electronic device initiates the augmented reality function, including: the electronic device starts the camera, the image signal processor ISP and the graphics processor GPU.
In another possible implementation manner, the electronic device detects the positioning tag within a preset detection range, including: and the positioning sensor in the electronic equipment detects the positioning label in the preset detection range.
In yet another possible implementation, the positioning sensor is an ultra wideband UWB sensor and the positioning tag is a UWB tag.
In still another possible implementation manner, when the electronic device detects the first positioning tag within the preset detection range, the electronic device starts the augmented reality function, and specifically includes: when a positioning sensor in the electronic equipment detects a first positioning label in the preset detection range, the positioning sensor sends a first indication signal to a coprocessor, and the first indication signal is used for indicating the positioning sensor to detect the first positioning label in the preset detection range; the coprocessor receives a first indication signal from the positioning sensor and determines the value of a zone bit in the coprocessor; and starting the camera, the image signal processor ISP and the graphic processor GPU by the controller in the electronic equipment according to the value of the zone bit in the coprocessor.
In another possible implementation manner, when the electronic device does not detect the positioning tag within the preset detection range, the electronic device does not start or close the augmented reality function, and specifically includes: when the positioning sensor in the electronic equipment does not detect the positioning label in the preset detection range, the positioning sensor sends a second indication signal to the coprocessor, wherein the second indication signal is used for indicating that the positioning sensor does not detect the positioning label in the preset detection range; the coprocessor receives a second indication signal from the positioning sensor and determines the value of a zone bit in the coprocessor; and closing the camera, the image signal processor ISP and the graphic processor GPU by a controller in the electronic equipment according to the value of the zone bit in the coprocessor.
In yet another possible implementation manner, when the electronic device detects the first positioning tag within the preset detection range, after the electronic device starts the augmented reality function, the method further includes: and the electronic equipment processes and displays the augmented reality scene of the position of the first positioning label.
In still another possible implementation manner, when the electronic device detects the first positioning tag within the preset detection range, after the electronic device starts the augmented reality function, before the electronic device processes and displays the augmented reality scene where the first positioning tag is located, the method further includes: when a positioning sensor in the electronic equipment detects a second positioning label in the preset detection range, determining the first distance and the second distance, wherein the first distance is the distance between the first positioning label and the positioning sensor, and the second distance is the distance between the second positioning label and the positioning sensor; the electronic device determines that the second distance is greater than the first distance.
In still another possible implementation manner, when the electronic device detects the first positioning tag within the preset detection range, after the electronic device starts the augmented reality function, before the electronic device processes and displays the augmented reality scene where the first positioning tag is located, the method further includes: when the positioning sensor in the electronic equipment detects the second positioning label in the preset detection range, the electronic equipment processes and displays the augmented reality scene of the position of the second positioning label.
A second aspect of the embodiments discloses an electronic device that includes a positioning sensor and a controller.
The positioning sensor is used for detecting the positioning label in a preset detection range; the preset detection range is in front of the camera; the positioning tag is used for identifying the position of the augmented reality scene;
the controller is used for controlling not to start or close the augmented reality function under the condition that the positioning sensor does not detect the positioning label in the preset detection range; the augmented reality function is used for processing the data of the augmented reality scene and displaying an augmented reality picture;
the controller is further configured to control to start the augmented reality function when the positioning sensor detects a first positioning tag within the preset detection range, where the first positioning tag is a positioning tag.
In a possible implementation, the controller is further configured to control not to start or shut down the camera, the image signal processor ISP and the graphics processor GPU.
In yet another possible implementation, the controller is further configured to control activation of the camera, the image signal processor ISP, and the graphics processor GPU.
In yet another possible implementation, the positioning sensor is an ultra wideband UWB sensor and the positioning tag is a UWB tag.
In yet another possible implementation manner, the positioning sensor is further configured to send a first indication signal to a coprocessor when the positioning sensor in the electronic device detects a first positioning tag within the preset detection range, where the first indication signal is used to indicate that the positioning sensor detects the first positioning tag within the preset detection range; the coprocessor is used for receiving a first indication signal from the positioning sensor and determining the value of a zone bit in the coprocessor; and the controller is used for starting the camera, the image signal processor ISP and the graphic processor GPU according to the value of the zone bit in the coprocessor.
In a further possible implementation manner, the positioning sensor is further configured to send a second indication signal to a coprocessor when the positioning sensor in the electronic device does not detect a positioning tag within the preset detection range, where the second indication signal is used to indicate that the positioning sensor does not detect a positioning tag within the preset detection range; the coprocessor is also used for receiving a second indication signal from the positioning sensor and determining the value of a marker bit in the coprocessor; and the controller is also used for closing the camera, the image signal processor ISP and the graphic processor GPU according to the value of the zone bit in the coprocessor.
In yet another possible implementation, the electronic device further includes: and the processor is used for processing and displaying the augmented reality scene of the position of the first positioning label.
In yet another possible implementation, the electronic device further includes: the distance sensor is used for determining the first distance and the second distance when the positioning sensor detects the second positioning label in the preset detection range, wherein the first distance is the distance between the first positioning label and the positioning sensor, and the second distance is the distance between the second positioning label and the positioning sensor; the electronic device determines that the second distance is greater than the first distance.
In yet another possible implementation manner, the processor is further configured to process and display an augmented reality scene where the second positioning tag is located if the positioning sensor detects the second positioning tag within the preset detection range.
With regard to the technical effects brought about by the second aspect or the possible implementation manner, reference may be made to the description of the technical effects of the first aspect or the corresponding implementation manner.
A third aspect of an embodiment of the present application discloses an electronic device, including: one or more processors, and memory; the memory is coupled with the one or more processors, the memory is used for storing computer program code, the computer program code comprises computer instructions, and the one or more processors call the computer instructions to realize the method described in the first aspect or the possible implementation manner of the first aspect.
A fourth aspect of the embodiments of the present application discloses a computer storage medium storing a computer program comprising program instructions for implementing the method described in the first aspect or a possible implementation manner of the first aspect when the program instructions are executed by a processor.
A fifth aspect of the embodiments of the present application discloses a computer program product which, when run on an electronic device, causes the electronic device to implement the method described in the first aspect or a possible implementation of the first aspect.
Drawings
The drawings used in the embodiments of the present application are described below.
FIG. 1 is a schematic view of a positioning sensor mounting location provided in an embodiment of the present application;
FIG. 2 is a schematic illustration of yet another positioning sensor mounting location provided by an embodiment of the present application;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
FIG. 4 is a schematic diagram of an electronic device from a location where an augmented reality scene is detected to a location where an augmented reality scene is not detected, provided in an embodiment of the present application;
fig. 5 is a schematic diagram of an electronic device detecting a location of an augmented reality scene according to an embodiment of the present application;
fig. 6 is a schematic diagram of a location where an electronic device does not detect an augmented reality scene according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
FIG. 8 is a schematic diagram of an electronic device from detecting a location tag to not detecting a location tag according to an embodiment of the present application;
fig. 9 is a schematic diagram of an electronic device detecting a positioning tag according to an embodiment of the present application;
fig. 10 is a schematic diagram of an electronic device provided in an embodiment of the present application, where no positioning tag is detected;
fig. 11 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 12 is a schematic view of an indoor museum provided in an embodiment of the present application;
FIG. 13 is a schematic diagram of an exhibit in which a UWB tag is detected within a detection range provided by an embodiment of the present application;
FIG. 14 is a schematic view of an image displayed by a display screen according to an embodiment of the present application;
FIG. 15 is a schematic diagram of an exhibit in which no UWB tag is detected within a detection range provided by an embodiment of the present application;
FIG. 16 is a schematic image view of a display screen according to an embodiment of the present disclosure;
FIG. 17 is a schematic diagram of an exhibit with UWB tags detected to an exhibit without UWB tags detected within a detection range provided by an embodiment of the present application;
FIG. 18 is a schematic diagram of an exhibit detected UWB tag to an exhibit detected UWB tag within a detection range provided by an embodiment of the present application;
FIG. 19 is a schematic view of an image displayed by a display screen according to an embodiment of the present application;
FIG. 20 is a schematic diagram of an exhibit without UWB tag detected to an exhibit with UWB tag detected within a detection range provided by an embodiment of the present application;
FIG. 21 is a schematic diagram of an exhibit without UWB tag detected to an exhibit without UWB tag detected within a detection range provided by an embodiment of the present application;
FIG. 22 is a schematic diagram of detecting 2 UWB tagged exhibits within a detection range provided by an embodiment of the present application;
FIG. 23 is a schematic image view of a display screen according to an embodiment of the present disclosure;
FIG. 24 is a schematic view of an image displayed by a display screen according to an embodiment of the present application;
fig. 25 is a flowchart of an augmented reality function control method according to an embodiment of the present application.
Detailed Description
Embodiments of the present application are described below with reference to the accompanying drawings in the embodiments of the present application. The terminology used in the description of the embodiments of the application is for the purpose of describing particular embodiments of the application only and is not intended to be limiting of the application.
Several concepts involved in the embodiments of the present application are described below:
(1) Starting and closing of virtual reality functions:
in this embodiment of the present application, the electronic device is an AR device with an AR function. The electronic device may implement processing of data in the AR scene and display the AR screen by starting the AR function.
It is understood that the electronic device consumes more power when the AR function is activated and less power when the AR function is not activated or the AR is off.
Illustratively, when the AR function in the electronic device is off, a co-processor, a location sensor, in the electronic device operates.
A coprocessor: for relieving the processing burden of the application processor (application processor, AP), an auxiliary processor that assists the AP in performing part of the work, the power consumption of the co-processor is generally less than that of the AP. The coprocessor may be a chip-level control unit formed by appropriately reducing the frequency and specification of a central processing unit (central processing unit, CPU), and integrating related modules such as a counter, a memory, and a display driver on a single chip. For example, one of the coprocessors may be a universal smart sensor hub (sensorh ub), the main function of which is to connect and process data from the various sensors, the power consumption of the coprocessor chip of the sensorh ub being only 1% -5% of the power consumption of the application processor chip.
When the AR function in the electronic device is activated, a controller in the electronic device controls the camera, the image signal processor (image signal processor, ISP) or the graphics processor (graphics processing unit, GPU) to operate.
Specifically, in some embodiments of the present application, how the positioning sensor interacts with the co-processor, and the controller to determine whether the camera, ISP, or GPU is operating may be accomplished in two ways:
the first way is:
when the positioning sensor detects the positioning tag, the positioning sensor sends a first indication signal to the coprocessor, wherein the first indication signal is used for indicating the positioning sensor to detect the positioning tag, and correspondingly, the coprocessor sets the value of the AR function start-stop flag bit to be 1 after receiving the first indication signal; when the positioning sensor does not detect the positioning tag, the positioning sensor sends a second indication signal to the coprocessor, wherein the second indication signal is used for indicating that the positioning sensor does not detect the positioning tag, and correspondingly, the coprocessor sets the value of the AR function start-stop flag bit to 0 after receiving the second indication signal.
The controller controls the camera, ISP or GPU to work by detecting the value of an AR function start-stop zone bit in the coprocessor as 1; the controller controls the camera, ISP or GPU to be not operated by detecting the value of the start-stop flag bit of the AR function to be 0.
The second way is:
when the positioning sensor never detects the positioning label to detect the positioning label or detects the positioning label to not detect the positioning label, the positioning sensor sends a third indication signal to the coprocessor, wherein the third indication signal is used for indicating that the value of the AR function start-stop flag bit is changed, for example, when the positioning sensor does not detect the positioning label in the 5 th second, the value of the AR function start-stop flag bit in the coprocessor is 0, when the positioning sensor detects the positioning label in the 10 th second, the positioning sensor determines that the value of the AR function start-stop flag bit is changed, then at the 10 th second, the positioning sensor sends a third indication signal to the coprocessor, the third indication signal is used for indicating that the value of the AR function start-stop flag bit is changed, and correspondingly, the coprocessor modifies the value of the AR function start-stop flag bit from 0 to 1 after receiving the third indication signal.
The controller controls the camera, ISP or GPU to work by detecting the value of an AR function start-stop zone bit in the coprocessor as 1; or the controller controls the camera, ISP or GPU to work by detecting that the value of the AR function start-stop flag bit in the coprocessor is changed from 0 to 1.
In the above description, the value of the AR function start-stop flag is 0,1, that is, when the positioning sensor detects the positioning tag, the value of the AR function start-stop flag is 1; when the positioning sensor does not detect the positioning label, the value of the AR function start-stop flag bit is 0; of course, the value of the start-stop flag bit of the AR function may also be other different values to respectively indicate that the AR function is started or shut down, which is not limited in the embodiment of the present application.
(2) Positioning sensor and positioning label:
in the embodiment of the application, the positioning sensor is used for detecting the positioning tag and can determine the position of the positioning tag; the location tag is used to identify the location of the augmented reality scene.
Detection range of the positioning sensor:
since the user visual range is determined by the human eye visual model calculation, the user visual range is determined. The position sensor is positioned differently under a certain visual range of the user, and the detection range thereof becomes larger in order to cover the visual range of the user. In one example, as shown in fig. 1, the visual range of both eyes of the user is determined to be 188 degrees, 2 meters according to the human eye visual model, and when the positioning sensor is installed at a position between two display screens on the electronic device 500, since the detection range is equal to or greater than the visual range, the detection range is 190 degrees, 2.2 meters. In still another example, as shown in fig. 2, the visual range of both eyes of the user is determined to be 188 degrees, 2 meters according to the human eye visual model, and when the positioning sensor is installed at a position in the middle of the right display screen on the electronic device 500, since the detection range is equal to or more than the visual range, the detection range is 200 degrees, 2.5 meters. In still another example, the visual range of both eyes of the user is determined to be 188 degrees, 2 meters according to the human eye visual model, and when the positioning sensor is installed at a position in the middle of the left display screen on the electronic device 500, since the detection range is equal to or greater than the visual range, the detection range is 200 degrees, 2.5 meters.
In the embodiment of the application, the detection direction of the positioning sensor is in front of the camera in the electronic equipment.
Alternatively, in some embodiments of the present application, when the positioning sensor is an Ultra Wide Band (UWB) sensor, the positioning tag is a UWB tag; when the positioning sensor is a sensor with a Bluetooth positioning function, the positioning tag is a Bluetooth tag; when the positioning sensor is a sensor with a near field communication (near field communication, NFC) function, the positioning tag is an NFC tag; when the positioning sensor is a sensor with radio frequency identification (radio frequency identification, RFID) functionality, the positioning tag is an RFID tag.
At present, the structure and working principle of the electronic device in the prior art are as follows: as shown in fig. 3, fig. 3 shows a schematic structural diagram of an electronic device, which includes a processor, a camera, a display screen, and an image signal processor (image signal processor, ISP) or a graphics processor (graphics processing unit, GPU), wherein the processor, ISP, and GPU are not shown in fig. 1. The display may be a lens through which the user can see the image in the real world.
Taking a museum scene as an example, the museum comprises an exhibit 1 and an exhibit 2, wherein the exhibit 1 comprises position information of an augmented reality scene, and the exhibit 2 does not comprise position information of the augmented reality scene. As shown in fig. 4, when the user moves from exhibit 1 to exhibit 2, that is, the electronic device moves from a position where the augmented reality scene is detected to a position where the augmented reality scene is not detected, the AR function in the electronic device is always in an activated operation state. As shown in fig. 5, when the user always views the exhibit 1, the AR function in the electronic device is in a start-up working state, so that the augmented reality scene of the exhibit 1 can be processed; as shown in fig. 6, when the user always views the exhibit 2, there is no augmented reality scene on the exhibit 2, and the augmented reality scene is not required to be processed, and the AR function in the electronic device is also in the active state.
Through the process analysis from the startup of the electronic equipment to the display of the virtual reality image in the prior art, at present, after the startup of the electronic equipment, the AR function is always in a startup state, and the processor, the camera, the display screen, the ISP or the GPU and the like need to continuously work all the time, so that the power consumption is high and the power consumption is large.
In order to solve the above problems, an embodiment of the application provides an augmented reality function control method and electronic equipment. By way of example, as shown in fig. 7, the electronic device may include a processor, a camera, a positioning sensor, a display screen, and an ISP or GPU, where the processor, ISP, and GPU are not shown in fig. 4.
Taking a museum scene as an example, the museum comprises an exhibit 1 and an exhibit 2, wherein the exhibit 1 comprises a positioning label, and the exhibit 2 does not comprise the positioning label. As shown in fig. 8, when the user moves from exhibit 1 to exhibit 2, that is, the electronic device detects the positioning tag to not detect the positioning tag, whether the AR function in the electronic device starts the operation is shown in fig. 9 and 10. As shown in fig. 9, when the user always views the exhibit 1, that is, the electronic device detects the positioning tag, the AR function in the electronic device is in a start-up working state; as shown in fig. 10, when the user is always watching the exhibit 2, that is, the electronic device does not detect the positioning tag, the AR function in the electronic device is in an off state.
When the positioning tag is not detected, the high-power-consumption device does not work, so that the power consumption can be reduced by adopting the augmented reality function control mode.
The electronic device in the embodiment of the present application may be a handheld electronic device with AR function, a head-mounted electronic device, or the like, which is not limited herein.
For example, a user may wear a head mounted electronic device to achieve different effects of Virtual Reality (VR), AR, mixed Reality (MR), etc. For example, the head-mounted electronic device may be glasses, goggles, or the like. When the head-mounted electronic device is mounted on the user's head, the user's eyes can see the image presented by the head-mounted electronic device display screen.
Referring to fig. 11, fig. 11 shows a schematic structural diagram of an electronic device 1100.
The electronic device 1100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earpiece interface 170D, a sensor module 180, keys 190, a camera 193, a display 194, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, a barometric sensor 180C, an acceleration sensor 180E, a distance sensor 180F, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a positioning sensor 180N, and the like.
It should be understood that the illustrated structure of the embodiment of the present invention does not constitute a specific limitation on the electronic device 1100. In other embodiments of the present application, electronic device 1100 may include more or fewer components than shown, or certain components may be combined, certain components may be separated, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an AP, a coprocessor, a modem processor, a GPU, an ISP, a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural Network Processor (NPU), or the like. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 1100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. Such as a memory in the coprocessor for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, such that the processor 110 communicates with the touch sensor 180K through an I2C bus interface to implement a touch function of the electronic device 1100.
The I2S interface may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, to implement a function of answering a call through the bluetooth headset.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through a UART interface, to implement a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (camera serial interface, CSI), display serial interfaces (display serial interface, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the photographing functions of electronic device 1100. The processor 110 and the display screen 194 communicate via a DSI interface to implement the display functionality of the electronic device 1100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 1100, or may be used to transfer data between the electronic device 1100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present invention is only illustrative, and is not meant to limit the structure of the electronic device 1100. In other embodiments of the present application, the electronic device 1100 may also employ different interfacing manners in the above embodiments, or a combination of multiple interfacing manners.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 1100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The electronic device 1100 implements display functionality through a GPU, a display screen 194, and an application processor, among others. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 1100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The number of display screens 1100 in the electronic device 1100 may be two, corresponding to two eyeballs of the user 200, respectively.
The electronic device 1100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, the electronic device 1100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The camera 193 may be mounted on the side of the electronic device 1100 and may also be mounted on the electronic device 1100 in a position between two display screens. Camera 193 is used to capture images and video in real-time from the perspective of user 200. The electronic device 1100 generates virtual images from captured real-time images and videos and displays the virtual images through the display screen 194.
The processor 110 may determine the virtual image displayed on the display screen 194 based on the still or video image captured by the camera 193 in combination with data (e.g., brightness, sound, etc.) acquired by the sensor module 130 to implement the superimposing of the virtual image on the real world object.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 1100 is selecting a bin, the digital signal processor is used to fourier transform the bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 1100 may support one or more video codecs. Thus, the electronic device 1100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 1100 may be implemented by the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 1100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 1100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications of the electronic device 1100 as well as data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 1100 may implement audio functions through the audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone interface 170D, and application processor, among others. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 1100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When the electronic device 1100 picks up a phone call or voice message, the voice can be picked up by placing the receiver 170B close to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device 1100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 1100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 1100 may also be provided with three, four, or more microphones 170C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
In some embodiments, the electronic device 1100 may include one or more keys 190 that may control the electronic device to provide a user with access to functions on the electronic device 1100. The keys 190 may be in the form of buttons, switches, dials, and touch or near touch sensing devices (e.g., touch sensors). Specifically, for example, the user 200 may turn on the display 194 of the electronic device 1100 by pressing a button. The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 1100 may receive key inputs, generate key signal inputs related to user settings and function controls of the electronic device 1100.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 1100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 1100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic device 1100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 1100. In some embodiments, the angular velocity of electronic device 1100 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device 1100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 1100 by the reverse motion, so as to realize anti-shake. The gyro sensor 180B may also be used for navigating, somatosensory game scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, the electronic device 1100 calculates altitude from barometric pressure values measured by the barometric pressure sensor 180C, aiding in positioning and navigation.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 1100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 1100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 1100 may measure distance by infrared or laser. In some embodiments, the electronic device 1100 may range using the distance sensor 180F to achieve fast focus.
The ambient light sensor 180L is used to sense ambient light level. The electronic device 1100 may adaptively adjust the brightness of the display 194 based on perceived ambient light levels. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 1100 may utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, take a photograph of the fingerprint, answer an incoming call, etc.
The temperature sensor 180J is for detecting temperature. In some embodiments, the electronic device 1100 performs a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by temperature sensor 180J exceeds a threshold, electronic device 1100 performs a reduction in performance of a processor located in proximity to temperature sensor 180J in order to reduce power consumption to implement thermal protection. In other embodiments, when the temperature is below another threshold, the electronic device 1100 heats the battery 142 to avoid the low temperature causing the electronic device 1100 to shut down abnormally. In other embodiments, when the temperature is below a further threshold, the electronic device 1100 performs boosting of the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperatures.
The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 1100 at a different location than the display 194.
The positioning sensor 180N is used in some embodiments to detect and read positioning tags by means of accurate positioning capability and pointing functionality.
The following describes the augmented reality function control method in the embodiment of the present application with reference to the above-mentioned hardware structure of the electronic device 1100 and specific scenarios:
taking the electronic device 1100 as AR glasses, the positioning sensor as UWB sensor, the positioning tag as UWB tag, and the position of the UWB sensor in the middle of two display screens as an example, the user wears the AR glasses to view the exhibits in the indoor museum. Most exhibits that a user sees are static while visiting a museum. The rich historical information of the human beings behind the exhibits is difficult to know by users, and the users can only imagine depending on the explanation content. With the aid of AR glasses, a user can interact with the exhibits to learn more information, so that browsing the museum is more interesting and attractive. For example, as shown in fig. 12, fig. 12 shows a schematic diagram of an indoor museum, and it is assumed that the indoor museum includes 4 exhibits: exhibit 1, exhibit 2, exhibit 3 and exhibit 4, wherein, all dispose UWB label on exhibit 1 and the exhibit 3, UWB label can be regarded as the chip that has the directional function of location, do not dispose UWB label on exhibit 2 and the exhibit 4.
In a museum, the user 200 opens the AR glasses by pressing a start key or other means, the AR glasses are started, after the AR glasses are started, UWB sensors and coprocessors in the default AR glasses start working, and the camera, the display screen, the ISP and the GPU do not start working. After the power-on, the value of the AR function start-stop flag bit in the default coprocessor is 0, and a controller in the AR glasses periodically checks whether the value of the AR function start-stop flag bit or the value of the AR function start-stop flag bit in the coprocessor is changed.
The user 200 wears AR glasses to move within the room museum, and the AR glasses also follow the user 200 to move.
First case: after the AR glasses are started, the camera, the display screen, the ISP and the GPU do not start up, at this time, the value of the AR function start-stop flag bit in the coprocessor is 0, the user 200 wears the AR glasses, the UWB sensor detects the UWB tag, as shown in fig. 13, fig. 13 shows a schematic diagram of an exhibit in which the UWB tag is detected in a detection range, for example, after the user 200 wears the AR glasses and constantly views exhibit 1 for a short time, the UWB sensor in the AR glasses detects the UWB tag in exhibit 1, the UWB tag information is read, and then the UWB sensor reports the information to the coprocessor, and correspondingly, after the coprocessor receives the report information of the UWB sensor, the value of the AR function start-stop flag bit is modified from 0 to 1, and the controller controls the camera, the display screen, the ISP and the GPU to start up by detecting that the value of the AR function start-stop flag bit in the determination is modified from 0 to 1. Then, the camera scans the exhibit 1, and the images are analyzed by the processor to generate a virtual image, such as a drum, and then the ISP or the GPU attaches the virtual image, such as the drum, to the real image, such as the exhibit 1, which is seen through the display screen, and then displays the virtual image on the display screen, and accordingly, a user can see the dynamic head portraits of the terracotta at the drum through the display screen, as shown in fig. 14.
Second case: after the AR glasses are started, the camera, the display screen, the ISP and the GPU are not started to work, at this time, the value of the start-stop flag bit of the AR function in the coprocessor is 0, the user 200 wears the AR glasses, the UWB sensor does not detect the UWB tag, as shown in fig. 15, fig. 15 shows a schematic diagram of an exhibit in which the UWB tag is not detected in the detection range, for example, when the user 200 wears the AR glasses to watch the exhibit 2 all the time, that is, the UWB sensor does not detect the UWB tag, the value of the start-stop flag bit of the AR function in the coprocessor is still 0, the camera, the display screen, the ISP and the GPU are in standby or sleep states, at this time, the UWB sensor and the coprocessor in the AR glasses are defaulted to work continuously, the AR glasses are operated in a low power consumption mode, at this time, the display screen is in a dead screen state, and the user can see real world images through the display screen, as shown in fig. 16.
Third case: the UWB sensor detects the UWB tag to does not detect the UWB tag, and the camera, display screen, ISP and GPU are in standby or sleep state from a start-up operating state. When the UWB sensor detects the UWB tag, the value of the AR function start-stop flag bit is 1. As shown in fig. 17, during the movement of the user 200, a process from the detection range of the exhibit detected to the exhibit detected no UWB label occurs, in one example, the user 200 wears AR glasses to watch the exhibit 1, then the user 200 wears AR glasses to watch the exhibit 2, wherein the exhibit 1 is disposed with the UWB label, the exhibit 2 is not disposed with the UWB label, when the user 200 wears AR glasses to watch the exhibit 1, after the UWB sensor in the AR glasses detects the UWB label in the exhibit 1, UWB label information is read, then the UWB sensor reports the coprocessor, and accordingly, the coprocessor sets the value of the AR function start-stop flag bit to 1, the controller controls the camera, the display screen, the ISP and the GPU to start working by detecting the value of the AR function start-stop flag bit in the determination, then the camera scans the exhibit 1, and generates virtual images, such as drums, by the GPU or the GPU at a determined angle, the virtual images, such as drums, are attached to the images seen through the display screen, such as the exhibit 1, and displayed on the display screen, and the corresponding ISP can see the dynamic image on the display screen of the corresponding ISP, such as shown in fig. 14. Then, the user 200 wears the AR glasses to watch the exhibit 2, that is, the UWB sensor does not detect the UWB label, the value of the AR function start-stop flag bit in the coprocessor is changed from 1 to 0, the controller controls the camera, the display screen, the ISP and the GPU to be in a standby or sleep state from a starting working state by detecting that the value of the AR function start-stop flag bit is changed from 1 to 0, and at the moment, the UWB sensor and the coprocessor in the default AR glasses work continuously, and the AR glasses operate in a low-power consumption mode.
Fourth case: the UWB sensor maintains the starting working state from the starting working state of the camera, the display screen, the ISP and the GPU from the detection of the UWB label to the detection of the UWB label. When the UWB sensor detects the UWB tag, the value of the AR function start-stop flag bit is 1. As shown in fig. 18, during the movement of the user 200, a process from detecting the exhibit of UWB tag to detecting the exhibit of UWB tag occurs, in one example, the user 200 wears AR glasses to watch exhibit 1, then the user 200 wears AR glasses to watch exhibit 2, wherein UWB tag is disposed in exhibit 1, UWB tag is disposed in exhibit 3, when user 200 wears AR glasses to watch exhibit 1, UWB tag information is read after UWB tag in exhibit 1 is detected by UWB sensor in AR glasses, then UWB sensor reports coprocessor, and accordingly, the coprocessor sets the value of AR function start-stop flag to 1, the controller controls camera, display screen, ISP and GPU to start operation by detecting the value of AR function start-stop flag in determining, then the camera scans exhibit 1, and generates virtual image such as drum, then ISP or GPU to attach virtual image such as drum with determined angle, inclination angle, and display real image such as exhibit 1 seen through display screen, and display the coprocessor displays on the display screen, and the display screen can see the dynamic drum as shown in fig. 14 by the corresponding user. Then, when the user 200 wears the AR glasses to watch the exhibit 3, after the UWB sensor in the AR glasses detects the UWB tag in the exhibit 3, the UWB tag information in the exhibit 3 is read, then the UWB sensor reports the coprocessor, the value of the AR function start-stop flag bit is still 1, the controller controls the camera, the display screen, the ISP and the GPU to maintain the starting working state, then the camera scans the exhibit 3, and the virtual image, such as a section of characters, is generated through the analysis of the processor, and then the ISP or the GPU displays the virtual image, such as a section of characters, on the display screen after being attached to the real image, such as the exhibit 3, seen through the display screen, according to the determined angle and the inclination angle, as shown in fig. 19.
Fifth case: the UWB sensor transitions from a standby or sleep state to a start-up operating state from no UWB tag detected to a UWB tag detected. When the UWB sensor does not detect the UWB tag, the value of the AR function start-stop flag bit is 0. As shown in fig. 20, during the movement of the user 200, a process from the case where no UWB tag is detected to the case where no UWB tag is detected in the detection range occurs, in one example, the user 200 wears AR glasses to watch the case where the exhibit 4 is worn, and then the user 200 wears AR glasses to watch the exhibit 3, where no UWB tag is disposed in the exhibit 4, and UWB tag is disposed in the exhibit 3, and when the user 200 wears AR glasses to watch the exhibit 4, the UWB sensor does not detect the UWB tag in the exhibit 4, the value of the AR function start-stop flag is 0, and the camera and the display, ISP, and GPU are in standby or sleep states. Then, when the user 200 wears the AR glasses to watch the exhibit 3, after the UWB sensor in the AR glasses detects the UWB label in the exhibit 3, the UWB label information in the exhibit 3 is read, then the UWB sensor reports the coprocessor, the corresponding coprocessor sets the value of the AR function start-stop flag bit to 1, the controller controls the camera, the display screen, the ISP and the GPU to start working by detecting and determining the value of the AR function start-stop flag bit in the coprocessor to be 1, then the camera scans the exhibit 3, and the processor analyzes the images to generate a virtual image, such as a section of text, and then the ISP or the GPU attaches the virtual image, such as a section of text, to the real image seen through the display screen, such as the exhibit 3 at a determined angle, and then the virtual image is displayed on the display screen, as shown in fig. 19.
Sixth case: the UWB sensor never detects the UWB tag to does not detect the UWB tag and the camera, display screen, ISP and GPU remain in a standby or sleep state from being in a standby or sleep state. When the UWB sensor does not detect the UWB tag, the value of the AR function start-stop flag bit is 0. As shown in fig. 21, during the movement of the user 200, a process from the fact that no UWB tag is detected to the fact that no UWB tag is detected occurs in the detection range, in one example, the user 200 wears AR glasses to watch the exhibit 2, then the user 200 wears AR glasses to watch the exhibit 4, wherein no UWB tag is disposed in the exhibit 2, no UWB tag is disposed in the exhibit 4, the user 200 wears AR glasses to watch the exhibit 2, that is, no UWB tag is detected by the UWB sensor, the value of the on-off flag of the AR function in the coprocessor is 0, the camera and the display screen, the ISP and the GPU are in standby or sleep state, at this time, the UWB sensor and the GPU in the AR glasses are in continuous operation, the AR glasses are operated in a low power consumption mode, at this time, the display screen is in a dead state, the user can see an image of the real world through the display screen, as shown in fig. 16, then when the user 200 wears AR glasses to watch the exhibit 4, the UWB tag in the exhibit 4 is not detected by the UWB sensor, the value of the function on-off flag is still 0, the camera and the display screen and the ISP are still in standby or sleep state.
Seventh case: the UWB sensor detects 2 UWB tags, as shown in fig. 22, fig. 22 shows a schematic diagram of an exhibit in which 2 UWB tags are detected within a detection range, in one example, the user 200 stands in front of the exhibit 1 and the exhibit 3, the user 200 wears AR glasses, when the UWB sensor scans 2 UWB tags, for example, the UWB tag in the exhibit 1 and the UWB tag in the exhibit 3, it can be considered that the UWB tag in the exhibit 3 is a first positioning tag, the UWB tag in the exhibit 3 is a second positioning tag, the UWB sensor reports, the coprocessor invokes the distance sensor, and then the distance sensor measures the distance between the UWB tag in the exhibit 1 and the UWB sensor, that is, the first distance, and the distance between the UWB tag in the exhibit 3 and the UWB sensor, that is, the second distance, because the second distance is greater than the first distance, the display item which is close to the user is determined, such as display item 1, and the coprocessor is notified, the coprocessor notifies the UWB sensor of the display item which is close to the user, such as display item 1, then the UWB sensor scans the UWB label in display item 1, reads the UWB label information in display item 1, then the UWB sensor reports the coprocessor, correspondingly, after the coprocessor receives the report information of the UWB sensor, the value of the AR function start-stop flag bit is set to 1, the controller controls the camera, the display screen, the ISP and the GPU to start working by detecting and determining the value of the AR function start-stop flag bit in the coprocessor, then the camera scans display item 1, and the processor analyzes to generate a virtual image, such as a drum, then the ISP or the GPU attaches the virtual image, such as the drum, to a real image seen through the display screen, such as display item 1 at a determined angle and inclination, displayed on the display screen as shown in fig. 23.
Eighth case: the UWB sensor detects 2 UWB tags, as shown in fig. 22, fig. 22 shows a schematic diagram of an exhibit in which 2 UWB tags are detected in a detection range, in one example, the user 200 stands in front of the exhibit 1 and the exhibit 3, the user 200 wears AR glasses, when the UWB sensor scans 2 UWB tags, for example, the UWB tag in the exhibit 1 and the UWB tag in the exhibit 3, it can be considered that the UWB tag in the exhibit 3 is a first positioning tag, the UWB tag in the exhibit 3 is a second positioning tag, the UWB sensor reports the coprocessor, then the UWB sensor scans the UWB tag in the exhibit 1 and the UWB tag in the exhibit 3 simultaneously, reads UWB tag information in the exhibit 1 and UWB tag information in the exhibit 3, then the UWB sensor reports the information to the coprocessor, and correspondingly, after the coprocessor receives the reported information of the UWB sensor, the value of the AR function start-stop flag bit is set to 1, the controller controls the camera, the display screen, the ISP and the GPU to start working by detecting and determining that the value of the AR function start-stop flag bit in the coprocessor is 1, then the camera scans the exhibit 1 and the exhibit 3, and the virtual image, such as a drum and a section of text, is generated by the processor for analysis, then the ISP or the GPU attaches the virtual image, such as the drum, to the real image, such as the exhibit 1, seen through the display screen at a determined angle, and the virtual image, such as a section of text, is displayed on the display screen after the real image, such as the exhibit 3, seen through the display screen, is attached, as shown in fig. 24.
In combination with the hardware structure of the electronic device 1100, a specific description of an augmented reality function control method in the embodiment of the application is described below:
referring to fig. 25, fig. 25 is a diagram illustrating an augmented reality function control method according to an embodiment of the present application, including but not limited to the following steps:
step S2501: the user wears the electronic equipment and opens the electronic equipment.
Step S2502: the electronic equipment detects the positioning tag in a preset detection range.
Specifically, the preset detection range is greater than or equal to the user visual range, and the user visual range is calculated and determined according to the human eye visual model. Reference may be made specifically to the detection range of the positioning sensor, and details thereof are not described herein.
Specifically, the electronic device may detect the positioning tag within a preset detection range through the positioning sensor. When the positioning sensor is an ultra wideband UWB sensor, the positioning tag is a UWB tag; of course, the positioning sensor and the positioning tag may be other positioning sensors and positioning tags, and specific reference may be made to the descriptions of the positioning sensor and the positioning tag, which are not repeated herein.
Step S2503: and when the electronic equipment does not detect the positioning label in the preset detection range, the augmented reality function is not started or closed.
Specifically, not turning on or off the augmented reality function means that the camera, display screen, ISP, and GPU are in a standby or sleep state. That is, when the electronic device does not detect the positioning tag within the preset detection range, the camera, the display screen, the ISP, and the GPU are never in a standby or sleep state.
Specifically, not enabling or disabling the augmented reality function may be achieved by positioning the interaction between the sensor and the co-processor, the co-processor and the controller: the first way is: when the positioning sensor does not detect the positioning tag, the positioning sensor sends a second indication signal to the coprocessor, wherein the second indication signal is used for indicating that the positioning sensor does not detect the positioning tag, and correspondingly, the coprocessor sets the value of the AR function start-stop flag bit to 0 after receiving the second indication signal. The controller controls the camera, ISP or GPU to be not operated by detecting the value of the start-stop flag bit of the AR function to be 0. The second way is: whether the value of the flag bit in the coprocessor is changed or not can be detected through the controller at regular time; and closing the camera, the image signal processor ISP and the graphic processor GPU according to the value of the zone bit. Reference is specifically made to the above description and will not be repeated here.
The specific electronic device does not detect the positioning tag, and the second, third, fifth and sixth situations described in the above embodiments can be seen without starting or closing the augmented reality function, which are not described herein again.
Step S2504: the electronic equipment detects the first positioning label in a preset detection range and starts the augmented reality function.
Specifically, starting the augmented reality function means that the camera, the display screen, the ISP and the GPU are in a starting state. When the electronic equipment detects the first positioning label in the preset detection range, the camera, the display screen, the ISP and the GPU are in a starting state, and the coprocessor and the positioning sensor are in a working state.
Specifically, the initiation of the augmented reality function may be achieved by the interaction between the positioning sensor and the co-processor, the co-processor and the controller: the first way is: when the positioning sensor detects the positioning tag, the positioning sensor sends a first indication signal to the coprocessor, the first indication signal is used for indicating that the positioning sensor detects the positioning tag, and correspondingly, the coprocessor sets the value of the AR function start-stop flag bit to be 1 after receiving the first indication signal. The controller controls the camera, ISP or GPU to work by detecting the value of the start-stop flag bit of the AR function as 1. The second way is: whether the value of the flag bit in the coprocessor is changed or not can be detected through the controller at regular time; and starting the camera, the ISP and the GPU according to the value of the zone bit. Reference is specifically made to the above description and will not be repeated here.
The first positioning tag is detected in a preset detection range, and the first and fourth conditions described in the above embodiment can be seen for starting the augmented reality function, which are not described here again.
Step S2505: the electronic device detects the second positioning tag within a preset detection range.
In one possible implementation manner, when the electronic device detects the second positioning tag within the preset detection range, the size of the first distance and the second distance is determined, and the second distance is determined to be larger than the first distance. Specifically, the first distance is a distance between the first positioning tag and the positioning sensor, and the second distance is a distance between the second positioning tag and the positioning sensor. Reference may be made specifically to the seventh case described above, and no further description is given here.
In one possible implementation manner, when the electronic device detects the second positioning tag within the preset detection range, the electronic device processes and displays the augmented reality scene where the second positioning tag is located. Reference may be made specifically to the eighth case described above, and no further description is given here.
Step S2506: the electronic equipment processes the augmented reality scene of the position of the first positioning label and displays the augmented reality scene.
The first, third, fourth, fifth and seventh cases described in the above embodiments can be seen, and will not be described here.
In the method, whether the electronic equipment detects the positioning tag in the preset detection range or not is determined, so that whether the augmented reality function is started or not is determined.
The embodiment of the application also provides a chip system, which comprises at least one processor, a memory and an interface circuit, wherein the memory, the transceiver and the at least one processor are interconnected through a circuit, and a computer program is stored in the at least one memory; the method flow shown in fig. 25 is implemented when the computer program is executed by the processor.
Embodiments of the present application also provide a computer-readable storage medium having a computer program stored therein, which when run on an electronic device, implements the method flow shown in fig. 25.
Embodiments of the present application also provide a computer program product, which when run on a processor, implements the method flow shown in fig. 25.
Those of ordinary skill in the art will appreciate that implementing all or part of the above-described embodiment methods may be accomplished by a computer program in hardware associated with the computer program, which may be stored on a computer readable storage medium, which when executed may comprise the above-described embodiment methods. And the aforementioned storage medium includes: various media capable of storing computer program code, such as ROM or random access memory RAM, magnetic or optical disk.

Claims (22)

1. An augmented reality function control method, comprising:
the electronic equipment detects the positioning label in a preset detection range; the preset detection range is in front of a camera in the electronic equipment; the positioning tag is used for identifying the position of the augmented reality scene;
when the electronic equipment does not detect the positioning tag in the preset detection range, the electronic equipment does not start or close the augmented reality function; the augmented reality function is used for processing the data of the augmented reality scene and displaying an augmented reality picture;
and when the electronic equipment detects the first positioning label in the preset detection range, the electronic equipment starts the augmented reality function.
2. The method of claim 1, wherein the electronic device does not activate or deactivate an augmented reality function, comprising:
the electronic device does not activate or deactivate the camera, the image signal processor ISP and the graphics processor GPU.
3. The method of claim 1, wherein the electronic device initiates the augmented reality function, comprising:
the electronic device starts the camera, the image signal processor ISP and the graphics processor GPU.
4. The method of claim 1, wherein the electronic device detects the location tag within a preset detection range, comprising:
and the positioning sensor in the electronic equipment detects the positioning label in the preset detection range.
5. The method of claim 4, wherein the step of determining the position of the first electrode is performed,
the positioning sensor is an ultra-wideband UWB sensor, and the positioning tag is a UWB tag.
6. The method according to any one of claims 1-5, wherein the electronic device activates the augmented reality function when the electronic device detects a first positioning tag within the preset detection range, specifically comprising:
when a positioning sensor in the electronic equipment detects a first positioning label in the preset detection range, the positioning sensor sends a first indication signal to a coprocessor, and the first indication signal is used for indicating the positioning sensor to detect the first positioning label in the preset detection range;
The coprocessor receives a first indication signal from the positioning sensor and determines the value of a zone bit in the coprocessor;
and starting the camera, the image signal processor ISP and the graphic processor GPU by the controller in the electronic equipment according to the value of the zone bit in the coprocessor.
7. The method according to any one of claims 1-5, wherein the electronic device does not activate or deactivate an augmented reality function when the electronic device does not detect a positioning tag within the preset detection range, specifically comprising:
when the positioning sensor in the electronic equipment does not detect the positioning label in the preset detection range, the positioning sensor sends a second indication signal to the coprocessor, wherein the second indication signal is used for indicating that the positioning sensor does not detect the positioning label in the preset detection range;
the coprocessor receives a second indication signal from the positioning sensor and determines the value of a zone bit in the coprocessor;
and closing the camera, the image signal processor ISP and the graphic processor GPU by a controller in the electronic equipment according to the value of the zone bit in the coprocessor.
8. The method of any of claims 1-5, wherein the method further comprises, after the electronic device initiates the augmented reality function when the electronic device detects a first location tag within the preset detection range:
and the electronic equipment processes and displays the augmented reality scene of the position of the first positioning label.
9. The method of claim 8, wherein the electronic device processes and displays an augmented reality scene of the location of the first location tag, comprising:
when a positioning sensor in the electronic equipment detects a second positioning label in the preset detection range, determining the first distance and the second distance, wherein the first distance is the distance between the first positioning label and the positioning sensor, and the second distance is the distance between the second positioning label and the positioning sensor;
and when the electronic equipment determines that the second distance is larger than the first distance, the electronic equipment processes the augmented reality scene where the first positioning label is located and displays the augmented reality scene.
10. The method of claim 8, wherein when the electronic device detects a first positioning tag within the preset detection range, the electronic device processes and displays an augmented reality scene at a location of the first positioning tag after the electronic device starts the augmented reality function, the method further comprising:
When the positioning sensor in the electronic equipment detects the second positioning label in the preset detection range, the electronic equipment processes and displays the augmented reality scene of the position of the second positioning label.
11. An electronic device, characterized in that the electronic device comprises a positioning sensor and a controller,
the positioning sensor is used for detecting the positioning label in a preset detection range; the preset detection range is in front of the camera; the positioning tag is used for identifying the position of the augmented reality scene;
the controller is used for controlling not to start or close the augmented reality function under the condition that the positioning sensor does not detect the positioning label in the preset detection range; the augmented reality function is used for processing the data of the augmented reality scene and displaying an augmented reality picture;
the controller is further configured to control to start the augmented reality function when the positioning sensor detects the first positioning tag within the preset detection range.
12. The apparatus of claim 11, wherein the controller is further configured to control not to activate or deactivate the camera, the image signal processor ISP, and the graphics processor GPU.
13. The apparatus of claim 11, wherein the controller is further configured to control activation of the camera, image signal processor ISP, and graphics processor GPU.
14. The apparatus of claim 11, wherein the positioning sensor is an ultra wideband UWB sensor and the positioning tag is a UWB tag.
15. The apparatus according to any one of claims 11 to 14, wherein,
the positioning sensor is further configured to send a first indication signal to the coprocessor when the positioning sensor in the electronic device detects a first positioning tag within the preset detection range, where the first indication signal is used to indicate that the positioning sensor detects the first positioning tag within the preset detection range;
the coprocessor is used for receiving a first indication signal from the positioning sensor and determining the value of a zone bit in the coprocessor;
and the controller is used for starting the camera, the image signal processor ISP and the graphic processor GPU according to the value of the zone bit in the coprocessor.
16. The apparatus according to any one of claims 11 to 14, wherein,
The positioning sensor is further configured to send a second indication signal to the coprocessor when the positioning sensor in the electronic device does not detect a positioning tag within the preset detection range, where the second indication signal is used to indicate that the positioning sensor does not detect a positioning tag within the preset detection range;
the coprocessor is also used for receiving a second indication signal from the positioning sensor and determining the value of a marker bit in the coprocessor;
and the controller is also used for closing the camera, the image signal processor ISP and the graphic processor GPU according to the value of the zone bit in the coprocessor.
17. The apparatus according to any one of claims 11-14, characterized in that the apparatus further comprises:
and the processor is used for processing and displaying the augmented reality scene of the position of the first positioning label.
18. The apparatus of claim 17, wherein the apparatus further comprises:
the distance sensor is used for determining the first distance and the second distance when the positioning sensor detects the second positioning label in the preset detection range, wherein the first distance is the distance between the first positioning label and the positioning sensor, and the second distance is the distance between the second positioning label and the positioning sensor;
And the processor is used for processing and displaying the augmented reality scene where the first positioning tag is located under the condition that the electronic equipment determines that the second distance is larger than the first distance.
19. The apparatus of claim 17, wherein the device comprises a plurality of sensors,
the processor is further configured to process and display an augmented reality scene where the second positioning tag is located when the positioning sensor detects the second positioning tag within the preset detection range.
20. An electronic device, comprising: one or more processors and memory; the memory being coupled to the one or more processors, the memory being for storing computer program code, the computer program code comprising computer instructions that the one or more processors call to cause the electronic device to perform the method of any one of claims 1-10.
21. A computer storage medium, characterized in that the computer storage medium stores a computer program comprising program instructions for implementing the method of any of claims 1-10 when executed by a processor.
22. A computer program product which, when run on an electronic device, causes the electronic device to perform the method of any one of claims 1-10.
CN202011206535.6A 2020-11-02 2020-11-02 Augmented reality function control method and electronic equipment Active CN114531582B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011206535.6A CN114531582B (en) 2020-11-02 2020-11-02 Augmented reality function control method and electronic equipment
PCT/CN2021/127781 WO2022089625A1 (en) 2020-11-02 2021-10-30 Augmented reality function control method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011206535.6A CN114531582B (en) 2020-11-02 2020-11-02 Augmented reality function control method and electronic equipment

Publications (2)

Publication Number Publication Date
CN114531582A CN114531582A (en) 2022-05-24
CN114531582B true CN114531582B (en) 2023-06-13

Family

ID=81381893

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011206535.6A Active CN114531582B (en) 2020-11-02 2020-11-02 Augmented reality function control method and electronic equipment

Country Status (2)

Country Link
CN (1) CN114531582B (en)
WO (1) WO2022089625A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11935093B1 (en) 2023-02-19 2024-03-19 Toyota Motor Engineering & Manufacturing North America, Inc. Dynamic vehicle tags

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014169692A1 (en) * 2013-04-15 2014-10-23 Tencent Technology (Shenzhen) Company Limited Method,device and storage medium for implementing augmented reality
CN107767460A (en) * 2016-08-18 2018-03-06 深圳市劲嘉数媒科技有限公司 The methods of exhibiting and device of augmented reality
CN107967054A (en) * 2017-11-16 2018-04-27 中国人民解放军陆军装甲兵学院 The immersion three-dimensional electronic sand table that a kind of virtual reality is coupled with augmented reality
CN109129507A (en) * 2018-09-10 2019-01-04 北京联合大学 A kind of medium intelligent introduction robot and explanation method and system
CN109614785A (en) * 2018-11-01 2019-04-12 Oppo广东移动通信有限公司 Using the management-control method of operation, device, storage medium and electronic equipment
CN111722710A (en) * 2020-06-02 2020-09-29 广东小天才科技有限公司 Method for starting augmented reality AR interactive learning mode and electronic equipment

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120212484A1 (en) * 2010-02-28 2012-08-23 Osterhout Group, Inc. System and method for display content placement using distance and location information
US10037465B2 (en) * 2016-03-18 2018-07-31 Disney Enterprises, Inc. Systems and methods for generating augmented reality environments
CN106204743B (en) * 2016-06-28 2020-07-31 Oppo广东移动通信有限公司 Control method and device for augmented reality function and mobile terminal
CN106126066A (en) * 2016-06-28 2016-11-16 广东欧珀移动通信有限公司 Control method, device and the mobile terminal of a kind of augmented reality function
CN106406520A (en) * 2016-08-30 2017-02-15 徐丽芳 Position identification method of virtual reality system
US11347054B2 (en) * 2017-02-16 2022-05-31 Magic Leap, Inc. Systems and methods for augmented reality
CN108962098A (en) * 2018-08-02 2018-12-07 合肥市徽马信息科技有限公司 A kind of guide system based on AR augmented reality

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014169692A1 (en) * 2013-04-15 2014-10-23 Tencent Technology (Shenzhen) Company Limited Method,device and storage medium for implementing augmented reality
CN107767460A (en) * 2016-08-18 2018-03-06 深圳市劲嘉数媒科技有限公司 The methods of exhibiting and device of augmented reality
CN107967054A (en) * 2017-11-16 2018-04-27 中国人民解放军陆军装甲兵学院 The immersion three-dimensional electronic sand table that a kind of virtual reality is coupled with augmented reality
CN109129507A (en) * 2018-09-10 2019-01-04 北京联合大学 A kind of medium intelligent introduction robot and explanation method and system
CN109614785A (en) * 2018-11-01 2019-04-12 Oppo广东移动通信有限公司 Using the management-control method of operation, device, storage medium and electronic equipment
CN111722710A (en) * 2020-06-02 2020-09-29 广东小天才科技有限公司 Method for starting augmented reality AR interactive learning mode and electronic equipment

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
基于增强现实技术的物联网数据呈现与交互;孙效华等;《包装工程》;20171020(第20期);全文 *
手机增强现实室内向导的研究与实现;史晓琳等;《计算机应用与软件》;20130215(第02期);全文 *
用于电力资产在线感知的eRFID标签设计;张鋆等;《电工技术学报》;20200610(第11期);全文 *

Also Published As

Publication number Publication date
CN114531582A (en) 2022-05-24
WO2022089625A1 (en) 2022-05-05

Similar Documents

Publication Publication Date Title
CN109582141B (en) Method for controlling display screen according to eyeball focus and head-mounted electronic equipment
CN110045908B (en) Control method and electronic equipment
CN110798568B (en) Display control method of electronic equipment with folding screen and electronic equipment
CN116070684B (en) Integrated chip and method for processing sensor data
CN114202000A (en) Service processing method and device
CN110401768B (en) Method and device for adjusting working state of electronic equipment
CN112947755A (en) Gesture control method and device, electronic equipment and storage medium
CN113395382B (en) Method for data interaction between devices and related devices
CN113986002B (en) Frame processing method, device and storage medium
CN112799508A (en) Display method and device, electronic equipment and storage medium
CN115589051B (en) Charging method and terminal equipment
CN112860428A (en) High-energy-efficiency display processing method and equipment
CN113448482A (en) Sliding response control method and device of touch screen and electronic equipment
CN114531582B (en) Augmented reality function control method and electronic equipment
US20230162529A1 (en) Eye bag detection method and apparatus
CN110058729B (en) Method and electronic device for adjusting sensitivity of touch detection
CN113496477A (en) Screen detection method and electronic equipment
CN111880661A (en) Gesture recognition method and device
CN110673694A (en) Application opening method and electronic equipment
CN115665632A (en) Audio circuit, related device and control method
CN115150542B (en) Video anti-shake method and related equipment
CN113325948B (en) Air-isolated gesture adjusting method and terminal
CN117093068A (en) Vibration feedback method and system based on wearable device, wearable device and electronic device
CN114221402A (en) Charging method and device of terminal equipment and terminal equipment
CN115393676A (en) Gesture control optimization method and device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant