CN117990073A - Direction identification method, apparatus, device, storage medium, and computer program product - Google Patents

Direction identification method, apparatus, device, storage medium, and computer program product Download PDF

Info

Publication number
CN117990073A
CN117990073A CN202211349687.0A CN202211349687A CN117990073A CN 117990073 A CN117990073 A CN 117990073A CN 202211349687 A CN202211349687 A CN 202211349687A CN 117990073 A CN117990073 A CN 117990073A
Authority
CN
China
Prior art keywords
azimuth
dial
included angle
ray
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211349687.0A
Other languages
Chinese (zh)
Inventor
李潇然
刘晓波
汪芳山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202211349687.0A priority Critical patent/CN117990073A/en
Publication of CN117990073A publication Critical patent/CN117990073A/en
Pending legal-status Critical Current

Links

Landscapes

  • Telephone Function (AREA)

Abstract

A method, apparatus, device, storage medium and computer program product for position identification are applicable to scenes in which the earth's magnetic field is disturbed or otherwise unavailable. The method may include: controlling the dial on the electronic equipment to rotate, so that a reference ray in the rotated dial points to a first reference azimuth; determining a target included angle based on the included angle between the ray of the hour hand on the dial after rotation and the ray of the preset scale mark; and determining the position pointed by the angular bisector of the target included angle as a first position. Wherein, the relative positions between the hour hand, the minute hand and the reference ray on the dial plate are kept unchanged in the rotating process; the end points of the reference rays, the end points of the rays where the hour hand is located in the dial after rotation and the end points of the rays where the preset scale marks are located are all the origins of the dial. By adopting the application, the azimuth can be intelligently and accurately identified under the condition that the earth magnetic field is interfered or unavailable.

Description

Direction identification method, apparatus, device, storage medium, and computer program product
Technical Field
The present application relates to the field of positioning technology, and in particular, to a method, apparatus, device, storage medium and computer program product for position identification.
Background
Electronic devices typically sense the earth's magnetic field through a magnetic sensor to identify an orientation. Because the earth magnetic field is easy to interfere and a certain angle deviation exists between the earth magnetic pole and the true azimuth, the accuracy of the azimuth identification mode through the geomagnetic sensor is difficult to ensure in the mode. In situations where the earth's magnetic field is disturbed or not available, how to accurately identify the bearing is a problem to be solved.
Disclosure of Invention
Embodiments of the present application provide a method, apparatus, device, storage medium, and computer program product for identifying a bearing, which can intelligently and accurately identify a bearing in a scenario where the earth's magnetic field is disturbed or unavailable.
In a first aspect, an embodiment of the present application provides a method for identifying an azimuth, including: based on the first reference azimuth, controlling the dial on the electronic equipment to rotate, wherein the azimuth pointed by the reference ray in the rotated dial is the first reference azimuth; the relative positions of the hour hand, the minute hand and each scale mark on the dial plate and the dial plate are kept unchanged in the rotating process; determining a target included angle based on the rotated dial; the target included angle is the included angle between the ray of the hour hand in the dial after rotation and the ray of the preset scale mark; and determining a first azimuth according to the target included angle, wherein the azimuth pointed by the angular bisector of the target included angle is the first azimuth. The end points of the reference rays, the end points of the rays of the hour hand in the dial after rotation and the end points of the rays of the preset scale marks are all the origins of the dial.
The accuracy of determining the first azimuth according to the angular bisector of the target included angle is not affected by the earth magnetic field, and the azimuth can be intelligently and accurately identified under the condition that the earth magnetic field is disturbed or unavailable.
In one possible implementation manner, the determining the target included angle based on the dial after rotation includes: obtaining a first included angle and a second included angle based on the rotated dial; the first included angle is an included angle between rays of the pointer in the dial after rotation and rays of the preset scale mark; the second included angle is an included angle between the ray with the preset scale mark and the ray with the hour hand in the dial after rotation; if the current display time of the dial is 12 pm, determining that the target included angle is a first included angle or a second included angle; if the current display time of the dial is earlier than 12 pm, determining that the target included angle is a first included angle; and if the current display time of the dial is later than 12 pm, determining the target included angle as a second included angle.
Therefore, according to the current display time of the dial plate, namely, according to the current actual time, the first included angle or the second included angle formed between the ray where the hour hand is located and the ray where the preset scale mark is located is determined, and the pointing accuracy of the obtained angular bisector can be improved. In this way, the accuracy of the determined first orientation can be improved.
In one possible implementation, the method further includes determining a first reference position.
In one possible implementation manner, the determining the first reference position may include: acquiring an image containing a reference azimuth mark, and determining a second reference azimuth according to the image; and correcting the second reference azimuth according to the read attitude sensor information to obtain a first reference azimuth.
Therefore, the second reference azimuth is corrected through the information of the attitude sensor, so that the accurate first reference azimuth can be obtained, and the accuracy of azimuth identification can be improved.
In one possible implementation manner, the acquiring the image including the reference azimuth mark, and determining the second reference azimuth according to the image may include: acquiring an image containing a reference azimuth mark through a shooting device; performing image processing on the image, and determining the position information of the reference azimuth mark; calling a light-sensitive sensor to sense the reference azimuth mark to obtain a sensing result; and determining a second reference azimuth according to the position information and the sensing result.
Therefore, the accurate second reference azimuth can be obtained by shooting and processing the reference azimuth mark, so that the accuracy of the determined first reference azimuth can be improved, and the accuracy of azimuth identification is further improved.
In one possible implementation manner, controlling the dial rotation on the electronic device based on the first reference azimuth may include: and controlling the dial on the electronic equipment to rotate based on the first reference azimuth under the condition that the mode selection interface receives a selection instruction for the sun mode.
In one possible implementation manner, controlling the dial rotation on the electronic device based on the first reference azimuth may include: acquiring calibration geomagnetic information under the condition that a mode selection interface receives a selection instruction aiming at a geomagnetic mode; and determining a second azimuth according to the calibrated geomagnetic information.
In one possible implementation manner, the method further includes: and the geomagnetic information is calibrated to meet the precision condition, and a direction recognition result is determined according to the second direction and is used for describing the direction information of the electronic equipment.
Therefore, in the geomagnetic mode, if the calibrated geomagnetic information meets the accuracy condition, the azimuth recognition result can be determined according to the second azimuth.
In one possible implementation manner, the method further includes: and when the calibrated geomagnetic information does not meet the precision condition, determining an azimuth recognition result according to the first azimuth and the second azimuth, wherein the azimuth recognition result is used for describing azimuth information of the electronic equipment.
Therefore, in the geomagnetic mode, if the calibrated geomagnetic information does not meet the accuracy condition, the azimuth recognition result can be determined by combining the first azimuth and the second azimuth, the influence of the earth magnetic field on azimuth recognition is reduced, and therefore the accuracy of azimuth recognition is improved.
In one possible implementation manner, the method further includes: when the calibrated geomagnetic information does not meet the precision condition, acquiring a third position from auxiliary equipment; and determining a position recognition result according to the second position and the third position, wherein the position recognition result is used for describing position information of the electronic equipment.
Therefore, in the geomagnetic mode, if the calibrated geomagnetic information does not meet the accuracy condition, a third direction can be obtained from the auxiliary equipment, and the direction recognition result is determined by combining the second direction, so that the influence of the earth magnetic field on direction recognition can be reduced, and the accuracy of direction recognition is improved.
In one possible implementation manner, the method further includes: and determining a position recognition result according to the first position, wherein the position recognition result is used for describing the position information of the electronic equipment.
In one possible implementation, the method further includes one or more of: displaying an azimuth recognition result on an azimuth information display interface; providing a direction recognition result to the associated application through an application programming interface; and transmitting the azimuth recognition result to the associated equipment through the wireless transmission module.
Therefore, the azimuth recognition result can be output in various modes, and meanwhile, the azimuth recognition result can be shared with the associated equipment, so that the use scenes of azimuth recognition are enriched.
In one possible implementation manner, the method further includes: acquiring an actual traveling direction and a set traveling direction determined based on a direction identification result; determining error information of an actual traveling direction and a set traveling direction; and displaying deviation correcting information on the azimuth information display interface according to the error information, wherein the deviation correcting information is used for correcting the actual travelling azimuth.
Therefore, by setting the traveling azimuth information and the error information of the actual traveling azimuth, deviation rectifying information can be generated and used for prompting to correct the actual traveling azimuth, so that the real-time navigation function is realized.
In one possible implementation manner, in the case that the longitude and latitude information of the electronic device belongs to the northern hemisphere, the reference ray is the ray where the hour hand is located; or under the condition that longitude and latitude information of the electronic equipment belongs to the southern hemisphere, the reference ray is the ray where the preset scale line is located.
In a second aspect, an embodiment of the present application provides an azimuth recognition device, including: the processing unit is used for controlling the dial plate on the electronic equipment to rotate based on the first reference azimuth, and the azimuth pointed by the reference ray in the dial plate after rotation is the first reference azimuth; the relative positions of the hour hand, the minute hand and each scale mark on the dial plate and the dial plate are kept unchanged in the rotating process; determining a target included angle based on the rotated dial; the target included angle is the included angle between the ray of the hour hand in the dial after rotation and the ray of the preset scale mark; and determining a first azimuth according to the target included angle, wherein the azimuth pointed by the angular bisector of the target included angle is the first azimuth. The end point of the ray where the hour hand is located in the dial after rotation and the end point of the ray where the preset scale mark is located are the origins of the dial.
In a third aspect, embodiments of the present application provide an electronic device comprising one or more processors and one or more memories. The one or more memories are coupled with the one or more processors, the one or more memories being for storing computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the method of the first aspect or any of the possible implementations of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium storing a computer program comprising program instructions which, when run on an electronic device, cause the electronic device to perform the method of the first aspect or any of the possible implementations of the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program comprising instructions which, when executed by a computer, cause the computer to perform the method of the first aspect or any of the possible implementations of the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product for, when run on a computer, causing the computer to perform the method of any one of the possible implementations of the above aspect.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
FIG. 2A is a schematic diagram of an implementation environment of a method for identifying an azimuth according to an embodiment of the present application;
fig. 2B is a schematic block diagram of an electronic device according to an embodiment of the present application;
FIG. 3 is a schematic flow chart of a method for identifying an azimuth according to an embodiment of the present application;
FIG. 4A is a schematic diagram of an interactive interface of an electronic device according to an embodiment of the present application;
FIG. 4B is a schematic illustration of a first orientation provided by an embodiment of the present application;
FIG. 4C is an interface diagram of an azimuthal result display interface according to an embodiment of the present application;
fig. 4D is a schematic diagram of an application scenario of a direction recognition result provided by an embodiment of the present application;
FIG. 5 is a schematic flow chart of a method for identifying an azimuth according to an embodiment of the present application;
FIG. 6A is a schematic view of a scenario for controlling dial rotation according to an embodiment of the present application;
FIG. 6B is a schematic diagram of an image and image processing including a reference orientation mark according to an embodiment of the present application;
FIG. 6C is a schematic diagram of a photosensor according to an embodiment of the present application;
FIG. 6D is a schematic diagram of a ground coordinate system according to an embodiment of the present application;
FIG. 6E is a schematic diagram of a target angle according to an embodiment of the present application;
FIG. 7 is a flowchart of another method for identifying an azimuth according to an embodiment of the present application;
FIG. 8 is a schematic diagram of an azimuth recognition result provided by an embodiment of the present application;
fig. 9 is a schematic structural diagram of an azimuth recognition device according to an embodiment of the present application.
Detailed Description
Embodiments of the present application will be described below with reference to the accompanying drawings in the embodiments of the present application.
The terms "first," "second," "third," and "fourth" and the like in the description and in the claims and drawings are used for distinguishing between different objects and not necessarily for describing a particular sequential or chronological order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus.
It should be understood that in the present application, "at least one (item)" means one or more, and "a plurality" means two or more. "and/or" for describing the association relationship of the association object, the representation may have three relationships, for example, "a and/or B" may represent: only a, only B and both a and B are present, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b or c may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b, c may be single or plural.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
As used in this specification, the terms "component," "module," "system," and the like are intended to refer to a computer-related entity, either hardware, firmware, a combination of hardware and software, or software in execution. For example, a component may be, but is not limited to being, a process running on a processor, an object, an executable, a thread of execution, a program, and/or a computer. By way of illustration, both an application running on a computing device and the computing device can be a component. One or more components may reside within a process and/or thread of execution and a component may be localized on one computer and/or distributed between 2 or more computers. Furthermore, these components can execute from various computer readable media having various data structures stored thereon. The components may communicate by way of local and/or remote processes such as in accordance with a signal having one or more data packets (e.g., data from two components interacting with one another in a local system, distributed system, and/or across a network such as the internet with other systems by way of the signal).
Fig. 1 shows a schematic configuration of an electronic device 100.
The embodiment will be specifically described below taking the electronic device 100 as an example. It should be understood that electronic device 100 may have more or fewer components than shown in fig. 1, may combine two or more components, or may have a different configuration of components. The various components shown in fig. 1 may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The electronic device 100 may include: processor 110, external memory interface 120, internal memory 121, universal serial bus (universal serial bus, USB) interface 130, charge management module 140, power management module 141, battery 142, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headset interface 170D, sensor module 180, keys 190, motor 191, indicator 192, camera 193, display 194, and subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
Wherein some of the components shown in fig. 1 (e.g., processor 110, internal memory 121) may be integrated in a system-on-chip (SOC).
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown in FIG. 1, or may combine certain components, or split certain components, or a different arrangement of components. The components shown in fig. 1 may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a controller, a memory, a video codec, a digital signal processor (DIGITAL SIGNAL processor, DSP), a baseband processor, and/or a neural Network Processor (NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-INTEGRATED CIRCUIT, I2C) interface, an integrated circuit built-in audio (inter-INTEGRATED CIRCUIT SOUND, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also employ different interfacing manners in the above embodiments, or a combination of multiple interfacing manners.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a Liquid Crystal Display (LCD) CRYSTAL DISPLAY, an organic light-emitting diode (OLED), an active-matrix organic LIGHT EMITTING diode (AMOLED), a flexible light-emitting diode (FLED), miniled, microLed, micro-oLed, a quantum dot LIGHT EMITTING diode (QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The internal memory 121 may include one or more random access memories (random access memory, RAM) and one or more non-volatile memories (NVM).
The random access memory may include a static random-access memory (SRAM), a dynamic random-access memory (dynamic random access memory, DRAM), a synchronous dynamic random-access memory (synchronous dynamic random access memory, SDRAM), a double data rate synchronous dynamic random-access memory (double data rate synchronous dynamic random access memory, DDR SDRAM, such as fifth generation DDR SDRAM is commonly referred to as DDR5 SDRAM), etc.; the nonvolatile memory may include a disk storage device, a flash memory (flash memory).
The FLASH memory may include NOR FLASH, NAND FLASH, 3D NAND FLASH, etc. divided according to an operation principle, may include single-level memory cells (SLC-LEVEL CELL), multi-level memory cells (multi-LEVEL CELL, MLC), triple-level memory cells (LEVEL CELL, TLC), quad-LEVEL CELL, QLC), etc. divided according to a memory cell potential order, may include general FLASH memory (english: universal FLASH storage, UFS), embedded multimedia memory card (eMMC) MEDIA CARD, eMMC), etc. divided according to a memory specification.
The random access memory may be read directly from and written to by the processor 110, may be used to store executable programs (e.g., machine instructions) for an operating system or other on-the-fly programs, may also be used to store data for users and applications, and the like.
The nonvolatile memory may store executable programs, store data of users and applications, and the like, and may be loaded into the random access memory in advance for the processor 110 to directly read and write.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, a file such as a compressed drive file is stored in an external memory card.
The electronic device may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The gyro sensor 180B is an angular motion detection sensor that can be used to determine the motion gesture of the electronic device. In some embodiments, the angular velocities of the electronic device about three axes of the reference coordinate system may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device through the reverse motion, thereby realizing anti-shake. The gyro sensor 180B may also be used for navigating, somatosensory game scenes.
The magnetic sensor 180D includes a hall sensor.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device in various directions. The magnitude and direction of gravity can be detected when the electronic device is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, such as shooting a scene, the electronic device 100 may range using the distance sensor 180F to achieve quick focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light outward through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that there is an object in the vicinity of the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there is no object in the vicinity of the electronic device 100.
The ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect whether electronic device 100 is in a pocket to prevent false touches.
The touch sensor 180K may also be referred to as a touch panel or touch sensitive surface. The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type.
The bone conduction sensor 180M may acquire a vibration signal.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc. In some embodiments, an indicator 192 may be used to indicate that the camera 193 is in use.
The electronic device 100 as shown in fig. 1 may display various user interfaces described in various embodiments below through the display 194. The electronic apparatus 100 may detect a touch operation in each user interface through the touch sensor 180K, such as a click operation (e.g., a touch operation on an icon, a double click operation) in each user interface, a slide operation up or down in each user interface, and so on.
In some embodiments, the electronic device 100 may detect a motion gesture performed by the user holding the electronic device 100, for example, shake the electronic device, through the gyro sensor 180B, the acceleration sensor 180E, and the like, so as to obtain gesture information of the electronic device 100.
The following is a description of some of the principles involved in the present application to facilitate understanding by those skilled in the art.
The compass is mainly composed of a magnetic needle arranged on a shaft. The magnetic needle can freely rotate and keep in the tangential direction of the magnetic meridian under the action of the natural earth magnetic field, and the south pole of the magnetic needle points to the geographic south pole (magnetic north pole), so that the azimuth can be distinguished by utilizing the performance. Compass is often used for navigation, geodetic, travel, etc. The compass for physically indicating the azimuth usually has three types of components, namely a southward, a compass and a magnetic needle. A compass in an electronic device such as a smart phone, a smart wearable device, etc. is typically an electronic compass implemented based on a hall effect sensor, which can detect the earth magnetic field based on the principle of hall effect, so as to indicate the azimuth. Due to the nature of the earth's magnetic field itself, as well as the presence of other magnetic fields and ferrous metals, the accuracy of compass bearing identification is inevitably compromised, for example by electromagnetic interference generated by nearby high voltage power lines. Meanwhile, a certain angle deviation exists between the earth magnetic pole and the true azimuth. Even if the compensation is performed by using the circuit method, the accuracy is still not ideal, and new factors affecting the accuracy may be introduced due to the electrical characteristics of the measuring device itself.
The earth rotates 360 degrees for 24 hours, so the earth rotates 15 degrees per hour; the hour hand rotates 360 degrees for 12 hours in the watch, i.e. 30 degrees per hour. Based on this principle, the watch and sun can be used to roughly determine the azimuth.
Referring to fig. 2A, fig. 2A is a schematic diagram of an implementation environment of azimuth recognition according to an embodiment of the present application. As shown in fig. 2A, the implementation environment includes at least one electronic device 210 and a reference azimuth indicator 220. The electronic device 210 may be a wearable device such as a smart phone, a smart watch, a tablet computer, a notebook computer, a desktop computer, a vehicle-mounted terminal, etc., but is not limited thereto. The reference azimuth mark 220 may be a reference object for azimuth recognition, for example, a celestial body such as the sun, the moon, or the arctic, or the reference azimuth mark 220 may be a reference object such as a building. The azimuth identifying method provided by the embodiment of the present application may be performed by the electronic device 210, or may be performed by a chip, a part of a component, or a set of parts in the electronic device 210, such as a system on a chip (SOC).
For example, referring to fig. 2B, fig. 2B is a schematic functional block diagram of an electronic device according to an embodiment of the present application. As shown in fig. 2B, the electronic device may include an interaction module 240 and a computing module 250. The interaction module 240 includes a display module 241 and a transmission module 242. The computing module 250 includes a sensor module 251 and a processor module 252. The sensor module 251 may include a gyro sensor, an acceleration sensor, an azimuth sensor, a magnetic sensor, and a global positioning system (global Positioning system, GPS). The display module 241 may be used for interacting with a user, for example, may be used for obtaining an instruction of the user, displaying an azimuth recognition result, and the like. The transmission module 242 may be configured to transmit location information, such as information about a reference location or a location identification. The gyro sensor and the acceleration sensor in the sensor module 251 can be used to determine the gesture information of the electronic device; the position sensor may be used to determine a reference position, which may be the position of a reference position marker; the magnetic sensor may be used to identify the bearing by sensing the earth's magnetic field; the GPS may be used to determine latitude and longitude information of the electronic device. The processor module 252 may include an SOC for processing information in the interaction module 240 and the sensor module 251 to determine an orientation recognition result.
The implementation of the electronic device identification azimuth will be exemplarily described by taking the electronic device as a smart watch as an example through a flowchart shown in fig. 3.
1. Selecting a mode
In some embodiments, the user may turn on the bearing identification in the electronic device through a compass or other Application (APP). For example, as shown in (a) of fig. 4A, a plurality of APPs may be installed in the electronic device, such as APP1, APP2, APP3, and compass shown in (b) of fig. 4A. The user may select "compass" in this interface, and the electronic device may display a mode selection interface as shown in (b) in fig. 4A on the electronic device in response to the user's selection. The mode selection interface may include two modes, namely a solar mode and a geomagnetic mode. The solar pattern indicates that the azimuth is identified by a reference azimuth mark, which may be a reference object such as the sun. Geomagnetic patterns represent the identification of azimuth by inducing the earth's magnetic field.
If the user selects the solar mode in the interface as shown in (b) of fig. 4A, the electronic device may generate a selection instruction for the solar mode in response to the user's selection. The electronic device may perform step 2 of the following steps based on the selection instruction for the solar mode. Similarly, if the user selects the geomagnetic mode, the electronic device may generate a selection instruction for the geomagnetic mode in response to the selection of the user. The electronic device may perform step 3 of the following steps based on the selection instruction for the geomagnetic mode.
2. If the sun mode is selected, determining a first orientation by the sun mode
The electronic device can respond to a selection instruction of a user for a sun mode, and the shooting device is called to acquire an image containing a reference azimuth mark, and the first reference azimuth can be determined through the image. The dial on the electronic equipment can be controlled to rotate based on the first reference azimuth, so that the reference rays on the dial after rotation point to the first reference azimuth, and a target included angle between the rays of the hour hand on the dial after rotation and the rays of the preset scale mark is obtained. According to the target included angle, the direction pointed by the angular bisector of the target included angle can be determined as the first direction. During the rotation, the relative positions of the hour hand, the minute hand and the scale marks on the dial are kept unchanged. For example, if the reference ray is the ray of the hour hand on the dial after rotation, pointing the reference ray to the first reference direction may obtain the target included angle as shown in fig. 4B. From the angular bisector of the target angle, a first orientation is obtained as shown by the dashed line in fig. 4B.
Therein, the first direction is exemplarily shown in (a) of fig. 4C to describe the right south determined by the solar mode, and further, the right north determined by the solar mode may also be described.
Alternatively, the position recognition result may be determined based on the first position. The orientation recognition result is used to describe orientation information of the electronic device. The orientation information of the electronic device may include an orientation value of any point in a space formed by the east-west north-south orientation. For example, assuming the electronic device is to be moved toward point a, using the current location of the electronic device as a starting point, the orientation information of the electronic device may describe an orientation value of point a relative to the electronic device (e.g., 20 ° north-east).
Optionally, the shooting device may be a camera in the electronic device. The electronic device may invoke the camera to capture an image. Taking the reference azimuth mark as an example of the sun, if the sun cannot be identified in the collected image, reminding information shown in (c) in fig. 4A can be displayed on the dial to remind a user to adjust the gesture of the electronic equipment or move the position of the electronic equipment, so that the camera can collect the image containing the sun.
3. If the geomagnetic mode is selected, determining a second azimuth through the geomagnetic mode
The electronic device may obtain calibrated geomagnetic information in response to a user selection instruction for a geomagnetic mode. The calibrated geomagnetic information is geomagnetic information after geomagnetic calibration, and can be obtained through a magnetic sensor in the electronic equipment. The geomagnetic information may include geomagnetic data such as magnetic field strength of a location where the electronic apparatus is located, or data such as whether or not an interfering magnetic field exists and magnetic field strength of the interfering magnetic field. The geomagnetic calibration method is not limited in this application. The electronic device may determine the second bearing based on the calibrated geomagnetic information. The second orientation is used to describe the direct north determined by the induced earth's magnetic field, and further, the direct north determined by the induced earth's magnetic field.
If the calibrated geomagnetic information meets the accuracy condition, determining an azimuth recognition result according to the second azimuth. If the calibrated geomagnetic information does not meet the accuracy condition, optionally, the electronic device may generate a prompt message as shown in (d) of fig. 4A, where the prompt message is used to remind the user that the current earth magnetic field is abnormal, and recommend that the user select the sun mode. If the user selects the sun mode on the interface as in (d) of fig. 4A, the electronic device may perform step 2 in response to the selection instruction. If the user still selects geomagnetic mode on the interface, the azimuth recognition result may be determined according to the following manner:
Mode 1: step 2 is executed to obtain the first azimuth determined by the solar mode. And determining an azimuth recognition result according to the first azimuth and the second azimuth.
Mode 2: and acquiring a third position from the auxiliary equipment, and determining a position identification result according to the second position and the third position. Wherein the third orientation is used to describe the direct north and south determined by the auxiliary device. The auxiliary device may be one or more, and each auxiliary device may determine a third orientation. The auxiliary device may determine the third position by a solar mode or a geomagnetic mode, or the auxiliary device may determine the third position by other means (e.g., GPS), which is not limited by the present application.
Mode 3: a first position determined by the solar mode is obtained, and a third position is obtained from the auxiliary device. And determining an azimuth recognition result according to the first azimuth, the second azimuth and the third azimuth.
4. Outputting the direction recognition result
After the azimuth recognition result is obtained, the azimuth recognition result can be displayed or output in various modes.
Optionally, the azimuth recognition result may be displayed on an azimuth information display interface. As shown in fig. 4C (a), the azimuth information presentation interface may be a virtual compass interface on the electronic device, on which both the north and south directions may be displayed.
Optionally, the location identification results may be provided to the associated application through an application programming interface (application programming interface, API). The associated application can be map APP, navigation APP and other applications. For example, as shown in fig. 4D, the map APP may call the direction recognition result through the API to correct the display information. The direction of the arrow indicates the direction faced by the user, and before invoking the direction recognition result, the direction faced by the user displayed by the map APP is shown as (a) in fig. 4D; after the direction recognition result is called, the direction faced by the user displayed by the map APP is shown in (b) of fig. 4D. Therefore, the API provides the direction recognition result for the map APP, so that the accuracy of guiding the direction or navigation for the user in the map APP can be improved.
Alternatively, the position recognition result may be transmitted to the associated device by a wireless transmission module. The wireless transmission module may include wireless communication technology (Wi-Fi), bluetooth (blue), etc. The associated device may be a terminal device such as a smart phone, a smart watch, etc. The associated device can use the direction recognition result to realize related functions such as navigation, positioning and the like. For example, the electronic device may share the determined location recognition result to an associated device that does not have a location recognition capability (e.g., does not support a compass function) through bluetooth technology, so that the associated device may implement location recognition through the electronic device.
In one possible implementation, after determining the azimuth recognition result, an actual traveling azimuth and a set traveling azimuth determined based on the azimuth recognition result may be obtained; determining error information of an actual traveling direction and a set traveling direction; and displaying deviation correcting information on the azimuth information display interface according to the error information, wherein the deviation correcting information is used for correcting the actual travelling azimuth. Where the actual travel position is the actual movement position of the user, the set travel position may be the desired position of the user input electronic device, such as the forward direction. Illustratively, as shown in (b) of fig. 4C, the dashed arrow in the figure indicates that the current display time is 16:00. The heading is set to forward (indicated by solid arrow ①) and the electronic device recognizes that the actual heading is 45 ° southeast (indicated by solid arrow ②). Based on this, the electronic device may generate deviation rectification information, such as text prompt information of "please walk left", and may use graphic prompt information such as arrow ③, etc., on the azimuth information display interface. The user can correct the travelling direction according to the deviation correcting information. After one or more corrections, the user may be assisted in advancing to the desired orientation.
Therefore, the embodiment of the application can provide two modes for carrying out azimuth recognition, namely a solar mode and a geomagnetic mode. The azimuth recognition result can be obtained through the solar mode and the geomagnetic mode, so that the azimuth can be intelligently and accurately recognized. In addition, the embodiment of the application can provide output modes of various azimuth recognition results, and realize sharing azimuth of multiple applications and multiple devices.
The azimuth recognition method provided by the embodiment of the application is briefly introduced, and the electronic device is taken as an intelligent watch as an example, and the specific implementation manner of the azimuth recognition through the solar mode is described in detail below.
Fig. 5 is a schematic flow chart of a direction recognition method according to an embodiment of the present application, which is applicable to direction recognition in a scene where a magnetic field is disturbed or unavailable. The azimuth recognition method includes, but is not limited to, the following steps:
S501: and controlling the dial on the electronic equipment to rotate based on the first reference azimuth, wherein the azimuth pointed by the reference ray in the rotated dial is the first reference azimuth.
The relative positions of the hour hand, the minute hand and each scale mark on the dial plate and the dial plate are kept unchanged in the rotating process. The end point of the reference ray is the origin of the dial. The origin of the dial is the intersection point of the hour hand and the minute hand. The first reference azimuth is the azimuth where the reference azimuth mark is located. The reference azimuth mark may be a celestial body such as sun, moon, or arctic, or may be a reference object such as a building. In the present application, the reference azimuth mark is taken as an example of the sun. When the reference azimuth mark is the sun, the first reference azimuth is the azimuth of the sun.
The electronic device includes a dial, which may be the dial of a virtual timepiece displayed on the electronic device. The time is displayed by the hour hand, minute hand and each tick mark on the dial. The electronic device may determine that the current time is at am or noon or afternoon, so as to determine current actual time information, i.e. current display time. For example, when the hour hand points to the 5 tick mark and the minute hand points to the 6 tick mark in the dial, this may indicate a time of 5:30 or 17:30, if the afternoon time, the electronic device may determine that the current display time is 17:30.
In the process of rotating the dial, when the current display time on the dial is the place of the time zone of the electronic equipment, namely, the scale marks respectively indicated by the hour hand and the minute hand on the dial represent the place of the time zone of the electronic equipment. If the time displayed by the dial is not the local time zone, the dial can determine the time zone of the electronic equipment according to the longitude and latitude information of the electronic equipment, automatically calculate the time difference, and adjust the directions of the hour hand and the minute hand on the dial so that the current display time is the local time zone of the electronic equipment. It is assumed that the time used in the area a is the time a, that is, the time of the east-eighth area. If the electronic device is located in city B in region a, since city B is located in the east-sixth region. Illustratively, on the basis of the east-west 120 ° line in the east-eight zone, longitude is 15 ° every east (one time zone every 15 °), then the current a time is added by 1 hour; longitude 15 ° each west, the current time a is subtracted by 1 hour to get the current local time. For example, if the longitude of city B is 87 ° 40', the time difference between the time of the city B and the time of the city a is calculated to be 2 hours and 9 minutes. If the current A time is 12:09, the local time of B market is 10:00. In this case, the current display time of the dial is 10:00, the hour hand points to the 10 scale mark, and the minute hand points to the 12 scale mark.
The reference ray in the dial plate can be the ray in which the hour hand is located or the ray in which the preset scale mark is located after the dial plate rotates. The preset graduation marks can be 12 graduation marks in the dial plate. The end points of the rays where the hour hand is located and the end points of the rays where the preset scale marks are located are all the origins of the dial plate. For example, as shown in fig. 6A, the current display time of the dial is 3 pm, the hour hand of the dial of the smart watch points to the 3 scale mark, and the minute hand points to the 12 scale mark. The dial is controlled to rotate so that the ray of the hour hand is directed to the first reference direction (as shown in (a) of fig. 6A), or so that the ray of the preset graduation mark is directed to the first reference direction (as shown in (b) of fig. 6A).
In one possible implementation, the reference ray may be determined from latitude and longitude information of the electronic device. Optionally, in the case that the latitude and longitude information of the electronic device belongs to the northern hemisphere, the reference ray is a ray in which the hour hand is located; under the condition that longitude and latitude information of the electronic equipment belongs to the southern hemisphere, the reference ray is the ray where the preset scale line is located. The longitude and latitude information is used for describing that the electronic equipment is located in the southern hemisphere or the northern hemisphere. Alternatively, the latitude and longitude information may be determined by a GPS in the electronic device, or may be determined by information such as a name of a place where the user inputs, or may be directly input by the user that the place where the user inputs is in the southern hemisphere or the northern hemisphere.
In one possible implementation, the manner of determining the first reference position may be: acquiring an image containing a reference azimuth mark, and determining a second reference azimuth according to the image; and correcting the second reference azimuth according to the read attitude sensor information to obtain a first reference azimuth. The second reference azimuth is the azimuth of the reference azimuth mark. Typically, the second reference position is less accurate than the first reference position. Optionally, the second reference azimuth is a rough azimuth in which the reference azimuth mark is located; and the first reference azimuth obtained after correcting the second reference azimuth through the attitude sensor information is the accurate azimuth of the reference azimuth mark. Alternatively, the second reference position may be taken as the first reference position.
In one possible implementation, an image containing the reference azimuth mark can be acquired through a shooting device, and the position information of the reference azimuth mark can be determined by processing the image; calling a light-sensitive sensor to sense the reference azimuth mark to obtain a sensing result; the second reference position can be determined based on the position information of the reference position mark and the sensing result. Based on the read attitude sensor information, the second reference orientation may be corrected to determine the first orientation.
In one possible implementation manner, the manner of determining the second reference azimuth may specifically be: acquiring an image containing a reference azimuth mark through a shooting device; performing image processing on the image, and determining the position information of the reference azimuth mark; calling a light-sensitive sensor to sense the reference azimuth mark to obtain a sensing result; and determining a second reference azimuth according to the position information and the sensing result.
Alternatively, the camera may be a camera in an electronic device. Taking the reference azimuth mark as the sun as an example, for an acquired image containing the sun, the electronic device may perform image processing on the image. Alternatively, the system on chip SOC may be invoked for image processing. For example, binarization processing is performed on an image to obtain a binarized image corresponding to the image. Based on the binarized image, position information of the solar light source in the image can be determined. Optionally, if there is an interference light source, such as an artificial light source like LED light, the interference light source may be filtered based on the difference information of the solar spectrum and the spectrum of other light sources. For example, referring to fig. 6B, an image captured by the camera may be as shown in (a) of fig. 6B; after binarization processing, three light sources as shown in (B) in fig. 6B can be obtained; filtering the interference light source based on solar spectrum, and if the photographed sun has a blocked part, performing image complementation (IMAGE INPAINTING) through an AI image processing technology, so as to obtain a binarized image as shown in (c) in FIG. 6B; based on the binarized image, a rough position of the sun with respect to the electronic device can be determined, as shown in (d) in fig. 6B.
The camera shoots the reference azimuth mark in real time, the camera can shoot in real time according to the movement of the electronic equipment, and the photosensitive sensor can detect the first reference azimuth in real time based on the acquired image so as to ensure that the first reference azimuth is determined based on the current position of the electronic equipment, thereby ensuring the accuracy of the first reference azimuth. Optionally, before the camera is called for shooting, the use permission of the camera can be obtained from the user; in the process of collecting the image, prompt information such as 'camera is in use' can be displayed on the electronic equipment, so that the privacy of a user is protected.
Alternatively, the light-sensitive sensor is a sensitive device in the electronic apparatus that is responsive to or converts external light signals or light radiation. Based on the photosensitive characteristic of the photosensitive sensor, the azimuth of the sun can be judged according to the intensity of illumination, namely, the azimuth of the strongest illumination sensed by the photosensitive sensor is determined as a first reference azimuth. For example, as shown in fig. 6C, the light-sensitive sensor may be located on the outer ring of the smart watch dial, represented as a circular ring. The illumination intensity of the sun is gradually enhanced from the direction of the light sensing sensor facing away from the sun to the direction facing the sun. As shown in fig. 6C, the stronger the sun illumination, the lighter the schematic color of the ring. Based on this, as shown by the broken line in fig. 6C, the position indicated by the ray in which the lightest-color-schematically-indicated portion of the circular ring is located can be determined as the second reference position.
Alternatively, the read gesture sensor information may include gesture information such as pitch angle (pitch), heading angle (yaw), roll angle (roll) and the like of the electronic device. The posture of the electronic equipment can be adjusted through the read posture sensor information, so that the direction of the reference direction mark determined from the image is more accurate.
The attitude sensor may include a gyroscope sensor, an acceleration sensor, and the like. The gesture information of the electronic device can be acquired through the gesture sensor. The attitude information of the electronic device can be determined by attitude angles such as pitch angle, course angle and roll angle. Where pitch, heading and roll generally refer to the angle of three-axis rotation of the electronic device about the ground coordinate system.
In one possible implementation, the pitch angle may be an angle between the Y-axis of the electronic device coordinate system and the local horizontal plane; the yaw angle can be an included angle between the projection of the Y axis of the electronic equipment coordinate system on the local horizontal plane and the Yg axis of the ground coordinate system, and the roll angle can be an included angle between the XY plane of the electronic equipment coordinate system and the Zg axis of the ground coordinate system. According to the embodiment of the application, the electronic equipment can determine three attitude angles of the electronic equipment based on the angular speeds of the three axes around the ground coordinate system, which are acquired by the gyroscope sensor, so as to determine the current attitude of the electronic equipment. Optionally, the gesture information of the electronic device may also be determined in combination with the gesture of the electronic device recognized by the acceleration sensor. The acceleration sensor can detect the acceleration of the electronic device in all directions (such as three axes of a ground coordinate system).
Illustratively, the three-axis (Xg, yg, and Zg) coordinate system shown in FIG. 6D is a ground coordinate system shown in an embodiment of the present application. The Xg axis points eastward (east) along the local latitude line, the Yg axis points north (north) along the local meridian line, and the Zg axis points upward along the geographic perpendicular line, and forms a right-hand rectangular coordinate system with the Xg axis and the Yg axis. The plane formed by the Xg axis and the Yg axis is a local horizontal plane, and the plane formed by the Y axis and the Zg axis is a local meridian plane.
In the embodiment of the application, the method is not limited to a gyroscope sensor or an acceleration sensor, and the gesture information of the electronic equipment can be determined through other hardware equipment, so that the method is not limited.
S502: determining a target included angle based on the rotated dial; the target included angle is the included angle between the ray of the hour hand in the dial after rotation and the ray of the preset scale mark.
The end points of the rays of the dial after rotation, where the hour hand is located, and the rays of the preset scale marks are located, are the origins of the dial.
In one possible implementation, the first angle and the second angle may be derived based on the dial after rotation. The first included angle is an included angle between rays of the dial after rotation and rays of the dial where the hour hand is located and rays of the dial where the preset scale mark is located; the second included angle is an included angle between the rays of the preset scale mark and the rays of the hour hand in the dial after rotation. And if the current display time of the dial is earlier than 12 pm, determining the target included angle as a first included angle. And if the current display time of the dial is later than 12 pm, determining the target included angle as a second included angle. If the current display time of the dial is 12 pm, determining that the target included angle is a first included angle or a second included angle, and obtaining the target included angle at this time as a zero angle.
For example, assuming that latitude and longitude information of the electronic device belongs to the northern hemisphere, the reference ray is the ray where the hour hand is located. As shown in fig. 6E, the preset scale line is a 12 scale line, when the hour hand points to the 6 scale line on the dial after rotation and the minute hand points to the 12 scale line, the representative time is 6:00 or 18:00. under the condition, two included angles can be obtained by directing the ray with the hour hand to the first reference direction, wherein the included angles are respectively an included angle A between the ray with the hour hand in the dial after rotation and the ray with the 12 scale marks, and an included angle B between the ray with the 12 scale marks and the ray with the hour hand in the dial after rotation. If the current display time of the dial plate is 6:00, namely, is earlier than 12 pm, the target included angle is equal to the angle A. If the current display time of the dial plate is 18:00, namely, is later than 12 pm, the target included angle is equal to B.
S503: and determining a first azimuth according to the target included angle, wherein the azimuth pointed by the angular bisector of the target included angle is the first azimuth.
Optionally, during the movement of the electronic device, the electronic device can rotate the dial in real time, so that the reference ray always points to the first reference azimuth, thereby improving accuracy of azimuth identification.
In one possible implementation, after the position identification result is obtained, the position identification result may be displayed or output in various manners. Optionally, the azimuth recognition result may be displayed on an azimuth information display interface. As shown in (a) of fig. 4C, the orientation information presentation interface may be a virtual compass interface on the electronic device on which both the north and south directions may be displayed. Optionally, the electronic device may include a first camera and a second camera. The first camera is used for shooting the azimuth reference mark so as to identify the azimuth. The second camera is used for realizing augmented reality (augmented reality, AR) live-action navigation, for example, in the moving process of the user, live-action images acquired by the second camera are used as background images of the azimuth information display interface shown in (b) in fig. 4C, so that the user can view azimuth recognition results according to the live-action images conveniently.
In one possible implementation, after the position identification result is obtained, the position identification result may be displayed or output in various manners. Specifically, reference may be made to the description in step 4 shown in fig. 3, and the description is omitted here.
Therefore, the first reference azimuth can be determined through the reference azimuth mark, and based on the first reference azimuth, the dial plate on the electronic equipment can be controlled to rotate, so that the reference rays on the dial plate point to the first reference azimuth, and the target included angle and the angular bisector of the target included angle are obtained. The first direction pointed by the angular bisector is the right south. The electronic equipment automatically controls and executes the steps, so that errors of operations such as manually rotating the dial plate and aligning the reference azimuth mark can be avoided. Further, the position recognition result may be determined according to the first position. By implementing the embodiment of the application, the accuracy of direction identification can be improved. Meanwhile, the azimuth recognition result can be displayed through the azimuth recognition result display interface, or the API is shared to the associated application in the electronic equipment, or the azimuth recognition result is shared to the associated equipment through the wireless transmission technology, so that the use scene of azimuth recognition can be effectively expanded. The embodiment of the application does not use the information related to the earth magnetic field to carry out the direction recognition, so that the direction recognition result can be intelligently and accurately determined even if the geomagnetic field is disturbed or unavailable.
Fig. 7 is a flowchart of another direction recognition method according to an embodiment of the present application. As shown in fig. 7, the method includes, but is not limited to, the steps of:
s701: and receiving a mode selection result at a mode selection interface.
S702: and under the condition that the mode selection interface receives a selection instruction for the sun mode, controlling the dial plate on the electronic equipment to rotate, wherein the direction pointed by the reference ray in the dial plate after the rotation is the first reference direction.
Based on the same inventive concept, the specific implementation of step S702 may be referred to as the specific description in step S501 in the embodiment shown in fig. 5, which is not repeated herein.
S703: determining a target included angle based on the rotated dial, and determining the azimuth pointed by an angular bisector of the target included angle as a first azimuth; and determining an azimuth recognition result according to the first azimuth.
Based on the same inventive concept, the specific implementation of step S703 may refer to step S502 to step S503 in the embodiment shown in fig. 5, which is not described herein.
S704: acquiring calibration geomagnetic information under the condition that a mode selection interface receives a selection instruction aiming at a geomagnetic mode; and determining a second azimuth according to the calibrated geomagnetic information.
The calibrated geomagnetic information is geomagnetic information after calibration, and the geomagnetic information can be obtained through a magnetic sensor.
S705: and under the condition that the calibrated geomagnetic information meets the precision condition, determining a direction recognition result according to the second direction, wherein the direction recognition result is used for describing the direction information of the electronic equipment.
The azimuth recognition result is used for describing azimuth information of the electronic equipment. For example, reference may be made to the direction recognition result as shown in fig. 4C (a).
In one possible implementation, it may be determined whether the calibrated geomagnetic information satisfies the accuracy condition by:
Mode 1: and performing data fitting on geomagnetic data in the calibrated geomagnetic information through a circle fitting algorithm (such as a least square method circle fitting algorithm), and if the data fitting result indicates that the geomagnetic data cannot be fitted into a function representing a sphere, determining that the calibrated geomagnetic information does not meet the accuracy condition. If the data fitting result indicates that geomagnetic data can be fitted into a function representing a sphere, it can be determined that the calibrated geomagnetic information meets the accuracy condition.
Mode 2: judging whether an interference source exists near the electronic equipment according to the variation amplitude of the calibrated geomagnetic information, and if the interference source exists, determining that the calibrated geomagnetic information does not meet the accuracy condition; and if no interference source exists or the influence of the interference source is small, determining that the calibrated geomagnetic information meets the precision condition. For example, if geomagnetic data in the calibrated geomagnetic information changes substantially (e.g., the geomagnetic data jumps) within an extremely short time, it may indicate that an interference source exists near the electronic device.
S706: and under the condition that the calibrated geomagnetic information does not meet the precision condition, determining an azimuth recognition result according to the first azimuth and the second azimuth.
And determining an azimuth recognition result according to the second azimuth and the first azimuth obtained through the solar mode. The electronic device can determine the confidence coefficient of the first azimuth and the second azimuth, and the first azimuth and the second azimuth are fused and calculated through the confidence coefficient of the first azimuth (expressed as n) and the confidence coefficient of the second azimuth (expressed as m), so that the direct north-south direction and the direct south direction are determined. Based on the locations of the north and south, a location identification result may be determined. Illustratively, an azimuth vector corresponding to the first azimuth may be determined (toRepresentation), and an azimuth vector (in/>) corresponding to the second azimuthRepresentation). The azimuth vector corresponding to the first azimuth may be represented as a directional line segment starting from the origin of the dial and ending from the first azimuth to the edge of the dial. Similarly, the azimuth vector corresponding to the second azimuth may be represented as a directed line segment starting from the origin of the dial and ending along the second azimuth to the edge of the dial. Based on this, the azimuth vector (in/>) corresponding to the right south can be determined by the following formulaRepresentation):
Wherein, The direction of (2) is the right south. Illustratively, ① represents an azimuth vector corresponding to a first azimuth, and ② represents an azimuth vector corresponding to a second azimuth, as shown in fig. 8. If the confidence of the first direction and the confidence of the second direction are both 0.5, it can be known from the above formula that the direction vector corresponding to the right south may be represented by ③ in fig. 8, i.e., edge/>And/>The azimuth indicated by the angular bisector of (④) is directly south. And determining the azimuth recognition result according to the determined direct south.
In one possible implementation, the position recognition result may also be determined according to the second position, the first position determined by the solar mode, and the third position obtained from the auxiliary device. Specifically, the electronic device may determine the confidence level of the first azimuth, the confidence level of the second azimuth, and the confidence level of the third azimuth, and calculate the azimuth corresponding to the direct south based on the principle of the above formula. And determining the azimuth recognition result according to the azimuth corresponding to the south.
In the case where the accuracy condition is not satisfied by the calibration geomagnetic information, any one of steps S706 and S707 may be selected to be executed.
S707: and under the condition that the calibrated geomagnetic information does not meet the precision condition, acquiring a third azimuth from the auxiliary equipment, and determining an azimuth recognition result according to the second azimuth and the third azimuth.
Wherein the third orientation is used to describe the right south obtained from the auxiliary device, and further, the right north can be described. The number of auxiliary devices may be one or more, and one third orientation may be obtained from each auxiliary device, and thus the number of third orientations may also be one or more. The auxiliary device may determine the third position by a solar mode or a geomagnetic mode, or the auxiliary device may determine the third position by other means (e.g., GPS), which is not limited by the present application.
The auxiliary device may be one or more, and each auxiliary device may determine a third orientation. The auxiliary device may determine the third position by a solar mode or a geomagnetic mode, or the auxiliary device may determine the third position by other means (e.g., GPS), which is not limited by the present application.
Illustratively, taking the number of third orientations as 1 as an example, the electronic device may determine a confidence level (denoted by m) for the second orientation and a confidence level (denoted by p) for the third orientation. The confidence level may be obtained through laboratory data, or may be obtained through other manners, which is not limited in the present application. The azimuth vector corresponding to the second azimuth may be represented as a directional line segment taking the origin of the dial as a starting point and taking the edge of the dial as an ending point along the second azimuth. Similarly, the azimuth vector corresponding to the third azimuth may be represented as a directional line segment starting from the origin of the dial and ending along the edge of the dial from the third azimuth. According to the azimuth vector corresponding to the second azimuth (in order toRepresentation) and an azimuth vector corresponding to the third azimuth (in/>Representation) the corresponding azimuth vector (in/>) for the direct south can be determined by the following formulaRepresentation):
Wherein, The direction of (2) is the right south. And determining the azimuth recognition result according to the determined direct south.
In one possible implementation, after the position identification result is obtained, the position identification result may be displayed or output in various manners. Specifically, reference may be made to the description in step 4 shown in fig. 3, and the description is omitted here.
Therefore, the embodiment of the application can provide two modes for carrying out azimuth recognition, namely a solar mode and a geomagnetic mode. Under the condition that the calibrated geomagnetic information meets the precision condition, the precise azimuth recognition result can be determined through a solar mode or a geomagnetic mode. In the case that the earth magnetic field is disturbed or the earth magnetic field is not available, on the one hand, the solar mode can be used for realizing azimuth recognition; on the other hand, the second azimuth can be combined with the first azimuth obtained through the solar mode to determine an azimuth recognition result; in yet another aspect, the position recognition result may be determined by performing a fusion calculation based on the second position and a third position obtained from the auxiliary device. By implementing the embodiment of the application, the accurate azimuth can be determined by fusing multiple information, the influence of the earth magnetic field on the azimuth identification can be avoided, and the accuracy of the azimuth identification is improved. In addition, the embodiment of the application can provide output modes of various azimuth recognition results, and realize sharing azimuth of multiple applications and multiple devices.
Referring to fig. 9, fig. 9 is a schematic structural diagram of an azimuth identifying device according to an embodiment of the application. The azimuth recognition device includes:
A processing unit 910, configured to control, based on the first reference azimuth, rotation of a dial on the electronic device, where an azimuth indicated by a reference ray in the dial after rotation is the first reference azimuth; the relative positions of the hour hand, the minute hand and each scale mark on the dial plate and the dial plate are kept unchanged in the rotating process; determining a target included angle based on the rotated dial; the target included angle is the included angle between the ray of the hour hand in the dial after rotation and the ray of the preset scale mark; and determining a first azimuth according to the target included angle, wherein the azimuth pointed by the angular bisector of the target included angle is the first azimuth. The end point of the ray where the hour hand is located in the dial after rotation and the end point of the ray where the preset scale mark is located are the origins of the dial.
In a possible implementation, the processing unit 910 is further configured to obtain, based on the rotated dial, a first included angle and a second included angle; the first included angle is an included angle between rays of the pointer in the dial after rotation and rays of the preset scale mark; the second included angle is an included angle between the ray with the preset scale mark and the ray with the hour hand in the dial after rotation; if the current display time of the dial is 12 pm, determining that the target included angle is a first included angle or a second included angle; if the current display time of the dial is earlier than 12 pm, determining that the target included angle is a first included angle; and if the current display time of the dial is later than 12 pm, determining the target included angle as a second included angle.
In one possible implementation, the processing unit 910 is further configured to determine a first reference position.
In a possible implementation, the position identifying device further includes an obtaining unit 920. An acquisition unit 920, configured to acquire an image including the reference azimuth mark. A processing unit 910, configured to determine a second reference position according to the image; and correcting the second reference azimuth according to the read attitude sensor information to obtain a first reference azimuth.
In a possible implementation, the acquiring unit 920 is further configured to acquire, by using the photographing device, an image including the reference azimuth mark. A processing unit 910, configured to perform image processing on the image, and determine location information of the reference azimuth mark; calling a light-sensitive sensor to sense the reference azimuth mark to obtain a sensing result; and determining a second reference azimuth according to the position information and the sensing result.
In a possible implementation, the obtaining unit 920 is further configured to receive a selection instruction for the solar mode at the mode selection interface. The processing unit 910 is further configured to control dial rotation on the electronic device based on the first reference azimuth, if the mode selection interface receives a selection instruction for the sun mode.
In a possible implementation manner, the obtaining unit 920 is further configured to receive a selection instruction for the geomagnetic mode at the mode selection interface. The processing unit 910 is further configured to obtain calibrated geomagnetic information when the mode selection interface receives a selection instruction for a geomagnetic mode; and determining a second azimuth according to the calibrated geomagnetic information.
In a possible implementation, the processing unit 910 is further configured to calibrate geomagnetic information to meet a precision condition, and determine an azimuth recognition result according to the second azimuth, where the azimuth recognition result is used to describe azimuth information of the electronic device.
In a possible implementation, the processing unit 910 is further configured to determine, when the calibrated geomagnetic information does not meet the accuracy condition, an azimuth recognition result according to the first azimuth and the second azimuth, where the azimuth recognition result is used to describe azimuth information of the electronic device.
In a possible implementation manner, the processing unit 910 is further configured to obtain the third location from the auxiliary device when the calibrated geomagnetic information does not meet the accuracy condition; and determining a position recognition result according to the second position and the third position, wherein the position recognition result is used for describing position information of the electronic equipment.
In a possible implementation, the processing unit 910 is further configured to determine a location identification result according to the first location, where the location identification result is used to describe location information of the electronic device.
In a possible implementation, the position recognition device further comprises an output unit 930. An output unit 930, configured to display the azimuth recognition result on the azimuth information display interface; providing a direction recognition result to the associated application through an application programming interface; and transmitting the azimuth recognition result to the associated equipment through the wireless transmission module.
In one possible implementation, the obtaining unit 920 is further configured to obtain an actual traveling direction and a set traveling direction determined based on the direction identification result. The processing unit 910 is further configured to determine error information of the actual traveling direction and the set traveling direction. The output unit 930 is further configured to display deviation correcting information on the azimuth information display interface according to the error information, where the deviation correcting information is used to correct the actual traveling azimuth.
In one possible implementation manner, in the case that the longitude and latitude information of the electronic device belongs to the northern hemisphere, the reference ray is the ray where the hour hand is located; or under the condition that longitude and latitude information of the electronic equipment belongs to the southern hemisphere, the reference ray is the ray where the preset scale line is located.
According to an embodiment of the present application, each unit in the data query device shown in fig. 9 may be separately or completely combined into one or several units to form a structure, or some unit(s) therein may be further split into a plurality of sub-units with smaller functions, so that the same operation may be implemented without affecting the implementation of the technical effects of the embodiment of the present application. The above units are divided based on logic functions, and in practical applications, the functions of one unit may be implemented by a plurality of units, or the functions of a plurality of units may be implemented by one unit. In other embodiments of the present application, the drive file processing may also include other units, and in actual practice, these functions may also be implemented with the assistance of other units, and may be implemented by the cooperation of multiple units.
It may be understood that the functions of each functional unit of the azimuth identifying device described in the embodiments of the present application may be specifically implemented according to the method in the above method embodiments, and the specific implementation process may refer to the relevant description of the above method embodiments, which is not repeated herein.
Therefore, the embodiment of the application can provide two modes for carrying out azimuth recognition, namely a solar mode and a geomagnetic mode. Under the condition that the calibrated geomagnetic information meets the precision condition, the precise azimuth recognition result can be determined through a solar mode or a geomagnetic mode. In the case that the earth magnetic field is disturbed or the earth magnetic field is not available, on the one hand, the solar mode can be used for realizing azimuth recognition; on the other hand, the second azimuth can be combined with the first azimuth obtained through the solar mode to determine an azimuth recognition result; in yet another aspect, the position recognition result may be determined by performing a fusion calculation based on the second position and a third position obtained from the auxiliary device. By implementing the embodiment of the application, the accurate azimuth can be determined by fusing multiple information, the influence of the earth magnetic field on the azimuth identification can be avoided, and the accuracy of the azimuth identification is improved. In addition, the embodiment of the application can provide output modes of various azimuth recognition results, and realize sharing azimuth of multiple applications and multiple devices.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
It should be noted that, for simplicity of description, the foregoing method embodiments are all described as a series of acts, but it should be understood by those skilled in the art that the present application is not limited by the order of acts described, as some steps may be performed in other orders or concurrently in accordance with the present application. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily required for the present application.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, such as the above-described division of units, merely a division of logic functions, and there may be additional manners of dividing in actual implementation, such as multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, or may be in electrical or other forms.
The units described above as separate components may or may not be physically separate, and components shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units described above, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on this understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server or a network device, etc., in particular may be a processor in the computer device) to perform all or part of the steps of the above-mentioned method according to the embodiments of the present application. Wherein the aforementioned storage medium may comprise: various media capable of storing program codes, such as a U disk, a removable hard disk, a magnetic disk, a compact disk, a read-only memory (ROM), or a random access memory (random access memory, RAM), are provided.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application.

Claims (18)

1. A method of orientation identification, the method comprising:
Based on a first reference azimuth, controlling a dial on the electronic equipment to rotate, wherein the azimuth pointed by a reference ray in the dial after rotation is the first reference azimuth; the relative positions of the hour hand, the minute hand and each scale mark on the dial plate and the dial plate are kept unchanged in the rotating process; the end point of the reference ray is the origin of the dial plate;
Determining a target included angle based on the rotated dial plate; the target included angle is an included angle between a ray of the hour hand in the dial after rotation and a ray of a preset scale mark; the end points of the ray of the hour hand and the ray of the preset scale mark in the dial after rotation are the origins of the dial;
and determining a first azimuth according to the target included angle, wherein the azimuth pointed by the angular bisector of the target included angle is the first azimuth.
2. The method of claim 1, wherein said determining a target angle based on said dial after rotation comprises:
obtaining a first included angle and a second included angle based on the rotated dial; the first included angle is an included angle between a ray of the hour hand in the dial after rotation and a ray of the preset scale mark; the second included angle is an included angle between the ray where the preset scale mark is located and the ray where the hour hand is located in the dial after rotation;
If the current display time of the dial is 12 pm, determining that the target included angle is the first included angle or the second included angle;
if the current display time of the dial is earlier than 12 pm, determining a target included angle as the first included angle;
And if the current display time of the dial is later than 12 pm, determining the target included angle as the second included angle.
3. The method of any one of claims 1 or 2, wherein the method further comprises:
the first reference position is determined.
4. A method as claimed in claim 3, wherein said determining said first reference position comprises:
acquiring an image containing a reference azimuth mark, and determining a second reference azimuth according to the image;
And correcting the second reference azimuth according to the read attitude sensor information to obtain the first reference azimuth.
5. The method of claim 4, wherein the acquiring an image including a reference azimuth mark, determining a second reference azimuth from the image, comprises:
Acquiring an image containing a reference azimuth mark through a shooting device;
performing image processing on the image, and determining the position information of the reference azimuth mark;
Calling a light-sensitive sensor to sense the reference azimuth mark to obtain a sensing result;
and determining a second reference azimuth according to the position information and the sensing result.
6. The method of any of claims 1 to 5, wherein controlling dial rotation on the electronic device based on the first reference position comprises:
And controlling the dial on the electronic equipment to rotate based on the first reference azimuth under the condition that the mode selection interface receives a selection instruction for the sun mode.
7. The method of any one of claims 1 to 5, further comprising:
Acquiring calibration geomagnetic information under the condition that a mode selection interface receives a selection instruction aiming at a geomagnetic mode;
And determining a second azimuth according to the calibrated geomagnetic information.
8. The method of claim 7, wherein the method further comprises:
and the calibrated geomagnetic information meets the precision condition, and a direction recognition result is determined according to the second direction, wherein the direction recognition result is used for describing the direction information of the electronic equipment.
9. The method of claim 7, wherein the method further comprises:
And determining a direction recognition result according to the first direction and the second direction, wherein the direction recognition result is used for describing the direction information of the electronic equipment.
10. The method of claim 7, wherein the method further comprises:
the geomagnetic information calibration method comprises the steps that the geomagnetic information calibration method does not meet the accuracy condition, and a third position is obtained from auxiliary equipment;
And determining a position recognition result according to the second position and the third position, wherein the position recognition result is used for describing the position information of the electronic equipment.
11. The method of any one of claims 1 to 6, further comprising:
and determining a direction recognition result according to the first direction, wherein the direction recognition result is used for describing the direction information of the electronic equipment.
12. The method of any one of claims 8 to 11, further comprising one or more of:
displaying the azimuth recognition result on an azimuth information display interface;
providing the azimuth recognition result to the associated application through an application programming interface;
And transmitting the azimuth recognition result to the associated equipment through a wireless transmission module.
13. The method of any one of claims 8 to 11, wherein the method further comprises:
Acquiring an actual traveling azimuth and a set traveling azimuth determined based on the azimuth identification result;
Determining error information of the actual traveling direction and the set traveling direction;
And displaying deviation rectifying information on an azimuth information display interface according to the error information, wherein the deviation rectifying information is used for rectifying the actual travelling azimuth.
14. The method according to any one of claims 1 to 13, wherein the reference ray is a ray in which the hour hand is located in a case where latitude and longitude information of the electronic device belongs to a northern hemisphere; or alternatively
And under the condition that the longitude and latitude information of the electronic equipment belongs to the southern hemisphere, the reference ray is the ray where the preset scale line is located.
15. An orientation recognition device, the device comprising:
The processing unit is used for controlling the dial plate on the electronic equipment to rotate based on a first reference position, and the position pointed by the reference ray in the dial plate after rotation is the first reference position; the relative positions of the hour hand, the minute hand and each scale mark on the dial plate and the dial plate are kept unchanged in the rotating process; the end point of the reference ray is the origin of the dial plate; determining a target included angle based on the rotated dial plate; the target included angle is an included angle between a ray of the hour hand in the dial after rotation and a ray of a preset scale mark; the end points of the ray of the hour hand and the ray of the preset scale mark in the dial after rotation are the origins of the dial; and determining a first azimuth according to the target included angle, wherein the azimuth pointed by the angular bisector of the target included angle is the first azimuth.
16. An electronic device, comprising: one or more processors, one or more memories; wherein one or more memories are coupled to one or more processors, the one or more memories being operable to store computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the method of any of claims 1-14.
17. A computer readable storage medium, characterized in that the computer storage medium stores a computer program comprising program instructions which, when run on an electronic device, cause the electronic device to perform the method of any of claims 1-14.
18. A computer program product, characterized in that the computer program product, when run on a computer, causes the computer to perform the method according to any of claims 1-14.
CN202211349687.0A 2022-10-31 2022-10-31 Direction identification method, apparatus, device, storage medium, and computer program product Pending CN117990073A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211349687.0A CN117990073A (en) 2022-10-31 2022-10-31 Direction identification method, apparatus, device, storage medium, and computer program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211349687.0A CN117990073A (en) 2022-10-31 2022-10-31 Direction identification method, apparatus, device, storage medium, and computer program product

Publications (1)

Publication Number Publication Date
CN117990073A true CN117990073A (en) 2024-05-07

Family

ID=90898112

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211349687.0A Pending CN117990073A (en) 2022-10-31 2022-10-31 Direction identification method, apparatus, device, storage medium, and computer program product

Country Status (1)

Country Link
CN (1) CN117990073A (en)

Similar Documents

Publication Publication Date Title
US11170708B2 (en) Gamma correction method and device, display device, and computer storage medium
CN110967011B (en) Positioning method, device, equipment and storage medium
US11276183B2 (en) Relocalization method and apparatus in camera pose tracking process, device, and storage medium
US20170003133A1 (en) Applying a correct factor derivative method for determining an orientation of a portable electronic device based on sense gravitation component linear accelerate filter data obtained
CN110986930B (en) Equipment positioning method and device, electronic equipment and storage medium
US11688146B2 (en) Electronic device and method for displaying sharing information on basis of augmented reality
CN108462818B (en) Electronic device and method for displaying 360-degree image in the same
WO2019154097A1 (en) Method, device and system for updating geomagnetic information
CN111897429A (en) Image display method, image display device, computer equipment and storage medium
CN110163296B (en) Image recognition method, device, equipment and storage medium
CN111768454A (en) Pose determination method, device, equipment and storage medium
CN110570465A (en) real-time positioning and map construction method and device and computer readable storage medium
CN112150560A (en) Method and device for determining vanishing point and computer storage medium
WO2021000956A1 (en) Method and apparatus for upgrading intelligent model
US20240193945A1 (en) Method for determining recommended scenario and electronic device
EP3161611A1 (en) Controlling brightness of a remote display
CN110633336B (en) Method and device for determining laser data search range and storage medium
CN113160031B (en) Image processing method, device, electronic equipment and storage medium
CN108196701B (en) Method and device for determining posture and VR equipment
US11294452B2 (en) Electronic device and method for providing content based on the motion of the user
US10783645B2 (en) Apparatuses, methods, and storage medium for preventing a person from taking a dangerous selfie
CN117990073A (en) Direction identification method, apparatus, device, storage medium, and computer program product
US20240012451A1 (en) Display method and related apparatus
CN112988613B (en) Method, device and equipment for determining physical state of flash memory
CN112835021A (en) Positioning method, device, system and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication