CN112559910A - Direction identification method and device, electronic equipment and readable storage medium - Google Patents

Direction identification method and device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN112559910A
CN112559910A CN202011561733.4A CN202011561733A CN112559910A CN 112559910 A CN112559910 A CN 112559910A CN 202011561733 A CN202011561733 A CN 202011561733A CN 112559910 A CN112559910 A CN 112559910A
Authority
CN
China
Prior art keywords
entity
picture
identified
azimuth
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011561733.4A
Other languages
Chinese (zh)
Inventor
李伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202011561733.4A priority Critical patent/CN112559910A/en
Publication of CN112559910A publication Critical patent/CN112559910A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9537Spatial or temporal dependent retrieval, e.g. spatiotemporal queries

Abstract

The application discloses a direction identification method, a direction identification device, electronic equipment and a readable storage medium, and relates to the technical field of computer vision. The implementation scheme adopted when the direction is identified is as follows: acquiring the current date, time and geographic position; acquiring a picture to be identified, and identifying entity types of entities in the picture to be identified; obtaining an entity azimuth angle according to the date, the time and the geographic position by using a calculation method corresponding to the entity category; and determining the direction of the user facing the entity according to the entity azimuth. The method and the device can improve the success rate and accuracy of direction identification.

Description

Direction identification method and device, electronic equipment and readable storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method and an apparatus for identifying a direction, an electronic device, and a readable storage medium in the field of computer vision technologies.
Background
When the terminal device in the prior art performs direction identification, a geomagnetic sensor is generally used to sense an earth magnetic field to identify a direction or a navigation positioning chip is used to identify a direction based on a change of a position, but when a surrounding magnetic field is greatly interfered, a user is stationary or a satellite signal is blocked, a better direction identification method is not available.
Disclosure of Invention
The technical scheme adopted by the application for solving the technical problem is to provide a direction identification method, which comprises the following steps: acquiring the current date, time and geographic position; acquiring a picture to be identified, and identifying entity types of entities in the picture to be identified; obtaining an entity azimuth angle according to the date, the time and the geographic position by using a calculation method corresponding to the entity category; and determining the direction of the user facing the entity according to the entity azimuth.
The technical scheme that this application adopted for solving technical problem provides a direction recognition device, includes: the acquisition unit is used for acquiring the current date, time and geographic position; the identification unit is used for acquiring a picture to be identified and identifying the entity type of an entity in the picture to be identified; the processing unit is used for obtaining an entity azimuth angle according to the date, the time and the geographic position by using a calculation method corresponding to the entity type; and the determining unit is used for determining the direction of the user facing the entity according to the entity azimuth.
An electronic device, comprising: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the above method.
A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the above method.
A computer program product comprising a computer program which, when executed by a processor, implements the above-described method.
One embodiment in the above application has the following advantages or benefits: the method and the device can improve the success rate and accuracy of direction identification. Because the technical means of direction identification based on the acquired date, time, geographical position and the entity existing in nature is adopted, the purpose of direction identification under the condition that the direction cannot be distinguished due to weak satellite signals, static state of a user or serious magnetic field interference and the like is achieved, the interference of external factors during direction identification is greatly reduced, and the success rate and the accuracy of direction identification are improved.
Other effects of the above-described alternative will be described below with reference to specific embodiments.
Drawings
The drawings are included to provide a better understanding of the present solution and are not intended to limit the present application. Wherein:
FIG. 1 is a schematic diagram according to a first embodiment of the present application;
FIG. 2 is a schematic diagram according to a second embodiment of the present application;
fig. 3 is a block diagram of an electronic device for implementing the direction identification method according to the embodiment of the present application.
Detailed Description
The following description of the exemplary embodiments of the present application, taken in conjunction with the accompanying drawings, includes various details of the embodiments of the application for the understanding of the same, which are to be considered exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Fig. 1 is a schematic diagram according to a first embodiment of the present application. As shown in fig. 1, the direction identification method of this embodiment may specifically include the following steps:
s101, acquiring the current date, time and geographic position;
s102, acquiring a picture to be identified, and identifying the entity type of an entity in the picture to be identified;
s103, obtaining an entity azimuth angle according to the date, the time and the geographic position by using a calculation method corresponding to the entity type;
and S104, determining the direction of the user facing the entity according to the entity azimuth.
The execution main body of the direction identification method of the embodiment is the terminal device, and the terminal device identifies the direction based on the acquired date, time, geographical position and the entity existing in nature, so that the purpose that the terminal device can identify the direction under the condition that the satellite signal is weak, the user is static or the magnetic field interference is serious and the like, the direction cannot be identified is achieved, the interference of external factors during direction identification is greatly reduced, and the success rate and the accuracy of direction identification are improved.
In this embodiment, when S101 is executed, the current date, time and geographic position may be obtained through a system interface of the terminal device, where the obtained geographic position is the longitude and latitude where the terminal device is currently located.
After the current date, time and geographic position are obtained in step S101, step S102 is executed to first obtain a picture to be recognized, and then identify an entity category of an entity in the picture to be recognized. In this embodiment, the entity type obtained by performing the identification in S102 is one of sun, moon, arctic star and shadow.
In this embodiment, when S102 is executed to acquire a picture to be recognized, the picture taken in real time may be taken as the picture to be recognized, for example, the picture taken by a user in real time through a camera of a terminal device is acquired as the picture to be recognized; the embodiment may also present, to the user, a plurality of pictures respectively corresponding to different entities, and use a picture selected by the user from the plurality of pictures as a picture to be identified.
In addition, when the entity category of the entity in the picture to be recognized is recognized in S102, the embodiment may adopt an optional implementation manner as follows: and inputting the picture to be recognized into a pre-trained entity recognition model, and taking an output result of the entity recognition model as an entity category of an entity in the picture to be recognized. That is, the entity recognition model in this embodiment can output the entity category of the entity in the picture according to the inputted picture.
That is to say, this embodiment helps the user to select an appropriate entity, improves the flexibility when the user selects an entity, and achieves the purpose that the direction recognition can be completed according to the selected entity in different environments, thereby improving the success rate of the direction recognition.
In this embodiment, after the entity category of the entity in the picture to be recognized is obtained by performing S102, S103 is performed to obtain the entity azimuth angle by using a calculation method corresponding to the entity category obtained by recognition according to the date, the time and the geographic position.
Since different entity types in this embodiment correspond to different calculation methods, a correspondence table between entity types and calculation methods may be preset in this embodiment, and the calculation method used when calculating different entity azimuth angles may be determined according to the correspondence table.
It can be understood that, in the embodiment, the calculation methods corresponding to different entity categories can calculate the corresponding entity azimuth according to the input time, date and geographic position, for example, the calculation method of the solar azimuth is used to obtain the solar azimuth, the calculation method of the lunar azimuth is used to obtain the corpus azimuth, and the calculation method of the north star azimuth is used to obtain the north star azimuth, and the calculation processes of the calculation methods belong to the prior art, which are not described herein again.
In addition, if the entity type obtained by performing S102 is a figure, in this embodiment, when performing S103 to obtain a figure azimuth angle by using a calculation method corresponding to the figure, an optional implementation manner that can be adopted is as follows: determining whether the current day or night is according to the time; under the condition that the current day is determined, a solar azimuth angle is obtained by using a calculation method corresponding to the sun, and an azimuth angle opposite to the calculated solar azimuth angle is used as a shadow azimuth angle; and under the condition that the current time is night, obtaining a moon azimuth angle by using a calculation method corresponding to the moon, and taking an azimuth angle opposite to the calculated moon azimuth angle as a shadow azimuth angle.
For example, if the sun azimuth obtained by executing S103 in the present embodiment is 90 °, 270 ° is taken as the shadow azimuth in the present embodiment; if the moon azimuth angle obtained by executing S103 in this embodiment is 0 °, 180 ° is taken as the portrait azimuth angle in this embodiment.
In this embodiment, after the entity azimuth in the picture to be recognized is obtained in step S103, step S104 is performed to determine the direction of the user when the user faces the entity according to the obtained entity azimuth.
Generally, the azimuth angle represents an included angle between the projection of the entity light on the ground plane and the true north direction, the value range of the azimuth angle is 0-360 degrees, the azimuth angle takes the true north direction as the starting direction, namely 0 degree, and rotates for a circle clockwise, the azimuth angle is gradually increased to 360 degrees, namely the entity is located in the true north direction when the azimuth angle is 0 degree, the entity is located in the true east direction when the azimuth angle is 90 degrees, the entity is located in the true south direction when the azimuth angle is 180 degrees, and the entity is located in the true west direction when the azimuth angle is 270 degrees.
Therefore, when the S104 is executed to determine the direction of the user when the user faces the entity according to the entity azimuth, the present embodiment may adopt an optional implementation manner as follows: and according to a preset azimuth angle-direction corresponding relation table, taking the direction corresponding to the entity azimuth angle as the direction when the user faces the entity.
For example, if the entity azimuth obtained by executing S103 in this embodiment is 90 °, in the preset correspondence table between azimuth and direction, the direction corresponding to the azimuth 90 ° is the righteast 90 °, then the righteast 90 ° is taken as the direction when the user faces the entity when S104 is executed in this embodiment; if the entity azimuth obtained by executing S103 in this embodiment is 230 °, in the preset azimuth-direction correspondence table, the direction corresponding to the azimuth 230 ° is southwest 230 °, then the southwest 230 ° is taken as the direction when the user faces the entity when S104 is executed in this embodiment.
After the S104 is executed to determine the direction when the user faces the entity, the present embodiment may display the determined direction to the user in a text manner, for example, display a text prompt message "you face the sun at 90 ° righteast" and "you face the sun at 230 ° southwest" to the user; the determined direction may also be presented to the user in a direction indication graphic, for example in the form of a compass dial, which presents the user with his direction towards the entity.
In addition, after executing S104 to determine the direction of the user when facing the entity, the present embodiment may further include the following: taking the direction of the user facing the entity as a reference value; the reference value is used to correct the value of the orientation sensor in the terminal device.
When S104 is executed to correct the value of the direction sensor in the terminal device using the reference value, the present embodiment may adopt an optional implementation manner as follows: adjusting the value of the direction sensor to a reference value upon determining that the difference between the reference value and the value of the direction sensor exceeds a preset angle threshold; otherwise, the value of the direction sensor is not adjusted.
That is to say, in this embodiment, the value of the direction sensor in the terminal device is adjusted according to the direction of the user facing the entity determined by the entity azimuth, so that the direction sensor has a more accurate direction value, and the accuracy of the navigation service based on the value of the direction sensor is further improved.
According to the method provided by the embodiment, the direction is identified based on the acquired date, time, geographical position and the entity existing in nature, so that the purpose that the terminal equipment can identify the direction under the condition that the satellite signal is weak, the user is still or the magnetic field interference is serious and the like, the direction cannot be identified is achieved, the interference of external factors during direction identification is greatly reduced, and the success rate and the accuracy of direction identification are improved.
Fig. 2 is a schematic diagram according to a second embodiment of the present application. As shown in fig. 2, the direction recognition apparatus of the present embodiment includes:
an obtaining unit 201, configured to obtain a current date, time, and geographic location;
the identification unit 202 is configured to acquire a picture to be identified, and identify an entity type of an entity in the picture to be identified;
the processing unit 203 is configured to obtain an entity azimuth according to the date, the time and the geographic location by using a calculation method corresponding to the entity type;
the determining unit 204 is configured to determine a direction in which the user faces the entity according to the entity azimuth.
The obtaining unit 201 may obtain the current date, time, and geographic location through a system interface of the terminal device, where the obtained geographic location is the longitude and latitude where the terminal device is currently located.
In the embodiment, after the acquiring unit 201 acquires the current date, time and geographic position, the identifying unit 202 first acquires the picture to be identified, and then identifies the entity category of the entity in the picture to be identified. The entity category identified by the identifying unit 202 is one of sun, moon, Polaris and shadow.
When acquiring a picture to be recognized, the recognition unit 202 may acquire a picture taken in real time as the picture to be recognized; the identifying unit 202 may also present a plurality of pictures respectively corresponding to different entities to the user, and use a picture selected by the user from the plurality of pictures as a picture to be identified.
In addition, when the identifying unit 202 identifies the entity category of the entity in the picture to be identified, the optional implementation manner that can be adopted is as follows: and inputting the picture to be recognized into a pre-trained entity recognition model, and taking an output result of the entity recognition model as an entity category of an entity in the picture to be recognized. That is, the entity recognition model in the recognition unit 202 can output the entity category of the entity in the picture according to the inputted picture.
That is to say, the identification unit 202 facilitates the user to select an appropriate entity, improves the flexibility of the user in selecting the entity, and achieves the purpose of completing direction identification according to the selected entity in different environments, thereby improving the success rate of direction identification.
In this embodiment, after the identification unit 202 identifies and obtains the entity category of the entity in the picture to be identified, the processing unit 203 obtains the entity azimuth angle by using a calculation method corresponding to the identified entity category according to the date, time and geographic location.
Since different entity classes in the processing unit 203 correspond to different calculation methods, the processing unit 203 may preset a correspondence table between the entity classes and the calculation methods, and may determine the calculation methods used in calculating the different entity azimuth angles according to the correspondence table.
It can be understood that the calculation methods corresponding to different entity types in the processing unit 203 can calculate the corresponding entity azimuth according to the input time, date and geographic location, and the calculation processes of the calculation methods belong to the prior art and are not described herein.
In addition, if the entity type identified by the identifying unit 202 is a figure, when the processing unit 203 obtains the figure azimuth angle by using a calculation method corresponding to the figure, the following optional implementation manners may be adopted: determining whether the current day or night is according to the time; under the condition that the current day is determined, a solar azimuth angle is obtained by using a calculation method corresponding to the sun, and an azimuth angle opposite to the calculated solar azimuth angle is used as a shadow azimuth angle; and under the condition that the current time is night, obtaining a moon azimuth angle by using a calculation method corresponding to the moon, and taking an azimuth angle opposite to the calculated moon azimuth angle as a shadow azimuth angle.
In the embodiment, after the processing unit 203 obtains the azimuth of the entity in the picture to be identified, the determining unit 204 determines the direction of the user when the user faces the entity according to the obtained azimuth of the entity.
When the determining unit 204 determines the direction of the user towards the entity according to the entity azimuth, the optional implementation manners that can be adopted are as follows: and according to a preset azimuth angle-direction corresponding relation table, taking the direction corresponding to the entity azimuth angle as the direction when the user faces the entity.
After determining the direction of the user when facing the entity, the determining unit 204 may display the determined direction to the user in a text manner or in a direction indication graphic manner.
In addition, the determining unit 204 may further include the following after determining the direction of the user when facing the entity: taking the direction of the user facing the entity as a reference value; the reference value is used to correct the value of the orientation sensor in the terminal device.
When the determining unit 204 corrects the value of the direction sensor in the terminal device by using the reference value, the optional implementation manner that can be adopted is as follows: adjusting the value of the direction sensor to a reference value upon determining that the difference between the reference value and the value of the direction sensor exceeds a preset angle threshold; otherwise, the value of the direction sensor is not adjusted.
That is to say, the determining unit 204 adjusts the value of the direction sensor in the terminal device according to the direction of the user towards the entity determined by the entity azimuth, so that the direction sensor has a more accurate direction value, and the accuracy of the navigation service based on the direction sensor value is improved.
There is also provided, in accordance with an embodiment of the present application, an electronic device, a computer-readable storage medium, and a computer program product.
FIG. 3 illustrates a schematic block diagram of an example electronic device 300 that can be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital assistants, cellular telephones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 3, the apparatus 300 includes a computing unit 301 that can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM)302 or a computer program loaded from a storage unit 308 into a Random Access Memory (RAM) 303. In the RAM 303, various programs and data required for the operation of the device 300 can also be stored. The calculation unit 301, the ROM 302, and the RAM 303 are connected to each other via a bus 304. An input/output (I/O) interface 305 is also connected to bus 304.
Various components in device 300 are connected to I/O interface 305, including: an input unit 306 such as a keyboard, a mouse, or the like; an output unit 307 such as various types of displays, speakers, and the like; a storage unit 308 such as a magnetic disk, optical disk, or the like; and a communication unit 309 such as a network card, modem, wireless communication transceiver, etc. The communication unit 309 allows the device 300 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
The computing unit 301 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of the computing unit 301 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various dedicated Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, and so forth. The calculation unit 301 executes the respective methods and processes described above, such as the direction recognition method. For example, in some embodiments, the direction identification method may be implemented as a computer software program tangibly embodied in a machine-readable medium, such as storage unit 308. In some embodiments, part or all of the computer program may be loaded and/or installed onto device 300 via ROM 302 and/or communication unit 309. When the computer program is loaded into RAM 303 and executed by the computing unit 301, one or more steps of the direction recognition method described above may be performed. Alternatively, in other embodiments, the computing unit 301 may be configured to perform the direction recognition method in any other suitable manner (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), the internet, and blockchain networks.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The Server can be a cloud Server, also called a cloud computing Server or a cloud host, and is a host product in a cloud computing service system, so as to solve the defects of high management difficulty and weak service expansibility in the traditional physical host and VPS service ("Virtual Private Server", or simply "VPS"). The server may also be a server of a distributed system, or a server incorporating a blockchain.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present disclosure may be executed in parallel, sequentially, or in different orders, as long as the desired results of the technical solutions disclosed in the present disclosure can be achieved, and the present disclosure is not limited herein.
The above detailed description should not be construed as limiting the scope of the disclosure. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present disclosure should be included in the scope of protection of the present disclosure.

Claims (13)

1. A direction identification method, comprising:
acquiring the current date, time and geographic position;
acquiring a picture to be identified, and identifying entity types of entities in the picture to be identified;
obtaining an entity azimuth angle according to the date, the time and the geographic position by using a calculation method corresponding to the entity category;
and determining the direction of the user facing the entity according to the entity azimuth.
2. The method of claim 1, wherein the obtaining the picture to be recognized comprises:
and acquiring a picture shot in real time as the picture to be identified.
3. The method of claim 1, wherein the identifying the entity category of the entity in the picture to be identified comprises:
and inputting the picture to be recognized into an entity recognition model obtained by pre-training, and taking an output result of the entity recognition model as an entity category of an entity in the picture to be recognized.
4. The method of claim 1, wherein the determining a direction in which a user is facing the entity from the entity bearing angle comprises:
and according to a preset azimuth angle-direction corresponding relation table, taking the direction corresponding to the entity azimuth angle as the direction when the user faces the entity.
5. The method of claim 1, further comprising,
after determining the direction of the user facing the entity according to the entity azimuth, taking the direction of the user facing the entity as a reference value;
and correcting the value of the direction sensor in the terminal equipment by using the reference value.
6. A direction recognition apparatus comprising:
the acquisition unit is used for acquiring the current date, time and geographic position;
the identification unit is used for acquiring a picture to be identified and identifying the entity type of an entity in the picture to be identified;
the processing unit is used for obtaining an entity azimuth angle according to the date, the time and the geographic position by using a calculation method corresponding to the entity type;
and the determining unit is used for determining the direction of the user facing the entity according to the entity azimuth.
7. The apparatus according to claim 6, wherein the identifying unit, when acquiring the picture to be identified, specifically performs:
and acquiring a picture shot in real time as the picture to be identified.
8. The apparatus according to claim 6, wherein the identifying unit, when identifying the entity category of the entity in the picture to be identified, specifically performs:
and inputting the picture to be recognized into an entity recognition model obtained by pre-training, and taking an output result of the entity recognition model as an entity category of an entity in the picture to be recognized.
9. The apparatus according to claim 6, wherein the determining unit, when determining the direction of the user towards the entity according to the entity azimuth, specifically performs:
and according to a preset azimuth angle-direction corresponding relation table, taking the direction corresponding to the entity azimuth angle as the direction when the user faces the entity.
10. The apparatus of claim 6, the determination unit further to perform,
after determining the direction of the user facing the entity according to the entity azimuth, taking the direction of the user facing the entity as a reference value;
and correcting the value of the direction sensor in the terminal equipment by using the reference value.
11. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-5.
12. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-5.
13. A computer program product comprising a computer program which, when executed by a processor, implements the method according to any one of claims 1-5.
CN202011561733.4A 2020-12-25 2020-12-25 Direction identification method and device, electronic equipment and readable storage medium Pending CN112559910A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011561733.4A CN112559910A (en) 2020-12-25 2020-12-25 Direction identification method and device, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011561733.4A CN112559910A (en) 2020-12-25 2020-12-25 Direction identification method and device, electronic equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN112559910A true CN112559910A (en) 2021-03-26

Family

ID=75032646

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011561733.4A Pending CN112559910A (en) 2020-12-25 2020-12-25 Direction identification method and device, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN112559910A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104092948A (en) * 2014-07-29 2014-10-08 小米科技有限责任公司 Method and device for processing image
CN107246865A (en) * 2017-05-26 2017-10-13 郭宇光 A kind of method and device based on intelligent terminal fixation and recognition building
CN111093266A (en) * 2019-12-20 2020-05-01 维沃移动通信有限公司 Navigation calibration method and electronic equipment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104092948A (en) * 2014-07-29 2014-10-08 小米科技有限责任公司 Method and device for processing image
CN107246865A (en) * 2017-05-26 2017-10-13 郭宇光 A kind of method and device based on intelligent terminal fixation and recognition building
CN111093266A (en) * 2019-12-20 2020-05-01 维沃移动通信有限公司 Navigation calibration method and electronic equipment

Similar Documents

Publication Publication Date Title
JP6587297B2 (en) Method and system for displaying short-term forecasts along a route on a map
CN112966587B (en) Training method of target detection model, target detection method and related equipment
US10949669B2 (en) Augmented reality geolocation using image matching
CN112800915A (en) Building change detection method, building change detection device, electronic device, and storage medium
CN112699765A (en) Method and device for evaluating visual positioning algorithm, electronic equipment and storage medium
CN109489654B (en) Navigation route presenting method, device, equipment and storage medium
CN113219505B (en) Method, device and equipment for acquiring GPS coordinates for vehicle-road cooperative tunnel scene
US20180224560A1 (en) Location and orientation based digital media search
US9021709B2 (en) Electronic device magnetic interference indication method
CN112595329A (en) Vehicle position determining method and device and electronic equipment
CN113570608A (en) Target segmentation method and device and electronic equipment
CN110487264B (en) Map correction method, map correction device, electronic equipment and storage medium
CN112559910A (en) Direction identification method and device, electronic equipment and readable storage medium
US11543485B2 (en) Determining location or orientation based on environment information
CN113532428A (en) Data processing method and device, communication-in-motion terminal and computer readable storage medium
CN114596362B (en) High-point camera coordinate calculation method and device, electronic equipment and medium
CN113643440A (en) Positioning method, device, equipment and storage medium
CN114187509B (en) Object positioning method and device, electronic equipment and storage medium
CN110320496B (en) Indoor positioning method and device
CN113012555B (en) Map display method, map display device, electronic equipment and storage medium
CN113654548A (en) Positioning method, positioning device, electronic equipment and storage medium
CN114443679A (en) Map data updating method, device, equipment and storage medium
CN113114929A (en) Photographing guiding method, terminal device, electronic device and storage medium
CN117191069A (en) Correction method, device, equipment and storage medium for indication direction in navigation process
CN103886587A (en) Method and system for realizing positioning anti-shaking on terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination