CN112558129B - Method for determining indoor and outdoor scenes, related device, equipment and storage medium - Google Patents

Method for determining indoor and outdoor scenes, related device, equipment and storage medium Download PDF

Info

Publication number
CN112558129B
CN112558129B CN202011406545.4A CN202011406545A CN112558129B CN 112558129 B CN112558129 B CN 112558129B CN 202011406545 A CN202011406545 A CN 202011406545A CN 112558129 B CN112558129 B CN 112558129B
Authority
CN
China
Prior art keywords
satellite
target
included angle
information
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011406545.4A
Other languages
Chinese (zh)
Other versions
CN112558129A (en
Inventor
苏景岚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202011406545.4A priority Critical patent/CN112558129B/en
Publication of CN112558129A publication Critical patent/CN112558129A/en
Application granted granted Critical
Publication of CN112558129B publication Critical patent/CN112558129B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position

Abstract

The application discloses an indoor and outdoor scene determining method applicable to an electronic map, which comprises the steps of obtaining N satellite positions corresponding to a target moment; acquiring the device position corresponding to the terminal device at the target moment; acquiring an included angle between each satellite position in the N satellite positions and the equipment position to obtain N included angles; determining a maximum included angle according to the N included angles; and determining a target scene corresponding to the terminal equipment at the target moment according to the maximum included angle, wherein the target scene is an indoor scene or an outdoor scene. Related apparatus, devices, and media are disclosed. The method and the device can avoid the influence of environmental factors, and can effectively reduce the misjudgment rate when the terminal equipment is positioned in a window or a balcony and the like and can receive satellite signal scenes, so that the accuracy of judging indoor and outdoor scenes is improved.

Description

Method for determining indoor and outdoor scenes, related device, equipment and storage medium
Technical Field
The present application relates to the field of positioning and navigation technologies, and in particular, to a method, a related apparatus, a device, and a storage medium for determining an indoor scene and an outdoor scene.
Background
The positioning can be divided into two categories, namely outdoor positioning and indoor positioning according to different application scenes, and the positioning requirements are different due to different scenes. The outdoor positioning technology has the characteristics of wide covering area and stable signal, but the precision of the outdoor positioning technology is influenced by weather, direction and the like. Indoor location technique can adopt earth magnetism location or bluetooth location etc. according to the user demand of difference.
Currently, a method for determining indoor and outdoor scenes is provided, in which an exposure value, a gain value, a brightness value, a sharpness evaluation value, and a Red Green Blue (RGB) value of an image are analyzed to obtain a light intensity characteristic, a light source characteristic, and a color temperature characteristic of a current scene, thereby determining the indoor and outdoor scenes.
However, the exposure value, the gain value, the brightness value, and the RGB value are used as a basis for analyzing the current scene, and are easily affected by the change of the illumination environment and the indoor light, so that the accuracy of determining the indoor and outdoor scenes is low.
Disclosure of Invention
The embodiment of the application provides a method for determining indoor and outdoor scenes, a related device, equipment and a storage medium, which can avoid the influence of environmental factors, and can effectively reduce the misjudgment rate when terminal equipment is positioned in a window or a balcony and the like and can receive satellite signal scenes, so that the accuracy of judging the indoor and outdoor scenes is improved.
In view of the above, an aspect of the present application provides a method for object control, including:
acquiring N satellite positions corresponding to a target moment, wherein each satellite position corresponds to a satellite, and N is an integer greater than or equal to 1;
acquiring a device position corresponding to the terminal device at a target moment;
acquiring an included angle between each satellite position in the N satellite positions and the equipment position to obtain N included angles;
determining a maximum included angle according to the N included angles;
and determining a target scene corresponding to the terminal equipment at the target moment according to the maximum included angle, wherein the target scene is an indoor scene or an outdoor scene.
Another aspect of the present application provides an indoor/outdoor scene determining apparatus, including:
the system comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring N satellite positions corresponding to a target moment, each satellite position corresponds to a satellite, and N is an integer greater than or equal to 1;
the acquisition module is also used for acquiring the equipment position corresponding to the terminal equipment at the target moment;
the acquisition module is further used for acquiring an included angle between each satellite position in the N satellite positions and the equipment position to obtain N included angles;
the determining module is used for determining the maximum included angle according to the N included angles;
the determining module is further configured to determine a target scene corresponding to the terminal device at the target time according to the maximum included angle, where the target scene is an indoor scene or an outdoor scene.
In one possible design, in another implementation of another aspect of an embodiment of the present application,
the acquisition module is further used for executing the step of acquiring N satellite positions corresponding to the target moment if the satellite signals are detected at the target moment;
the determining module is further configured to determine that a target scene corresponding to the terminal device at the target moment is an indoor scene if the satellite signal is not detected at the target moment and the WIFI signal is detected;
the determining module is further configured to determine that the determination fails if the satellite signal is not detected at the target moment and the WIFI signal is not detected.
In one possible design, in another implementation of another aspect of an embodiment of the present application,
the acquisition module is specifically used for sending an ephemeris issue request to the server so that the server responds to the ephemeris issue request and acquires broadcast ephemeris information corresponding to a target moment, wherein the broadcast ephemeris information comprises ephemeris parameters corresponding to the N satellites;
receiving broadcast ephemeris information corresponding to a target moment sent by a server;
and determining N satellite positions corresponding to the target time according to the broadcast ephemeris information corresponding to the target time.
In one possible design, in another implementation of another aspect of an embodiment of the present application,
an obtaining module, configured to specifically obtain, for an ith satellite of the N satellites, ephemeris parameters corresponding to the ith satellite according to broadcast ephemeris information corresponding to a target time, where i is an integer greater than or equal to 1 and less than or equal to N;
for an ith satellite in the N satellites, determining a satellite position corresponding to the ith satellite at a target moment according to ephemeris parameters corresponding to the ith satellite;
and acquiring N satellite positions according to the satellite position corresponding to the ith satellite at the target moment.
In one possible design, in another implementation of another aspect of the embodiments of the present application, N is an integer greater than or equal to 4;
the acquisition module is specifically used for acquiring first signal information, wherein the first signal information comprises coordinate values of a first satellite and time for a signal of the first satellite to reach the terminal device;
acquiring second signal information, wherein the second signal information comprises coordinate values of a second satellite and time for a signal of the second satellite to reach the terminal equipment;
acquiring third signal information, wherein the third signal information comprises coordinate values of a third satellite and time for a signal of the third satellite to reach the terminal equipment;
acquiring fourth signal information, wherein the fourth signal information comprises coordinate values of a fourth satellite and time for a signal of the fourth satellite to reach the terminal equipment;
and determining the device position corresponding to the terminal device at the target moment according to the first signal information, the second signal information, the third signal information and the fourth signal information.
In one possible design, in another implementation of another aspect of an embodiment of the present application,
the terminal device comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is specifically used for acquiring first position information, and the first position information comprises position information of a first base station and a first distance between the terminal device and the first base station;
acquiring second position information, wherein the second position information comprises position information of a second base station and a second distance between the terminal equipment and the second base station;
acquiring third position information, wherein the third position information comprises position information of a third base station and a third distance between the terminal equipment and the third base station;
and determining the device position corresponding to the terminal device at the target moment according to the first position information, the second position information and the third position information.
In one possible design, in another implementation of another aspect of an embodiment of the present application,
the acquisition module is specifically used for constructing a plane coordinate system by taking the position of the equipment as a coordinate origin;
acquiring longitude information and latitude information corresponding to terminal equipment;
for the ith satellite position of the N satellite positions, determining a coordinate value of the ith satellite in a plane coordinate system according to longitude information and latitude information corresponding to the terminal equipment, the equipment position and the ith satellite position, wherein i is an integer which is greater than or equal to 1 and less than or equal to N;
determining an included angle between the ith satellite and the position of the equipment according to the coordinate value of the ith satellite in the plane coordinate system;
and obtaining N included angles according to the included angle between the ith satellite and the equipment position.
In one possible design, in another implementation of another aspect of an embodiment of the present application,
the determining module is specifically configured to determine that a maximum included angle of the N included angles is 0 if N is an integer less than or equal to 3;
and if N is an integer greater than 3, determining the maximum included angle according to K included angle differences in the N included angles, wherein the K included angle differences are the differences between every two included angles in the N included angles, and K is an integer greater than or equal to 6.
In one possible design, in another implementation of another aspect of an embodiment of the present application,
the determining module is specifically configured to determine an angle difference between an ith included angle and a jth included angle in the N included angles, where i and j are integers greater than or equal to 1 and less than or equal to N, and i and j are not equal to each other;
acquiring K included angle difference values according to the ith included angle and the jth included angle;
and selecting the maximum value from the K included angle differences as the maximum included angle.
In one possible design, in another implementation of another aspect of an embodiment of the present application,
the determining module is specifically configured to determine that a target scene corresponding to the terminal device at the target moment is an outdoor scene if the maximum included angle is greater than the included angle threshold;
and if the maximum included angle is smaller than the included angle threshold value, determining that the target scene corresponding to the terminal equipment at the target moment is an indoor scene.
In one possible design, in another implementation manner of another aspect of the embodiment of the present application, the indoor and outdoor scene determining apparatus further includes a sending module, a receiving module, and a displaying module;
the acquisition module is also used for acquiring a target position after the determination module determines that the target scene corresponding to the terminal equipment at the target moment is an outdoor scene;
the system comprises a sending module, a service server and a service processing module, wherein the sending module is used for sending a target position and an outdoor positioning identifier to the service server so that the service server generates first navigation information according to a device position, the target position and the outdoor positioning identifier corresponding to a terminal device at a target moment, and the outdoor positioning identifier corresponds to an outdoor scene;
the receiving module is used for receiving first navigation information sent by the service server;
and the display module is used for displaying the first navigation information.
In one possible design, in another implementation manner of another aspect of the embodiment of the present application, the indoor and outdoor scene determining apparatus further includes a sending module, a receiving module, and a displaying module;
the acquisition module is further used for acquiring a target position after determining that a target scene corresponding to the terminal equipment at the target moment is an indoor scene;
the sending module is used for sending the target position and the indoor positioning identifier to the service server so that the service server generates second navigation information according to the equipment position, the target position and the indoor positioning identifier corresponding to the terminal equipment at the target moment, wherein the indoor positioning identifier corresponds to an indoor scene;
the receiving module is used for receiving second navigation information sent by the service server;
and the display module is used for displaying the second navigation information.
Another aspect of the present application provides a terminal device, including: a memory, a processor, and a bus system;
wherein, the memory is used for storing programs;
the processor is used for executing the program in the memory, and the processor is used for executing the method provided by the aspects according to the instructions in the program code;
the bus system is used for connecting the memory and the processor so as to enable the memory and the processor to communicate.
Another aspect of the present application provides a computer-readable storage medium having stored therein instructions, which when executed on a computer, cause the computer to perform the method of the above-described aspects.
In another aspect of the application, a computer program product or computer program is provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the method provided by the above aspects.
According to the technical scheme, the embodiment of the application has the following advantages:
the embodiment of the application provides a method for determining indoor and outdoor scenes, which includes the steps of firstly obtaining N satellite positions corresponding to a target moment and an equipment position corresponding to a terminal equipment at the target moment, then obtaining an included angle between each satellite position in the N satellite positions and the equipment position to obtain N included angles, then determining a maximum included angle according to the N included angles, and finally determining a target scene corresponding to the terminal equipment at the target moment according to the maximum included angle, wherein the target scene is an indoor scene or an outdoor scene. By the method, indoor and outdoor scenes are judged by utilizing the space distribution condition of the satellite, and the influence of environmental factors such as temperature, illumination intensity and indoor light can be avoided, so that the accuracy of judging the indoor and outdoor scenes is improved. In addition, when the terminal equipment is positioned in a window or a balcony and can receive a satellite signal scene, the misjudgment rate can be effectively reduced, and the accuracy of judging indoor and outdoor scenes is further improved.
Drawings
Fig. 1 is a schematic diagram of an architecture of an indoor/outdoor scene determining system according to an embodiment of the present application;
fig. 2 is a schematic interaction flow diagram of an indoor and outdoor scene determination method according to an embodiment of the present application;
fig. 3 is a schematic diagram of an embodiment of an indoor and outdoor scene determining method in an embodiment of the present application;
FIG. 4 is a schematic structural diagram of a GNSS navigation chip in an embodiment of the present application;
fig. 5 is a schematic overall flow chart of an indoor and outdoor scene determining method according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a process for determining satellite positions in an embodiment of the present application;
FIG. 7 is a schematic diagram illustrating the acquisition of broadcast ephemeris information in an embodiment of the application;
FIG. 8 is a schematic diagram of a GPS-based device location determination in an embodiment of the present application;
FIG. 9 is a schematic diagram of determining the location of a device based on the location of a base station in an embodiment of the present application;
FIG. 10 is a schematic diagram of N included angles in a plane coordinate system in the embodiment of the present application;
FIG. 11 is a schematic diagram of satellite distribution in an outdoor scene according to an embodiment of the present application;
FIG. 12 is a schematic view of a positioning interface based on an outdoor scene in an embodiment of the present application;
FIG. 13 is a schematic diagram illustrating satellite distribution in an indoor and outdoor interfacing scenario according to an embodiment of the present application;
FIG. 14 is a schematic view of another satellite distribution under an indoor and outdoor interfacing scenario in an embodiment of the present application;
FIG. 15 is a schematic view of another satellite distribution under an indoor and outdoor interfacing scenario in an embodiment of the present application;
FIG. 16 is a schematic view of another exemplary positioning interface based on an indoor and outdoor interface scenario according to the present application;
FIG. 17 is a schematic diagram of a satellite distribution in an indoor scenario according to an embodiment of the present application;
FIG. 18 is a schematic view of a positioning interface based on an indoor scene in an embodiment of the present application;
FIG. 19 is a schematic diagram of an interface showing first navigation information in an embodiment of the present application;
FIG. 20 is a schematic view of an interface showing second navigation information in an embodiment of the present application;
fig. 21 is a schematic diagram of an embodiment of an indoor and outdoor scene determination apparatus according to an embodiment of the present application;
fig. 22 is a schematic structural diagram of a terminal device in the embodiment of the present application.
Detailed Description
The embodiment of the application provides a method for determining indoor and outdoor scenes, a related device, equipment and a storage medium, which can avoid the influence of environmental factors, and can effectively reduce the misjudgment rate when terminal equipment is positioned in a window or a balcony and the like and can receive satellite signal scenes, so that the accuracy of judging the indoor and outdoor scenes is improved.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims of the present application and in the drawings described above, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "corresponding" and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
With the progress of society and the development of science and technology, the positioning technology has advanced in quality in the aspects of technical means, positioning accuracy, usability and the like, and gradually permeates the aspects of social life from the fields of navigation, aerospace, aviation, surveying and mapping, military, natural disaster prevention and the like, and becomes an indispensable important application of people in daily life, such as personnel search, position finding, traffic management, vehicle navigation, route planning and the like. Positioning can be divided into two categories, namely outdoor positioning and indoor positioning according to different application scenes, and because the scenes are different, the positioning requirements are also different, people no longer need to navigate to the doorway, indoor positioning navigation becomes a new trend, and several categories of indoor navigation scenes are introduced below.
Firstly, commercial application;
when it is detected that the customer enters the mall, indoor navigation can be provided for the customer, for example, to help the customer find the location of the mall, or to facilitate the customer to share the current location with other users, or to guide the customer to find the location where the vehicle is parked, and the like.
Secondly, industrial application;
when it is detected that a worker enters the plant, indoor navigation can be provided for the worker, for example, the position of placing articles is provided, or the position of the plant area is provided, or a fence alarm is triggered, or the position of a safety exit is provided, and the like.
Thirdly, medical application;
the indoor positioning navigation service meets the main application requirements of patient medical guidance in the medical industry, can provide the patient with the quick query function of a service area and a consulting room in a hospital in a map, and can help the patient to realize the intelligent medical guidance function.
In order to determine whether a user is currently in an indoor scene or an outdoor scene in the above-mentioned scenes, an embodiment of the present application provides a method for determining an indoor scene and an outdoor scene, where the method is applied to an indoor and outdoor scene determination system shown in fig. 1, as shown in the figure, the indoor and outdoor scene determination system includes a Continuously Operating Reference Station (CORS) server, a service server, and a terminal device, and a client is deployed on the terminal device. The service server may specifically be a navigation service server, and correspondingly, the client may be a navigation application. The service server related to the present application may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a Network service, cloud communication, a middleware service, a domain name service, a security service, a Content Delivery Network (CDN), a big data and artificial intelligence platform, and the like. The terminal device may be, but is not limited to, a smart phone, a tablet computer, a notebook computer, a palm computer, a personal computer, a smart television, a smart watch, and the like. The terminal device and the server may be directly or indirectly connected through wired or wireless communication, and the application is not limited herein. The number of servers and terminal devices is not limited.
The CORS server consists of four parts, namely a reference station part, a data center part, a data communication part and a user application part. Each segment forms a private network distributed throughout the city.
The reference station part consists of reference stations which are uniformly distributed in a control area. The reference station is composed of a Global Navigation Satellite System (GNSS) device, a computer, a meteorological device, a communication device, a power supply device, an observation field and the like, has the capability of continuously tracking and recording Satellite signals for a long time, is a data source of the CORS, and has the main functions of capturing, tracking, recording and transmitting the Satellite signals, monitoring the integrity of the device and the like.
The data center part is composed of a computer, a network and a software system. The data center part comprises a system control center and a user data center, the system control center is a neural center of CORS, the overall modeling calculation can be continuously carried out in 24 hours in an area according to real-time observation data collected by each reference station, and code phase/carrier phase difference correction information is provided for various users needing measurement and navigation in an international universal format through the existing data communication network and a wireless data broadcasting network, so that the accurate point position of the mobile station can be calculated in real time. The user data center provides a down link of CORS service and transmits the data result of the control center to the user.
The data communication part is formed by public or private communication network, and comprises data transmission hardware equipment and software control module. The main functions of the data communication part are to transmit the GNSS observation data of the reference station to the system control center, transmit the system differential information to the user, and the like.
The user application part consists of a receiver, a demodulator for wireless communication and related equipment. The main function of the user application part is to perform different precision positioning according to the requirements of users.
For convenience of description, referring to fig. 2, an indoor and outdoor scene determining method provided in the present application will be described below with reference to fig. 2, where fig. 2 is an interaction flow diagram of the indoor and outdoor scene determining method in the embodiment of the present application, and as shown in the figure, specifically:
in step S1, the terminal device sends an ephemeris delivery request to the CORS server.
In step S2, the CORS server sends broadcast ephemeris information to the terminal device.
In step S3, the terminal device calculates the satellite position at the current time based on the broadcast ephemeris information.
In step S4, the spatial distribution of the satellites is acquired from the satellite positions of the respective satellites at the current time.
In step S5, the maximum included angle between the satellites is calculated according to the spatial distribution of the satellites, whether the current time is in an indoor scene or an outdoor scene is determined according to the maximum included angle, and a scene positioning identifier, such as an indoor scene positioning identifier or an outdoor scene positioning identifier, is output according to the scene type.
With reference to fig. 3, a method for determining indoor and outdoor scenes in the present application will be described below, and an embodiment of the method for determining indoor and outdoor scenes in the present application includes:
101. the method comprises the steps that terminal equipment obtains N satellite positions corresponding to a target moment, wherein each satellite position corresponds to a satellite, and N is an integer greater than or equal to 1;
in this embodiment, the terminal device may obtain the positions of the N satellites at the target time, that is, obtain the positions of the N satellites, where the target time represents the current time based on the system time of the terminal device, for example, 20 o' clock 04 min 45 sec at 11/29/2020.
Specifically, the terminal device may acquire N satellite positions based on the GNSS device. Taking a terminal device as an example of a smart phone, a navigation chip is disposed in the smart phone, and the navigation chip processes a satellite signal and provides a Position estimated by a user using a Position Velocity and Time (PVT) algorithm, where the PVT is calculated based on an original observation value, a real-Time navigation ephemeris and other information provided by the chip, a structure of a common GNSS navigation chip is shown in fig. 4, and fig. 4 is a schematic structural diagram of a GNSS navigation chip in the embodiment of the present application, so that the original observation value can be extracted by the GNSS navigation chip.
The GNSS device can provide all-weather three-dimensional coordinates and speed and time information for users at any place on the earth surface or near-earth space. Common systems include the Global Positioning System (GPS) in the united states, the BeiDou Navigation Satellite System (BDS) in china, the GLONASS (GLONASS) in russia, and the GALILEO (GALILEO) Satellite Navigation System in europe. The earliest was the GPS in the united states, and the most advanced technology is the GPS system. With the recent opening of full services of the BDS and GLONASS systems in the asia-pacific region, the BDS system has been growing faster and faster in the civilian field in particular. Satellite navigation systems have been widely used in aviation, navigation, communications, personnel tracking, consumer entertainment, mapping, time service, vehicle monitoring management, and car navigation and information services, and a general trend is to provide high-precision services for real-time applications.
102. The method comprises the steps that terminal equipment obtains an equipment position corresponding to the terminal equipment at a target moment;
in this embodiment, the terminal device may obtain a position of the terminal device at the target time, that is, a device position.
Specifically, the terminal device may obtain the device Location Based on Location Based Services (LBS), where LBS is a Location-related service provided by a wireless carrier company for a user, and obtains the current Location of the positioning device by using various types of positioning technologies, and provides information resources and basic Services to the positioning device through the mobile internet. The LBS service integrates various information technologies such as mobile communication, internet, space positioning, position information, big data and the like, and a mobile internet service platform is utilized to update and interact data, so that a user can obtain corresponding services through space positioning.
The method includes the steps that a smart phone with terminal equipment serving as an Android platform is taken as an example, the Android platform provides a Development Software Development Kit (SDK) of the LBS, the current position, the positioning accuracy and other related information of a user can be obtained through an Application Programming Interface (API) provided by the SDK, and meanwhile the SDK also provides an original satellite observation value and a real-time navigation ephemeris which utilize PVT.
103. The terminal equipment acquires an included angle between each satellite position in the N satellite positions and the equipment position to obtain N included angles;
in this embodiment, the terminal device obtains an included angle between each satellite position in the N satellite positions and the device position, so as to obtain N included angles.
Specifically, a plane coordinate system is constructed by taking the device position of the terminal device as an origin, the satellite position of each satellite is mapped based on the plane coordinate system, so that coordinate values of each satellite in the plane coordinate system are obtained, a corresponding included angle is generated according to the coordinate values corresponding to each satellite, and finally N included angles are obtained. The satellite position and the equipment position are both position coordinates under an Earth-Centered Earth-Fixed coordinate system (ECEF), and the ECEF is an Earth-Fixed coordinate system taking the Earth center as an origin and is a Cartesian coordinate system. The origin is the earth centroid, the Z axis and the earth axis are parallel and point to the north pole, the X axis points to the intersection point of the meridian and the equator, and the Y axis is perpendicular to the XOZ plane (namely the intersection point of the east longitude 90 degrees and the equator) to form a right-hand coordinate system.
104. The terminal equipment determines the maximum included angle according to the N included angles;
in this embodiment, the terminal device determines a maximum included angle according to the N included angles. The maximum included angle is related to a value of N, and if N is an integer less than or equal to 3, the maximum included angle is 0, for example, N included angles are "50 degrees" and "70 degrees", respectively, and the maximum included angle is "0 degree". If N is an integer greater than 3, the maximum included angle is the maximum of the N included angles, for example, if the N included angles are "50 degrees", "70 degrees", "130 degrees", and "200 degrees", respectively, the maximum included angle is "200 degrees".
105. And the terminal equipment determines a target scene corresponding to the terminal equipment at the target moment according to the maximum included angle, wherein the target scene is an indoor scene or an outdoor scene.
In this embodiment, the terminal device determines whether the target scene corresponding to the target time is an indoor scene or an outdoor scene according to the maximum included angle, specifically, if the maximum included angle is greater than 180 degrees, it is determined that the terminal device is in the outdoor scene at the target time, and if the maximum included angle is less than 180 degrees, it is determined that the terminal device is in the indoor scene at the target time. It should be noted that, when the maximum included angle is equal to 180 degrees, it may be determined that the terminal device is in an indoor scene or an outdoor scene at the target time, which is not limited herein.
The embodiment of the application provides a method for determining indoor and outdoor scenes, which includes the steps of firstly obtaining N satellite positions corresponding to a target moment and an equipment position corresponding to a terminal equipment at the target moment, then obtaining an included angle between each satellite position in the N satellite positions and the equipment position to obtain N included angles, then determining a maximum included angle according to the N included angles, and finally determining a target scene corresponding to the terminal equipment at the target moment according to the maximum included angle, wherein the target scene is an indoor scene or an outdoor scene. By the method, indoor and outdoor scenes are judged by utilizing the space distribution condition of the satellite, and the influence of environmental factors such as temperature, illumination intensity and indoor light can be avoided, so that the accuracy of judging the indoor and outdoor scenes is improved. In addition, when the terminal equipment is positioned in a window or a balcony and can receive a satellite signal scene, the misjudgment rate can be effectively reduced, and the accuracy of judging indoor and outdoor scenes is further improved.
Optionally, on the basis of the foregoing respective embodiments corresponding to fig. 3, another optional embodiment provided in the embodiments of the present application may further include:
if the satellite signal is detected at the target moment, the terminal equipment executes the step of acquiring N satellite positions corresponding to the target moment;
if the satellite signal is not detected at the target time and the wireless fidelity (WIFI) signal is detected, the terminal device determines that a target scene corresponding to the terminal device at the target time is an indoor scene;
if the satellite signal is not detected at the target moment and the WIFI signal is not detected, the terminal device determines that the judgment fails.
In this embodiment, a method for preliminarily determining a target scene based on a satellite signal and a WIFI signal is introduced. When determining a target scene where the terminal device is located, the terminal device needs to first determine whether a satellite signal or a wireless fidelity (WIFI) signal can be received at a target time, and then further determine the target scene according to a type of the received signal, where the satellite signal may be a GNSS signal.
For convenience of explanation, please refer to fig. 5, fig. 5 is a schematic overall flow chart of the indoor and outdoor scene determining method in the embodiment of the present application, and as shown in the figure, specifically:
in step a1, the indoor/outdoor scene determination is started.
In step a2, the terminal device determines whether a GNSS signal or a WIFI signal at the target time is available, and if so, performs step a4, and otherwise, if neither a GNSS signal nor a WIFI signal is available, performs step A3.
In step a3, the terminal device determines that the process of this determination of the indoor/outdoor scene has failed.
In step a4, the terminal device determines whether a GNSS signal at the target time is available, and if the GNSS signal is available, performs step a6, whereas if the GNSS signal is not available, performs step a 5.
In step a5, the terminal device continues to determine whether a WIFI signal at the target time is available, and if the WIFI signal is available, step a10 is performed, otherwise, if the WIFI signal is not available, step A3 is performed.
In step a6, once the GNSS signals are determined to be available, the terminal device may calculate the satellite position at the current time based on the broadcast ephemeris information.
In step a7, the terminal device calculates the maximum angle between the satellites.
In step A8, the terminal device may determine the indoor and outdoor scenes by using the maximum included angle.
In step a9, the terminal device determines whether the maximum included angle is greater than 180 degrees, and if the maximum included angle is greater than 180 degrees, step a11 is performed, otherwise, if the maximum included angle is less than 180 degrees, step a10 is performed.
In step a10, the terminal device determines that the target scene is an indoor scene.
In step a11, the terminal device determines that the target scene is an outdoor scene.
Secondly, in the embodiment of the application, a mode of primarily judging the target scene based on the satellite signals and the WIFI signals is provided, and through the mode, different scene judgment results are obtained for different types of signals, so that the flexibility of the scheme is improved.
Optionally, on the basis of each embodiment corresponding to fig. 3, in another optional embodiment provided in this embodiment of the present application, the acquiring, by the terminal device, N satellite positions corresponding to the target time may include:
the method comprises the steps that terminal equipment sends an ephemeris issuing request to a server, so that the server responds to the ephemeris issuing request and obtains broadcast ephemeris information corresponding to a target moment, wherein the broadcast ephemeris information comprises ephemeris parameters corresponding to N satellites;
the method comprises the steps that terminal equipment receives broadcast ephemeris information corresponding to target time sent by a server;
and the terminal equipment determines N satellite positions corresponding to the target time according to the broadcast ephemeris information corresponding to the target time.
In this embodiment, a method for determining the position of a satellite in real time is described. Before calculating the satellite position, the terminal device needs to send an ephemeris issuing request to the server to receive the broadcast ephemeris information sent by the server. The broadcast ephemeris information is broadcast ephemeris information corresponding to a target Time, the broadcast ephemeris information mainly includes a header file and satellite related parameters, and the header file includes some basic information of the file, for example, ionospheric parameters, a data type, a Time increment caused by a second skip, and almanac parameters for calculating Universal Time Coordinated (UTC) Time. The satellite-related parameters include pseudo random noise codes (PRN), clock time, satellite clock bias, orbital eccentricity, and ephemeris reference time. Based on the above, the terminal device may calculate the satellite position corresponding to each satellite at the target time according to the broadcast ephemeris information corresponding to the target time.
For convenience of illustration, please refer to fig. 6, fig. 6 is a schematic flowchart of a process for determining satellite positions according to an embodiment of the present application, and specifically as shown in the figure:
in step B1, the terminal device sends an ephemeris issue request to the CORS server through a fourth generation mobile communication technology (4G) network or a WIFI network, where the ephemeris issue request may carry a time identifier corresponding to the target time.
In step B2, the CORS server may broadcast real-time broadcast ephemeris information to the terminal device through a 4G network or a WIFI network based on the ephemeris issue request, and the method for sending the broadcast ephemeris information includes, but is not limited to, sending the broadcast ephemeris information in the form of a binary stream, or sending the broadcast ephemeris information in the form of a data packet. The broadcast ephemeris information includes ephemeris parameter tables for different satellites, and each ephemeris parameter table may be represented as a set of ephemeris parameters for calculating the positions of the satellites, that is, the broadcast ephemeris information includes ephemeris parameters corresponding to N satellites.
In step B3, the terminal device calculates satellite positions according to the broadcast ephemeris information, thereby obtaining N satellite positions corresponding to the target time.
More specifically, please refer to fig. 7, fig. 7 is a schematic diagram of obtaining broadcast ephemeris information in the embodiment of the present application, and as shown in the figure, taking N satellites as a GPS satellite No. G01, a beidou satellite No. C01, a GLONASS satellite No. R01, and a GALILEO satellite No. E01 as examples, that is, N is equal to 4, based on a current time (i.e., a target time) of the terminal device system time, an ephemeris parameter table of each satellite corresponding to the time may be obtained, that is, ephemeris parameters are obtained, and finally, a satellite position corresponding to each satellite is calculated by using the ephemeris parameters and the target time of each satellite.
It should be noted that the satellite position can be calculated only by the current time of the terminal device in the system time and the satellite number, for convenience of understanding, please refer to table 1, where table 1 is an illustration of the satellite type and the satellite number.
TABLE 1
Figure BDA0002818635450000101
Therefore, each satellite type can correspond to at least one satellite number, the satellite number and the satellite have a unique corresponding relation, and different satellites have corresponding ephemeris parameters.
Secondly, in the embodiment of the application, a method for determining the positions of satellites in real time is provided, and through the method, a server can issue real-time broadcast ephemeris information to terminal equipment, so that the terminal equipment can determine the position of each satellite at the current moment according to the broadcast ephemeris information. Because the broadcast ephemeris information has higher real-time performance, the satellite position obtained by calculation is more accurate.
Optionally, on the basis of each embodiment corresponding to fig. 3, in another optional embodiment provided by this application embodiment, the determining, by the terminal device, the N satellite positions corresponding to the target time according to the broadcast ephemeris information corresponding to the target time may include:
for an ith satellite of the N satellites, the terminal device obtains ephemeris parameters corresponding to the ith satellite according to broadcast ephemeris information corresponding to a target time, wherein i is an integer which is greater than or equal to 1 and less than or equal to N;
for an ith satellite in the N satellites, the terminal equipment determines the satellite position corresponding to the ith satellite at the target moment according to the ephemeris parameters corresponding to the ith satellite;
and the terminal equipment acquires N satellite positions according to the satellite position corresponding to the ith satellite at the target moment.
In this embodiment, a method for calculating a satellite position based on broadcast ephemeris information is introduced. For N satellites, each satellite can calculate its corresponding satellite position in a similar manner, and therefore, the calculation of the satellite position of the ith satellite will be described as an example. It can be understood that the satellite position calculation method corresponding to other satellites is similar, and therefore, the details are not described herein. After N times of calculation, the satellite positions corresponding to the N satellites at the target moment are obtained, and therefore the N satellite positions are obtained.
For example, for a GPS satellite, a beidou satellite or a GALILEO satellite, the satellite position may be calculated as follows, and specifically, based on ephemeris parameters of the ith satellite, the average angular velocity of the satellite motion may be calculated as follows:
Figure BDA0002818635450000111
n=n 0 +Δn
wherein GM represents the product of the gravitational constant G and the total mass M of the earth, n 0 Represents the average angular velocity of the TOE, Δ n represents the perturbation corrections included in the ephemeris parameters, n represents the average angular velocity of the satellite at the target time,
Figure BDA0002818635450000112
representing the square root of the half-axis of the orbit length included in the ephemeris parameters.
The mean anomaly of the satellite at the time of signal transmission is calculated as follows:
Figure BDA0002818635450000113
wherein Δ t represents a time error, a 0 Representing the satellite clock difference, a, included in the ephemeris parameters 1 Representation ephemerisNumber of satellite clocks included in the parameter, a 2 Indicating the rate of change of the satellite clock included in the ephemeris parameters, t' indicating the GPS intra-week seconds corresponding to the target time, t oc Denotes the GPS intra-week second corresponding to the reference time, t denotes the time corrected by the satellite clock, t k Indicating the normalized time, M 0 Represents the mean anomaly at TOE and M represents the mean anomaly at the satellite at the time of signal transmission.
Calculating the off-proximal angle and the true proximal angle by the following method:
Figure BDA0002818635450000114
wherein E represents a deviation angle, the initial value of E is M, convergence can be achieved after iteration is carried out for a plurality of times, and E represents satellite orbit eccentricity and V included in ephemeris parameters k Representing the true proximal angle.
The rise-angle is calculated as follows:
u=ω+V k
where u denotes the ascending-phase angle and ω denotes the near-point angular distance included in the ephemeris parameters.
Perturbation correction terms are calculated as follows:
Figure BDA0002818635450000121
wherein, C uc 、C us 、C rc 、C rs 、C ic And C ic Representing 6 perturbation correction parameters, δ, included in the ephemeris parameters u Perturbation correction term, δ, representing the rise-angle r Perturbation correction term, δ, representing the satellite vector i Perturbation correction terms representing satellite orbital inclination.
Calculating the elevation angle, the satellite vector and the orbit inclination angle after perturbation correction by adopting the following modes:
Figure BDA0002818635450000122
wherein a represents the major radius of the satellite orbit, i 0 Denotes the orbital inclination of TOE, I denotes the rate of change of orbital inclination I, u k Representing the rise-angle, r, after perturbation correction k Representing the perturbed satellite radial, i k Representing the perturbed corrected track inclination.
The coordinates of the satellite in the orbital plane coordinate system are calculated as follows:
Figure BDA0002818635450000123
wherein x represents the abscissa of the satellite in the orbital plane coordinate system, y represents the ordinate of the satellite in the orbital plane coordinate system, and r represents the radius of the satellite.
The longitude of the point of intersection of the transmission instants is calculated as follows:
L=Ω 0 +Ω*t ke *(t k +t oe );
wherein L represents the longitude, Ω, of the point of intersection of the transmission times 0 Denotes the ascension of the TOE, omega denotes the rate of change of the ascension longitude with respect to time, omega e The earth rotation speed is shown.
The coordinates of the satellite under ECEF are calculated as follows:
Figure BDA0002818635450000124
wherein x is i Represents the coordinate value of the ith satellite in the X-axis direction under the ECEF i Represents the Y-axis direction coordinate value, z, corresponding to the ith satellite under ECEF i And the Z-axis coordinate value corresponding to the ith satellite under the ECEF is shown. That is, the satellite position corresponding to the ith satellite at the target time is represented as:
Figure BDA0002818635450000125
at the target time, the terminal equipment receives N effective satellite signals (namely the satellite signal carrier-to-noise ratio is more than 7.0dbHz is effective), and the satellite position of each satellite can be calculated according to the target time through the formula.
For example, for GLONASS satellites, the satellite positions can be calculated as follows, specifically, the ephemeris parameters of the GLONASS satellites are:
P eph (t b )=(x,y,z,v x ,v y ,v z ,a x ,a y ,a znn );
wherein, t b Representing the reference time of the GLONASS satellite, x, y and z representing the reference time t b ECEF coordinate value, v, of a time GLONASS satellite x 、v y And v z Is shown at a reference instant t b ECEF velocity value of a time GLONASS satellite, a x 、a y And a z Is shown at a reference instant t b ECEF acceleration value, tau, of a geostationary GLONASS satellite n And gamma n For calculating GLONASS satellite clock error.
GLONASS satellite positions r in the ECEF (PZ-90) coordinate system s (t)=(x,y,z) T And velocity v s (t)=(v x ,v y ,v z ) T The differential equation with respect to time is:
Figure BDA0002818635450000131
Figure BDA0002818635450000132
Figure BDA0002818635450000133
Figure BDA0002818635450000134
Figure BDA0002818635450000135
wherein, a e Represents the semi-major axis (6378136.0m) of the earth, and μ represents the earth's gravitational constant, which is 398600.44 × 10 9 m3/s2,ω e Representing the rotational angular velocity of the earth, and having a value of 7.292115 × 10 -5 rad/s,J 2 Representing second order band harmonic coefficients representing the oblateness of the earth, with values of 1082625.7 × 10 -9
The GLONASS satellite and velocity values can be solved by adopting a four-order Runge-Kutta (RK4) numerical integration method, and the specific steps are as follows:
first, determine the integral step, if t Rx -t b If dh is greater than 0, the integral step dh is 60s, if t Rx -t b If < 0, the integration step dh is-60 s. Last integration step
Figure BDA0002818635450000136
Then with t b At the initial moment, dh steps up the integral until the integral reaches t Rx Up to, the number of integration is
Figure BDA0002818635450000137
Int is the rounding operation, i.e. n loop operations are performed, each loop formula is as follows:
Figure BDA0002818635450000138
Figure BDA0002818635450000139
Figure BDA00028186354500001310
Figure BDA00028186354500001311
Figure BDA00028186354500001312
Figure BDA00028186354500001313
when i is 1, P eph (t b ) The position and velocity of the line are used as initial values, where k 11 ~k 61 Calculated using the formula:
k 11 =v x,i ,k 21 =v y,i ,k 31 =v z,i
k 41 =f 1 (x i ,y i ,z i ,v x,i ,v y,i ,v z,i ,a x );
k 51 =f 2 (x i ,y i ,z i ,v x,i ,v y,i ,v z,i ,a y );
k 61 =f 3 (x i ,y i ,z i ,v x,i ,v y,i ,v z,i ,a z );
k 12 ~k 62 calculated using the formula:
Figure BDA0002818635450000141
Figure BDA0002818635450000142
Figure BDA0002818635450000143
Figure BDA0002818635450000144
k 13 ~k 63 calculated using the formula:
Figure BDA0002818635450000145
Figure BDA0002818635450000146
Figure BDA0002818635450000147
Figure BDA0002818635450000148
k 14 ~k 64 calculated using the formula:
k 14 =v x,i +dh·k 43 ,k 24 =v y,i +dh·k 53 ,k 34 =v z,i +dh·k 63
k 44 =f 1 (x i +dh·k 13 ,y i +dh·k 23 ,z i +dh·k 33 ,v x,i +dh·k 43 ,v y,i +dh·k 53 ,v z,i +dh·k 63 ,a x );
k 54 =f 2 (x i +dh·k 13 ,y i +dh·k 23 ,z i +dh·k 33 ,v x,i +dh·k 43 ,v y,i +dh·k 53 ,v z,i +dh·k 63 ,a y );
k 64 =f 3 (x i +dh·k 13 ,y i +dh·k 23 ,z i +dh·k 33 ,v x,i +dh·k 43 ,v y,i +dh·k 53 ,v z,i +dh·k 63 ,a z );
by combining the above equations, the satellite position corresponding to the ith satellite at the target time can be obtained, which is expressed as:
Figure BDA0002818635450000149
at the target time, the terminal equipment receives N effective satellite signals (namely the satellite signal carrier-to-noise ratio is more than 7.0dbHz as effective), and the satellite position of each satellite can be calculated according to the target time through the formula.
In the embodiment of the present application, a method for calculating satellite positions based on broadcast ephemeris information is provided, and in the above manner, a terminal device may calculate a satellite position corresponding to each satellite by using the broadcast ephemeris information and a target time, so as to improve feasibility and operability of a scheme.
Optionally, on the basis of each of the embodiments corresponding to fig. 3, in another optional embodiment provided in this embodiment of the present application, N is an integer greater than or equal to 4;
the obtaining, by the terminal device, a device position corresponding to the terminal device at the target time may include:
the method comprises the steps that terminal equipment obtains first signal information, wherein the first signal information comprises coordinate values of a first satellite and time when a signal of the first satellite reaches the terminal equipment;
the terminal equipment acquires second signal information, wherein the second signal information comprises coordinate values of a second satellite and time for a signal of the second satellite to reach the terminal equipment;
the terminal equipment acquires third signal information, wherein the third signal information comprises coordinate values of a third satellite and time for reaching the terminal equipment by the signal of the third satellite;
the terminal equipment acquires fourth signal information, wherein the fourth signal information comprises coordinate values of a fourth satellite and time for a signal of the fourth satellite to reach the terminal equipment;
and the terminal equipment determines the equipment position corresponding to the terminal equipment at the target moment according to the first signal information, the second signal information, the third signal information and the fourth signal information.
In this embodiment, a method for implementing device positioning based on satellite positioning is introduced. In the satellite positioning process, at least signal information corresponding to four satellites at a target time needs to be obtained, that is, first signal information, second signal information, third signal information, and fourth signal information are obtained, where the first signal information includes coordinate values (e.g., (x1, y1, z1)) of a first satellite (e.g., satellite 1) and a time (e.g., V1) when a signal of the first satellite reaches a terminal device t1 ). The second signal information includes coordinate values (e.g., (x2, y2, z2)) of a second satellite (e.g., satellite 2) and a time (e.g., V) at which the signal of the second satellite reaches the terminal device t2 ). The third signal information includes coordinate values (e.g., (x3, y3, z3)) of a third satellite (e.g., satellite 3) and a time (e.g., V) when the signal of the third satellite reaches the terminal device t3 ). The fourth signal information includes coordinate values (e.g., (x4, y4, z4)) of a fourth satellite (e.g., satellite 4) and a time (e.g., V) when the signal of the fourth satellite reaches the terminal device t4 )。
Specifically, the device position of the terminal device is determined by adopting a method of intersecting behind a space distance according to the instantaneous position of a satellite moving at a high speed as known calculation data. For convenience of introduction, please refer to fig. 8, fig. 8 is a schematic diagram of determining a device location based on a global positioning system in the embodiment of the present application, and as shown in the figure, assuming that a satellite positioning system receiver is disposed on a ground point to be measured, a time Δ t of a signal of a beidou satellite positioning system arriving at the receiver may be measured, and in addition, other data such as a satellite ephemeris received by the receiver may determine the following four equations:
Figure BDA0002818635450000151
wherein X1 denotes the first satellite's sitting on the X axisA standard value, Y1 denotes a coordinate value of the first satellite on the Y axis, Z1 denotes a coordinate value of the first satellite on the Z axis, X2 denotes a coordinate value of the second satellite on the X axis, Y2 denotes a coordinate value of the second satellite on the Y axis, Z2 denotes a coordinate value of the second satellite on the Z axis, X3 denotes a coordinate value of the third satellite on the X axis, Y3 denotes a coordinate value of the third satellite on the Y axis, Z3 denotes a coordinate value of the third satellite on the Z axis, X4 denotes a coordinate value of the fourth satellite on the X axis, Y4 denotes a coordinate value of the fourth satellite on the Y axis, Z4 denotes a coordinate value of the fourth satellite on the Z axis, X denotes a coordinate value of the device position on the X axis, Y denotes a coordinate value of the device position on the Z axis, c denotes a propagation velocity (i.e. light velocity) of the GPS signal, d1 denotes a distance of the first satellite to the satellite positioning system receiver, d2 denotes the distance of the second satellite to the satellite positioning system receiver, d3 denotes the distance of the third satellite to the satellite positioning system receiver, d4 denotes the distance of the fourth satellite to the satellite positioning system receiver, V t1 Representing the clock error, V, of the first satellite t2 Indicating the clock error, V, of the second satellite t3 Indicating the clock error, V, of the third satellite t4 Indicating the clock error, V, of the fourth satellite t0 Representing the clock error of the satellite positioning system receiver.
After the above four equations are combined, the corresponding device location at the target time (i.e., (x, y, z)) can be solved, and based on this, the device location (i.e., (x, y, z)) can be expressed as the corresponding device location under ECEF, that is:
Figure BDA0002818635450000161
wherein r is u Indicating the corresponding device location, x, of the terminal device under the ECEF u Represents the coordinate value of X-axis direction, y, corresponding to the terminal equipment under ECEF u The coordinate value of the terminal equipment in the Y-axis direction, z u And the Z-axis coordinate value corresponding to the terminal equipment under the ECEF is represented.
Secondly, in the embodiment of the present application, a method for implementing device positioning based on satellite positioning is provided, and by the above method, at least four satellites are used to position a terminal device, and each satellite has the characteristics of high precision and good reliability, so that the device position corresponding to the terminal device at a target moment can be calculated according to the signal information of the four satellites, thereby improving the feasibility of the scheme.
Optionally, on the basis of each embodiment corresponding to fig. 3, in another optional embodiment provided in this application embodiment, the obtaining, by the terminal device, a device position corresponding to the terminal device at the target time may include:
the method comprises the steps that terminal equipment acquires first position information, wherein the first position information comprises position information of a first base station and a first distance between the terminal equipment and the first base station;
the terminal equipment acquires second position information, wherein the second position information comprises position information of a second base station and a second distance between the terminal equipment and the second base station;
the terminal equipment acquires third position information, wherein the third position information comprises position information of a third base station and a third distance between the terminal equipment and the third base station;
and the terminal equipment determines the equipment position corresponding to the terminal equipment at the target moment according to the first position information, the second position information and the third position information.
In this embodiment, a method for implementing device location based on base station location is introduced. In the process of positioning the base station, at least the position information corresponding to the three base stations at the target time needs to be obtained, that is, the first position information, the second position information, and the third position information are obtained, where the first position information includes the position information of the first base station (e.g., base station a) and a first distance (e.g., d1) between the terminal device and the first base station. The second location information includes location information of a second base station (e.g., base station B) and a second distance (e.g., d2) between the terminal device and the second base station. The third location information includes location information of a third base station (e.g., base station C) and a third distance (e.g., d3) between the terminal device and the third base station.
Specifically, the device position of the terminal device is determined by a positioning method using three base stations based on a positioning method based on a radio wave propagation Time (Time of Arrival, TOA), a Time Difference of Arrival (TDOA), or the like. For convenience of introduction, please refer to fig. 9, where fig. 9 is a schematic diagram illustrating determining a device location based on a base station location according to an embodiment of the present disclosure, and as shown in the figure, base station a location information (i.e., (x1, y1, z1)), base station B location information (i.e., (x2, y2, z2)), and base station C location information (i.e., (x3, y3, z3)) are obtained, and a target point, which is a device location corresponding to a terminal device at a target time, is determined by an intersection point of three circles. Solving the coordinate value of the equipment position in the X-axis direction and the coordinate value of the equipment position in the Y-axis direction by adopting the following method:
Figure BDA0002818635450000171
wherein X represents an X-axis coordinate value of the device position, Y represents a Y-axis coordinate value of the device position, and X 1 X-axis coordinate value, y, representing base station A 1 Coordinate value, x, of base station A in Y-axis direction 2 X-axis coordinate value, y, of base station B 2 Coordinate value, x, of base station B in Y-axis direction 3 X-axis coordinate value, y, of base station C 3 Coordinate value of Y-axis direction, gamma, representing base station C 1 Denotes a first intermediate quantity, γ 2 Representing a second intermediate quantity.
Figure BDA0002818635450000172
Wherein, d 1 Representing a first distance between the terminal device and the base station A, d 2 Representing a second distance between the terminal device and the base station B, d 3 Representing a third distance between the terminal device and base station C.
After the above four equations are combined, the corresponding device location (i.e., (x, y)) at the target time can be solved, and based on this, the device location (i.e., (x, y)) can be expressed as the corresponding device location under ECEF, that is:
Figure BDA0002818635450000173
wherein r is u Indicating the corresponding device location, x, of the terminal device under the ECEF u Represents the coordinate value of X-axis direction, y, corresponding to the terminal equipment under ECEF u And the coordinate value of the Y-axis direction corresponding to the terminal equipment under the ECEF is shown.
Secondly, in the embodiment of the application, a mode for realizing equipment positioning based on base station positioning is provided, through the above mode, the terminal equipment can be positioned by using three base stations, and the positioning mode has the advantages of high positioning speed, low cost (no need of adding extra hardware on the terminal equipment), low power consumption, indoor availability and the like, and belongs to a lightweight positioning method, so that the feasibility of the scheme is improved.
Optionally, on the basis of each embodiment corresponding to fig. 3, in another optional embodiment provided in this embodiment of the present application, the obtaining, by the terminal device, an included angle between each satellite position in the N satellite positions and the device position to obtain N included angles may include:
the terminal equipment takes the equipment position as a coordinate origin to construct a plane coordinate system;
the method comprises the steps that terminal equipment obtains longitude information and latitude information corresponding to the terminal equipment;
for the ith satellite position of the N satellite positions, the terminal equipment determines a coordinate value of the ith satellite in a plane coordinate system according to longitude information and latitude information corresponding to the terminal equipment, the equipment position and the ith satellite position, wherein i is an integer which is greater than or equal to 1 and less than or equal to N;
the terminal equipment determines an included angle between the ith satellite and the equipment position according to the coordinate value of the ith satellite in the plane coordinate system;
and the terminal equipment acquires N included angles according to the included angle between the ith satellite and the equipment position.
In the embodiment, aThe way of calculating the angle between the satellite position and the device position. After the satellite position and the device position are obtained, the coordinate value (i.e. X) in the X-axis direction is extracted from the satellite position and the device position u ) And coordinate values in the Y-axis direction (i.e., Y) u ). Based on this, a plane coordinate system S is established with the device position as the origin of coordinates, the north direction as the Y-axis, and the east direction as the X-axis. Converting the coordinate values of N satellites into a plane coordinate system S, namely
Figure BDA0002818635450000181
Wherein r is Si Coordinate value, x, of the ith satellite in a plane coordinate system S Si Represents the coordinate value of the ith satellite in the X-axis direction under the plane coordinate system S, y Si Represents the coordinate value of the ith satellite in the Y-axis direction under the plane coordinate system S, x u X-axis coordinate value, y, representing the position of the device under ECEF u Y-axis coordinate value, x, representing the position of the device under ECEF i The coordinate value of the ith satellite position in the X-axis direction under ECEF, y i And the coordinate value of the Y axis direction of the ith satellite position under the ECEF is represented, wherein lambda represents longitude information of the terminal equipment, and phi represents latitude information of the terminal equipment.
Based on the method, the included angle between the connecting line between the ith satellite and the equipment position and the Y axis can be calculated, and the method can be obtained:
Figure BDA0002818635450000182
wherein, theta i Representing the angle between the ith satellite and the location of the device.
For convenience of introduction, please refer to fig. 10, fig. 10 is a schematic diagram of N included angles in a planar coordinate system in the embodiment of the present application, as shown in the figure, five satellites are taken as an example, that is, a satellite 1, a satellite 2, a satellite 3, a satellite 4, and a satellite 5, respectively, where u represents a device position of a terminal device in the planar coordinate system S, that is, a coordinate origin. The included angle between the connecting line between the No. 1 satellite and the equipment position and the Y axis is theta 1 And the included angle between the connecting line between the No. 2 satellite and the equipment position and the Y axis is theta 2 And the included angle between the connecting line between the No. 3 satellite and the equipment position and the Y axis is theta 3 And the included angle between the connecting line between the No. 4 satellite and the equipment position and the Y axis is theta 4 And the included angle between the connecting line between the No. 5 satellite and the equipment position and the Y axis is theta 5
Secondly, in the embodiment of the application, a way of calculating an included angle between a satellite position and an equipment position is provided, and through the way, satellite information of a satellite can be mapped to a plane coordinate system based on a constructed plane coordinate system, so that the included angle between each satellite and the equipment position is obtained, and feasibility of a scheme is improved.
Optionally, on the basis of each embodiment corresponding to fig. 3, in another optional embodiment provided in this embodiment of the present application, the determining, by the terminal device, the maximum included angle according to the N included angles may include:
if N is an integer less than or equal to 3, the terminal equipment determines that the maximum included angle in the N included angles is 0;
and if N is an integer larger than 3, the terminal equipment determines the maximum included angle according to K included angle difference values in the N included angles, wherein the K included angle difference values are the difference values between every two included angles in the N included angles, and K is an integer larger than or equal to 6.
In this embodiment, a method for determining the maximum included angle according to the N included angles is described. After the included angles between the N satellites and the Y axis of the equipment position connecting line are obtained, the included angle is obtained, and theta is equal to theta 12 ,...θ N And therefore, constructing indoor and outdoor scene discrimination quantity, namely calculating a maximum included angle by adopting the following method:
Figure BDA0002818635450000183
wherein, theta max Representing the maximum angle between the satellites in view of the user at the target moment, theta i Representing the angle between the ith satellite and the location of the apparatus, theta j Represents the angle between the jth satellite and the device location, | θ ij I represents theta i Theta between (a) and (b) j The included angle difference.
Since the difference between two included angles needs to be calculated in the case where N is greater than 3 (i.e. in the case where N is greater than or equal to 4), if N is equal to 4, the included angle θ exists 1 Angle theta 2 Angle theta 3 And angle theta 4 Thus, there are at least six cases, each being θ 1 Theta between (a) and (b) 2 Included angle difference | θ 12 |,θ 1 Theta between (a) and (b) 3 Included angle difference | θ 13 |,θ 1 Theta between (a) and (b) 4 Included angle difference | θ 14 |,θ 2 Theta between (a) and (b) 3 Included angle difference | theta 23 |,θ 2 Theta between (a) and (b) 4 Included angle difference | θ 24 |,θ 3 Theta between (a) and (b) 4 Included angle difference | θ 34 |。
Secondly, in the embodiment of the application, a mode of determining the maximum included angle according to the N included angles is provided, and by the above mode, the maximum included angle can be determined based on the size of the N value, and for the case that N is less than or equal to 3, the maximum included angle can be directly determined to be 0 degree, and for the case that N is greater than 3, the maximum included angle can be determined from the difference value of the K included angles, thereby improving the feasibility and operability of the scheme.
Optionally, on the basis of each embodiment corresponding to fig. 3, in another optional embodiment provided in this embodiment of the present application, the determining, by the terminal device, the maximum included angle according to K included angle differences among the N included angles may include:
for an ith included angle and a jth included angle in the N included angles, the terminal equipment determines an angle difference value between the ith included angle and the jth included angle, wherein i and j are integers which are greater than or equal to 1 and less than or equal to N, and i and j are unequal;
the terminal equipment obtains K included angle difference values according to the ith included angle and the jth included angle;
and the terminal equipment selects the maximum value from the K included angle difference values as the maximum included angle.
In this embodiment, a method for determining the maximum included angle based on two included angles is introduced. As described in the previous embodiment, the maximum included angle is calculated as follows:
θ max =max{|θ ij |},i=1,2,...,N,j=1,2,...,N,i≠j,N>3;
wherein, theta max Representing the maximum angle between the satellites in view of the user at the target moment, theta i Representing the angle between the ith satellite and the location of the device, theta j Represents the angle between the jth satellite and the device location, | θ ij I represents theta i Theta between (a) and (b) j A difference in included angle.
Specifically, taking N equal to 4 as an example, let θ be obtained 1 Theta between (a) and (b) 2 Included angle difference | θ 12 L is 50 degrees, theta 1 Theta between (a) and (b) 3 Included angle difference | θ 13 L is 70 degrees, theta 1 Theta between (a) and (b) 4 Included angle difference | θ 14 I is 85 degrees, theta 2 Theta between (a) and (b) 3 Included angle difference | θ 23 L is 120 degrees, theta 2 Theta between (a) and (b) 4 Included angle difference | θ 24 I is 100 degrees, theta 3 Theta between (a) and (b) 4 Included angle difference | θ 34 And | is 150 degrees. From which the maximum included angle theta is obtained max Is 150 degrees.
In the embodiment of the application, a mode for determining the maximum included angle based on two included angles is provided, and in the above mode, under the condition that N is greater than 3, the included angle difference between two included angles needs to be calculated respectively, so that the difference between any two included angles is avoided being omitted, and a more accurate maximum included angle can be obtained.
Optionally, on the basis of each embodiment corresponding to fig. 3, in another optional embodiment provided by this embodiment of the present application, the determining, by the terminal device, a target scene corresponding to the terminal device at the target time according to the maximum included angle may include:
if the maximum included angle is larger than the included angle threshold value, the terminal equipment determines that a target scene corresponding to the terminal equipment at the target moment is an outdoor scene;
and if the maximum included angle is smaller than the included angle threshold value, the terminal equipment determines that the target scene corresponding to the terminal equipment at the target moment is an indoor scene.
In this embodiment, a method for determining a target scene according to a maximum included angle is described. After determining the maximum included angle, the terminal device determines whether the maximum included angle is greater than an included angle threshold, in this application, the included angle threshold is taken as 180 degrees for example, it should be noted that the included angle threshold may also be 180 ± 1 degree. And if the maximum included angle is larger than the included angle threshold value, determining that the target scene corresponding to the terminal equipment at the target moment is an outdoor scene, and if the maximum included angle is smaller than the included angle threshold value, determining that the target scene corresponding to the terminal equipment at the target moment is an indoor scene.
Exemplarily, in practical application, the scene type of the target scene may be determined by combining the WIFI signal strength and the maximum included angle, for example, if the maximum included angle is greater than the included angle threshold and the WIFI signal is less than the signal strength threshold, it is determined that the target scene corresponding to the terminal device at the target moment is the outdoor scene. And if the maximum included angle is larger than the included angle threshold value, but the WIFI signal is larger than the signal intensity threshold value, determining that the judgment is failed. And if the maximum included angle is smaller than the included angle threshold value and the WIFI signal is larger than the signal intensity threshold value, determining that the target scene corresponding to the terminal equipment at the target moment is an indoor scene.
Exemplarily, in practical application, the scene type of the target scene may be determined by combining WIFI signal distribution, base station distribution and the maximum included angle, for example, if the maximum included angle is greater than the included angle threshold and the number of WIFI signals is greater than the number of base station signals, it is determined that the target scene corresponding to the terminal device at the target moment is an indoor scene. And if the maximum included angle is smaller than the included angle threshold value and the number of the WIFI signals is smaller than that of the base station signals, determining that the target scene corresponding to the terminal equipment at the target moment is an outdoor scene.
Specifically, for ease of understanding, an outdoor scene, an indoor-outdoor interfacing scene, and an indoor scene will be described separately below.
Firstly, an outdoor scene;
referring to fig. 11, fig. 11 is a schematic diagram illustrating satellite distribution in an outdoor scene according to an embodiment of the present disclosure, as shown in the figure, signals of satellite 1, satellite 2, satellite 3, satellite 4, satellite 5, satellite 6, satellite 7, satellite 8, and satellite 9 can be received in the outdoor scene, and in a general case, when a user is located in an open scene, the satellites are more uniformly distributed around the user, and a maximum included angle between the satellites is greater than 180 degrees. Further, referring to fig. 12, fig. 12 is a schematic diagram of a positioning interface based on an outdoor scene in the embodiment of the present application, and as shown in the figure, if it is determined that the user is currently located outdoors, a word "the user is located outdoors" may be displayed on the interface of the navigation application.
Secondly, an indoor and outdoor junction scene;
referring to fig. 13, fig. 13 is a schematic distribution diagram of satellites in an indoor and outdoor boundary scene according to an embodiment of the present disclosure, and as shown in the drawing, signals from an outdoor satellite 1, an outdoor satellite 2, an outdoor satellite 3, an outdoor satellite 4, and an outdoor satellite 5, and signals from an indoor WIFI signal 0, a WIFI signal 1, a WIFI signal 2, and a WIFI signal 3 may be received in the indoor and outdoor boundary scene (e.g., at a doorway).
Referring to fig. 14, fig. 14 is another schematic distribution diagram of satellites in an indoor and outdoor boundary scene according to the embodiment of the present disclosure, and as shown in the figure, signals from an outdoor satellite 1, an outdoor satellite 2, an outdoor satellite 3, an outdoor satellite 4, and an outdoor satellite 5, and signals from an indoor WIFI signal 0, a WIFI signal 1, a WIFI signal 2, and a WIFI signal 3 may be received in the indoor and outdoor boundary scene (e.g., at a window).
Please refer to fig. 15, fig. 15 is another schematic distribution diagram of a satellite in an indoor and outdoor boundary scene according to the embodiment of the present disclosure, as shown in the figure, signals from an outdoor satellite 1, an outdoor satellite 2, an outdoor satellite 3, an outdoor satellite 4, and an outdoor satellite 5, and WIFI signals from an indoor satellite 0, WIFI signals 1, WIFI signals 2, and WIFI signals 3 may be received in the indoor and outdoor boundary scene (e.g., on a balcony).
Typically, when a user is located at the edge of a building, doorway, window or balcony, only satellite signals within half a day are received. I.e., the maximum included angle between satellites is less than an included angle threshold (e.g., 180 degrees). When a user is under the eave or enters indoors from outdoors, the user may be considered to be indoors if the maximum included angle between the satellites is less than an included angle threshold (e.g., 180 degrees), and outdoors if the maximum included angle between the satellites is greater than the included angle threshold (e.g., 180 degrees).
Further, referring to fig. 16, fig. 16 is another schematic diagram of a positioning interface based on an indoor and outdoor junction scene in the embodiment of the present application, as shown in the figure, if it is determined that the user is currently located indoors but the maximum included angle is close to the included angle threshold (e.g., 180 degrees), a word "the user is located at the junction from indoor to outdoor" may be displayed on the interface of the navigation application.
Three, indoor scene
Referring to fig. 17, fig. 17 is a schematic distribution diagram of a satellite in an indoor scene in the embodiment of the present application, and as shown in the drawing, only WIFI signals No. 0, WIFI signals No. 1, WIFI signals No. 2, and WIFI signals No. 3 from the indoor can be received in an indoor and outdoor interfacing scene (e.g., on a balcony). When no satellite signal exists but a WIFI signal exists, the situation that the user is located in an indoor scene is judged.
Further, referring to fig. 18, fig. 18 is a schematic diagram of a positioning interface based on an indoor scene in the embodiment of the present application, and as shown in the figure, if it is determined that the user is currently located indoors but the maximum included angle is close to the included angle threshold (e.g., 180 degrees), a word "the user is located near an indoor window" may be displayed on the interface of the navigation application.
And the bottom layer positioning module of the navigation application judges the current target scene of the user according to the current satellite signal of the user, the WIFI signal, the space distribution condition of the satellite and other information, and returns the current target scene to the map navigation module. The navigation application switches the navigation scene according to the scene where the user is located, and switches to the indoor navigation scene when the user is indoors or switches to the outdoor navigation scene when the user is outdoors.
Secondly, in the embodiment of the application, a mode for determining a target scene according to a maximum included angle is provided, and through the mode, indoor and outdoor discrimination is assisted by satellite space geometric distribution, compared with the existing indoor and outdoor discrimination method, the method is not influenced by environmental factors such as temperature and humidity, illumination intensity and indoor light, and is different from discrimination based on the number of visible satellites and satellite signal intensity, indoor and outdoor scenes are simply and effectively discriminated by using the satellite space geometric distribution condition, and the misjudgment rate when a user is located in a window, a balcony or a doorway and can receive satellite signal scenes is reduced.
Optionally, on the basis of each embodiment corresponding to fig. 3, in another optional embodiment provided by this application embodiment, after the terminal device determines that the target scene corresponding to the terminal device at the target time is an outdoor scene, the method may further include:
the terminal equipment acquires a target position;
the method comprises the steps that terminal equipment sends a target position and an outdoor positioning identifier to a service server, so that the service server generates first navigation information according to an equipment position, the target position and the outdoor positioning identifier corresponding to the terminal equipment at a target moment, wherein the outdoor positioning identifier corresponds to an outdoor scene;
the terminal equipment receives first navigation information sent by a service server;
the terminal equipment displays the first navigation information.
In this embodiment, a navigation method in an outdoor scene is introduced. Taking the example that the terminal device is installed with a navigation application, a user can input a target location (i.e., a destination) through the navigation application, and then the terminal device sends the target location and an outdoor positioning identifier to a server (i.e., a navigation service server), where the outdoor positioning identifier corresponds to an outdoor scene, that is, if the terminal device determines that the target scene at the target time is the outdoor scene, the corresponding outdoor positioning identifier is generated. The server generates first navigation information according to the device position (namely the departure place) corresponding to the terminal device at the target moment, the target position and the outdoor positioning identification, wherein the first navigation information comprises route information and predicted time required for arrival. Then, the server sends the first navigation information to the terminal device, and the terminal device displays the first navigation information.
Specifically, for convenience of understanding, please refer to fig. 19, fig. 19 is an interface schematic diagram illustrating first navigation information in the embodiment of the present application, as shown in the figure, it is assumed that a user inputs a target location through an interface of a navigation application, for example, "happy cell is right", and after determination, it is determined that the terminal device is in an outdoor scene at the current time, that is, an outdoor positioning identifier is generated, based on which, the server generates the first navigation information according to the outdoor positioning identifier, a device location of the terminal device at the current time, and the target location, and displays the first navigation information by the terminal device, for example, a route map is displayed, an expected arrival time is 1 hour and 1 minute, a total of 4.1 kilometers, and 4 intersections are traveled.
The interface shown in fig. 19 is only an example, and should not be construed as limiting the present application.
In the embodiment of the application, a navigation method in an outdoor scene is provided, and in the method, when the terminal device held by the user is detected to be in the outdoor scene, the corresponding navigation information can be generated based on the outdoor scene, so that accurate navigation information is provided for the user.
Optionally, on the basis of each embodiment corresponding to fig. 3, in another optional embodiment provided in this application embodiment, after the terminal device determines that the target scene corresponding to the terminal device at the target time is an indoor scene, the method may further include:
the terminal equipment acquires a target position;
the terminal equipment sends a target position and an indoor positioning identifier to the service server so that the service server generates second navigation information according to the equipment position, the target position and the indoor positioning identifier corresponding to the terminal equipment at the target moment, wherein the indoor positioning identifier corresponds to an indoor scene;
the terminal equipment receives second navigation information sent by the service server;
and the terminal equipment displays the second navigation information.
In this embodiment, a method for performing navigation in an indoor scene is introduced. Taking the example that the terminal device is installed with the navigation application, the user may input a target location (i.e., a destination) through the navigation application, and then the terminal device sends the target location and the indoor positioning identifier to the server (i.e., the navigation service server), where the indoor positioning identifier corresponds to an indoor scene, that is, if the terminal device determines that the target scene at the target time is an indoor scene, the corresponding indoor positioning identifier is generated. And the server generates second navigation information according to the equipment position (namely the departure place) corresponding to the terminal equipment at the target moment, the target position and the indoor positioning identification, wherein the second navigation information comprises route information and predicted time required for arrival. Then, the server sends the second navigation information to the terminal device, and the terminal device displays the second navigation information.
Specifically, for convenience of understanding, please refer to fig. 20, fig. 20 is an interface schematic diagram illustrating second navigation information in the embodiment of the present application, and as shown in the figure, it is assumed that a user inputs a target location through an interface of a navigation application, for example, "happy convenience store (minus 1 building)", and after a determination, it is determined that the terminal device is in an indoor scene at the current time, that is, an indoor positioning identifier is generated, based on which, the server generates the second navigation information according to the indoor positioning identifier, a device location of the terminal device at the current time, and the target location, and the terminal device displays the second navigation information, for example, displays a route map, and the expected time to reach is 2 minutes and 107 meters in total.
The interface shown in fig. 20 is only an example, and should not be construed as limiting the present application.
In the embodiment of the application, a navigation method in an indoor scene is provided, and in the above manner, when it is detected that the terminal device held by the user is in the indoor scene, corresponding navigation information can be generated based on the indoor scene, so that accurate navigation information is provided for the user.
Referring to fig. 21, fig. 21 is a schematic view of an embodiment of an indoor and outdoor scene determining apparatus 20 according to the present invention, which includes:
an obtaining module 201, configured to obtain N satellite positions corresponding to a target time, where each satellite position corresponds to a satellite, and N is an integer greater than or equal to 1;
the obtaining module 201 is further configured to obtain a device location corresponding to the terminal device at the target time;
the obtaining module 201 is further configured to obtain an included angle between each satellite position of the N satellite positions and the device position, so as to obtain N included angles;
a determining module 202, configured to determine a maximum included angle according to the N included angles;
the determining module 202 is further configured to determine a target scene corresponding to the terminal device at the target time according to the maximum included angle, where the target scene is an indoor scene or an outdoor scene.
In the embodiment of the application, an indoor and outdoor scene determination device is provided, and by adopting the device, the indoor and outdoor scenes are determined by utilizing the space distribution condition of the satellite, and the influence of environmental factors such as temperature, illumination intensity and indoor light can be avoided, so that the accuracy of determining the indoor and outdoor scenes is improved. In addition, when the terminal equipment is positioned in a window or a balcony and the like, the satellite signal scene can be received indoors, the misjudgment rate can be effectively reduced, and the accuracy of judging indoor and outdoor scenes is further improved.
Alternatively, on the basis of the embodiment corresponding to fig. 21, in another embodiment of the indoor and outdoor scene determining apparatus 20 provided in the embodiment of the present application,
the obtaining module 201 is further configured to, if a satellite signal is detected at a target time, perform a step of obtaining N satellite positions corresponding to the target time;
the determining module 202 is further configured to determine that a target scene corresponding to the terminal device at the target moment is an indoor scene if the satellite signal is not detected at the target moment and the WIFI signal is detected;
the determining module 202 is further configured to determine that the determination fails if the satellite signal is not detected at the target time and the WIFI signal is not detected.
In the embodiment of the application, an indoor and outdoor scene determining device is provided, and by adopting the device, different scene judgment results are obtained for different types of signals, so that the flexibility of the scheme is improved.
Alternatively, on the basis of the embodiment corresponding to fig. 21, in another embodiment of the indoor and outdoor scene determining apparatus 20 provided in the embodiment of the present application,
an obtaining module 201, configured to send an ephemeris issue request to a server, so that the server responds to the ephemeris issue request to obtain broadcast ephemeris information corresponding to a target time, where the broadcast ephemeris information includes ephemeris parameters corresponding to N satellites;
receiving broadcast ephemeris information corresponding to a target moment sent by a server;
and determining N satellite positions corresponding to the target time according to the broadcast ephemeris information corresponding to the target time.
In the embodiment of the application, an indoor and outdoor scene determining device is provided, and by adopting the device, a server can issue real-time broadcast ephemeris information to a terminal device, so that the terminal device can determine the position of each satellite at the current moment according to the broadcast ephemeris information. Because the broadcast ephemeris information has higher real-time performance, the satellite position obtained by calculation is more accurate.
Alternatively, on the basis of the embodiment corresponding to fig. 21, in another embodiment of the indoor and outdoor scene determining apparatus 20 provided in the embodiment of the present application,
an obtaining module 201, configured to specifically obtain, for an ith satellite of the N satellites, ephemeris parameters corresponding to the ith satellite according to broadcast ephemeris information corresponding to a target time, where i is an integer greater than or equal to 1 and less than or equal to N;
for an ith satellite in the N satellites, determining a satellite position corresponding to the ith satellite at a target moment according to ephemeris parameters corresponding to the ith satellite;
and acquiring N satellite positions according to the satellite position corresponding to the ith satellite at the target moment.
By adopting the device, the terminal equipment can calculate the satellite position corresponding to each satellite by utilizing the broadcast ephemeris information and the target time, so that the feasibility and operability of the scheme are improved.
Optionally, on the basis of the embodiment corresponding to fig. 21, in another embodiment of the indoor and outdoor scene determining apparatus 20 provided in the embodiment of the present application, N is an integer greater than or equal to 4;
an obtaining module 201, configured to obtain first signal information, where the first signal information includes a coordinate value of a first satellite and a time when a signal of the first satellite reaches a terminal device;
acquiring second signal information, wherein the second signal information comprises coordinate values of a second satellite and time for a signal of the second satellite to reach the terminal equipment;
acquiring third signal information, wherein the third signal information comprises coordinate values of a third satellite and time for a signal of the third satellite to reach the terminal equipment;
acquiring fourth signal information, wherein the fourth signal information comprises coordinate values of a fourth satellite and time for a signal of the fourth satellite to reach the terminal equipment;
and determining the device position corresponding to the terminal device at the target moment according to the first signal information, the second signal information, the third signal information and the fourth signal information.
In the embodiment of the application, an indoor and outdoor scene determining device is provided, and by adopting the device, the terminal equipment can be positioned by utilizing at least four satellites, and each satellite has the characteristics of high precision and high reliability, so that the equipment position corresponding to the terminal equipment at the target moment can be calculated according to the signal information of the four satellites, and the feasibility of the scheme is improved.
Alternatively, on the basis of the embodiment corresponding to fig. 21, in another embodiment of the indoor and outdoor scene determining apparatus 20 provided in the embodiment of the present application,
an obtaining module 201, configured to obtain first location information, where the first location information includes location information of a first base station and a first distance between a terminal device and the first base station;
acquiring second position information, wherein the second position information comprises position information of a second base station and a second distance between the terminal equipment and the second base station;
acquiring third position information, wherein the third position information comprises position information of a third base station and a third distance between the terminal equipment and the third base station;
and determining the device position corresponding to the terminal device at the target moment according to the first position information, the second position information and the third position information.
In the embodiment of the application, an indoor and outdoor scene determining device is provided, by adopting the device, the terminal equipment can be positioned by utilizing three base stations, and by adopting the positioning mode, the advantages of high positioning speed, low cost (no need of adding extra hardware on the terminal equipment), low power consumption, indoor availability and the like are achieved, and the positioning method belongs to a light-weight positioning method, so that the feasibility of the scheme is improved.
Alternatively, on the basis of the embodiment corresponding to fig. 21, in another embodiment of the indoor and outdoor scene determining apparatus 20 provided in the embodiment of the present application,
an obtaining module 201, configured to specifically use the device position as a coordinate origin to construct a planar coordinate system;
acquiring longitude information and latitude information corresponding to terminal equipment;
for the ith satellite position of the N satellite positions, determining a coordinate value of the ith satellite in a plane coordinate system according to longitude information and latitude information corresponding to the terminal equipment, the equipment position and the ith satellite position, wherein i is an integer which is greater than or equal to 1 and less than or equal to N;
determining an included angle between the ith satellite and the equipment position according to a coordinate value of the ith satellite in a plane coordinate system;
and obtaining N included angles according to the included angle between the ith satellite and the equipment position.
In the embodiment of the application, an indoor and outdoor scene determining device is provided, and by adopting the device, satellite information of satellites can be mapped into a plane coordinate system based on the established plane coordinate system, so that an included angle between each satellite and the position of equipment is obtained, and the feasibility of the scheme is improved.
Alternatively, on the basis of the embodiment corresponding to fig. 21, in another embodiment of the indoor and outdoor scene determining apparatus 20 provided in the embodiment of the present application,
a determining module 202, configured to determine that a maximum included angle of the N included angles is 0 if N is an integer less than or equal to 3;
and if N is an integer greater than 3, determining the maximum included angle according to K included angle differences in the N included angles, wherein the K included angle differences are the differences between every two included angles in the N included angles, and K is an integer greater than or equal to 6.
In the embodiment of the application, an indoor and outdoor scene determining device is provided, adopt above-mentioned device, can confirm the biggest contained angle based on the size of N value, for N for being less than or equal to 3's the condition, can directly be 0 degree with the biggest contained angle determination, for N is greater than 3's the condition, can follow K contained angle difference and confirm the biggest contained angle, promote the feasibility and the maneuverability of scheme from this.
Alternatively, on the basis of the embodiment corresponding to fig. 21, in another embodiment of the indoor and outdoor scene determining apparatus 20 provided in the embodiment of the present application,
the determining module 202 is specifically configured to determine an angle difference between an ith included angle and a jth included angle in the N included angles, where i and j are integers greater than or equal to 1 and less than or equal to N, and i and j are not equal to each other;
acquiring K included angle difference values according to the ith included angle and the jth included angle;
and selecting the maximum value from the K included angle differences as the maximum included angle.
In the embodiment of the application, an indoor and outdoor scene determining device is provided, and by adopting the device, under the condition that N is greater than 3, the included angle difference between every two included angles needs to be calculated respectively, so that the difference between any two included angles is avoided being omitted, and a more accurate maximum included angle can be obtained.
Alternatively, on the basis of the embodiment corresponding to fig. 21, in another embodiment of the indoor and outdoor scene determining apparatus 20 provided in the embodiment of the present application,
the determining module 202 is specifically configured to determine that a target scene corresponding to the terminal device at the target moment is an outdoor scene if the maximum included angle is greater than the included angle threshold;
and if the maximum included angle is smaller than the included angle threshold value, determining that the target scene corresponding to the terminal equipment at the target moment is an indoor scene.
In the embodiment of the application, an indoor and outdoor scene determining device is provided, and by adopting the device, indoor and outdoor discrimination is assisted by satellite space geometric distribution, compared with the existing indoor and outdoor discrimination method, the device is not influenced by environmental factors such as temperature and humidity, illumination intensity and indoor light, and is different from discrimination based on the number of visible satellites and satellite signal intensity, indoor and outdoor scenes are simply and effectively discriminated by utilizing satellite space geometric distribution conditions, and the misjudgment rate when a user is located in a window, a balcony or a door and can receive satellite signal scenes is reduced.
Optionally, on the basis of the embodiment corresponding to fig. 21, in another embodiment of the indoor and outdoor scene determining apparatus 20 provided in the embodiment of the present application, the indoor and outdoor scene determining apparatus 20 further includes a sending module 203, a receiving module 204, and a displaying module 205;
the obtaining module 201 is further configured to obtain a target position after the determining module 202 determines that the target scene corresponding to the terminal device at the target time is an outdoor scene;
a sending module 203, configured to send a target position and an outdoor positioning identifier to a service server, so that the service server generates first navigation information according to a device position, the target position, and the outdoor positioning identifier, which correspond to a terminal device at a target time, where the outdoor positioning identifier corresponds to an outdoor scene;
a receiving module 204, configured to receive first navigation information sent by a service server;
the displaying module 205 is configured to display the first navigation information.
In the embodiment of the application, an indoor and outdoor scene determining device is provided, and by adopting the device, when the terminal device held by the user is detected to be in the outdoor scene, the corresponding navigation information can be generated based on the outdoor scene, so that accurate navigation information is provided for the user.
Optionally, on the basis of the embodiment corresponding to fig. 21, in another embodiment of the indoor and outdoor scene determining apparatus 20 provided in the embodiment of the present application, the indoor and outdoor scene determining apparatus 20 further includes a sending module 203, a receiving module 204, and a displaying module 205;
the obtaining module 201 is further configured to obtain a target position after the determining module 202 determines that the target scene corresponding to the terminal device at the target time is an indoor scene;
the sending module 203 is configured to send the target position and the indoor positioning identifier to the service server, so that the service server generates second navigation information according to the device position, the target position, and the indoor positioning identifier corresponding to the terminal device at the target time, where the indoor positioning identifier corresponds to an indoor scene;
a receiving module 204, configured to receive second navigation information sent by the service server;
and a display module 205, configured to display the second navigation information.
In the embodiment of the application, an indoor and outdoor scene determining device is provided, and by adopting the device, when it is detected that the terminal equipment held by the user is in an indoor scene, corresponding navigation information can be generated based on the indoor scene, so that accurate navigation information is provided for the user.
The embodiment of the present application further provides another indoor and outdoor scene determining apparatus, where the indoor and outdoor scene determining apparatus is disposed in a terminal device, as shown in fig. 22, for convenience of description, only a part related to the embodiment of the present application is shown, and details of the specific technology are not disclosed, please refer to the method part of the embodiment of the present application. The terminal device may be any terminal device including a mobile phone, a tablet computer, a Personal Digital Assistant (PDA), a Point of Sales (POS), a vehicle-mounted computer, and the like, taking the terminal device as the mobile phone as an example:
fig. 22 is a block diagram illustrating a partial structure of a mobile phone related to a terminal device provided in an embodiment of the present application. Referring to fig. 22, the cellular phone includes: radio Frequency (RF) circuit 310, memory 320, input unit 330, display unit 340, sensor 350, audio circuit 360, wireless fidelity (WiFi) module 370, processor 380, and power supply 390. Those skilled in the art will appreciate that the handset configuration shown in fig. 22 is not intended to be limiting and may include more or fewer components than shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile phone in detail with reference to fig. 22:
the RF circuit 310 may be used for receiving and transmitting signals during information transmission and reception or during a call, and in particular, receives downlink information of a base station and then processes the received downlink information to the processor 380; in addition, the data for designing uplink is transmitted to the base station. In general, the RF circuit 310 includes, but is not limited to, an antenna, at least one Amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, RF circuit 310 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Messaging Service (SMS), and the like.
The memory 320 may be used to store software programs and modules, and the processor 380 executes various functional applications and data processing of the mobile phone by operating the software programs and modules stored in the memory 320. The memory 320 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 320 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 330 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone. Specifically, the input unit 330 may include a touch panel 331 and other input devices 332. The touch panel 331, also referred to as a touch screen, can collect touch operations of a user (e.g., operations of the user on the touch panel 331 or near the touch panel 331 using any suitable object or accessory such as a finger, a stylus, etc.) on or near the touch panel 331, and drive the corresponding connection device according to a preset program. Alternatively, the touch panel 331 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 380, and can receive and execute commands sent by the processor 380. In addition, the touch panel 331 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The input unit 330 may include other input devices 332 in addition to the touch panel 331. In particular, other input devices 332 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 340 may be used to display information input by the user or information provided to the user and various menus of the mobile phone. The Display unit 340 may include a Display panel 341, and optionally, the Display panel 341 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch panel 331 can cover the display panel 341, and when the touch panel 331 detects a touch operation on or near the touch panel 331, the touch panel is transmitted to the processor 380 to determine the type of the touch event, and then the processor 380 provides a corresponding visual output on the display panel 341 according to the type of the touch event. Although in fig. 22, the touch panel 331 and the display panel 341 are two independent components to implement the input and output functions of the mobile phone, in some embodiments, the touch panel 331 and the display panel 341 may be integrated to implement the input and output functions of the mobile phone.
The handset may also include at least one sensor 350, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that adjusts the brightness of the display panel 341 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 341 and/or the backlight when the mobile phone is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
Audio circuitry 360, speaker 361, microphone 362 may provide an audio interface between the user and the handset. The audio circuit 360 may transmit the electrical signal converted from the received audio data to the speaker 361, and the audio signal is converted by the speaker 361 and output; on the other hand, the microphone 362 converts the collected sound signals into electrical signals, which are received by the audio circuit 360 and converted into audio data, which are then processed by the audio data output processor 380 and then transmitted to, for example, another cellular phone via the RF circuit 310, or output to the memory 320 for further processing.
WiFi belongs to short-distance wireless transmission technology, and the mobile phone can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 370, and provides wireless broadband internet access for the user. Although fig. 22 shows the WiFi module 370, it is understood that it does not belong to the essential constitution of the handset, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 380 is a control center of the mobile phone, connects various parts of the whole mobile phone by using various interfaces and lines, and performs various functions of the mobile phone and processes data by operating or executing software programs and/or modules stored in the memory 320 and calling data stored in the memory 320, thereby integrally monitoring the mobile phone. Optionally, processor 380 may include one or more processing units; optionally, processor 380 may integrate an application processor, which primarily handles operating systems, user interfaces, application programs, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 380.
The handset also includes a power supply 390 (e.g., a battery) for powering the various components, optionally, the power supply may be logically connected to the processor 380 through a power management system, so that the power management system may be used to manage charging, discharging, and power consumption.
Although not shown, the mobile phone may further include a camera, a bluetooth module, etc., which will not be described herein.
In this embodiment, the processor 380 included in the terminal device further has the following functions:
acquiring N satellite positions corresponding to a target moment, wherein each satellite position corresponds to a satellite, and N is an integer greater than or equal to 1;
acquiring a device position corresponding to the terminal device at a target moment;
acquiring an included angle between each satellite position in the N satellite positions and the equipment position to obtain N included angles;
determining a maximum included angle according to the N included angles;
and determining a target scene corresponding to the terminal equipment at the target moment according to the maximum included angle, wherein the target scene is an indoor scene or an outdoor scene.
Optionally, the processor 380 is further configured to perform the following steps:
if the satellite signal is detected at the target time, executing a step of acquiring N satellite positions corresponding to the target time;
if the satellite signal is not detected at the target time and the wireless fidelity (WIFI) signal is detected, determining that a target scene corresponding to the terminal equipment at the target time is an indoor scene;
and if the satellite signal is not detected at the target moment and the WIFI signal is not detected, determining that the judgment is failed.
Optionally, the processor 380 is specifically configured to perform the following steps:
sending an ephemeris issuing request to a server so that the server responds to the ephemeris issuing request and acquires broadcast ephemeris information corresponding to a target moment, wherein the broadcast ephemeris information comprises ephemeris parameters corresponding to N satellites;
receiving broadcast ephemeris information corresponding to a target moment sent by a server;
and determining N satellite positions corresponding to the target time according to the broadcast ephemeris information corresponding to the target time.
Optionally, the processor 380 is specifically configured to perform the following steps:
aiming at the ith satellite in the N satellites, acquiring ephemeris parameters corresponding to the ith satellite according to broadcast ephemeris information corresponding to a target time, wherein i is an integer which is greater than or equal to 1 and less than or equal to N;
for an ith satellite in the N satellites, determining a satellite position corresponding to the ith satellite at a target moment according to ephemeris parameters corresponding to the ith satellite;
and acquiring N satellite positions according to the satellite position corresponding to the ith satellite at the target moment.
Optionally, the processor 380 is specifically configured to perform the following steps:
acquiring first signal information, wherein the first signal information comprises coordinate values of a first satellite and time for a signal of the first satellite to reach terminal equipment;
acquiring second signal information, wherein the second signal information comprises coordinate values of a second satellite and time for a signal of the second satellite to reach the terminal equipment;
acquiring third signal information, wherein the third signal information comprises coordinate values of a third satellite and time for a signal of the third satellite to reach the terminal equipment;
acquiring fourth signal information, wherein the fourth signal information comprises coordinate values of a fourth satellite and time for a signal of the fourth satellite to reach the terminal equipment;
and determining the device position corresponding to the terminal device at the target moment according to the first signal information, the second signal information, the third signal information and the fourth signal information.
Optionally, the processor 380 is specifically configured to perform the following steps:
acquiring first position information, wherein the first position information comprises position information of a first base station and a first distance between terminal equipment and the first base station;
acquiring second position information, wherein the second position information comprises position information of a second base station and a second distance between the terminal equipment and the second base station;
acquiring third position information, wherein the third position information comprises position information of a third base station and a third distance between the terminal equipment and the third base station;
and determining the device position corresponding to the terminal device at the target moment according to the first position information, the second position information and the third position information.
Optionally, the processor 380 is specifically configured to perform the following steps:
constructing a plane coordinate system by taking the position of the equipment as a coordinate origin;
acquiring longitude information and latitude information corresponding to terminal equipment;
aiming at the ith satellite position of N satellite positions, determining a coordinate value of the ith satellite in a plane coordinate system according to longitude information and latitude information corresponding to terminal equipment, equipment position and the ith satellite position, wherein i is an integer which is greater than or equal to 1 and less than or equal to N;
determining an included angle between the ith satellite and the position of the equipment according to the coordinate value of the ith satellite in the plane coordinate system;
and obtaining N included angles according to the included angle between the ith satellite and the equipment position.
Optionally, the processor 380 is specifically configured to perform the following steps:
if N is an integer less than or equal to 3, determining that the maximum included angle in the N included angles is 0;
and if N is an integer greater than 3, determining the maximum included angle according to K included angle differences in the N included angles, wherein the K included angle differences are the differences between every two included angles in the N included angles, and K is an integer greater than or equal to 6.
Optionally, the processor 380 is specifically configured to perform the following steps:
determining an angle difference value between an ith included angle and a jth included angle in the N included angles, wherein i and j are integers which are greater than or equal to 1 and less than or equal to N, and i and j are not equal;
acquiring K included angle difference values according to the ith included angle and the jth included angle;
and selecting the maximum value from the K included angle differences as the maximum included angle.
Optionally, the processor 380 is specifically configured to perform the following steps:
if the maximum included angle is larger than the included angle threshold value, determining that a target scene corresponding to the terminal equipment at the target moment is an outdoor scene;
and if the maximum included angle is smaller than the included angle threshold value, determining that the target scene corresponding to the terminal equipment at the target moment is an indoor scene.
Optionally, the processor 380 is further configured to perform the following steps:
acquiring a target position;
sending a target position and an outdoor positioning identifier to a service server so that the service server generates first navigation information according to an equipment position, the target position and the outdoor positioning identifier corresponding to the terminal equipment at a target moment, wherein the outdoor positioning identifier corresponds to an outdoor scene;
receiving first navigation information sent by a service server;
and displaying the first navigation information.
Optionally, the processor 380 is further configured to perform the following steps:
acquiring a target position;
sending a target position and an indoor positioning identifier to a service server so that the service server generates second navigation information according to the equipment position, the target position and the indoor positioning identifier corresponding to the terminal equipment at the target moment, wherein the indoor positioning identifier corresponds to an indoor scene;
receiving second navigation information sent by the service server;
and displaying the second navigation information.
Embodiments of the present application also provide a computer-readable storage medium, in which a computer program is stored, and when the computer program runs on a computer, the computer is caused to execute the method described in the foregoing embodiments.
Embodiments of the present application also provide a computer program product including a program, which, when run on a computer, causes the computer to perform the method described in the foregoing embodiments.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a portable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, an optical disk, or other various media capable of storing program codes.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (13)

1. A method for determining indoor and outdoor scenes, comprising:
acquiring N satellite positions corresponding to a target moment, wherein each satellite position corresponds to a satellite, and N is an integer greater than or equal to 1;
acquiring the device position corresponding to the terminal device at the target moment;
constructing a plane coordinate system by taking the equipment position as a coordinate origin, and mapping the satellite position of each satellite based on the plane coordinate system;
acquiring longitude information and latitude information corresponding to the terminal equipment;
for an ith satellite position of the N satellite positions, determining a coordinate value of the ith satellite in the plane coordinate system according to longitude information and latitude information corresponding to the terminal equipment, the equipment position and the ith satellite position, wherein i is an integer which is greater than or equal to 1 and less than or equal to N;
determining an included angle between the ith satellite and the position of the equipment according to a coordinate value of the ith satellite in the plane coordinate system;
acquiring N included angles according to the included angle between the ith satellite and the equipment position;
determining a maximum included angle according to the N included angles;
if the maximum included angle is larger than the included angle threshold value, determining that a target scene corresponding to the terminal equipment at the target moment is an outdoor scene;
if the maximum included angle is smaller than the included angle threshold, determining that a target scene corresponding to the terminal equipment at the target moment is an indoor scene;
and if the maximum included angle is equal to the included angle threshold, determining that the target scene corresponding to the terminal equipment at the target moment is the indoor scene or the outdoor scene.
2. The method of determining according to claim 1, further comprising:
if a satellite signal is detected at the target time, executing the step of acquiring N satellite positions corresponding to the target time;
if the satellite signal is not detected at the target moment and the wireless fidelity (WIFI) signal is detected, determining that a target scene corresponding to the terminal equipment at the target moment is the indoor scene;
and if the satellite signal is not detected and the WIFI signal is not detected at the target moment, determining that the judgment is failed.
3. The method of claim 1, wherein the obtaining N satellite positions corresponding to the target time comprises:
sending an ephemeris issuing request to a server so that the server can respond to the ephemeris issuing request to acquire broadcast ephemeris information corresponding to the target moment, wherein the broadcast ephemeris information comprises ephemeris parameters corresponding to N satellites;
receiving broadcast ephemeris information corresponding to the target time sent by the server;
and determining the N satellite positions corresponding to the target time according to the broadcast ephemeris information corresponding to the target time.
4. The method according to claim 3, wherein the determining the N satellite positions corresponding to the target time according to the broadcast ephemeris information corresponding to the target time comprises:
for an ith satellite in the N satellites, acquiring ephemeris parameters corresponding to the ith satellite according to the broadcast ephemeris information corresponding to the target time, wherein i is an integer greater than or equal to 1 and less than or equal to N;
for the ith satellite in the N satellites, determining a satellite position corresponding to the ith satellite at the target time according to ephemeris parameters corresponding to the ith satellite;
and acquiring the N satellite positions according to the satellite position corresponding to the ith satellite at the target moment.
5. The determination method according to any one of claims 1 to 4, wherein N is an integer greater than or equal to 4;
the acquiring the device position corresponding to the terminal device at the target time includes:
acquiring first signal information, wherein the first signal information comprises coordinate values of a first satellite and time for a signal of the first satellite to reach the terminal equipment;
acquiring second signal information, wherein the second signal information comprises coordinate values of a second satellite and time when a signal of the second satellite reaches the terminal equipment;
acquiring third signal information, wherein the third signal information comprises coordinate values of a third satellite and time for a signal of the third satellite to reach the terminal device;
acquiring fourth signal information, wherein the fourth signal information comprises coordinate values of a fourth satellite and time for a signal of the fourth satellite to reach the terminal device;
and determining the device position corresponding to the terminal device at the target moment according to the first signal information, the second signal information, the third signal information and the fourth signal information.
6. The method according to claim 1, wherein the obtaining of the device location corresponding to the terminal device at the target time includes:
acquiring first position information, wherein the first position information comprises position information of a first base station and a first distance between the terminal equipment and the first base station;
acquiring second position information, wherein the second position information comprises position information of a second base station and a second distance between the terminal equipment and the second base station;
acquiring third position information, wherein the third position information comprises position information of a third base station and a third distance between the terminal equipment and the third base station;
and determining the device position corresponding to the terminal device at the target moment according to the first position information, the second position information and the third position information.
7. The method of claim 1, wherein determining the maximum angle from the N angles comprises:
if the N is an integer less than or equal to 3, determining that the maximum included angle in the N included angles is 0;
and if the N is an integer larger than 3, determining the maximum included angle according to K included angle difference values in the N included angles, wherein the K included angle difference values are the difference values between every two included angles in the N included angles, and the K is an integer larger than or equal to 6.
8. The method according to claim 7, wherein determining the maximum angle according to the K difference values of the N angles comprises:
determining an angle difference value between an ith included angle and a jth included angle in the N included angles, wherein i and j are integers which are greater than or equal to 1 and less than or equal to N, and i is not equal to j;
acquiring the K included angle difference values according to the ith included angle and the jth included angle;
and selecting the maximum value from the K included angle difference values as the maximum included angle.
9. The method according to claim 1, wherein after determining that the target scene corresponding to the terminal device at the target time is the outdoor scene, the method further includes:
acquiring a target position;
sending the target position and an outdoor positioning identifier to a service server so that the service server generates first navigation information according to a device position corresponding to the terminal device at the target moment, the target position and the outdoor positioning identifier, wherein the outdoor positioning identifier corresponds to the outdoor scene;
receiving the first navigation information sent by the service server;
and displaying the first navigation information.
10. The method according to claim 1, wherein after determining that the target scene corresponding to the terminal device at the target time is the indoor scene, the method further includes:
acquiring a target position;
sending the target position and an indoor positioning identifier to a service server so that the service server generates second navigation information according to a device position corresponding to the terminal device at the target moment, the target position and the indoor positioning identifier, wherein the indoor positioning identifier corresponds to the indoor scene;
receiving the second navigation information sent by the service server;
and displaying the second navigation information.
11. An indoor and outdoor scene determination apparatus, comprising:
the system comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring N satellite positions corresponding to a target moment, each satellite position corresponds to a satellite, and N is an integer greater than or equal to 1;
the acquisition module is further configured to acquire a device position corresponding to the terminal device at the target time;
the acquisition module is further used for constructing a plane coordinate system by taking the equipment position as a coordinate origin, and mapping the satellite position of each satellite based on the plane coordinate system; acquiring longitude information and latitude information corresponding to the terminal equipment; for an ith satellite position of the N satellite positions, determining a coordinate value of the ith satellite in the plane coordinate system according to longitude information and latitude information corresponding to the terminal equipment, the equipment position and the ith satellite position, wherein i is an integer which is greater than or equal to 1 and less than or equal to N; determining an included angle between the ith satellite and the position of the equipment according to a coordinate value of the ith satellite in the plane coordinate system; acquiring N included angles according to the included angle between the ith satellite and the equipment position;
the determining module is used for determining the maximum included angle according to the N included angles;
the determining module is further configured to determine that a target scene corresponding to the terminal device at the target moment is an outdoor scene if the maximum included angle is greater than an included angle threshold; if the maximum included angle is smaller than the included angle threshold, determining that a target scene corresponding to the terminal equipment at the target moment is an indoor scene; and if the maximum included angle is equal to the included angle threshold, determining that the target scene corresponding to the terminal equipment at the target moment is the indoor scene or the outdoor scene.
12. A terminal device, comprising: a memory, a processor, and a bus system;
wherein the memory is used for storing programs;
the processor is configured to execute a program in the memory, the processor is configured to perform the determination method of any one of claims 1 to 10 according to instructions in program code;
the bus system is used for connecting the memory and the processor so as to enable the memory and the processor to communicate.
13. A computer-readable storage medium comprising instructions which, when executed on a computer, cause the computer to perform the determination method of any one of claims 1 to 10.
CN202011406545.4A 2020-12-04 2020-12-04 Method for determining indoor and outdoor scenes, related device, equipment and storage medium Active CN112558129B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011406545.4A CN112558129B (en) 2020-12-04 2020-12-04 Method for determining indoor and outdoor scenes, related device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011406545.4A CN112558129B (en) 2020-12-04 2020-12-04 Method for determining indoor and outdoor scenes, related device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112558129A CN112558129A (en) 2021-03-26
CN112558129B true CN112558129B (en) 2022-09-02

Family

ID=75048261

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011406545.4A Active CN112558129B (en) 2020-12-04 2020-12-04 Method for determining indoor and outdoor scenes, related device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112558129B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113340292A (en) * 2021-05-17 2021-09-03 国网江苏省电力有限公司苏州供电分公司 Positioning navigation device and system under no satellite signal
CN113419266B (en) * 2021-08-23 2021-12-10 腾讯科技(深圳)有限公司 Positioning method and device, electronic equipment and computer readable storage medium
CN114126042A (en) * 2021-11-22 2022-03-01 中大检测(湖南)股份有限公司 TDOA-based WLAN positioning method
CN113905438B (en) * 2021-12-10 2022-03-22 腾讯科技(深圳)有限公司 Scene identification generation method, positioning method and device and electronic equipment
CN115859158B (en) * 2023-02-16 2023-07-07 荣耀终端有限公司 Scene recognition method, system and terminal equipment

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102455427B (en) * 2010-10-22 2013-06-05 南京莱斯信息技术股份有限公司 Method for correcting global position system (GPS) offset
CN103856989B (en) * 2012-11-28 2017-12-15 中国电信股份有限公司 Method and system, terminal and the positioning application platform of indoor and outdoor positioning switching
CN104180810B (en) * 2014-08-26 2016-08-31 上海微小卫星工程中心 A kind of Satellite TT entry and exit decision method and device
WO2016064631A1 (en) * 2014-10-20 2016-04-28 Nextnav, Llc Mitigating effects of multipath during position computation
EP3290953B1 (en) * 2015-06-04 2020-04-22 Huawei Technologies Co., Ltd. Method of setting positioning mode and mobile terminal
JP6947168B2 (en) * 2016-03-30 2021-10-13 日本電気株式会社 Indoor / outdoor judgment program, indoor / outdoor judgment system, indoor / outdoor judgment method, mobile terminal, and indoor / outdoor environment classification judgment means
US9766349B1 (en) * 2016-09-14 2017-09-19 Uber Technologies, Inc. Localization and tracking using location, signal strength, and pseudorange data
CN111078805A (en) * 2019-09-26 2020-04-28 深圳市东深电子股份有限公司 River reach patrol track validity judgment method
CN111796313B (en) * 2020-06-28 2023-07-21 中国人民解放军63921部队 Satellite positioning method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN112558129A (en) 2021-03-26

Similar Documents

Publication Publication Date Title
CN112558129B (en) Method for determining indoor and outdoor scenes, related device, equipment and storage medium
CN112558125B (en) Vehicle positioning method, related device, equipment and storage medium
US9326105B2 (en) Systems and methods for using three-dimensional location information to improve location services
US8666432B2 (en) Method and system for indoor RF mapping
US7747259B2 (en) Method and system for sending location coded images over a wireless network
US6850844B1 (en) Portable navigation device with integrated GPS and dead reckoning capabilities
WO2019067360A1 (en) Three-dimensional city models and shadow mapping to improve altitude fixes in urban environments
US20200273204A1 (en) Accurate positioning system using attributes
CN109932686B (en) Positioning method, mobile terminal and indoor positioning system
CN108337368B (en) Method for updating positioning data and mobile terminal
CN108151748B (en) Flight device surveying and mapping operation route planning method and device and terminal
EP1903349A1 (en) Mobile communication terminal for receiving position information service and method thereof
CN110285809B (en) Indoor and outdoor integrated combined positioning device
CA2946686C (en) Location error radius determination
US10993204B2 (en) Systems and methods for determining if a receiver is inside or outside a building or area
CN111006650B (en) Ground observation whistle reconnaissance early warning system
WO2019120195A1 (en) Indoor navigation system using inertial sensors and short-wavelength low energy device
CN107655474A (en) A kind of air navigation aid and navigation system based on intelligent terminal
CN113295174B (en) Lane-level positioning method, related device, equipment and storage medium
CN109813300B (en) Positioning method and terminal equipment
US10054688B2 (en) Method and apparatus for saving power during synthetic positioning in GNSS receivers
CN113281796B (en) Position determining method, speed determining method, device, equipment and storage medium
CN112817337B (en) Method, device and equipment for acquiring path and readable storage medium
EP2569958B1 (en) Method, computer program and apparatus for determining an object in sight
Sultana et al. An innovative implementation of indoor positioning system using GPS

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40041022

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant