CN111242354A - Method and device for wearable device, electronic device and readable storage medium - Google Patents

Method and device for wearable device, electronic device and readable storage medium Download PDF

Info

Publication number
CN111242354A
CN111242354A CN202010007095.5A CN202010007095A CN111242354A CN 111242354 A CN111242354 A CN 111242354A CN 202010007095 A CN202010007095 A CN 202010007095A CN 111242354 A CN111242354 A CN 111242354A
Authority
CN
China
Prior art keywords
data
weather
wearable device
sky
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010007095.5A
Other languages
Chinese (zh)
Inventor
张健亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Rajax Network Technology Co Ltd
Original Assignee
Rajax Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rajax Network Technology Co Ltd filed Critical Rajax Network Technology Co Ltd
Priority to CN202010007095.5A priority Critical patent/CN111242354A/en
Publication of CN111242354A publication Critical patent/CN111242354A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01WMETEOROLOGY
    • G01W1/00Meteorology
    • G01W1/10Devices for predicting weather conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Abstract

The embodiment of the disclosure discloses a method, a device, an electronic device and a readable storage medium for a wearable device, wherein the method comprises the following steps: acquiring weather related data; sending the weather-related data and location data of the wearable device to a server; receiving weather prediction data from the server, the weather prediction data derived based on the weather-related data and location data of the wearable device; and displaying the weather forecast data. According to the technical scheme, the wearable equipment can be used for shooting the sky image in real time to obtain weather related data and sending the weather related data to the server for data analysis to obtain weather prediction data, so that the limitation of obtaining weather prediction information from the Internet is overcome, weather prediction can be carried out at different times and places, and the real-time performance of weather prediction is effectively improved.

Description

Method and device for wearable device, electronic device and readable storage medium
Technical Field
The present disclosure relates to the field of computer application technologies, and in particular, to a method and an apparatus for a wearable device, an electronic device, and a readable storage medium.
Background
The wearable device is a portable device, not only can talk, take a picture, play a game, but also can realize functions including positioning, information processing, fingerprint scanning and the like. With the development of various application programs and hardware devices, functions of wearable devices are becoming more abundant, and the wearable devices become indispensable devices in daily life of people.
The existing wearable equipment generally adopts an internet access mode to acquire weather forecast information from a network, so that the weather forecast information of a certain area and a certain time period can be conveniently acquired, and the weather forecast information can be updated. However, the weather forecast information obtained in this way often forecasts the city or region as the minimum spatial unit, and the forecasting time span is large, the real-time performance is not high, generally forecasts the weather condition of at least several hours to several days in the future, and the accuracy is not high.
Disclosure of Invention
To solve the problems in the related art, embodiments of the present disclosure provide a method and apparatus for a wearable device, an electronic device, and a readable storage medium.
In a first aspect, a method for a wearable device is provided in embodiments of the present disclosure.
Specifically, the method for the wearable device comprises the following steps:
acquiring weather related data;
sending the weather-related data and location data of the wearable device to a server;
receiving weather prediction data from the server, the weather prediction data derived based on the weather-related data and location data of the wearable device;
and displaying the weather forecast data.
With reference to the first aspect, in a first implementation manner of the first aspect, the weather-related data includes: sky color data, wind speed data, wind direction data, air temperature data, and air pressure data.
With reference to the first implementation manner of the first aspect, the present disclosure provides in a second implementation manner of the first aspect, wherein the sky color data, the wind speed data, and the wind direction data are obtained based on a sky image captured by an imaging device of the wearable apparatus; the air temperature data and the air pressure data are obtained through a sensor of the wearable device.
With reference to the second implementation manner of the first aspect, in a third implementation manner of the first aspect, the acquiring sky color data based on a sky image includes:
inputting the sky image into a trained first model, the first model identifying a sky portion in the sky image and outputting the sky color data.
With reference to the second implementation manner of the first aspect, in a fourth implementation manner of the first aspect, the acquiring wind speed data based on the sky image includes:
acquiring displacement of a cloud image in the sky image based on two sky images shot by a camera of the wearable device at a specific time interval;
and determining the wind speed data according to the displacement of the cloud image, the height of the cloud from the ground, the elevation angle of the camera device during shooting, the focal length of the camera device and the specific time interval.
With reference to the second implementation manner of the first aspect, in a fifth implementation manner of the first aspect, the acquiring wind direction data based on the sky image includes:
acquiring displacement of a cloud image in the sky image based on two sky images shot by a camera of the wearable device at a specific time interval;
and determining the wind direction data according to the displacement of the cloud image, the height of the cloud from the ground, the elevation angle of the camera device during shooting, the focal length of the camera device and the specific time interval.
With reference to the first implementation manner of the first aspect, in a sixth implementation manner of the first aspect, the deriving the weather prediction data based on the weather-related data and the location data of the wearable device includes:
obtaining historical weather data of the position according to the position data of the wearable device;
inputting the sky color data, the wind speed data, the wind direction data, the air temperature data, the air pressure data and historical weather data into a trained second model;
and acquiring the weather prediction data output by the second model.
With reference to the sixth implementation manner of the first aspect, in a seventh implementation manner of the first aspect, the historical weather data includes local weather data of the current day of the past year and/or weather data within a preset time period of the current day.
With reference to the first aspect, the present disclosure provides in an eighth implementation manner of the first aspect, the weather forecast data includes any one or more of the following data at a specific time or time period in the future: air temperature, air pressure, precipitation probability, wind speed, wind direction, and air quality.
With reference to the first aspect, in a ninth implementation manner of the first aspect, the method further includes:
determining a garbage type and a garbage contour based on the garbage image;
displaying the garbage image and displaying a color contour line on the garbage image along the garbage contour in an overlapping mode, wherein the color of the color contour line corresponds to the garbage type.
With reference to the first aspect, in a tenth implementation manner of the first aspect, the method further includes:
receiving a signal transmitted by a signal transmitter worn by a user;
determining the movement direction and the movement distance of the signal transmitter relative to the signal receiver according to the signals;
and determining a user instruction according to the movement direction and the movement distance.
With reference to the tenth implementation manner of the first aspect, the present disclosure provides in an eleventh implementation manner of the first aspect, the signal is an infrared signal;
the determining a movement distance of the signal emitter relative to the signal receiver from the signal includes determining the movement distance from a wavelength change of the infrared signal.
With reference to the tenth implementation manner of the first aspect, in a twelfth implementation manner of the first aspect, the determining a user instruction according to the moving direction and the moving distance includes:
matching the motion direction and the motion distance with the motion direction and the motion distance of a preset gesture respectively, and determining a user gesture according to the matching degree;
determining the user instruction according to the user gesture.
In a second aspect, an apparatus for a wearable device is provided in embodiments of the present disclosure, including:
an acquisition module configured to acquire weather-related data;
a transmitting module configured to transmit the weather-related data and location data of the wearable device to a server;
a first receiving module configured to receive weather prediction data from the server, the weather prediction data derived based on the weather-related data and location data of the wearable device;
a display module configured to display the weather projection data.
With reference to the second aspect, in a first implementation manner of the second aspect, the weather-related data includes: sky color data, wind speed data, wind direction data, air temperature data, and air pressure data.
With reference to the first implementation manner of the second aspect, in a second implementation manner of the second aspect, the sky color data, the wind speed data, and the wind direction data are obtained based on a sky image captured by an imaging device of the wearable apparatus; the air temperature data and the air pressure data are obtained based on a sensor of the wearable device.
With reference to the second implementation manner of the second aspect, the disclosure provides in a third implementation manner of the second aspect, a portion of the acquisition module, which acquires sky color data based on a sky image, is configured to input the sky image into a trained first model, and the first model identifies a sky portion in the sky image and outputs the sky color data.
With reference to the second implementation manner of the second aspect, in a fourth implementation manner of the second aspect, the portion of the acquiring module that acquires wind speed data based on a sky image is configured to acquire a displacement of a cloud image in the sky image based on two sky images captured by an imaging device of the wearable device at a specific time interval; and determining the wind speed data according to the displacement of the cloud image, the height of the cloud from the ground, the elevation angle of the camera device during shooting, the focal length of the camera device and the specific time interval.
With reference to the second implementation manner of the second aspect, in a fifth implementation manner of the second aspect, the acquiring module, where the portion that acquires the wind direction data based on the sky image, is configured to acquire a displacement of a cloud image in the sky image based on two sky images that are captured by an imaging device of the wearable device at a specific time interval; and determining the wind direction data according to the displacement of the cloud image, the height of the cloud from the ground, the elevation angle of the camera device during shooting, the focal length of the camera device and the specific time interval.
With reference to the first implementation manner of the second aspect, in a sixth implementation manner of the second aspect, the first receiving module includes:
a first obtaining unit configured to obtain historical weather data of the location according to location data of the wearable device;
an input unit configured to input the sky color data, the wind speed data, the wind direction data, the temperature data, the barometric pressure data, and historical weather data into a trained second model;
a second obtaining unit configured to obtain the weather prediction data output by the second model.
With reference to the sixth implementation manner of the second aspect, in a seventh implementation manner of the second aspect, the historical weather data includes local weather data of the current day of the past year and/or weather data within a preset time period of the current day.
With reference to the second aspect, the present disclosure provides in an eighth implementation manner of the second aspect, the weather forecast data includes any one or more of the following data at a specific time or time period in the future: air temperature, air pressure, precipitation probability, wind speed, wind direction, and air quality.
With reference to the second aspect, in a ninth implementation manner of the second aspect, the apparatus further includes: a first determination module configured to determine a spam type and a spam profile based on the spam image;
and the overlapping module is configured to display the garbage image and display a color contour line on the garbage image along the garbage contour in an overlapping mode, wherein the color of the color contour line corresponds to the garbage type.
With reference to the second aspect, in a tenth implementation manner of the second aspect, the apparatus further includes:
a second receiving module configured to receive a signal transmitted by a signal transmitter worn by a user;
a second determination module configured to determine a direction and distance of movement of the signal transmitter relative to the signal receiver from the signal;
a third determination module configured to determine a user instruction from the movement direction and the movement distance.
With reference to the tenth implementation manner of the second aspect, in an eleventh implementation manner of the second aspect, the signal is an infrared signal;
the second determining module includes: a first determination unit configured to determine the movement distance according to a wavelength variation of the infrared signal.
With reference to the tenth implementation manner of the second aspect, in a twelfth implementation manner of the second aspect, the third determining module includes:
the second determination unit is configured to match the motion direction and the motion distance with the motion direction and the motion distance of a preset gesture respectively, and determine a user gesture according to the matching degree;
a third determination unit configured to determine the user instruction according to the user gesture.
In a third aspect, the disclosed embodiments provide an electronic device comprising a memory and a processor, wherein the memory is configured to store one or more computer instructions, wherein the one or more computer instructions are executed by the processor to implement the method steps for a wearable device in the first aspect.
In a fourth aspect, an embodiment of the present disclosure provides a readable storage medium, on which computer instructions are stored, and when executed by a processor, the computer instructions implement the method according to the first aspect, or any one of the first to twelfth implementation manners of the first aspect.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
according to the technical scheme, the wearable equipment is used for obtaining weather related data, then the weather related data and the position data of the wearable equipment are sent to the server, the server predicts weather according to the received weather related data and the position data of the wearable equipment, then the weather prediction data are sent to the wearable equipment, and the wearable equipment displays the weather prediction data. According to the technical scheme, the wearable equipment can be used for shooting the sky image in real time to obtain weather related data and sending the weather related data to the server for data analysis to obtain weather prediction data, so that the limitation of obtaining weather prediction information from the Internet is overcome, weather prediction can be carried out at different times and places, and the real-time performance of weather prediction is effectively improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
Other features, objects, and advantages of the present disclosure will become more apparent from the following detailed description of non-limiting embodiments when taken in conjunction with the accompanying drawings. In the drawings:
fig. 1 shows an application scenario diagram of a method for a wearable device according to an embodiment of the present disclosure;
fig. 2 shows a flow diagram of a method for a wearable device according to an embodiment of the present disclosure;
FIG. 3 illustrates a flow chart for acquiring wind speed data based on a sky image in accordance with an embodiment of the present disclosure;
4 a-4 d show schematic diagrams of the acquisition of wind speed data;
FIG. 5 illustrates a flow chart for acquiring wind direction data based on a sky image in accordance with an embodiment of the present disclosure;
6 a-6 b show schematic diagrams of the acquisition of wind direction data;
fig. 7 illustrates a flow diagram for deriving weather prediction data based on weather-related data and location data of a wearable device, in accordance with an embodiment of the disclosure;
fig. 8 shows a flow diagram of a method of a wearable device according to an embodiment of the present disclosure;
fig. 9 shows a flow diagram of a method of a wearable device according to an embodiment of the present disclosure;
fig. 10 shows a block diagram of an apparatus for a wearable device according to an embodiment of the present disclosure;
fig. 11 shows a block diagram of a first receiving module according to an embodiment of the present disclosure;
fig. 12 shows a block diagram of an apparatus for a wearable device according to an embodiment of the present disclosure;
fig. 13 shows a block diagram of an apparatus for a wearable device according to an embodiment of the present disclosure;
fig. 14 shows a block diagram of an electronic device according to an embodiment of the present disclosure;
fig. 15 shows a schematic structural diagram of a computer system suitable for use to implement a method for a wearable device according to an embodiment of the present disclosure.
Detailed Description
Hereinafter, exemplary embodiments of the present disclosure will be described in detail with reference to the accompanying drawings so that those skilled in the art can easily implement them. Also, for the sake of clarity, parts not relevant to the description of the exemplary embodiments are omitted in the drawings.
In the present disclosure, it is to be understood that terms such as "including" or "having," etc., are intended to indicate the presence of the disclosed features, numbers, steps, behaviors, components, parts, or combinations thereof, and are not intended to preclude the possibility that one or more other features, numbers, steps, behaviors, components, parts, or combinations thereof may be present or added.
It should be further noted that the embodiments and features of the embodiments in the present disclosure may be combined with each other without conflict. The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
As mentioned above, the wearable device is a portable device, which not only can be used for communication, photographing and playing games, but also can realize functions including positioning, information processing, fingerprint scanning and the like. With the development of various application programs and hardware devices, functions of wearable devices are becoming more abundant, and the wearable devices become indispensable devices in daily life of people.
The existing wearable equipment generally adopts an internet access mode to acquire weather forecast information from a network, so that the weather forecast information of a certain area and a certain time period can be conveniently acquired, and the weather forecast information can be updated. However, the weather forecast information obtained in this way often forecasts the city or region as the minimum spatial unit, and the forecasting time span is large, the real-time performance is not high, generally forecasts the weather condition of at least several hours to several days in the future, and the accuracy is not high.
In view of the foregoing drawbacks, in the technical solution provided in the embodiment of the present disclosure, a wearable device is used to obtain weather-related data, and then send the weather-related data and location data of the wearable device to a server, and the server predicts weather according to the received weather-related data and location data of the wearable device, and then sends weather prediction data to the wearable device, and the wearable device displays the weather prediction data. According to the technical scheme, the wearable equipment can be used for shooting the sky image in real time to obtain weather related data and sending the weather related data to the server for data analysis to obtain weather prediction data, so that the limitation of obtaining weather prediction information from the Internet is overcome, weather prediction can be carried out at different times and places, and the real-time performance of weather prediction is effectively improved.
Fig. 1 shows an application scenario diagram of a method for a wearable device according to an embodiment of the present disclosure.
As shown in fig. 1, the wearable device 100 according to the embodiment of the present disclosure is illustrated by taking smart glasses as an example, and it is understood that the wearable device 100 may also be other smart devices, such as a smart bracelet, a smart watch, a smart helmet, a smart arm ring, and a smart phone. Fig. 1 shows an application scenario of the method of the present disclosure for smart glasses, which may capture an image of the sky, for example, capture the image through a camera disposed in an eye socket, identify the image, obtain preliminary analysis information, and then send the preliminary analysis information to a server 200 for data processing. The server 200 has a fast data processing capability, and can obtain a more accurate weather prediction result by combining the preliminary analysis information transmitted by the smart glasses obtained in real time. After the server 200 performs data processing, the weather prediction result is sent to the smart glasses, so that a user of the smart glasses can make a next decision. It is understood that the method for wearable device of the present disclosure can be applied to the application fields requiring real-time data analysis, such as weather prediction, traffic prediction, etc., and the above exemplary description does not constitute a limitation of the present disclosure.
Fig. 2 shows a flow diagram of a method for a wearable device according to an embodiment of the present disclosure.
As shown in fig. 2, the method for a wearable device comprises the following steps S101-S105:
in step S101, weather-related data is acquired;
in step S102, transmitting the weather-related data and the location data of the wearable device to a server;
in step S103, receiving weather prediction data from the server, the weather prediction data being derived based on the weather-related data and the location data of the wearable device;
in step S104, the weather prediction data is displayed;
according to an embodiment of the present disclosure, the weather-related data includes: sky color data, wind speed data, wind direction data, air temperature data, and air pressure data.
According to an embodiment of the present disclosure, the sky color data, the wind speed data, and the wind direction data are obtained based on a sky image captured by an imaging device of the wearable apparatus; the air temperature data and the air pressure data are obtained based on a sensor of the wearable device.
According to embodiments of the present disclosure, a wearable device may include one or more cameras through which capturing of images of the sky may be enabled. In the disclosed embodiments, the wearable device is further provided with an air pressure sensor, a temperature sensor, and the like. The air pressure sensor is used for acquiring air pressure data, and the temperature sensor is used for acquiring air temperature data. The specific positions of the air pressure sensor and the temperature sensor can be flexibly set according to the needs, and are not limited herein.
According to the embodiment of the disclosure, the sky color data, the wind speed data, the wind direction data, the air temperature data, the air pressure data and the position data of the wearable device can be made into a JSON format, so that data interaction with a server is facilitated.
According to the embodiment of the disclosure, when the server predicts weather, the server can firstly utilize the position data of the wearable device to judge whether weather prediction data of the position of the wearable device exists, and if yes, the server directly sends the weather prediction data to the communication device. Otherwise, weather prediction is carried out based on the sky color data, the wind speed data, the wind direction data, the air temperature data, the air pressure data and the position data of the wearable device, and weather prediction data is sent to a communication device.
According to an embodiment of the present disclosure, acquiring sky color data based on a sky image includes:
inputting the sky image into a trained first model, the first model identifying a sky portion in the sky image and outputting the sky color data.
In the disclosed manner, the first model may be a neural network model under the tensrflow. Js is a front-end deep learning framework based on TensorFlow, and can directly perform neural network model training on a browser of wearable equipment without downloading or installing other application programs and also can directly run the trained model for data processing.
In the method, the trained first model is used for identifying the sky part in the sky image, and sky color data are output. The sky color data may be an RGB value or an HSB value of a color, and different sky color data may be obtained according to different training data selected during the training of the first model, which is not limited by the present disclosure.
FIG. 3 illustrates a flow chart for acquiring wind speed data based on a sky image in accordance with an embodiment of the present disclosure.
As shown in fig. 3, the acquiring wind speed data based on the sky image includes the following steps S201 to S202:
in step S201, based on two sky images captured by an imaging device of the wearable device at a specific time interval, obtaining a displacement of a cloud image in the sky images;
in step S202, the wind speed data is determined according to the displacement of the cloud image, the height of the cloud from the ground, the elevation angle of the camera during shooting, the focal length of the camera, and the specific time interval.
According to the embodiment of the disclosure, the wind speed data can be determined by calculating the displacement change of the cloud after the wind blows the cloud in the sky. In the method, two sky images are shot in a specific time interval, and the displacement change of the cloud images in the sky images is analyzed to determine the displacement change of the cloud in the actual sky, so that the wind speed data is determined. According to an embodiment of the present disclosure, the two sky images may be cut from a section of a video photographed.
In the disclosure, fig. 4a to 4d are schematic diagrams illustrating a principle of acquiring wind speed data, and in conjunction with fig. 4a to 4d, determining wind speed data based on displacement changes of clouds in the sky image at a specific time interval, specifically by the following formula: v is S/t;
Figure BDA0002355639660000101
Scutting machine=(h/sinθ)*sinα,SDiameter of a pipe=h*(cotθ-cotθ′),
θ′=θ+Δθ,Δθ=sin-1(lUpward facing*sinθ/h),α=sin-1lFlat plate*sinθ/h,
lUpward facing=p*lUpward (Screen),lFlat plate=p*lFlat (Screen),p=h/(sinθ*f)
Wherein v represents wind speed; t represents a specific time interval; s represents the displacement change of the cloud in the actual sky; sCutting machineThe tangential displacement change of the actual cloud relative to a line from one point on the cloud to the shooting position (the direction of the line is called radial direction); sDiameter of a pipeThe radial displacement change of the actual cloud along the radial direction is obtained; h represents the actual cloud height and can be obtained by calculating the local dew point temperature and the surface temperature; theta represents the elevation angle of the mobile device obtained by a gyroscope when the mobile device shoots the cloud image; lUpward (Screen)Representing a vertical displacement of the cloud image in the sky image; lFlat (Screen)Representing a horizontal displacement of the cloud image in the sky image; p represents the size scale of the actual cloud and the cloud image; f denotes an imaging focal length.
As shown in fig. 4a, D is the actual distance from the wearable device to the cloud, i.e. the AC length in fig. 4c, and according to the similarity between the shapes T1 and T2, the scale p between the actual cloud and the imaging cloud can be calculated.
As shown in fig. 4b, the cloud in the sky image moves from the point C to the point C, and the displacement point CC can be divided into two directions, i.e. lUpward (Screen)And lFlat (Screen)
Due to lUpward (Screen)And lFlat (Screen)Specific numerical values can be determined from the image, and l can be determined according to the calculated scale pUpward facingAnd lFlat plate. The height h of the cloud from the ground can be calculated by the local dew point temperature and the ground surface temperature, and the AC length AC ═ h/sin θ, namely D in fig. 4a, can be determined by h and θ. According to lUpward facing、lFlat plateAnd AC, the α angle and the Δ θ angle can be determined, i.e., according to sin Δ θ ═ lUpward facing/AC,sinα=lFlat platethe/AC, calculated as:
Δθ=sin-1(lupward facing*sinθ/h),α=sin-1lFlat plate*sinθ/h。
As shown in fig. 4c, ∠ CAB is the elevation angle θ of the camera measured by the gyroscope.
Then, the S diameter is CC diameter is AB-AB diameter is BC cot theta-B diameter is C diameter cot theta'
=h*(cotθ-cotθ′)。
As shown in fig. 4d, CC cut is negligible with respect to wearable device-to-cloud distances AC and AB cut due to radial displacement.
Then S cut CC cut AC sin α (BC/sin θ) sin α (h/sin θ) sin α.
Fig. 5 illustrates a flowchart for acquiring wind direction data based on a sky image according to an embodiment of the present disclosure.
As shown in fig. 5, the acquiring of wind direction data based on sky images includes the following steps S301 to S302:
in step S301, based on two sky images captured by an imaging device of the wearable device at a specific time interval, obtaining a displacement of a cloud image in the sky image;
in step S302, the wind direction data is determined according to the displacement of the cloud image, the height of the cloud from the ground, the elevation angle of the camera during shooting, the focal length of the camera, and the specific time interval.
6 a-6 b illustrate schematic diagrams of acquiring wind direction data, according to embodiments of the present disclosure; referring to fig. 6a to 6b, the wind direction data is determined according to the displacement of the cloud image, the height of the cloud from the ground, the elevation angle of the camera during shooting, the focal length of the camera, and the specific time interval, and is specifically calculated by the following formula:
β=tan-1(Scutting machine/SDiameter of a pipe)
γ=tan-1lFlat (Screen)/lUpward (Screen)
As shown in fig. 6a, ∠ ACC is finally the cone half angle β, and according to the cone half angle β, a conical surface with the observation point a to the cloud start point C as the central line of the cone can be formed, and the cloud end point C is finally a point on the conical surface.
As shown in fig. 6b, the chemical formula is represented by tan γ ═ lFlat (Screen)/lUpward (Screen)It can be deduced that:
γ=tan-1lflat (Screen)/lUpward (Screen)
After the elevation angle θ, the cone half angle β, and γ are obtained, a calculation can be made
Figure BDA0002355639660000121
And determining the coordinates of C and C.
Referring to FIG. 4c, if the projected AC ground direction is X-axis, the vertical AC projection direction on the ground is Y-axis, the vertical AC projection direction on the ground is Z-axis, and the coordinate of point A is (0, 0, 0), then
The coordinate of point C is (AC sin theta, 0, AC cos theta)
The coordinates at the end of C are (AC x sin theta + CC end x cos theta sin β end x sin gamma, AC x cos theta-CC end x sin theta cos β).
Thus, the wind direction data may be expressed as:
Figure BDA0002355639660000122
fig. 7 shows a flow chart for deriving weather prediction data based on weather-related data and location data of a wearable device according to an embodiment of the disclosure.
As shown in fig. 7, the obtaining weather prediction data based on the weather-related data and the location data of the wearable device includes the following steps S401 to S403:
in step S401, obtaining historical weather data of the location according to the location data of the wearable device;
in step S402, inputting the sky color data, the wind speed data, the wind direction data, the air temperature data, the air pressure data, and historical weather data into a trained second model;
in step S403, the weather prediction data output by the second model is acquired.
According to the embodiment of the present disclosure, the location data of the wearable device may be GPS positioning data of the wearable device at the current time, specifically, may be coordinates of a location point, or may be a predetermined area range centered on the coordinates of the location point. For example, a range of 500 meters near a location point may be considered as the location of the wearable device.
According to an embodiment of the present disclosure, the historical weather data may be historical weather data of a city, region, or street in which the wearable device is located. In this disclosure, if the location of the wearable device obtained according to the location information of the wearable device is accurate to a street or a cell, it may be determined whether historical weather data exists in the street or the cell first, and if not, the location may be further expanded to a region or a city where the street or the cell is located, and the historical weather data of the region or the city may be used as the historical weather data for determining the location of the wearable device. It can be understood that, if the location position of the wearable device is a predetermined area range centered on the coordinates of the location point, the historical weather data of the location of the wearable device is determined at first by the location point, and gradually expanded to the predetermined area range, and finally, the historical weather data of the location of the wearable device is determined by the predetermined area range.
For example, if the location of the wearable device is determined to be the location a according to the location data of the wearable device, the historical weather data may be historical weather data of the location a, historical weather data of a city in which the location a is located, historical weather data of a 5 km area near the location a, or the like.
According to the embodiment of the disclosure, the historical weather data comprises the local weather data of the current day of the past year and/or the weather data in the local preset time period of the current day.
The preset time period may be a preset time period in which weather data is reserved. The preset time period can be flexibly selected when historical weather data are obtained, and if the historical weather data of the whole day are reserved on the same local day, the historical weather data of the same preset time period at the current moment of the wearable device are preferentially selected and determined.
For example, if the location of the wearable device is determined to be location a according to the location data of the wearable device, and the current date is 11 months and 5 days, the historical weather data may be weather data of location a on 11 months and 5 days of the last year or the previous year, or weather data of location a on 11 months and 5 days of the last year or the previous year, and weather data of 6 to 7 am.
According to the embodiment of the present disclosure, similarly to the first model, the second model may also be a neural network model under the tensrflow.
According to an embodiment of the present disclosure, the weather forecast data includes any one or more of the following data at a particular time or time period in the future: air temperature, air pressure, precipitation probability, wind speed, wind direction, and air quality.
According to the embodiment of the disclosure, the weather prediction data obtained by using the second model is more reliable due to the combination of the historical weather data and the current and local data, such as sky color data, wind speed data, wind direction data, air temperature data and air pressure data.
Fig. 8 shows a flow diagram of a method of a wearable device according to an embodiment of the present disclosure.
As shown in fig. 8, the method for a wearable device further comprises the following steps S105-S106:
in step S105, a trash type and a trash outline are determined based on the trash image;
in step S106, the garbage image is displayed, and a color contour line is superimposed on the garbage image along the garbage contour, where the color of the color contour line corresponds to the garbage type.
According to an embodiment of the present disclosure, the garbage types may include the following types: plastic waste, waste paper, household garbage, electronic waste, metal waste and the like.
According to the embodiment of the disclosure, the garbage contour can be determined according to the completeness of garbage, for example, the garbage contour can be a whole garbage contour or a partial garbage contour. When the color contour lines are superposed, corresponding modes are selected for superposition according to the determined garbage contours, for example, when partial garbage contours are determined, the color contour lines of corresponding parts are superposed.
Since there may be a plurality of types of garbage determined by the processor, when there are a plurality of different types of garbage to be processed, different color outlines may be preset for each type of garbage. For example, a red line is superimposed on the outline of the metal waste, and a green line is superimposed on the outline of the plastic waste.
Fig. 9 shows a flow diagram of a method of a wearable device according to an embodiment of the present disclosure.
As shown in fig. 9, the method for a wearable device further comprises the following steps S107-S109:
in step S107, receiving a signal transmitted by a signal transmitter worn by a user;
in step S108, determining the movement direction and the movement distance of the signal transmitter relative to the signal receiver according to the signal;
in step S109, a user instruction is determined according to the movement direction and the movement distance.
According to embodiments of the present disclosure, the signal receiver and the signal transmitter may each be a wearable device worn by the user. For example, the signal receiver may be a smart bracelet and the signal transmitter may be a smart helmet.
According to the embodiment of the present disclosure, the signal emitter is used for emitting signals, which may be infrared signals or wireless wave signals. If the signal is an infrared signal, the signal receiver may be an infrared receiver for receiving the infrared signal. If the signal is a radio wave signal, the signal receiver can be an intelligent device with a WIFI function or a Bluetooth function, and the radio wave signal is transmitted between the signal receiver and the signal transmitter through the WIFI, the Bluetooth or other close-range communication networks.
According to an embodiment of the present disclosure, when the signal is an infrared signal, determining the movement distance of the signal transmitter relative to the signal receiver from the signal includes determining the movement distance from a change in wavelength of the infrared signal.
According to an embodiment of the present disclosure, a relative displacement between the signal receiver and the signal transmitter is determined from a wavelength change of the infrared signal according to a doppler effect of the infrared signal by a processor. According to an embodiment of the present disclosure, the signal receiver may sense from which direction the signal transmitter transmits the signal, and the processor may determine the relative movement direction of the signal transmitter and the signal receiver. Therefore, the processor can acquire the relative movement direction and movement distance of the signal transmitter and the signal receiver, the movement direction and movement distance can be acquired and processed in a polling mode to obtain the movement track of the limb of the user, and the operation instruction of the user on the wearable device can be determined by utilizing the movement track.
According to an embodiment of the present disclosure, the determining a user instruction according to the moving direction and the moving distance includes: matching the motion direction and the motion distance with the motion direction and the motion distance of a preset gesture respectively, and determining a user gesture according to the matching degree; determining the user instruction according to the user gesture.
In this disclosure, the preset gesture may be a gesture option preset by the wearable device and having a predetermined operation function, and when a moving trajectory determined according to a moving direction and a moving distance is the highest in matching degree with one of the gesture options, the user gesture is selected, and the wearable device is operated according to a user instruction corresponding to the user gesture.
Fig. 10 shows a block diagram of an apparatus for a wearable device according to an embodiment of the present disclosure. The apparatus may be implemented as part or all of an electronic device through software, hardware, or a combination of both. As shown in fig. 10, the apparatus for a wearable device includes an obtaining module 110, a transmitting module 120, a first receiving module 130, and a display module 140.
The acquisition module 110 is configured to acquire weather-related data;
the transmitting module 120 is configured to transmit the weather-related data and the location data of the wearable device to a server;
the first receiving module 130 is configured to receive weather prediction data from the server, the weather prediction data being derived based on the weather-related data and the location data of the wearable device;
the display module 140 is configured to display the weather projection data.
According to an embodiment of the present disclosure, the weather-related data includes: sky color data, wind speed data, wind direction data, air temperature data, and air pressure data.
According to an embodiment of the present disclosure, the sky color data, the wind speed data, and the wind direction data are obtained based on a sky image captured by an imaging device of the wearable apparatus; the air temperature data and the air pressure data are obtained based on a sensor of the wearable device.
According to an embodiment of the present disclosure, a portion of the acquisition module that acquires sky color data based on a sky image is configured to input the sky image into a trained first model that identifies a sky portion in the sky image and outputs the sky color data.
According to an embodiment of the disclosure, the portion of the acquisition module, which acquires wind speed data based on a sky image, is configured to acquire a displacement of a cloud image in the sky image based on two sky images captured by a camera of the wearable device at a specific time interval; and determining the wind speed data according to the displacement of the cloud image, the height of the cloud from the ground, the elevation angle of the camera device during shooting, the focal length of the camera device and the specific time interval.
According to an embodiment of the disclosure, the acquiring module acquires wind direction data based on a sky image, and is configured to acquire a displacement of a cloud image in the sky image based on two sky images captured by an image capturing device of the wearable device at a specific time interval; and determining the wind direction data according to the displacement of the cloud image, the height of the cloud from the ground, the elevation angle of the camera device during shooting, the focal length of the camera device and the specific time interval.
Fig. 11 illustrates a block diagram of a first receiving module according to an embodiment of the present disclosure.
As shown in fig. 11, the first receiving module 130 includes a first acquiring unit 131, an input unit 132, and a second acquiring unit 133.
The first obtaining unit 131 is configured to obtain historical weather data of the location according to location data of the wearable device;
the input unit 132 is configured to input the sky color data, the wind speed data, the wind direction data, the air temperature data, the air pressure data, and historical weather data into a trained second model;
the second obtaining unit 133 is configured to obtain the weather prediction data output by the second model.
According to the embodiment of the disclosure, the historical weather data comprises the local weather data of the current day of the past year and/or the weather data in the local preset time period of the current day.
According to an embodiment of the present disclosure, the weather forecast data includes any one or more of the following data at a particular time or time period in the future: air temperature, air pressure, precipitation probability, wind speed, wind direction, and air quality.
Fig. 12 shows a block diagram of an apparatus for a wearable device according to an embodiment of the present disclosure. As shown in fig. 12, the apparatus for a wearable device further includes a first determining module 210 and a superimposing module 220.
The first determination module 210 is configured to determine a spam type and a spam profile based on a spam image;
the overlay module 220 is configured to display the spam image and overlay a color contour on the spam image along the spam contour, the color of the color contour corresponding to the spam type.
Fig. 13 shows a block diagram of an apparatus for a wearable device according to an embodiment of the present disclosure. As shown in fig. 13, the apparatus for a wearable device further includes a second receiving module 310, a second determining module 320, and a third determining module 330.
The second receiving module 310 is configured to receive a signal transmitted by a signal transmitter worn by a user;
the second determining module 320 is configured to determine a moving direction and a moving distance of the signal transmitter relative to the signal receiver according to the signal;
the third determination module 330 is configured to determine a user instruction according to the movement direction and the movement distance.
According to an embodiment of the present disclosure, the signal is an infrared signal;
the second determining module 320 includes: a first determination unit configured to determine the movement distance according to a wavelength variation of the infrared signal.
According to an embodiment of the present disclosure, the third determining module 330 includes:
the second determination unit is configured to match the motion direction and the motion distance with the motion direction and the motion distance of a preset gesture respectively, and determine a user gesture according to the matching degree;
a third determination unit configured to determine the user instruction according to the user gesture.
The present disclosure also discloses an electronic device, and fig. 14 shows a block diagram of the electronic device according to an embodiment of the present disclosure.
As shown in fig. 14, the electronic device 1400 includes a memory 1401 and a processor 1402; wherein the content of the first and second substances,
the memory 1401 is used to store one or more computer instructions, which are executed by the processor 1402 to implement the method steps of:
acquiring weather related data;
sending the weather-related data and location data of the wearable device to a server;
receiving weather prediction data from the server, the weather prediction data derived based on the weather-related data and location data of the wearable device;
and displaying the weather forecast data.
Fig. 15 shows a schematic structural diagram of a computer system suitable for use to implement a method for a wearable device according to an embodiment of the present disclosure.
As shown in fig. 15, the computer system 1500 includes a Central Processing Unit (CPU)1501 which can execute various processes in the above-described embodiments in accordance with a program stored in a Read Only Memory (ROM)1502 or a program loaded from a storage section 1508 into a Random Access Memory (RAM) 1503. In the RAM1503, various programs and data necessary for the operation of the system 1500 are also stored. The CPU1501, the ROM1502, and the RAM1503 are connected to each other by a bus 1504. An input/output (I/O) interface 1505 is also connected to bus 1504.
The following components are connected to the I/O interface 1505: an input portion 1506 including a keyboard, a mouse, and the like; an output portion 1507 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage portion 1508 including a hard disk and the like; and a communication section 1509 including a network interface card such as a LAN card, a modem, or the like. The communication section 1509 performs communication processing via a network such as the internet. A drive 1510 is also connected to the I/O interface 1505 as needed. A removable medium 1511 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 1510 as necessary, so that a computer program read out therefrom is mounted into the storage section 1508 as necessary.
In particular, the above described methods may be implemented as computer software programs according to embodiments of the present disclosure. For example, embodiments of the present disclosure include a computer program product comprising a computer program tangibly embodied on a machine-readable medium, the computer program comprising program code for performing the above-described object class determination method. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 1509, and/or installed from the removable medium 1511.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units or modules described in the embodiments of the present disclosure may be implemented by software or by programmable hardware. The units or modules described may also be provided in a processor, and the names of the units or modules do not in some cases constitute a limitation of the units or modules themselves.
As another aspect, the present disclosure also provides a computer-readable storage medium, which may be a computer-readable storage medium included in the electronic device or the computer system in the above embodiments; or it may be a separate computer readable storage medium not incorporated into the device. The computer readable storage medium stores one or more programs for use by one or more processors in performing the methods described in the present disclosure.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention in the present disclosure is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is possible without departing from the inventive concept. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.

Claims (10)

1. A method for a wearable device, comprising:
acquiring weather related data;
sending the weather-related data and location data of the wearable device to a server;
receiving weather prediction data from the server, the weather prediction data derived based on the weather-related data and location data of the wearable device;
and displaying the weather forecast data.
2. The method of claim 1, wherein the weather-related data comprises: sky color data, wind speed data, wind direction data, air temperature data, and air pressure data.
3. The method of claim 2, wherein the sky color data, wind speed data, and wind direction data are derived based on a sky image captured by a camera of the wearable device; the air temperature data and the air pressure data are obtained through a sensor of the wearable device.
4. The method of claim 3, wherein the obtaining sky color data based on the sky image comprises:
inputting the sky image into a trained first model, the first model identifying a sky portion in the sky image and outputting the sky color data.
5. An apparatus for a wearable device, comprising:
an acquisition module configured to acquire weather-related data;
a transmitting module configured to transmit the weather-related data and location data of the wearable device to a server;
a first receiving module configured to receive weather prediction data from the server, the weather prediction data derived based on the weather-related data and location data of the wearable device;
a display module configured to display the weather projection data.
6. The apparatus of claim 5, wherein the weather-related data comprises: sky color data, wind speed data, wind direction data, air temperature data, and air pressure data.
7. The apparatus of claim 6, wherein the sky color data, wind speed data, and wind direction data are derived based on a sky image captured by a camera of the wearable device; the air temperature data and the air pressure data are obtained through a sensor of the wearable device.
8. The apparatus of claim 7, wherein a portion of the acquisition module that acquires sky color data based on a sky image is configured to input the sky image into a trained first model that identifies a sky portion in the sky image and outputs the sky color data.
9. An electronic device comprising a memory and a processor; wherein the memory is configured to store one or more computer instructions, wherein the one or more computer instructions are executed by the processor to implement the method steps of any of claims 1-4.
10. A readable storage medium having stored thereon computer instructions, characterized in that the computer instructions, when executed by a processor, carry out the method steps of any of claims 1-4.
CN202010007095.5A 2020-01-03 2020-01-03 Method and device for wearable device, electronic device and readable storage medium Pending CN111242354A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010007095.5A CN111242354A (en) 2020-01-03 2020-01-03 Method and device for wearable device, electronic device and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010007095.5A CN111242354A (en) 2020-01-03 2020-01-03 Method and device for wearable device, electronic device and readable storage medium

Publications (1)

Publication Number Publication Date
CN111242354A true CN111242354A (en) 2020-06-05

Family

ID=70879641

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010007095.5A Pending CN111242354A (en) 2020-01-03 2020-01-03 Method and device for wearable device, electronic device and readable storage medium

Country Status (1)

Country Link
CN (1) CN111242354A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112489387A (en) * 2020-12-04 2021-03-12 广东电网有限责任公司江门供电局 Power distribution construction site safety early warning method based on weather monitoring
CN115032817A (en) * 2022-05-27 2022-09-09 北京理工大学 Real-time video defogging intelligent glasses for use in severe weather and control method
TWI795027B (en) * 2020-10-13 2023-03-01 美商谷歌有限責任公司 Distributed sensor data processing using multiple classifiers on multiple devices

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150050921A1 (en) * 2013-08-12 2015-02-19 Yahoo! Inc. Displaying location-based images that match the weather conditions
CN108181990A (en) * 2018-01-19 2018-06-19 昆山国显光电有限公司 Control method and smartwatch based on infrared gesture identification
CN109239808A (en) * 2018-08-01 2019-01-18 平安科技(深圳)有限公司 Weather forecast method, device, computer equipment and storage medium
CN110569874A (en) * 2019-08-05 2019-12-13 深圳大学 Garbage classification method and device, intelligent terminal and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150050921A1 (en) * 2013-08-12 2015-02-19 Yahoo! Inc. Displaying location-based images that match the weather conditions
CN108181990A (en) * 2018-01-19 2018-06-19 昆山国显光电有限公司 Control method and smartwatch based on infrared gesture identification
CN109239808A (en) * 2018-08-01 2019-01-18 平安科技(深圳)有限公司 Weather forecast method, device, computer equipment and storage medium
CN110569874A (en) * 2019-08-05 2019-12-13 深圳大学 Garbage classification method and device, intelligent terminal and storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI795027B (en) * 2020-10-13 2023-03-01 美商谷歌有限責任公司 Distributed sensor data processing using multiple classifiers on multiple devices
CN112489387A (en) * 2020-12-04 2021-03-12 广东电网有限责任公司江门供电局 Power distribution construction site safety early warning method based on weather monitoring
CN115032817A (en) * 2022-05-27 2022-09-09 北京理工大学 Real-time video defogging intelligent glasses for use in severe weather and control method

Similar Documents

Publication Publication Date Title
CN103941746B (en) Image processing system and method is patrolled and examined without man-machine
CN111242354A (en) Method and device for wearable device, electronic device and readable storage medium
TWI483215B (en) Augmenting image data based on related 3d point cloud data
CN105700547B (en) A kind of aerial three-dimensional video-frequency streetscape system and implementation method based on navigation dirigible
CN109374008A (en) A kind of image capturing system and method based on three mesh cameras
CN108154558B (en) Augmented reality method, device and system
JP2008158583A (en) Image-related information display system
JP6321570B2 (en) Indoor position information positioning system and indoor position information positioning method
US10817747B2 (en) Homography through satellite image matching
WO2019019819A1 (en) Mobile electronic device and method for processing tasks in task region
CN109801217B (en) Full-automatic orthographic image splicing method based on GPS ground control point
CN111444845A (en) Non-motor vehicle illegal parking identification method, device and system
CN107246865B (en) Method and device for positioning and identifying building based on intelligent terminal
CN111192321A (en) Three-dimensional positioning method and device for target object
JP2012010036A (en) Camera calibration system, and measuring vehicle and roadside device for the same
JP2012234451A (en) Information communication system, portable terminal, information processing method, and program
WO2019153855A1 (en) Object information acquisition system capable of 360-degree panoramic orientation and position sensing, and application thereof
KR20100060472A (en) Apparatus and method for recongnizing position using camera
CN112348891A (en) Image detection and positioning method and device, storage medium and electronic device
CN114549633A (en) Pose detection method and device, electronic equipment and storage medium
WO2020211593A1 (en) Digital reconstruction method, apparatus, and system for traffic road
KR100981588B1 (en) A system for generating geographical information of city facilities based on vector transformation which uses magnitude and direction information of feature point
JP2012054891A (en) Image processing apparatus, method, and program
US11842452B2 (en) Portable display device with overlaid virtual information
CN114608591B (en) Vehicle positioning method and device, storage medium, electronic equipment, vehicle and chip

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200605