CN109799838B - Training method and system - Google Patents

Training method and system Download PDF

Info

Publication number
CN109799838B
CN109799838B CN201811569630.5A CN201811569630A CN109799838B CN 109799838 B CN109799838 B CN 109799838B CN 201811569630 A CN201811569630 A CN 201811569630A CN 109799838 B CN109799838 B CN 109799838B
Authority
CN
China
Prior art keywords
time
aerial vehicle
unmanned aerial
user
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201811569630.5A
Other languages
Chinese (zh)
Other versions
CN109799838A (en
Inventor
金季春
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201811569630.5A priority Critical patent/CN109799838B/en
Publication of CN109799838A publication Critical patent/CN109799838A/en
Application granted granted Critical
Publication of CN109799838B publication Critical patent/CN109799838B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Rehabilitation Tools (AREA)
  • Toys (AREA)

Abstract

The application provides a training method and a device, and the method comprises the following steps: controlling the unmanned aerial vehicle to fly according to the configured flight track; controlling the unmanned aerial vehicle to generate a tracked signal according to the configured space and time information, so that a user tracks the signal; when a confirmation instruction of tracking the signal returned by the user through the remote controller is received, recording the time of receiving the confirmation instruction; when the unmanned aerial vehicle is controlled to finish flying according to the flight track, matching the recorded time for receiving the confirmation instruction with the configured time information, and determining the time matching degree; and when the time matching degree is larger than a first preset threshold value, determining that the eyes of the user are trained. The method can improve user experience and training efficiency.

Description

Training method and system
Technical Field
The present invention relates to the field of communications technologies, and in particular, to a training method and system.
Background
Basically, the main cause of visual deterioration or visual fatigue is the over-use or improper use of the eyeball, resulting in the long-term tension or spasm of the extraocular and ciliary muscles, which leads to the reduced function of the eyeball.
In view of the growing importance of eye health care, most people perform vision screening and training by ophthalmologists.
Most of the existing training instruments are similar to therapeutic instruments, and users need to carry out eye movement and corresponding operation according to prompts of medical care personnel and the like before sitting on the therapeutic instruments, so that an eye training method is achieved, and visual deterioration is prevented.
However, the implementation of the method is tedious, inconvenient and poor in user experience, and especially children cannot be matched with the method.
The number of people with myopia is not decreased or increased at present, and new prevention measures are needed.
Disclosure of Invention
In view of this, the present application provides a training method and system, which can improve user experience and improve training efficiency.
In order to solve the technical problem, the technical scheme of the application is realized as follows:
a method of training, the method comprising:
controlling the unmanned aerial vehicle to fly according to the configured flight track;
controlling the unmanned aerial vehicle to generate a tracked signal according to the configured space and time information, so that a user tracks the signal;
when a confirmation instruction of tracking the signal returned by the user through the remote controller is received, recording the time of receiving the confirmation instruction;
when the unmanned aerial vehicle is controlled to finish flying according to the flight track, matching the recorded time for receiving the confirmation instruction with the configured time information, and determining the time matching degree;
when the time matching degree is determined to be larger than a first preset threshold value, determining that the eyes of the user are trained.
The training device is used for controlling the unmanned aerial vehicle to fly according to the configured flight track; controlling the unmanned aerial vehicle to generate a tracked signal according to the configured space and time information, so that a user tracks the signal; when a confirmation instruction of tracking the signal returned by the user through the remote controller is received, recording the time of receiving the confirmation instruction; when the unmanned aerial vehicle is controlled to finish flying according to the flight track, matching the recorded time for receiving the confirmation instruction with the configured time information, and determining the time matching degree; when the time matching degree is larger than a first preset threshold value, determining that the eyes of the user are trained;
the unmanned aerial vehicle is used for flying according to the flight track specified by the training device and generating a tracked signal according to the time specified by the training device;
and the remote controller is used for sending the confirmation instruction to the training device when receiving the confirmation instruction input by the user when tracking the signal generated by the unmanned aerial vehicle.
According to the technical scheme, the unmanned aerial vehicle is controlled to generate the signal which can be tracked by the user for multiple times in the flying process, the user inputs the confirmation instruction when the signal is tracked each time through the remote controller, the time of the confirmation instruction is recorded, the time of the confirmation instruction is further matched with the time of the tracking signal generated by the unmanned aerial vehicle, and whether the eyes of the user are trained is determined through the matching degree of all time points. The scheme trains the eyes of the user in a game and entertainment mode, so that the purpose of protecting eyesight is achieved, the user experience can be improved, and the training efficiency is improved.
Drawings
FIG. 1 is a schematic diagram of a training system according to an embodiment of the present application;
FIG. 2 is a schematic flowchart of a training method according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a training system according to a second embodiment of the present application;
fig. 4 is a schematic flowchart of a training method in the second embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more clearly apparent, the technical solutions of the present invention are described in detail below with reference to the accompanying drawings and examples.
The embodiment of the application provides a training system, can be by the signal that the user tracked through controlling unmanned aerial vehicle to produce many times in flight, let the user input the confirmation instruction when tracking the signal each time through the remote controller to the time of recording the confirmation instruction, the time of further matching confirmation instruction and the time that unmanned aerial vehicle produced the tracking signal, whether the eyes of confirming the user receive the training through the matching degree of all time points. The scheme trains the eyes of the user in a game and entertainment mode, so that the purpose of protecting eyesight is achieved, the user experience can be improved, and the training efficiency is improved.
Two training systems are provided in the embodiments of the present application, which are described by the two embodiments respectively:
the first embodiment is as follows:
the training system comprises: trainer, unmanned aerial vehicle and remote controller. Referring to fig. 1, fig. 1 is a schematic diagram of a training system according to a first embodiment of the present application.
The training device can be a training instrument, and can also be a module integrated on certain equipment, such as notebook computers, mobile phones and other equipment; can the flight of wireless control unmanned aerial vehicle to and produce the signal that is tracked, and receive the confirmation instruction that the remote controller sent, the time of the confirmation instruction of record receipt also can be through wireless connection with the remote controller, and the convenience of customers removes the signal that comes better tracking unmanned aerial vehicle to produce like this.
The training device is provided with one or more flight tracks in advance, and the aircraft tracks can be configured in advance according to the use object, the suitable environment, the training effect on eyes and the like.
For each flight trajectory, spatial position and time information can be configured for generating a spatial position and a time point of a tracked signal, for example, how long an interval is generated to generate the tracked signal, the interval time can be the same or different, for example, the time for controlling the unmanned aerial vehicle to generate the tracked signal for the first time is 13:05:00, and the time for generating the tracked signal for the second time is 13:05:30 if the time for generating the tracked signal for the second time is 30 seconds.
The above is only one way to implement the time information, and is not limited to the above implementation form.
The remote control unit may receive user input, such as a button being pressed, to generate a confirmation command, and send the generated confirmation command to the exercise device.
The drone can be controlled by a training device and generate a tracked signal.
The training device controls the unmanned aerial vehicle to fly according to the configured flight track;
and the unmanned aerial vehicle is used for flying according to the flight track specified by the training device.
And the training device controls the unmanned aerial vehicle to generate a tracked signal according to the configured time information, so that the user tracks the signal.
The unmanned aerial vehicle generates a tracked signal according to the time designated by the training device;
the duration of each time the drone is controlled to generate the tracked signal is a first preset time, wherein the configured time information comprises a plurality of time points. The first preset time in the implementation of this step may be set according to actual needs, for example, the reaction time for a person to see the tracked signal.
In the embodiment of the present application, the tracked signal is a signal that can be seen by the user, such as a flashing signal light, or a colored smoke band, but is not limited to the signal display shown.
The remote controller can be a handheld remote controller, a user can track the unmanned aerial vehicle by holding the remote controller, and when a tracked signal generated by the unmanned aerial vehicle is observed, a relevant key is pressed on the remote controller so as to input a confirmation instruction of the tracked signal;
when the remote controller receives a confirmation instruction input when a user tracks a signal generated by the unmanned aerial vehicle, the confirmation instruction is sent to the training device.
When the training device receives a confirmation instruction of tracking the signal returned by the user through the remote controller, recording the time of receiving the confirmation instruction; when the unmanned aerial vehicle is controlled to finish flying according to the flight track, matching the recorded time for receiving the confirmation instruction with the configured time information, and determining the matching degree; when the time matching degree is larger than a first preset threshold value, determining that the eyes of the user are trained;
the training device matches the recorded time when the confirmation instruction is received with the configured time information, and determines the matching degree, specifically comprising:
when the difference between the time of any confirmation instruction and the time of generating the tracked signal is smaller than a preset difference, determining that the time of the confirmation instruction is matched with the time of the tracked signal; counting the number M of the time of the confirmation instruction matched with the time of the tracked signal; and determining the ratio of M to N as the matching degree, wherein N is the total number of time points in the time information for generating the tracked signal.
The difference value of the two times is determined to be matched with the two times within the preset difference value, and the time for controlling the unmanned aerial vehicle, the time for inputting the determination instruction by the user and the like are considered to have certain time errors, and the related time is considered to be matched within a certain error range.
The training device controls the unmanned aerial vehicle to fly according to the configured flight track again for training when the matching degree is determined to be not greater than the preset threshold value, and the matching degree is determined; when the matching degree determined for K times of training is not greater than the preset threshold value, the flight trajectory of the unmanned aerial vehicle is updated, or the space position and time generated by the tracked signal are updated, so that training is performed again until it is determined that the eyes of the user are trained.
If the matching degree does not reach the preset threshold value in one training, the eyes of the user are determined not to be trained, namely the effect of protecting eyesight is not achieved, the user, such as a child, is not matched, another training is needed, if the training times reach K, a higher matching degree cannot be obtained, which indicates that the training is possible to be a flight track, or the time information configuration is not appropriate, the space position and the time generated by the newly tracked signal can be updated, and the training is carried out again until the eyes of the user are determined to be trained.
The training track or the time information may be updated by selecting one of the preconfigured information, or may be manually input for updating, which is not limited in the present application.
Based on the same inventive concept as the above, the embodiment of the present application also provides a training method. Referring to fig. 2, fig. 2 is a schematic flowchart of a training method in a first embodiment of the present application. The method comprises the following specific steps:
step 201, the training device controls the unmanned aerial vehicle to fly according to the configured flight trajectory.
Step 202, the training device controls the unmanned aerial vehicle to generate a tracked signal according to the configured time and space information, so that the user can track the signal.
The duration of each time the drone is controlled to generate the tracked signal is a first preset time, wherein the configured time information comprises a plurality of time points.
The signal being tracked is a flashing signal light, or a colored smoke band.
Step 203, when the training device receives the confirmation instruction of the tracking signal returned by the user through the remote controller, recording the time of receiving the confirmation instruction.
And 204, when the training device controls the unmanned aerial vehicle to finish flying according to the flight track, matching the recorded time for receiving the confirmation instruction with the configured time information, and determining the time matching degree.
In this step, matching the recorded time when the confirmation instruction is received with the configured time information, and determining the time matching degree, includes:
when the difference between the time of any confirmation instruction and the time of generating the tracked signal is smaller than a preset difference, determining that the time of the confirmation instruction is matched with the time of the tracked signal;
counting the number M of the time of the confirmation instruction matched with the time of the tracked signal;
and determining the ratio of M to N as the time matching degree, wherein N is the total number of time points in the time information for generating the tracked signal.
When the matching degree is larger than the first preset threshold value, the training device determines that the eyes of the user are trained, step 205.
When the time matching degree is not larger than the first preset threshold value, controlling the unmanned aerial vehicle to fly according to the configured flight track again for training, and determining the time matching degree;
when the time matching degree determined for K times of training is not greater than the preset threshold value, the flight trajectory of the unmanned aerial vehicle is updated, or the time generated by the tracked signal is updated, so that training is performed again until it is determined that the eyes of the user are trained.
Example two
The training system comprises: training device, unmanned aerial vehicle, positioning device and remote controller. Referring to fig. 3, fig. 3 is a schematic diagram of a training system in the second embodiment of the present application.
The training device can be a training instrument, and can also be a module integrated on certain equipment, such as notebook computers, mobile phones and other equipment; can the flight of wireless control unmanned aerial vehicle to and produce the signal that is tracked, and receive the confirmation instruction that the remote controller sent, the time of the confirmation instruction of receipt is recorded, also can be through wireless connection with the remote controller, and the convenience of customers removes the signal that comes better pursuit unmanned aerial vehicle to produce like this, still is used for receiving the unmanned aerial vehicle's that the locating device sent spatial information simultaneously, and records spatial information and the time of receiving corresponding spatial information.
The training device is provided with one or more flight tracks in advance, and the aircraft tracks can be configured in advance according to the use object, the suitable environment, the training effect on eyes and the like.
The space position and time information can be configured for each flight trajectory, and the space position and the time point for generating the tracked signal can be set, for example, how long the tracked signal is generated once every interval can be set, and the interval time can be the same or different, for example, the time for controlling the unmanned aerial vehicle to generate the tracked signal for the first time is 13:05:00, and the time for generating the tracked signal for the second time is 13:05:30 if the time for controlling the unmanned aerial vehicle to generate the tracked signal for the second time is 30 seconds; the corresponding spatial position is, for example, two-dimensional coordinates, and the configured time information and the spatial position are corresponding to the pre-configured flight trajectory.
The above is only one way to implement the time information, and is not limited to the above implementation form.
The remote control unit may receive user input, such as a button being pressed, to generate a confirmation command, and send the generated confirmation command to the exercise device.
The training device controls the unmanned aerial vehicle to fly according to the configured flight track;
the drone can be controlled by the training device and generate a tracked signal; for flying according to the flight path appointed by the training device.
And the training device controls the unmanned aerial vehicle to generate a tracked signal according to the configured time information and the space information, so that the user can track the signal.
The unmanned aerial vehicle generates a tracked signal according to the time designated by the training device; when the signal is specifically generated, the unmanned aerial vehicle does not need to record time according to the received instruction generation of the training device.
The duration of each time the drone is controlled to generate the tracked signal is a first preset time, wherein the configured time information comprises a plurality of time points. The first preset time in the implementation of this step may be set according to actual needs, for example, the reaction time for a person to see the tracked signal.
In the embodiment of the present application, the tracked signal is a signal that can be seen by the user, such as a flashing signal light, or a colored smoke band, but is not limited to the signal display shown.
The remote controller can be a handheld remote controller, a user can track the unmanned aerial vehicle by holding the remote controller, and when a tracked signal generated by the unmanned aerial vehicle is observed, a relevant key is pressed on the remote controller so as to input a confirmation instruction of the tracked signal;
when the remote controller receives a confirmation instruction input when a user tracks a signal generated by the unmanned aerial vehicle, the confirmation instruction is sent to the training device.
The positioning device can be a base station which can be used, or a positioning device carried by a user; and the space position of the unmanned aerial vehicle is positioned in real time and fed back to the training device.
When the training device receives a confirmation instruction of tracking the signal returned by the user through the remote controller, recording the time of receiving the confirmation instruction;
the training device receives the spatial information fed back by the positioning equipment and records the time of the received spatial information. The result of the optimal spatial information composition is the configured flight trajectory.
When the unmanned aerial vehicle is controlled to finish flying according to the flight track, matching the recorded time for receiving the confirmation instruction with the configured time information, and determining the time matching degree; when the time matching degree is greater than a first preset threshold value, determining whether the corresponding space information recorded by the positioning equipment at the matching time is matched with the space information corresponding to the configured time; determining the space matching degree;
and when the determined spatial matching degree is larger than a second preset threshold value, determining that the eyes of the user are trained.
And when the time matching degree is not greater than a first preset threshold value or the space matching degree is not greater than a second preset threshold value, determining that the eyes of the user are not trained.
The training device matches the recorded time when the confirmation instruction is received with the configured time information, and determines the time matching degree, which specifically comprises:
when the difference between the time of any confirmation instruction and the time of generating the tracked signal is smaller than a preset difference, determining that the time of the confirmation instruction is matched with the time of the tracked signal; counting the number M of the time of the confirmation instruction matched with the time of the tracked signal; and determining the ratio of M to N as the matching degree, wherein N is the total number of time points in the time information for generating the tracked signal.
The difference value of the two times is determined to be matched with the two times within the preset difference value, and the time for controlling the unmanned aerial vehicle, the time for inputting the determination instruction by the user and the like are considered to have certain time errors, and the related time is considered to be matched within a certain error range.
Determining whether the corresponding spatial information recorded at the matched time for the positioning equipment is matched with the configured spatial information corresponding to the time; and determining a spatial matching degree, comprising:
when the difference value between any recorded space information and the corresponding configured space information is smaller than a second preset difference value, determining that the recorded space information is matched with the corresponding configured space information;
counting the number K of the matched recorded spatial information;
determining the ratio of K to N as a space matching degree; where N is the total number of time points in the time information for generating the tracked signal.
The training device controls the unmanned aerial vehicle to fly according to the configured flying track again for training when the eyes of the user are determined not to be trained, and the matching degree is determined; when the matching degree determined for K times of training is not greater than the preset threshold value, the flight trajectory of the unmanned aerial vehicle is updated, or the space position and time generated by the tracked signal are updated, so that training is performed again until it is determined that the eyes of the user are trained.
If the matching degree does not reach the preset threshold value in one training, the eyes of the user are determined not to be trained, namely the effect of protecting eyesight is not achieved, the user, such as a child, is not matched, another training is needed, if the training times reach K, a higher matching degree cannot be obtained, which indicates that the training is possible to be a flight track, or the time information configuration is not appropriate, the space position and the time generated by the newly tracked signal can be updated, and the training is carried out again until the eyes of the user are determined to be trained.
The training track or the time information may be updated by selecting one of the preconfigured information, or may be manually input for updating, which is not limited in the present application.
Based on the same inventive concept as the above, the embodiment of the present application also provides a training method. Referring to fig. 4, fig. 4 is a schematic flowchart of a training method in the second embodiment of the present application. The method comprises the following specific steps:
step 401, the training device controls the unmanned aerial vehicle to fly according to the configured flight trajectory.
Step 402, the training device controls the drone to generate a tracked signal according to the configured time and space information, so that the user can track the signal.
The duration of each time the drone is controlled to generate the tracked signal is a first preset time, wherein the configured time information comprises a plurality of time points.
The signal being tracked is a flashing signal light, or a colored smoke band.
And 403, recording the time when the training device receives the confirmation instruction of the tracking signal returned by the user through the remote controller.
And 404, when the training device receives the spatial information sent by the positioning device for positioning the unmanned aerial vehicle, recording the spatial information and the time information corresponding to the received spatial information.
The execution of step 403 and step 404 is not in a sequential order.
And 405, matching the recorded time for receiving the confirmation instruction with the configured time information when the training device controls the unmanned aerial vehicle to finish flying according to the flight track, and determining the time matching degree.
In this step, matching the recorded time when the confirmation instruction is received with the configured time information, and determining the time matching degree, includes:
when the difference between the time of any confirmation instruction and the time of generating the tracked signal is smaller than a preset difference, determining that the time of the confirmation instruction is matched with the time of the tracked signal;
counting the number M of the time of the confirmation instruction matched with the time of the tracked signal;
and determining the ratio of M to N as the time matching degree, wherein N is the total number of time points in the time information for generating the tracked signal.
Step 406, when the matching degree is greater than a first preset threshold, determining whether the corresponding spatial information recorded by the positioning device at the matching time is matched with the configured spatial information corresponding to the time; and determines the spatial matching degree.
Step 407, determining that the eyes of the user are trained when the determined spatial matching degree is greater than a second preset threshold.
When the time matching degree is not larger than the first preset threshold value or the space matching degree is not larger than the second preset threshold value, controlling the unmanned aerial vehicle to fly according to the configured flight track again for training, and determining the space and time matching degree;
when it is determined for K exercises that none of the user's eyes are trained, the flight trajectory of the drone is updated, or the time at which the tracked signal is generated is updated, to train again until it is determined that the user's eyes are trained.
After light enters the eye, the light reaches the retina through the cornea, aqueous humor, pupil, crystalline lens, vitreous body and other media, and during the period, due to the variation of the refractive power of the crystalline lens, the focus of the object focused in the eye can fall in front of (myopia), above (emmetropia) and behind (hyperopia) the retina, so the adjusting power of the structure diopter and visual axis of the eyeball is the cause of the myopia, and the elasticity of the cornea, crystalline lens and the like is a passive factor. Ciliary muscles and six oculomotor muscles are active factors. The ciliary muscle is the primary factor in adjusting the optical power of the lens.
Unlike the oculomotor muscles (voluntary, striated), the ciliary muscle is a smooth muscle, structurally a monocyte, without striations. Functionally, the contraction and relaxation activities are not controlled by the will of the human, and the contraction speed is slow. The ciliary muscle movement is caused by the stimulation of light from an external object entering the eyeball. Therefore, the ciliary muscle training cannot be done subjectively (as in skeletal muscle training), but by the motor stimulation of external objects.
Ciliary muscle is the main power for adjusting the diopter of crystalline lens, and the ability of the ciliary muscle is trained through the movement stimulation of an external object, so that the ciliary muscle has the function of improving the diopter of the eyeball and the adjusting ability of the visual axis length.
According to the technical scheme provided by the application, when the eyes are trained, the ciliary muscle of the eyes can be trained, the myopia can be prevented, the prevalence rate of the myopia is reduced in a large area, and particularly, the eyesight protection effect is particularly obvious for preventing the myopia of teenagers; and this application trains through tracking unmanned aerial vehicle flight, and the process is not boring, and the user also pleases the cooperation, user experience preferred.
To sum up, this application lets the user input confirm the instruction when tracking the signal at every turn through controlling unmanned aerial vehicle to produce the signal that can be tracked by the user many times in flight process through the remote controller to the time of confirming the instruction is confirmed in the record, and the time of further matching confirm the time of instruction and unmanned aerial vehicle production tracking signal, whether the eyes of confirming the user through the matching degree of all time points receive the training. The scheme trains the eyes of the user in a game and entertainment mode, so that the purpose of protecting eyesight is achieved, the user experience can be improved, and the training efficiency is improved.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (10)

1. A method of training, the method comprising:
controlling the unmanned aerial vehicle to fly according to the configured flight track;
controlling the unmanned aerial vehicle to generate a tracked signal according to the configured space and time information, so that a user tracks the signal;
when a confirmation instruction of tracking the signal returned by the user through the remote controller is received, recording the time of receiving the confirmation instruction;
when the unmanned aerial vehicle is controlled to finish flying according to the flight track, matching the recorded time for receiving the confirmation instruction with the configured time information, and determining the time matching degree;
when the time matching degree is determined to be larger than a first preset threshold value, determining that the eyes of the user are trained.
2. The method of claim 1, further comprising:
receiving space information sent by positioning of the unmanned aerial vehicle by positioning equipment, recording the space information, and receiving time of the space information;
when it is determined that the time matching degree is greater than the first preset threshold value, before the determining that the eyes of the user are trained, the method further comprises:
determining whether the corresponding spatial information recorded at the matched time for the positioning equipment is matched with the configured spatial information corresponding to the time; determining the space matching degree;
and when the determined spatial matching degree is larger than a second preset threshold value, determining that the eyes of the user are trained.
3. The method of claim 1, further comprising:
the duration of each time the drone is controlled to generate the tracked signal is a first preset time, wherein the configured spatial position and time information comprises a plurality of spatial positions and time points.
4. The method of claim 1, wherein the tracked signal is a blinking signal light or a colored smoke band.
5. The method of claim 1, wherein matching the recorded time when the confirmation instruction is received with the configured time information and determining the time matching degree comprises:
when the difference between the time of any confirmation instruction and the time of generating the tracked signal is smaller than a first preset difference, determining that the time of the confirmation instruction is matched with the time of the tracked signal;
counting the number M of the time of the confirmation instruction matched with the time of the tracked signal;
and determining the ratio of M to N as the time matching degree, wherein N is the total number of time points in the time information for generating the tracked signal.
6. The method of claim 2, wherein the time of determining the match is determined whether the corresponding spatial information recorded for the positioning device matches the spatial information corresponding to the configured time; and determining a spatial matching degree, comprising:
when the difference value between any recorded space information and the corresponding configured space information is smaller than a second preset difference value, determining that the recorded space information is matched with the corresponding configured space information;
counting the number K of the matched recorded spatial information;
determining the ratio of K to N as a space matching degree; where N is the total number of time points in the time information for generating the tracked signal.
7. The method according to any one of claims 1-6, wherein the method further comprises:
if the eyes of the user are not trained, controlling the unmanned aerial vehicle to fly according to the configured flying track again to train, and determining whether the eyes of the user are trained;
when it is determined that the user's eyes are not trained for each of the K training results, the flight trajectory of the drone is updated, or the time at which the tracked spatial signal is generated is updated, to train again until it is determined that the user's eyes are trained.
8. A training system, the system comprising: training device, unmanned aerial vehicle and remote controller;
the training device is used for controlling the unmanned aerial vehicle to fly according to the configured flight track; controlling the unmanned aerial vehicle to generate a tracked signal according to the configured space and time information, so that a user tracks the signal; when a confirmation instruction of tracking the signal returned by the user through the remote controller is received, recording the time of receiving the confirmation instruction; when the unmanned aerial vehicle is controlled to finish flying according to the flight track, matching the recorded time for receiving the confirmation instruction with the configured time information, and determining the time matching degree; when the time matching degree is larger than a first preset threshold value, determining that the eyes of the user are trained;
the unmanned aerial vehicle is used for flying according to the flight track specified by the training device and generating a tracked signal according to the time specified by the training device;
and the remote controller is used for sending the confirmation instruction to the training device when receiving the confirmation instruction input by the user when tracking the signal generated by the unmanned aerial vehicle.
9. Training system according to claim 8,
the training device is further used for receiving spatial information sent by positioning the unmanned aerial vehicle by the positioning equipment, recording the spatial information and receiving the time of the spatial information; when the time matching degree is determined to be larger than a first preset threshold value, whether the corresponding space information recorded by the matching time for the positioning equipment is matched with the configured space information corresponding to the time is further determined; determining the space matching degree; when the determined spatial matching degree is larger than a second preset threshold value, determining that the eyes of the user are trained;
the positioning equipment is used for positioning the space position of the unmanned aerial vehicle and sending the space information of the unmanned aerial vehicle to the training device.
10. A system according to claim 8 or 9, wherein the tracked signal is a flashing signal light or a coloured smoke band.
CN201811569630.5A 2018-12-21 2018-12-21 Training method and system Expired - Fee Related CN109799838B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811569630.5A CN109799838B (en) 2018-12-21 2018-12-21 Training method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811569630.5A CN109799838B (en) 2018-12-21 2018-12-21 Training method and system

Publications (2)

Publication Number Publication Date
CN109799838A CN109799838A (en) 2019-05-24
CN109799838B true CN109799838B (en) 2022-04-15

Family

ID=66557282

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811569630.5A Expired - Fee Related CN109799838B (en) 2018-12-21 2018-12-21 Training method and system

Country Status (1)

Country Link
CN (1) CN109799838B (en)

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2938774A1 (en) * 2008-11-27 2010-05-28 Parrot DEVICE FOR CONTROLLING A DRONE
US20150351655A1 (en) * 2013-01-08 2015-12-10 Interaxon Inc. Adaptive brain training computer system and method
CN103196430B (en) * 2013-04-27 2015-12-09 清华大学 Based on the flight path of unmanned plane and the mapping navigation method and system of visual information
US9987743B2 (en) * 2014-03-13 2018-06-05 Brain Corporation Trainable modular robotic apparatus and methods
US10564714B2 (en) * 2014-05-09 2020-02-18 Google Llc Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects
AU2015297036B2 (en) * 2014-05-09 2017-09-28 Google Llc Systems and methods for discerning eye signals and continuous biometric identification
KR102196975B1 (en) * 2015-08-15 2020-12-30 구글 엘엘씨 System and method for biomechanical-based eyeball signals to interact with real and virtual objects
AU2016314770A1 (en) * 2015-09-03 2018-03-29 Commonwealth Scientific And Industrial Research Organisation Unmanned aerial vehicle control techniques
TWI598143B (en) * 2016-06-03 2017-09-11 博泰科技有限公司 Following remote controlling method for aircraft
FR3052084B1 (en) * 2016-06-03 2018-07-13 Drobot X SYSTEM FOR CONTROLLING THE EVOLUTION OF REMOTE CONTROLS
CN105867407A (en) * 2016-06-12 2016-08-17 零度智控(北京)智能科技有限公司 Unmanned aerial vehicle as well as control device and control method thereof
CN107087435B (en) * 2016-09-27 2018-10-19 深圳市大疆创新科技有限公司 Control method, control device, electronic device and flight control system
CN106726389A (en) * 2017-03-01 2017-05-31 上海思明堂生物科技股份有限公司 A kind of training system for improving vision definition and acuity
CN107316315A (en) * 2017-05-04 2017-11-03 佛山市南海区广工大数控装备协同创新研究院 A kind of object recognition and detection method based on template matches
CN108475064B (en) * 2017-05-16 2021-11-05 深圳市大疆创新科技有限公司 Method, apparatus, and computer-readable storage medium for apparatus control
CN107741781A (en) * 2017-09-01 2018-02-27 中国科学院深圳先进技术研究院 Flight control method, device, unmanned plane and the storage medium of unmanned plane
CN107957733A (en) * 2017-12-05 2018-04-24 深圳市道通智能航空技术有限公司 Flight control method, device, terminal and unmanned plane

Also Published As

Publication number Publication date
CN109799838A (en) 2019-05-24

Similar Documents

Publication Publication Date Title
US10678335B2 (en) Methods, devices, and systems for creating haptic stimulations and tracking motion of a user
US10396905B2 (en) Method and system for direct communication
Friedman et al. Navigating virtual reality by thought: What is it like?
Franchak et al. Head-mounted eye-tracking of infants' natural interactions: a new method
US20170351326A1 (en) Eye training system and computer program product
CN110419018A (en) The automatic control of wearable display device based on external condition
CN109925678A (en) A kind of training method based on eye movement tracer technique, training device and equipment
US20090306741A1 (en) Systems and methods for altering brain and body functions and for treating conditions and diseases of the same
US20030214630A1 (en) Interactive occlusion system
US11992678B2 (en) System and method for individualizing neuromodulation
WO2005051329A2 (en) Systems and methods for altering vestibular biology
CN108721070A (en) A kind of intelligent vision functional training system and its training method based on eyeball tracking
CN110604675A (en) Method for realizing vision correction based on VR interaction
CN112315754A (en) Vision training method based on VR glasses and VR glasses
CN109799838B (en) Training method and system
US20230149248A1 (en) Methods and systems for dynamic ocular training using therapeutic games
WO2023141460A1 (en) Methods, apparatus, and articles to enhance brain function via presentation of visual effects in far and/or ultra-far peripheral field
Friedman et al. Navigating virtual reality by thought: First steps
CN114053107A (en) Human-computer interaction forward and backward operation intelligent training method based on virtual reality prevention and control and eyesight improvement
CN114028181A (en) Visual training method and system combining accommodation training and convergence training
JPH1156942A (en) Visual acuity training device
Prahm et al. Extending Mirror Therapy into Mixed Reality—Design and Implementation of the Application PhantomAR to Alleviate Phantom Limb Pain in Upper Limb Amputees
Yan et al. P5G: A Patient-Centered Design Method of Virtual Reality Health Game System for Children's Amblyopia Rehabilitation
EP3158933A1 (en) Functional learning device, system, and method
CN108392380A (en) A kind of Internet technology autozoom formula vision energy state exercise instrument and application

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20220415

CF01 Termination of patent right due to non-payment of annual fee