CN112985372A - Path planning system and method thereof - Google Patents

Path planning system and method thereof Download PDF

Info

Publication number
CN112985372A
CN112985372A CN201911285325.8A CN201911285325A CN112985372A CN 112985372 A CN112985372 A CN 112985372A CN 201911285325 A CN201911285325 A CN 201911285325A CN 112985372 A CN112985372 A CN 112985372A
Authority
CN
China
Prior art keywords
moving
processing unit
virtual
augmented reality
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911285325.8A
Other languages
Chinese (zh)
Other versions
CN112985372B (en
Inventor
陈奕男
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanning Fulian Fugui Precision Industrial Co Ltd
Original Assignee
Nanning Fugui Precision Industrial Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanning Fugui Precision Industrial Co Ltd filed Critical Nanning Fugui Precision Industrial Co Ltd
Priority to CN201911285325.8A priority Critical patent/CN112985372B/en
Publication of CN112985372A publication Critical patent/CN112985372A/en
Application granted granted Critical
Publication of CN112985372B publication Critical patent/CN112985372B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A path planning method is suitable for an augmented reality device, the augmented reality device at least comprises a processing unit, a display unit and a camera shooting unit, and the method is characterized by comprising the following steps: the processing unit establishes a plurality of virtual flags at different positions according to the flag setting signal; the display unit displays a display screen with the virtual flags; the camera shooting unit captures a plurality of images with indicators; the processing unit obtains a moving track of the indicator according to a plurality of images with the indicator; and the processing unit selects the virtual flag passed by the movement track and connects the selected virtual flag to generate a movement path or an activity boundary. The invention also discloses a path planning system. The invention can define the moving path or the moving boundary of the moving object by connecting the virtual flag in the augmented reality.

Description

Path planning system and method thereof
Technical Field
The present invention relates to a path planning system and method thereof, and more particularly, to a path planning system and method thereof for establishing a moving path or an active boundary through a virtual flag in an augmented reality device.
Background
The augmented reality device can combine virtual and real scenes together through a camera and positioning elements such as a gyroscope, a GPS and the like which are attached to the augmented reality device, and can interact with each other. In some applications, the user can generate the moving tracks of the aerial photography device, the sweeping robot, and the like in the display screen of the augmented reality device through the gestures. However, since the determination of the movement track of the gesture through image recognition may cause a spatial misjudgment, how to more accurately convert the movement track of the gesture into a correct actual spatial movement path is a problem to be solved at present.
Disclosure of Invention
Accordingly, a path planning system and a method thereof are needed to define a moving path or a moving range of a moving object more precisely.
The invention provides a path planning method, which is suitable for an augmented reality device, wherein the augmented reality device at least comprises a processing unit, a display unit and a camera unit, and is characterized by comprising the following steps: the processing unit establishes a plurality of virtual flags at different positions according to the flag setting signal; the display unit displays a display screen with the virtual flags; the camera shooting unit captures a plurality of images with indicators; the processing unit obtains a moving track of the indicator according to a plurality of images with the indicator; and the processing unit selects the virtual flag passed by the movement track and connects the selected virtual flag to generate a movement path or an activity boundary.
The invention also provides a path planning system which is characterized by comprising an augmented reality device and a mobile object. The augmented reality device comprises a display unit, a camera unit and a processing unit. The display unit is used for displaying a display picture with a plurality of virtual flags. The camera shooting unit is used for capturing a plurality of images with the indicator. The processing unit is used for establishing the virtual flags at different positions according to the flag setting signal, acquiring the moving track of the indicator according to the images with the indicator, selecting the virtual flag passed by the moving track, and connecting the selected virtual flag to generate a moving path or an activity boundary. The mobile object is used for receiving the moving path or the moving boundary from the augmented reality device and moving between the selected virtual flags according to the moving path or moving within a range formed by the selected virtual flags according to the moving boundary.
According to an embodiment of the present invention, the augmented reality apparatus further includes a positioning unit. The positioning unit is used for capturing actual coordinates corresponding to the different positions according to the flag setting signal received by the processing unit, and the processing unit further establishes a plurality of virtual flags according to the actual coordinates.
According to another embodiment of the present invention, the processing unit of the augmented reality device further generates the display frame according to the current coordinates obtained by the positioning unit of the augmented reality device and the actual coordinates corresponding to the virtual flags respectively.
According to another embodiment of the present invention, the augmented reality apparatus further comprises a communication unit. The communication unit is used for outputting the moving path or the moving boundary to a moving object.
According to another embodiment of the present invention, the moving path further includes an action command, so that the moving object performs a corresponding action according to the action command when moving.
Drawings
Fig. 1 is a block diagram of a path planning system according to an embodiment of the invention;
FIG. 2 is a diagram illustrating a finger movement trace and a movement path generated according to the finger movement trace according to an embodiment of the present invention;
fig. 3 is a flowchart of a path planning method according to an embodiment of the invention.
Description of the main elements
Path planning system 100
Augmented reality device 110
First imaging unit 111
First positioning unit 112
First processing unit 113
First display unit 114
First communication unit 115
Moving object 120
The second communication unit 121
Second positioning unit 122
Second processing unit 123
Second driving unit 124
Virtual flags 201 to 212
Finger movement trajectory 250
Moving path 260
Step flows S301 to S306
Detailed Description
Further areas of applicability of the present systems and methods will become apparent from the detailed description provided hereinafter. It should be understood that the following detailed description and specific examples, while indicating exemplary embodiments of the path planning system and method, are intended for purposes of illustration only and are not intended to limit the scope of the invention.
Fig. 1 is a block diagram of a path planning system according to an embodiment of the invention. The path planning system 100 includes at least an augmented reality device 110 and a mobile object 120. The augmented reality device 110 at least includes a first camera unit 111, a first positioning unit 112, a first processing unit 113, a first display unit 114 and a first communication unit 115. The first camera unit 111 and the first display unit 114 face the same direction, and are composed of at least one camera lens for capturing images at predetermined time intervals. The first positioning unit 112 may be a positioning system used outdoors (e.g., a global positioning system) or a positioning system used indoors (e.g., bluetooth indoor positioning, Wi-Fi positioning, radio frequency identification positioning, ZigBee positioning, UWB positioning, infrared positioning, ultrasonic positioning, etc.) for determining the current position of the augmented reality device 110 according to the positioning instruction of the first processing unit 113. The first processing unit 113 is configured to combine the image captured by the first camera unit 111 with the virtual flag to generate a plurality of display images, and generate a moving path according to a moving track of the indicator in the plurality of images captured by the first camera unit 111. The first processing unit 113 can be implemented in various ways, such as with dedicated hardware circuits or general purpose hardware (e.g., a single processor, multiple processors with parallel processing capability, a graphics processor, or other processor with computing capability), and provides functions described hereinafter when executing program code or software. The first display unit 114 may be a liquid crystal display screen or the like, and is used for displaying the display image output by the first processing unit 113. The first communication unit 115 can output the moving path to the moving object 120 through a wired or wireless manner.
The mobile object 120 at least includes a second communication unit 121, a second positioning unit 122, a second processing unit 123 and a second driving unit 124. The second communication unit 121 is connected to the first communication unit 115, and is configured to receive the moving path calculated by the first processing unit 113. The second positioning unit 122 may be a positioning system for outdoor use (e.g., global positioning system) or a positioning system for indoor use (e.g., bluetooth indoor positioning, Wi-Fi positioning, radio frequency identification positioning, ZigBee positioning, UWB positioning, infrared positioning, ultrasonic positioning, etc.) for determining the current position of the mobile object 120. The second processing unit 123 is configured to determine a moving direction and a moving distance according to the current position and the moving track of the moving object 120, and enable the second driving unit 124 according to the moving direction and the moving distance. The second driving unit 124 may be a motor for driving the moving object 120 to move.
According to an embodiment of the present invention, when the user establishes the virtual flag, the user first moves to a point where the virtual flag is to be set, and triggers the flag setting signal through the physical key or the virtual key displayed on the touch display screen. After receiving the flag setting signal, the first processing unit 113 enables the first positioning unit 112 to obtain the current coordinates of the augmented reality device 110, sets the current coordinates as the actual coordinates corresponding to the virtual flag, and displays the virtual flag on the current display screen. After receiving the image acquired by the first camera unit 111 at predetermined intervals, the first processing unit 113 calculates a relative position relationship between the current position and each virtual flag according to the current position acquired by the first positioning unit 112, and displays each virtual flag in the image according to the relative position relationship, so as to generate a display screen displayed on the first display unit 114. Then, the user can place an indicator (e.g., a finger) within the shooting range of the first camera unit 111 and try to connect the virtual flag in the display screen. The first processing unit 113 calculates the movement track of the front end of the pointing object according to the plurality of images, and selects the virtual flag through which the movement track passes. The first processing unit 113 connects the selected virtual flag and generates the motion trail or the motion range according to the selected virtual flag.
For example, as shown in fig. 2, the first display unit 111 displays virtual flags 201 to 212 on the display screen according to the setting of the user, then the finger of the user sequentially passes through the virtual flag 204, the virtual flag 203, the virtual flag 207, the virtual flag 206, the virtual flag 210, the virtual flag 209, the virtual flag 205, and the virtual flag 201 along the track of the dotted line 250, and the first processing unit 113 connects the virtual flag 204, the virtual flag 203, the virtual flag 207, the virtual flag 206, the virtual flag 210, the virtual flag 209, the virtual flag 205, and the virtual flag 201 according to the track of the dotted line 250 to generate the moving path shown by the solid line 260. In addition, a corresponding action command can be additionally added to the moving path between every two virtual flags, so that the second processing unit 123 of the mobile object 120 can execute the designated action when moving between the two virtual flags according to the action command. After the first processing unit 113 generates the moving path, the moving path is output to the second communication unit 121 of the mobile object 120 through the first communication unit 115 in a wired or wireless manner. After receiving the moving path from the second communication unit 121, the second processing unit 123 determines the current position of the moving object 120 through the second positioning unit 122, and calculates the relative position relationship between the current position and the starting point of the moving path, so that the second driving unit 124 can move the moving object 120 to the starting point of the moving path, so that the moving object 120 can move along the moving path and execute the specified action. According to another embodiment of the present invention, when the user defines the moving track of the finger as the active boundary, the second processing unit 122 enables the second driving unit 124 to move the moving object 120 within the active boundary.
Fig. 3 is a flowchart of a path planning method according to an embodiment of the invention. First, in step S301, the first processing unit 113 receives flag setting signals at different positions (users trigger the flag setting signals at different positions). In step S302, the first processing unit 113 establishes a plurality of virtual flags at different positions according to the actual coordinates obtained by the first positioning unit 112 (the virtual flags are established by obtaining the actual coordinates of the different positions by the first positioning unit of the augmented reality device). In step S303, the first display unit 114 displays a display screen having a plurality of virtual flags (the virtual flags are displayed in the first display unit of the augmented reality device). In step S304, the first camera 111 captures a plurality of images with the indicator (the first camera of the augmented reality device captures a plurality of images with the indicator). In step S305, the first processing unit 113 selects a virtual flag passed by the movement track according to the movement track of the pointing object in the plurality of images, and connects the selected virtual flag to generate a movement path or an activity boundary (the first processing unit of the augmented reality device generates the movement path or the activity boundary according to the movement track of the pointing object). In step S306, the first communication unit 115 outputs the moving path or the moving boundary to the moving object 120, so that the moving object 120 moves between the selected virtual flags according to the moving path or moves within the range of the selected virtual flags according to the moving boundary (the moving object receives the moving path or the moving boundary from the augmented reality device and moves along the moving path or moves within the moving boundary).
It is to be noted that although the above-described method has been described on the basis of a flowchart using a series of steps or blocks, the present invention is not limited to the order of the steps, and some steps may be performed in an order different from that of the rest of the steps or the rest of the steps may be performed simultaneously. Moreover, those skilled in the art will appreciate that the steps illustrated in the flow chart are not exclusive, that other steps of the flow chart may be included, or that one or more steps may be deleted without affecting the scope of the invention.
In summary, according to the path planning system and the method thereof provided by some embodiments of the present invention, the virtual flag is set in advance and connected to generate the moving path, so as to avoid the spatial misjudgment caused by the image recognition. In addition, the interaction of the augmented reality can be enhanced by adding instruction actions in the moving path between the two virtual flags.
It should be noted that the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting, and although the present invention is described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made to the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention.

Claims (10)

1. A path planning method is suitable for an augmented reality device, the augmented reality device at least comprises a processing unit, a display unit and a camera shooting unit, and the method is characterized by comprising the following steps:
the processing unit establishes a plurality of virtual flags at different positions according to the flag setting signal;
the display unit displays a display screen with the virtual flags;
the camera shooting unit captures a plurality of images with indicators;
the processing unit obtains a moving track of the indicator according to a plurality of images with the indicator; and
the processing unit selects the virtual flag passed by the movement track and connects the selected virtual flag to generate a movement path or an activity boundary.
2. The method of claim 1, wherein the step of the processing unit establishing the plurality of virtual flags at the different locations according to the flag setting signal further comprises:
the processing unit receives the flag setting signal at the different position;
the positioning unit of the augmented reality device captures actual coordinates corresponding to the different positions; and
the processing unit establishes a plurality of virtual flags according to the actual coordinates.
3. The augmented reality path planning method of claim 2, wherein the step of the display unit displaying the display screen with the virtual flag further comprises:
the positioning unit obtains current coordinates; and
the processing unit generates the display frame according to the current coordinates and the actual coordinates respectively corresponding to the virtual flags.
4. The method of augmented reality path planning of claim 1, the steps further comprising:
enabling a communication unit of the augmented reality device to output the moving track or the moving boundary to a moving object by the processing unit;
the moving object moves between the selected virtual flags according to the moving path or moves within the range formed by the selected virtual flags according to the moving boundary.
5. The method of claim 4, wherein the moving path further comprises an action command, and the moving object performs a corresponding action according to the action command when moving.
6. A path planning system, characterized in that the path planning system comprises:
an augmented reality device comprising:
a display unit for displaying a display screen having a plurality of virtual flags;
the camera shooting unit is used for capturing a plurality of images with the indicator; and
a processing unit for establishing the virtual flags at different positions according to flag setting signals, obtaining a moving track of the indicator according to the images with the indicator, selecting the virtual flag passed by the moving track, and connecting the selected virtual flag to generate a moving path or an activity boundary;
and the moving object is used for receiving the moving path or the moving boundary from the augmented reality device and moving between the selected virtual flags according to the moving path or moving within a range formed by the selected virtual flags according to the moving boundary.
7. The path planning system of claim 6, wherein the augmented reality device further comprises:
a positioning unit for capturing the actual coordinates corresponding to the different positions according to the flag setting signal received by the processing unit;
wherein the processing unit further establishes the virtual flags according to the actual coordinates.
8. The path planning system according to claim 7, wherein the processing unit of the augmented reality device further generates the display frame according to the current coordinates obtained by the positioning unit of the augmented reality device and the actual coordinates corresponding to the virtual flags, respectively.
9. The path planning system of claim 6, wherein the augmented reality device further comprises:
a communication unit for outputting the moving path or the moving boundary to a moving object.
10. The path planning system according to claim 6, wherein the moving path further includes an action command, so that the moving object performs a corresponding action according to the action command when moving.
CN201911285325.8A 2019-12-13 2019-12-13 Path planning system and method thereof Active CN112985372B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911285325.8A CN112985372B (en) 2019-12-13 2019-12-13 Path planning system and method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911285325.8A CN112985372B (en) 2019-12-13 2019-12-13 Path planning system and method thereof

Publications (2)

Publication Number Publication Date
CN112985372A true CN112985372A (en) 2021-06-18
CN112985372B CN112985372B (en) 2024-06-14

Family

ID=76342166

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911285325.8A Active CN112985372B (en) 2019-12-13 2019-12-13 Path planning system and method thereof

Country Status (1)

Country Link
CN (1) CN112985372B (en)

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201123077A (en) * 2009-12-29 2011-07-01 Ind Tech Res Inst Animation generation system and method
CN102915465A (en) * 2012-10-24 2013-02-06 河海大学常州校区 Multi-robot combined team-organizing method based on mobile biostimulation nerve network
CN103996322A (en) * 2014-05-21 2014-08-20 武汉湾流新技术有限公司 Welding operation training simulation method and system based on augmented reality
WO2016033717A1 (en) * 2014-09-01 2016-03-10 北京诺亦腾科技有限公司 Combined motion capturing system
CN106125932A (en) * 2016-06-28 2016-11-16 广东欧珀移动通信有限公司 The recognition methods of destination object, device and mobile terminal in a kind of augmented reality
WO2016193305A1 (en) * 2015-06-03 2016-12-08 Siemens Aktiengesellschaft Method for calculating an optimised trajectory
CN106371585A (en) * 2016-08-23 2017-02-01 塔普翊海(上海)智能科技有限公司 Augmented reality system and method
US20170039859A1 (en) * 2015-08-03 2017-02-09 Amber Garage, Inc. Planning a flight path by identifying key frames
CN107296650A (en) * 2017-06-01 2017-10-27 西安电子科技大学 Intelligent operation accessory system based on virtual reality and augmented reality
US20170312032A1 (en) * 2016-04-27 2017-11-02 Arthrology Consulting, Llc Method for augmenting a surgical field with virtual guidance content
CN107728792A (en) * 2017-11-17 2018-02-23 浙江大学 A kind of augmented reality three-dimensional drawing system and drawing practice based on gesture identification
KR20180045668A (en) * 2016-10-26 2018-05-04 (주)잼투고 User Terminal and Computer Implemented Method for Synchronizing Camera Movement Path and Camera Movement Timing Using Touch User Interface
US9984499B1 (en) * 2015-11-30 2018-05-29 Snap Inc. Image and point cloud based tracking and in augmented reality systems
JP2018128815A (en) * 2017-02-08 2018-08-16 ソフトバンク株式会社 Information presentation system, information presentation method and information presentation program
KR20180095400A (en) * 2017-02-17 2018-08-27 (주)지피트리 Method For Providing Time-Space Fusion Contents Using Virtual Reality/Augmented Reality Teaching Tools
CN108667764A (en) * 2017-03-28 2018-10-16 南宁富桂精密工业有限公司 Electronic device and communications protocol switching method
CN110142770A (en) * 2019-05-07 2019-08-20 中国地质大学(武汉) A kind of robot teaching system and method based on head-wearing display device
CN110168465A (en) * 2017-11-16 2019-08-23 南京德朔实业有限公司 Intelligent mowing system
CN110238831A (en) * 2019-07-23 2019-09-17 青岛理工大学 Robot teaching system and method based on RGB-D image and teaching device

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201123077A (en) * 2009-12-29 2011-07-01 Ind Tech Res Inst Animation generation system and method
CN102915465A (en) * 2012-10-24 2013-02-06 河海大学常州校区 Multi-robot combined team-organizing method based on mobile biostimulation nerve network
CN103996322A (en) * 2014-05-21 2014-08-20 武汉湾流新技术有限公司 Welding operation training simulation method and system based on augmented reality
WO2016033717A1 (en) * 2014-09-01 2016-03-10 北京诺亦腾科技有限公司 Combined motion capturing system
WO2016193305A1 (en) * 2015-06-03 2016-12-08 Siemens Aktiengesellschaft Method for calculating an optimised trajectory
US20170039859A1 (en) * 2015-08-03 2017-02-09 Amber Garage, Inc. Planning a flight path by identifying key frames
US9984499B1 (en) * 2015-11-30 2018-05-29 Snap Inc. Image and point cloud based tracking and in augmented reality systems
US20170312032A1 (en) * 2016-04-27 2017-11-02 Arthrology Consulting, Llc Method for augmenting a surgical field with virtual guidance content
CN106125932A (en) * 2016-06-28 2016-11-16 广东欧珀移动通信有限公司 The recognition methods of destination object, device and mobile terminal in a kind of augmented reality
CN106371585A (en) * 2016-08-23 2017-02-01 塔普翊海(上海)智能科技有限公司 Augmented reality system and method
KR20180045668A (en) * 2016-10-26 2018-05-04 (주)잼투고 User Terminal and Computer Implemented Method for Synchronizing Camera Movement Path and Camera Movement Timing Using Touch User Interface
JP2018128815A (en) * 2017-02-08 2018-08-16 ソフトバンク株式会社 Information presentation system, information presentation method and information presentation program
KR20180095400A (en) * 2017-02-17 2018-08-27 (주)지피트리 Method For Providing Time-Space Fusion Contents Using Virtual Reality/Augmented Reality Teaching Tools
CN108667764A (en) * 2017-03-28 2018-10-16 南宁富桂精密工业有限公司 Electronic device and communications protocol switching method
CN107296650A (en) * 2017-06-01 2017-10-27 西安电子科技大学 Intelligent operation accessory system based on virtual reality and augmented reality
CN110168465A (en) * 2017-11-16 2019-08-23 南京德朔实业有限公司 Intelligent mowing system
CN107728792A (en) * 2017-11-17 2018-02-23 浙江大学 A kind of augmented reality three-dimensional drawing system and drawing practice based on gesture identification
CN110142770A (en) * 2019-05-07 2019-08-20 中国地质大学(武汉) A kind of robot teaching system and method based on head-wearing display device
CN110238831A (en) * 2019-07-23 2019-09-17 青岛理工大学 Robot teaching system and method based on RGB-D image and teaching device

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
周华;陈丽;段登平;: "基于滑模变结构的欠驱动浮空器轨迹跟踪控制", 浙江大学学报(工学版), no. 07, 31 December 2017 (2017-12-31), pages 1412 - 1420 *
曾子懿;王清贤;朱俊虎;: "基于视图合成的多方攻防状态可视化", 信息工程大学学报, no. 06, 15 December 2016 (2016-12-15), pages 698 - 701 *
韩峰;张衡;朱镭;刘虎;: "增强现实环境下的CPS虚拟装配系统研究与实现", 应用光学, no. 03, 15 May 2019 (2019-05-15), pages 387 - 392 *

Also Published As

Publication number Publication date
CN112985372B (en) 2024-06-14

Similar Documents

Publication Publication Date Title
US11710322B2 (en) Surveillance information generation apparatus, imaging direction estimation apparatus, surveillance information generation method, imaging direction estimation method, and program
CN107820593B (en) Virtual reality interaction method, device and system
US10514708B2 (en) Method, apparatus and system for controlling unmanned aerial vehicle
KR20220028042A (en) Pose determination method, apparatus, electronic device, storage medium and program
CN109582122B (en) Augmented reality information providing method and device and electronic equipment
CN107710736B (en) Method and system for assisting user in capturing image or video
JP2011254289A (en) Moving body locus display device, and moving body locus display program
CN113910224B (en) Robot following method and device and electronic equipment
CN113610702A (en) Picture construction method and device, electronic equipment and storage medium
CN111445499B (en) Method and device for identifying target information
CN112985372B (en) Path planning system and method thereof
WO2022237071A1 (en) Locating method and apparatus, and electronic device, storage medium and computer program
CN115900713A (en) Auxiliary voice navigation method and device, electronic equipment and storage medium
US10459533B2 (en) Information processing method and electronic device
JP6839137B2 (en) Support devices and programs
KR101964227B1 (en) Apparatus and method for control military strategy
CN111540009A (en) Method, apparatus, electronic device, and medium for generating detection information
US10281294B2 (en) Navigation system and navigation method
CN111882675A (en) Model presentation method and device, electronic equipment and computer storage medium
KR102329785B1 (en) Method for estimating head direction of terminal for position measurement
CN111879331B (en) Navigation method and device and electronic equipment
CN117537820A (en) Navigation method, electronic device and readable storage medium
CN117806303A (en) Robot motion control method and device, robot and storage medium
JP2016126762A (en) Relative position determination method, display control method, and system applying the same method
JP2014209286A (en) Mobile terminal and control method therefor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 530033 plant B of Foxconn Nanning science and Technology Park, No. 51 Tongle Avenue, Jiangnan District, Nanning City, Guangxi Zhuang Autonomous Region

Applicant after: Nanning Fulian Fugui Precision Industry Co.,Ltd.

Address before: 530007 the Guangxi Zhuang Autonomous Region Nanning hi tech Zone headquarters road 18, China ASEAN enterprise headquarters three phase 5 factory building

Applicant before: NANNING FUGUI PRECISION INDUSTRIAL Co.,Ltd.

GR01 Patent grant
GR01 Patent grant