CN108680196B - Time delay correction method, system and computer readable medium - Google Patents

Time delay correction method, system and computer readable medium Download PDF

Info

Publication number
CN108680196B
CN108680196B CN201810401369.1A CN201810401369A CN108680196B CN 108680196 B CN108680196 B CN 108680196B CN 201810401369 A CN201810401369 A CN 201810401369A CN 108680196 B CN108680196 B CN 108680196B
Authority
CN
China
Prior art keywords
rotation angle
information
angle information
determining
modified
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810401369.1A
Other languages
Chinese (zh)
Other versions
CN108680196A (en
Inventor
谭明朗
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hyperception Technology Beijing Co ltd
Original Assignee
Hyperception Technology Beijing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hyperception Technology Beijing Co ltd filed Critical Hyperception Technology Beijing Co ltd
Priority to CN201810401369.1A priority Critical patent/CN108680196B/en
Publication of CN108680196A publication Critical patent/CN108680196A/en
Application granted granted Critical
Publication of CN108680196B publication Critical patent/CN108680196B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D18/00Testing or calibrating apparatus or arrangements provided for in groups G01D1/00 - G01D15/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P3/00Measuring linear or angular speed; Measuring differences of linear or angular speeds

Abstract

The application discloses a time delay correction method, which comprises the following steps: acquiring one or more pieces of angular velocity information; acquiring one or more reference image information; determining one or more first rotation angle information based on the one or more angular velocity information; determining one or more second rotation angle information based on the one or more reference image information; determining a delay value based on the one or more first rotation angle information and the one or more second rotation angle information; and correcting the time delay based on the time delay value.

Description

Time delay correction method, system and computer readable medium
Technical Field
The invention relates to a time delay correction method and a time delay correction system. In particular to a time delay correction method and a time delay correction system between sensors in an intelligent robot.
Background
The intelligent robot is more and more widely applied, and due to the difference of the measurement principle and the like of each sensor in the intelligent robot, time delay can occur among the sensors. The synchronization among the sensors in the intelligent robot has great influence on the accuracy of data acquisition and processing. Therefore, there is a need for an accurate delay correction method and system to achieve synchronization between sensors.
Disclosure of Invention
Aiming at the condition that the time delay correction error among the sensors is large in the prior art, the time delay correction method is provided, and the synchronization among a plurality of sensors is realized by comparing the angle difference, so that the accuracy of time delay correction is improved.
One aspect of the present application relates to a delay correction method, including: acquiring one or more pieces of angular velocity information; acquiring one or more reference image information; determining one or more first rotation angle information based on the one or more angular velocity information; determining one or more second rotation angle information based on the one or more reference image information; determining a delay value based on the one or more first rotation angle information and the one or more second rotation angle information; and correcting the time delay based on the time delay value.
In some embodiments, said determining a delay value based on said one or more first rotation angle information and said one or more second rotation angle information comprises: modifying the one or more first rotation angle information and the one or more second rotation angle information; determining a time delay value based on the one or more modified first rotation angle information and the one or more modified second rotation angle information.
In some embodiments, said modifying said one or more first rotation angle information and said one or more second rotation angle information comprises: acquiring one or more reference time points; modifying the one or more first rotation angle information and the one or more second rotation angle information based on the one or more reference time points.
In some embodiments, said modifying said one or more first rotation angle information based on said one or more reference time points comprises: delaying the one or more reference time points based on a pending time variable; determining one or more modified first rotation angle information based on the one or more delayed reference time points.
In some embodiments, said modifying said one or more second rotation angle information based on said one or more reference time points comprises: delaying the one or more reference time points based on a pending time variable; determining one or more modified second rotation angle information based on the one or more delayed reference time points.
In some embodiments, said determining a delay value based on said one or more modified first rotation angle information and said one or more modified second rotation angle information comprises: determining one or more angle difference values based on the one or more modified first rotation angle information and the one or more modified second rotation angle information; determining a sum of squares of the one or more angular differences; determining a delay value based on a sum of squares of the one or more angular differences.
In some embodiments, said determining a delay value based on a sum of squares of said one or more angular differences comprises: determining a minimum of a sum of squares of the one or more angular differences based on the pending time variable; and determining the undetermined time variable corresponding to the minimum value of the square sum of the one or more angle difference values as a time delay value.
Yet another aspect of the present application relates to a delay correction system, comprising: the angular velocity acquisition module is used for acquiring one or more pieces of angular velocity information; an image acquisition module for acquiring one or more reference image information; a processing module to determine a latency value based on the one or more angular velocity information and the one or more reference image information; a correction module to correct a time delay based on the time delay value.
In some embodiments, the angular velocity acquisition device is a gyroscope.
In some embodiments, the image capture device is a camera.
Another aspect of the present application relates to a latency correction apparatus, including a processor, which executes the latency correction method.
Another aspect of the application relates to a computer-readable storage medium storing computer instructions that, when read by a computer, perform the latency correction method.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings used in the description of the embodiments will be briefly introduced below. It is obvious that the drawings in the following description are only some embodiments of the application, and that it is also possible for a person skilled in the art to apply the application to other similar scenarios without inventive effort on the basis of these drawings. Unless otherwise apparent from the context of language or otherwise indicated, like reference numerals in the figures refer to like structures and operations.
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings used in the description of the embodiments will be briefly introduced below.
FIG. 1 is a schematic diagram of an intelligent robotic system, shown in accordance with some embodiments of the present application;
FIG. 2 is a schematic diagram of a computing device configuration, shown in accordance with some embodiments of the present application;
FIG. 3 is a schematic diagram of a mobile device according to some embodiments of the present application;
FIG. 4 is a schematic diagram of a delay correction method according to some embodiments of the present application;
FIG. 5 is a schematic view of a camera coordinate system and an image coordinate system shown in accordance with some embodiments of the present application;
FIG. 6 is a schematic diagram of a method of determining a delay value according to some embodiments of the present application;
FIG. 7 is a schematic illustration of modifying one or more first angle of rotation information and one or more second angle of rotation information, shown in accordance with some embodiments of the present application;
FIG. 8 is a schematic illustration of modifying one or more first rotation angle information according to some embodiments of the present application;
FIG. 9 is a schematic diagram of a method of determining a delay value according to some embodiments of the present application;
FIG. 10 is a schematic diagram of a latency correction system according to some embodiments of the present application; and
fig. 11 is a schematic diagram of an embodiment of intelligent robot delay correction according to some embodiments of the present application.
Detailed Description
In the following detailed description, numerous specific details of the present application are set forth by way of examples in order to provide a thorough understanding of the relevant disclosure. It will be apparent, however, to one skilled in the art that the present application may be practiced without these specific details. It should be understood that the use of the terms "system," "apparatus," "unit" and/or "module" herein is a method for distinguishing between different components, elements, portions or assemblies at different levels of sequential arrangement. However, these terms may be replaced by other expressions if they can achieve the same purpose.
It will be understood that when a device, unit or module is referred to as being "on" … … "," connected to "or" coupled to "another device, unit or module, it can be directly on, connected or coupled to or in communication with the other device, unit or module, or intervening devices, units or modules may be present, unless the context clearly dictates otherwise. For example, as used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to limit the scope of the present application. As used in the specification and claims of this application, the terms "a", "an", and/or "the" are not intended to be inclusive in the singular, but rather are intended to be inclusive in the plural, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" are intended to cover only the explicitly identified features, integers, steps, operations, elements, and/or components, but not to constitute an exclusive list of such features, integers, steps, operations, elements, and/or components.
These and other features and characteristics of the present application, as well as the methods of operation and functions of the related elements of structure and the combination of parts and economies of manufacture, will be better understood upon consideration of the following description and the accompanying drawings, which form a part of this specification. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the application. It will be understood that the figures are not drawn to scale.
In addition, the present application only describes the method and system for time delay correction, and it is understood that the description in the present application is only one embodiment.
The term "user equipment" or "smart device" or "mobile terminal" in this application may refer to a tool that may be used to request a service, subscribe to a service, or facilitate the provision of a service.
Various block diagrams are used in this application to illustrate various variations of embodiments according to the application. It should be understood that the foregoing and following structures are not intended to limit the present application. The protection scope of this application is subject to the claims.
The present application describes an intelligent robotic system schematic 100. As shown in fig. 1, the intelligent robot system 100 may include an intelligent robot 110, a network 120, a user device 130, and a database 140. In some embodiments, the smart robot 110 includes a processing device 111.
In some embodiments, the smart robot 110 may collect data through sensors. For example, the smart robot 110 may capture image information through a camera. For another example, the intelligent robot 110 may collect its own rotation information through a gyroscope. In some embodiments, the processing device 111 of the intelligent robot 110 may receive and process the collected data to determine the current state information of the intelligent robot 110. The state information may include a location, obstacle information, a driving speed, a turning speed, and the like. The state information of the intelligent robot can be used for guiding the motion of the intelligent robot. For example, when the position of the state information is in front of an obstacle, the processing device 111 in the intelligent robot 110 generates a series of instructions for avoiding the movement of the obstacle. In some embodiments, there is a time delay between sensors (e.g., cameras, gyroscopes, etc.) in the smart robot 110. In particular, the processing device 111 processes the data collected by the sensors with time delay, and the determined state information of the intelligent robot 110 may generate errors. In some embodiments, errors in state information due to time delays between sensors may affect the functionality of the smart robot 110 (e.g., travel, turn, etc.).
Network 120 may be a single network, or a combination of multiple different networks. For example, the network 120 may be a Local Area Network (LAN), Wide Area Network (WAN), public network, private network, Public Switched Telephone Network (PSTN), internet, wireless network, virtual network, metropolitan area network, telephone network, etc., or a combination thereof. Network 120 may include a plurality of network access points, such as wired or wireless access points, e.g., wired access points, wireless access points, base stations, internet switching points, etc. Through these access points, data sources may access network 120 and transmit data information through network 120. In some embodiments, the network 120 may be divided into a wireless network (bluetooth, wireless local area network (WLAN, Wi-Fi, WiMax, etc.), a mobile network (2G, 3G, 4G signals, etc.), or other connection (virtual private network (VPN), a shared network, Near Field Communication (NFC), ZigBee, etc.). in some embodiments, the network 120 may be used for communication of the smart robot system 100. for example, the network 120 may receive information inside or outside the smart robot system 100 and send information to other parts inside or outside the smart robot system 100. in some embodiments, the network 120 may be accessed to the network 120 through a wired connection, a wireless connection, or a combination thereof between the smart robot 110, the user device 130, and the database 140.
The user device 130 is a device of a user. The user devices may be connected to the intelligent robot 110 or the database 140 through the network 120. In some embodiments, the user device 130 may be a smart device. The intelligent device can be one or a combination of a plurality of mobile phones 130-1, tablet computers 130-2 or notebook computers 130-3. The user device 130 may also include one or more combinations of smart home devices, wearable devices, mobile devices, virtual reality devices, augmented reality devices, and the like. In some embodiments, the smart home appliance may include one or a combination of smart lighting devices, smart appliance control devices, smart monitoring devices, smart televisions, smart cameras, interphones, and the like. In some embodiments, the wearable device may include one or more of a bracelet, footwear, glasses, helmet, watch, clothing, backpack, smart accessory, and the like, in combination. In some embodiments, the mobile device may include a combination of one or more of a mobile phone, a Personal Digital Assistant (PDA), a gaming device, a navigation device, a point of sale (POS) device, a laptop, a tablet, a desktop, and the like. In some embodiments, the virtual reality apparatus and/or the augmented reality device may include a virtual reality helmet, virtual reality glasses, a virtual realityOne or more of a real eye shield, an augmented reality helmet, augmented reality glasses, augmented reality eye shields, and the like. For example, the virtual reality device and/or the augmented reality device may comprise a Google GlassTM、Oculus RiftTM、HololensTM、Gear VRTMAnd the like.
Database 140 may generally refer to a device having a storage function. The database 140 may store data collected by the intelligent robot 110 (e.g., information, instructions, etc. for the user device 130 received by the intelligent robot 110). The database 140 may be local or remote. The database 140 may include a hierarchical database, a network database, a relational database, etc., or a combination of several. The database 140 may digitize the information and store it in a storage device using electrical, magnetic, or optical means. The database 140 may be used to store various information, such as programs, data, and the like. The database 140 may be a device that stores information using an electric energy, such as various memories, a Random Access Memory (RAM), a Read Only Memory (ROM), and the like. The random access memory can comprise a decimal count tube, a number selection tube, a delay line memory, a Williams tube, a Dynamic Random Access Memory (DRAM), a Static Random Access Memory (SRAM), a thyristor random access memory (T-RAM), a zero-capacitance random access memory (Z-RAM), and the like, or a combination of the above. The rom may include bubble memory, magnetic button wire memory, thin film memory, magnetic wire memory, magnetic core memory, magnetic drum memory, optical disk drive, hard disk, magnetic tape, early nonvolatile memory (NVRAM), phase change memory, magnetoresistive random access memory, ferroelectric random access memory, nonvolatile SRAM, flash memory, eeprom, erasable programmable rom, shielded read-only memory, floating gate ram, nano-ram, racetrack memory, variable resistive memory, programmable metallization cells, and the like, or combinations thereof. The database 140 may be a device that stores information using magnetic energy, such as a hard disk, a floppy disk, a magnetic tape, a magnetic core memory, a bubble memory, a usb flash disk, a flash memory, etc. The database 140 may be a device that stores information optically, such as a CD or DVD, for example. The database 140 may be a device that stores information using magneto-optical means, such as magneto-optical disks and the like. The access mode of the database 140 may be random access memory, serial access memory, read-only memory, etc., or a combination of several. Database 140 may include non-persistent memory, or a combination of both. In some embodiments, the database 140 may store an online map or an offline map. The online map may be updated in real time. For example, real-time maps collected by the intelligent robot 110 during travel may be stored in the database 140 via the network 120. Also for example, the intelligent robot 110 may obtain the latest map data from the database 140 through the network 120.
FIG. 2 is a schematic diagram of a computing device configuration shown in accordance with some embodiments of the present application. Computing device 200 may be used to implement particular methods and apparatus disclosed herein. The specific apparatus in this embodiment is illustrated by a functional block diagram of a hardware platform that includes a display module. In some embodiments, computing device 200 may implement one or more modules and units of processing apparatus 111 described herein. In some embodiments, the processing device 111 may be implemented by the computing device 200 through hardware devices, software programs, firmware, and combinations thereof. In some embodiments, computing device 200 may be a general purpose computer, or a special purpose computer.
As shown in FIG. 2, computing device 200 may include an internal communication bus 210, a processor 220, a Read Only Memory (ROM)230, a Random Access Memory (RAM)240, a communication port 250, an input/output component 260, a hard disk 270, and a display 280. Internal communication bus 210 may enable data communication among the components of computing device 200. Processor 220 may make the determination and issue a prompt. In some embodiments, processor 220 may be comprised of one or more processors. The communication port 250 may enable data communication between the computing device 200 and other components in the intelligent robotic system 100 (e.g., the user device 130 and the database 140). In some embodiments, computing device 200 may send and receive information and data from network 120 through communication port 250. The input/output component 260 supports input/output data flow between the computing device 200 and other components of the intelligent robotic system 100 (e.g., the user device 130 and the database 140). The display 280 may be used to display information generated by the computing device 200. Computing device 200 may also include various forms of program storage units and data storage units such as a hard disk 270, Read Only Memory (ROM)230, and Random Access Memory (RAM)240, which are capable of storing various data files for processing and/or communication by computing device 200, as well as possible program instructions for execution by processor 220.
The data bus 210 may be used to transmit data information. In some embodiments, data may be transferred between hardware within the computing device 200 via the data bus 210. For example, the processor 220 may send data over the data bus 210 to memory or other hardware such as input/output ports 260. It should be noted that the data may be actual data, or may be instruction codes, status information, or control information. In some embodiments, data bus 210 may be an Industry Standard (ISA) bus, an Extended ISA (EISA) bus, a Video Electronics Standard (VESA) bus, a peripheral component interconnect standard (PCI) bus, or the like.
The processor 220 may be used for logical operations, data processing, and instruction generation. In some embodiments, processor 220 may fetch data/instructions from internal memory, which may include Read Only Memory (ROM), Random Access Memory (RAM), Cache memory (not shown in the figures), and the like. In some embodiments, the processor 220 may include multiple sub-processors that may be used to implement different functions of the system.
In some embodiments, the read-only memory may include programmable read-only memory (PROM), programmable erasable read-only memory (EPROM), and the like. The random access memory 240 is used for storing an operating system, various application programs, data, and the like. In some embodiments, random access memory 240 may include Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), and the like.
The communication port 250 is used for connecting the operating system and an external network to realize communication between the operating system and the external network. In some embodiments, communication ports 250 may include FTP ports, HTTP ports, DNS ports, or the like. The input/output port 260 is used for exchanging and controlling data and information between an external device or circuit and the processor 210. In some embodiments, input/output ports 260 may include USB ports, PCI ports, IDE ports, and the like.
The hard disk 270 is used to store information and data generated by the smart robot 110 or received from the smart robot 110. For example, data collected by the sensors of the smart robot 110 may be stored in the hard disk 270. In some embodiments, the hard disk 270 may include a mechanical hard disk (HDD), a Solid State Disk (SSD), a Hybrid Hard Disk (HHD), or the like. The display 280 is used to display information, data, etc. generated by the intelligent robot system 100. In some embodiments, display 280 may include a physical display, such as a display with speakers, an LCD display, an LED display, an OLED display, an electronic Ink display (E-Ink), or the like. In some embodiments, the display 280 may be included in the smart robot 110, rather than in the computing device 200 or the processing apparatus 111.
FIG. 3 is a schematic diagram of a mobile device according to some embodiments of the present application. In some embodiments, the mobile device 300 may implement one or more modules and units of the user device 130 described herein. As shown in fig. 3, the mobile device 300 may include a communication platform 310, a display 320, a Graphics Processor (GPU)330, a Central Processing Unit (CPU)340, an input/output interface 350, a memory 360, and a storage 390. In some embodiments, an operating system 370 (e.g., iOS, Android, Windows Phone, etc.) and one or more application programs 380 may be loaded from storage 390 into memory 360 for execution by CPU 340.
Fig. 4 is a schematic diagram of a delay correction method according to some embodiments of the present application. In some embodiments, the latency correction method is performed by the smart robot 110 or the computing device 200.
At 402, one or more angular velocity information is obtained. The one or more angular rate information is obtained by a gyroscope. In some embodiments, the angular velocity information may include first timestamp information and an angular velocity. The first timestamp information may be a timestamp of a gyroscope. The first timestamp information includes an initial timestamp and an end timestamp.
In 404, one or more reference image information is acquired. The reference image information is obtained by shooting a reference image by a camera. In some embodiments, the reference picture information may include the second timestamp information and the reference picture. The second time stamp information may be a time stamp of the camera. In some embodiments, the reference image is placed directly opposite the camera. The reference image can be a checkerboard image, a two-dimensional code image and the like which can be calibrated. The camera can take a complete reference image. In some embodiments, the camera is located in an X-Y-Z coordinate system and the reference image is located in a u-v coordinate system. The X-Y plane is parallel to the u-v plane. The X-Y-Z coordinates can be converted to u-v coordinates.
In 406, one or more first rotation angle information is determined based on the one or more angular velocity information. As described in 402, the angular velocity information may include first timestamp information and an angular velocity. The first time stamp information tgyroMay include an initial time stamp tSAnd an end timestamp te. The angular velocity may be an angular velocity w along different coordinate axesx、wyAnd wz. In particular, the rotation angle information is related only to the angular velocity w around the Z-axiszIt is related. w is azMay be a function that varies over time. In some embodiments, the distance may be measured by pairing wzThe integration yields a first angle of rotation. As shown in equation (1):
Figure BDA0001645807950000101
wherein, ts、teAn initial timestamp and an end timestamp. In a process of acquiring angular velocity information of a gyroscope, the acquired angular velocity information includes a start time pointAnd an end time point. The initial timestamp comprises a plurality of start time points of a gyroscope acquisition. The end time stamp includes a plurality of end time points of the gyroscope acquisition.
In some embodiments, the first rotation angle information includes a first rotation angle θgyroAnd first time stamp information tgyro. In some embodiments, a gyroscope may acquire one or more angular rate information. Further, one or more first rotation angle information may be determined by equation (1). In some embodiments, one or more of the one or more first rotation angle information may be one or more expressions as shown in formula (1), or according to specific values as shown in formula (1).
In 408, one or more second rotation angle information is determined based on the one or more reference image information. A method for determining one or more second rotation angle information based on the one or more reference image information is shown in fig. 5 and described herein.
In 410, a time delay value is determined based on the one or more first rotation angle information and the one or more second rotation angle information. The time delay value is a time delay value between the sensors. In particular, the time delay value is a time delay value between the gyroscope and the camera.
At 412, the delay is corrected based on the delay value. In some embodiments, the delay value may be used to correct for time delays between sensors. For example, based on the delay value, a first timestamp of the gyroscope is delayed by a corresponding length of time. The corresponding time duration is equal to the delay value. For another example, the second timestamp of the camera is advanced by a corresponding duration based on the time delay value. In some embodiments, the delay value may be a positive value or a negative value.
It is noted that many alternatives, modifications, and variations will be apparent to those skilled in the art. The features, structures, methods, and other features of the exemplary embodiments described herein may be combined in various ways to obtain additional and/or alternative exemplary embodiments. For example, the order of acquisition of the angular velocity information and the reference image information, or the order of processing of the one or more first rotation angle information and the one or more second rotation angle information may be arbitrary. That is, the execution sequence of the method in fig. 4 may also be 402-.
Fig. 5 is a schematic diagram of a camera coordinate system and an image coordinate system shown in accordance with some embodiments of the present application. As shown in FIG. 5, the camera coordinate system is an X-Y-Z coordinate system, the image coordinate system is a u-v coordinate system, and the intersection point of the Z axis and the u-v plane is (u-v)0,v0). In some embodiments, the internal parameter of the camera is the focal length fxAnd fyOptical center ofx,cy
At a sampling time point of the camera, the camera acquires reference image information. The one reference picture information includes one current time point and one reference picture. The current point in time is a sampling point in time in the camera timestamp. The coordinates of a point on the reference image are D (x, y, z). Then by the formula:
Figure BDA0001645807950000111
obtaining the coordinate p corresponding to the point D (x, y, z) in the image coordinate systemc(u,v)。
Further, it can be according to the formula
spc=K[R|T]pw (3)
And determining the rotation and translation relation of the reference image corresponding to the current time point relative to the reference image corresponding to the starting time point. The starting time point may be a time point when the camera takes the first reference image. Wherein p isc=[u v 1]TIs the coordinate corresponding to the image coordinate system, s is the corresponding point p in the image coordinate systemcK is the intrinsic parameter matrix of the camera, R and T are the 3D rotations and 3D translations required for the camera, pw=[x y z 1]TIs the coordinates of the world coordinate system.
In some embodiments, assuming the data is noise-free, the equivalent of equation (3) is:
Figure BDA0001645807950000121
in some embodiments, a second angle corresponding to the current coordinate D (x, y, z) may be determined from R and T. The second angle is an angle that the reference image at the current time point rotates relative to the reference image at the starting time point. The starting point in time may be the first point in time of the camera timestamp to acquire the reference image. Further, a second angle of a previous time point to the current time point may be determined. Determining the difference value of the second angle corresponding to the current time point and the second angle corresponding to the last time point as the second rotation angle theta of the current time pointcam. Corresponding the current time point and a second rotation angle theta of the current time pointcamAnd determining as second rotation angle information.
As depicted in 404, the camera may acquire one or more reference image information. The one or more reference image information may include time stamp information (second time stamp information) of the camera. Further, one or more second rotation angle information may be determined according to the method illustrated in fig. 5 based on the one or more reference image information. The one or more rotation angle information includes one or more second rotation angles and second time stamp information.
Fig. 6 is a schematic diagram of a method of determining a delay value according to some embodiments of the present application. In some embodiments, the latency value determination method is performed by the processing device 111 or the computing apparatus 200.
In 602, the one or more first rotation angle information and the one or more second rotation angle information are modified. As illustrated in fig. 4, the one or more first rotation angle information includes one or more first rotation angles θgyroAnd first time stamp information tgyro. Due to said one or more rotation angles θgyroIs based on the time stamp tgyroDetermined by integration, hence thetagyroMay be expressed as thetagyro(tgyro). The one or more second rotation angle information includes one or more second rotation angles θcam(tcam) And second time stamp information tcam. The modification process may be modifying the first time stamp information and the second time stamp information.
At 604, a time delay value is determined based on the one or more modified first rotation angle information and the one or more modified second rotation angle information. The method for determining the time delay value based on the one or more modified first rotation angle information and the one or more modified second rotation angle information is shown in fig. 9 and corresponding description.
Fig. 7 is a schematic illustration of modifying one or more first angle of rotation information and one or more second angle of rotation information, shown in accordance with some embodiments of the present application. In some embodiments, the method of modifying the one or more first rotation angle information and the one or more second rotation angle information is performed by the processing device 111 or the computing apparatus 200.
At 702, one or more reference time points are obtained. In some embodiments, the one or more reference time points may be a discrete time series t during the intelligent robot travel or rotationiWherein i is a sequence number. For example, the one or more reference points in time are clocks of the system. In some embodiments, the discrete time series tiAre known. In particular, the one or more reference points in time may be the same as the time stamp of the sensor. For example, a discrete time series tiTime stamp t with gyroscopegyroThe same is true. As another example, a discrete time series tiTime stamp t with cameracamThe same is true.
In 704, the one or more first rotation angle information and the one or more second rotation angle information are modified based on the one or more reference time points. In particular, the correction process is based on said one or more reference points in time tiCorrecting the first timeTimestamp information tgyroAnd said second time stamp information tcamAnd further implementing one or more first rotation angle information and the one or more second rotation angle information.
FIG. 8 is a schematic illustration of modifying one or more first rotation angle information according to some embodiments of the present application. In some embodiments, the one or more first rotation angle information correction methods are performed by the processing device 111 or the computing apparatus 200.
At 802, the one or more reference time points are delayed based on a pending time variable. In some embodiments, the pending time variable Δ t may be a parameter that characterizes a time delay value between sensors (e.g., between a gyroscope and a camera). The one or more reference points in time tiAfter the delay of a variable delta t to be determined, the variable is changed into ti+Δt。
In 804, one or more modified first rotation angle information is determined based on the one or more delayed reference time points. As illustrated in 406 of fig. 4, the first rotation angle information includes a first rotation angle θgyro(tgyro) And first time stamp information tgyro. The first time stamp information tgyroMay include an initial time stamp tsAnd an end timestamp te. In some embodiments, the one or more modified first rotation angles are determined to be θ based on the one or more delayed reference time pointsgyro(ti+ Δ t). Similar to equation (1), the one or more modified first rotation angles θgyro(ti+ Δ t) may also be obtained by integration. The one or more modified first rotation angle information includes one or more modified first rotation angles θgyro(ti+ Δ t) and a modified first timestamp ti+Δt。
Based on the discrete time series t, as described in 704iDetermining one or more modified second rotation angles as thetacam(ti). The one or more modified second rotation angle information includes one or more modified second rotation angles θcam(ti) And a modified second time stamp ti. Said modified second time stamp tiThe delay of the waiting time variable Δ t has not elapsed.
Fig. 9 is a schematic diagram of a method of determining a delay value according to some embodiments of the present application. In some embodiments, the latency value determination method is performed by the processing device 111 or the computing apparatus 200.
At 902, one or more angular difference values are determined based on the one or more modified first rotation angle information and the one or more modified second rotation angle information. In some embodiments, the one or more angular difference values may be θgyro(ti+ Δ t) and θcam(ti) The difference between them. For example, the one or more angular difference values may be θgyro(ti+Δt)-θcam(ti) Or thetacam(ti)-θgyro(ti+Δt)。
In some embodiments, the one or more angular difference values may be determined by a matching process. The matching process may be based on the one or more reference points in time tiMatching the one or more modified first angles of rotation and the one or more modified second angles of rotation. Specifically, a first rotation angle and a second rotation angle having the same reference time point sequence number are matched as a group. For example, will thetagyro(t1+ Δ t) and θcam(t1) Matched into a group. As another example, θgyro(t4+ Δ t) and θcam(t4) Matched into a group. Further, the difference value of the matched one or more groups of the first rotating angle and the second rotating angle is calculated and determined as one or more angle difference values.
In some embodiments, the one or more angular difference values θ may be determined by instructions or code based on the sequence number igyro(ti+Δt)-θcam(ti). At this point, the matching process described above will not be necessary.
At 904, a sum of squares of the one or more angular differences is determined. In some embodiments, the one or more cornersThe sum of the squares of the degree differences may be expressed as
Figure BDA0001645807950000151
Figure BDA0001645807950000152
At 906, a delay value is determined based on a sum of squares of the one or more angular differences. In some embodiments, when the sum of the squares of the one or more angular differences
Figure BDA0001645807950000153
Figure BDA0001645807950000154
At the minimum, Δ t at this time is determined as a delay value.
It should be noted that the delay value determination methods shown in fig. 6 to 9 are only one or several embodiments of the delay determination method of the present application, and are not intended to limit the present application, and it is obvious to those skilled in the art that many variations of the present application may be implemented without inventive labor. For example, the reference point in time t may be based on the one or more delaysi+ Δ t, determining one or more modified second rotation angles θcam(ti+ Δ t); based on the reference time point tiDetermining one or more corrected first rotation angles thetagyro(ti). Also for example, the first rotation angle θ based on the one or more correctionsgyro(ti) And the one or more modified second rotation angles thetacam(ti+ at) to determine a sum of squares of one or more angular differences
Figure BDA0001645807950000155
Further, a delay value is determined based on a sum of squares of the one or more angular differences.
In some embodiments, the latency correction methods described in fig. 4-9 may be embodied as instructions in a computer-readable medium. The computing device 200 may implement the electric bicycle searching method by reading and executing the instructions. The intelligent robot 110 or the processing device 111 or the computing device 200 may implement the latency correction method by reading and executing the instructions.
Fig. 10 is a schematic diagram of a latency correction system according to some embodiments of the present application. As shown in fig. 10, the latency correction system 1000 includes an angular velocity acquisition module 1010, an image acquisition module 1020, and a processing module 1030. In some embodiments, the functions of the latency correction system 1000 may be implemented by the intelligent robot 110.
The angular velocity acquisition module 1010 may be configured to obtain one or more pieces of angular velocity information. The angular velocity acquisition module 1010 may be a gyroscope. In some embodiments, the angular velocity information may include first timestamp information and an angular velocity. The first timestamp information includes an initial timestamp and an end timestamp.
Image acquisition module 1020 may be used to acquire one or more reference image information. The image acquisition module 1020 may be a camera. In some embodiments, the reference picture information may include the second timestamp information and the reference picture. The second time stamp information may be a time stamp of the camera. In some embodiments, the reference image is placed directly opposite the camera. The reference image can be a checkerboard image, a two-dimensional code image and the like which can be calibrated. In particular, the camera may take a complete reference image.
The processing module 1030 may be configured to determine a latency value based on the one or more angular velocity information and the one or more reference image information. The functions of the processing module 1030 may be implemented by the processing device 111 or the computing apparatus 200 in the intelligent robot. Specifically, the processing module 1030 determines one or more first rotation angle information based on the one or more angular velocity information; the processing module 1030 determines one or more second rotation angle information based on the one or more reference image information; the processing module 1030 determines a delay value based on the one or more first rotation angle information and the one or more second rotation angle information. The specific process of the processing module 1030 for determining the delay value is shown in fig. 4-9 and the related description.
The correction module 1040 is configured to correct the delay based on the delay value. For example, based on the delay value, the correction module 1040 delays the first timestamp of the gyroscope by a corresponding length of time. The corresponding time duration is equal to the delay value. For another example, based on the time delay value, the correction module 1040 advances the second timestamp of the camera by a corresponding time length. In some embodiments, the delay value may be a positive value or a negative value.
Fig. 11 is a schematic diagram of an embodiment of intelligent robot delay correction according to some embodiments of the present application. In particular, the robot is a sweeping robot. As shown in fig. 11, the sweeper robot is located on a delay correction assist device. The time delay correction auxiliary device consists of a bottom plate, a top plate and a fixing support for connecting the bottom plate and the top plate. A reference image is placed on the top plate. The reference image is a checkerboard. A base plate is placed on the bottom plate, the base plate is connected with the bottom plate through an adjustable support column, and the sweeping robot is located on the base plate. The upper side of the sweeping robot is provided with a camera, and the periphery of the robot is provided with a gyroscope. During the rotation process of the sweeping robot, the gyroscope can acquire one or more pieces of angular velocity information, and the camera can acquire one or more pieces of reference image information of the checkerboard. A processing module within the sweeping robot obtains the one or more angular velocity information and the one or more reference image information. The sweeping robot determines the time delay value by adopting the method described in the figures 4-9 based on the acquired one or more pieces of angular velocity information and the one or more pieces of reference image information. In some embodiments, a time delay value between another sensor and the camera may also be determined based on the methods described in fig. 4-9. For example, a camera acquires reference image information, another sensor acquires sensor information, and a delay value is determined based on the reference image information and the sensor information. Furthermore, time delay between the multiple sensors is corrected by correcting time delay between the camera and the multiple sensors.
Compared with the prior art, the beneficial effects of this application show as follows:
the method is used for correcting data measured by a gyroscope and a camera which are arranged in the intelligent robot, and is easy to operate.
And secondly, the method for correcting the data based on the data measured by the built-in gyroscope and the built-in camera of the intelligent robot is more accurate than the method for correcting the data based on the timestamps of the gyroscope and the built-in camera of the intelligent robot.
Various aspects of the methods outlined above and/or methods in which other steps are implemented by the program. Program portions of the technology may be thought of as "products" or "articles of manufacture" in the form of executable code and/or associated data embodied in or carried out by a computer readable medium. Tangible, non-transitory storage media include memory or storage for use by any computer, processor, or similar device or associated module. Such as various semiconductor memories, tape drives, disk drives, or similar devices capable of providing storage functions for software at any one time.
All or a portion of the software may sometimes communicate over a network, such as the internet or other communication network. Such communication enables loading of software from one computer device or processor to another. For example: from a management server or host computer of the intelligent robot system to a hardware platform of a computer environment or other computer environment implementing the system or similar functionality related to the information required by the intelligent robot system. Thus, another medium capable of transferring software elements may also be used as a physical connection between local devices, such as optical, electrical, electromagnetic waves, etc., propagating through cables, optical cables, or the air. The physical medium used for the carrier wave, such as an electric, wireless or optical cable or the like, may also be considered as the medium carrying the software. As used herein, unless limited to a tangible "storage" medium, other terms referring to a computer or machine "readable medium" refer to media that participate in the execution of any instructions by a processor.
Thus, a computer-readable medium may take many forms, including but not limited to, a tangible storage medium, a carrier wave medium, or a physical transmission medium. The stable storage medium comprises: optical or magnetic disks, and other computer or similar devices, capable of implementing the system components described in the figures. Volatile storage media include dynamic memory, such as the main memory of a computer platform. Tangible transmission media include coaxial cables, copper cables, and fiber optics, including the wires that form a bus within a computer system. Carrier wave transmission media may convey electrical, electromagnetic, acoustic, or light wave signals, which may be generated by radio frequency or infrared data communication methods. Common computer-readable media include hard disks, floppy disks, magnetic tape, any other magnetic medium; CD-ROM, DVD-ROM, any other optical medium; punch cards, any other physical storage medium containing a pattern of holes; RAM, PROM, EPROM, FLASH-EPROM, any other memory chip or cartridge; a carrier wave transmitting data or instructions, a cable or connection transmitting a carrier wave, any other program code and/or data which can be read by a computer. These computer-readable media may take many forms, and include any type of program code for causing a processor to perform instructions, communicate one or more results, and/or the like.
Computer program code required for the operation of various portions of the present application may be written in any one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C + +, C #, VB.NET, Python, and the like, a conventional programming language such as C, Visual Basic, Fortran 2003, Perl, COBOL 2002, PHP, ABAP, a dynamic programming language such as Python, Ruby, and Groovy, or other programming languages, and the like. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any network, such as a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet), or in a cloud computing environment, or as a service, such as a software as a service (SaaS).
Those skilled in the art will appreciate that various modifications and improvements may be made to the disclosure herein. For example, the different system components described above are implemented by hardware devices, but may also be implemented by software solutions only. For example: the system is installed on an existing server. Further, the location information disclosed herein may be provided via a firmware, firmware/software combination, firmware/hardware combination, or hardware/firmware/software combination.
The foregoing describes the present application and/or some other examples. The present application is susceptible to various modifications in light of the above teachings. The subject matter disclosed herein can be implemented in various forms and examples, and the present application can be applied to a wide variety of applications. All applications, modifications and variations that are claimed in the following claims are within the scope of this application.
Also, this application uses specific language to describe embodiments of the application. Reference throughout this specification to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with at least one embodiment of the present application is included in at least one embodiment of the present application. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of one or more embodiments of the present application may be combined as appropriate.
Additionally, the order in which elements and sequences of the processes described herein are processed, the use of alphanumeric characters, or the use of other designations, is not intended to limit the order of the processes and methods described herein, unless explicitly claimed. While various presently contemplated embodiments of the invention have been discussed in the foregoing disclosure by way of example, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements that are within the spirit and scope of the embodiments herein. For example, although the system components described above may be implemented by hardware devices, they may also be implemented by software-only solutions, such as installing the described system on an existing server or mobile device.
Similarly, it should be noted that in the preceding description of embodiments of the application, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to require more features than are expressly recited in the claims. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
Each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, articles, and the like, cited in this application is hereby incorporated by reference in its entirety. Except where the application is filed in a manner inconsistent or contrary to the present disclosure, and except where the claim is filed in its broadest scope (whether present or later appended to the application) as well. It is noted that the descriptions, definitions and/or use of terms in this application shall control if they are inconsistent or contrary to the statements and/or uses of the present application in the material attached to this application.
Finally, it should be understood that the embodiments described herein are merely illustrative of the principles of the embodiments of the present application. Other variations are also possible within the scope of the present application. Thus, by way of example, and not limitation, alternative configurations of the embodiments of the present application can be viewed as being consistent with the teachings of the present application. Accordingly, embodiments of the present application are not limited to those explicitly described and depicted herein.

Claims (9)

1. A method for delay correction, comprising:
acquiring one or more pieces of angular velocity information;
acquiring one or more reference image information;
determining one or more first rotation angle information based on the one or more angular velocity information, the first rotation angle information including a first rotation angle and first timestamp information;
determining one or more second rotation angle information based on the one or more reference image information, the second rotation angle information including a second rotation angle and second time stamp information;
acquiring one or more reference time points;
modifying the one or more first rotation angle information and the one or more second rotation angle information based on the one or more reference time points;
determining one or more angle difference values based on the one or more modified first rotation angle information and the one or more modified second rotation angle information; determining a delay value based on a sum of squares of the one or more angular differences;
and correcting the time delay based on the time delay value.
2. The method of claim 1, wherein said modifying the one or more first rotation angle information and the one or more second rotation angle information based on the one or more reference time points comprises:
delaying the one or more reference time points based on a pending time variable;
determining one or more modified first rotation angle information based on the one or more delayed reference time points, the modified first rotation angle information including a modified first rotation angle and modified first timestamp information;
correcting the one or more second rotation angle information based on the one or more reference time points, the corrected second rotation angle information including a corrected second rotation angle and corrected second time stamp information.
3. The method of claim 1, wherein said modifying the one or more first rotation angle information and the one or more second rotation angle information based on the one or more reference time points comprises:
delaying the one or more reference time points based on a pending time variable;
determining one or more modified second rotation angle information based on the one or more delayed reference time points, the modified second rotation angle information including a modified second rotation angle and modified second timestamp information;
correcting the one or more first rotation angle information based on the one or more reference time points, the corrected first rotation angle information including a corrected first rotation angle and corrected first time stamp information.
4. The method of claim 2 or 3, wherein the determining a delay value based on a sum of squares of the one or more angular differences comprises:
determining a minimum of a sum of squares of the one or more angular differences based on the pending time variable;
and determining the undetermined time variable corresponding to the minimum value of the square sum of the one or more angle difference values as a time delay value.
5. A delay correction system, comprising:
the angular velocity acquisition module is used for acquiring one or more pieces of angular velocity information;
an image acquisition module for acquiring one or more reference image information;
a processing module to:
determining one or more first rotation angle information based on the one or more angular velocity information, the first rotation angle information including a first rotation angle and first timestamp information;
determining one or more second rotation angle information based on the one or more reference image information, the second rotation angle information including a second rotation angle and second time stamp information;
acquiring one or more reference time points;
modifying the one or more first rotation angle information and the one or more second rotation angle information based on the one or more reference time points;
determining one or more angle difference values based on the one or more modified first rotation angle information and the one or more modified second rotation angle information; determining a delay value based on a sum of squares of the one or more angular differences;
a correction module to correct a time delay based on the time delay value.
6. The system of claim 5, wherein the angular rate acquisition module is a gyroscope.
7. The system of claim 5, wherein the image acquisition module is a camera.
8. A latency correction apparatus comprising a processor which performs the latency correction method of any one of claims 1 to 4.
9. A computer-readable storage medium storing computer instructions, which when read by a computer perform the latency correction method of any one of claims 1 to 4.
CN201810401369.1A 2018-04-28 2018-04-28 Time delay correction method, system and computer readable medium Active CN108680196B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810401369.1A CN108680196B (en) 2018-04-28 2018-04-28 Time delay correction method, system and computer readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810401369.1A CN108680196B (en) 2018-04-28 2018-04-28 Time delay correction method, system and computer readable medium

Publications (2)

Publication Number Publication Date
CN108680196A CN108680196A (en) 2018-10-19
CN108680196B true CN108680196B (en) 2021-01-19

Family

ID=63802706

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810401369.1A Active CN108680196B (en) 2018-04-28 2018-04-28 Time delay correction method, system and computer readable medium

Country Status (1)

Country Link
CN (1) CN108680196B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3884468A4 (en) * 2018-12-29 2021-12-15 Zhejiang Dahua Technology Co., Ltd. Methods and systems for camera calibration
CN109917440B (en) * 2019-04-09 2021-07-13 广州小鹏汽车科技有限公司 Combined navigation method, system and vehicle
CN112113582A (en) * 2019-06-21 2020-12-22 上海商汤临港智能科技有限公司 Time synchronization processing method, electronic device, and storage medium
EP4094050A4 (en) * 2020-01-22 2023-10-25 Abb Schweiz Ag Method and electronic device, system and computer readable medium for time calibration
CN111351487A (en) * 2020-02-20 2020-06-30 深圳前海达闼云端智能科技有限公司 Clock synchronization method and device of multiple sensors and computing equipment
CN114286012B (en) * 2022-01-29 2023-11-24 青岛海信智慧生活科技股份有限公司 Monitoring equipment, control method thereof and terminal

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8583392B2 (en) * 2010-06-04 2013-11-12 Apple Inc. Inertial measurement unit calibration system
CN104718431A (en) * 2012-10-12 2015-06-17 高通股份有限公司 Gyroscope conditioning and gyro-camera alignment
CN105031935A (en) * 2014-04-16 2015-11-11 鹦鹉股份有限公司 Rotary-wing drone provided with a video camera supplying stabilised image sequences
CN106101566A (en) * 2011-05-31 2016-11-09 斯凯普公司 Video stabilization
CN205718958U (en) * 2016-04-18 2016-11-23 中国科学院遥感与数字地球研究所 Camera inertia measurement monitor in high precision
CN107687932A (en) * 2016-08-05 2018-02-13 成都理想境界科技有限公司 The detection method and device of head-mounted display apparatus delay
CN107852462A (en) * 2015-07-22 2018-03-27 索尼公司 Camera model, solid-state imager, electronic equipment and image capture method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9300880B2 (en) * 2013-12-31 2016-03-29 Google Technology Holdings LLC Methods and systems for providing sensor data and image data to an application processor in a digital image format

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8583392B2 (en) * 2010-06-04 2013-11-12 Apple Inc. Inertial measurement unit calibration system
CN106101566A (en) * 2011-05-31 2016-11-09 斯凯普公司 Video stabilization
CN104718431A (en) * 2012-10-12 2015-06-17 高通股份有限公司 Gyroscope conditioning and gyro-camera alignment
CN105031935A (en) * 2014-04-16 2015-11-11 鹦鹉股份有限公司 Rotary-wing drone provided with a video camera supplying stabilised image sequences
CN107852462A (en) * 2015-07-22 2018-03-27 索尼公司 Camera model, solid-state imager, electronic equipment and image capture method
CN205718958U (en) * 2016-04-18 2016-11-23 中国科学院遥感与数字地球研究所 Camera inertia measurement monitor in high precision
CN107687932A (en) * 2016-08-05 2018-02-13 成都理想境界科技有限公司 The detection method and device of head-mounted display apparatus delay

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Spatio-Temporal Initialization for IMU to Camera Registration;Elmar Mair等;《2011 IEEE International Conference on Robotics and Biomimetics》;20111211;第557-564页中摘要/正文第IV部分及图1-4 *

Also Published As

Publication number Publication date
CN108680196A (en) 2018-10-19

Similar Documents

Publication Publication Date Title
CN108680196B (en) Time delay correction method, system and computer readable medium
JP6273400B2 (en) High-speed template matching using peripheral information
US11625841B2 (en) Localization and tracking method and platform, head-mounted display system, and computer-readable storage medium
US20180288386A1 (en) Coarse relocalization using signal fingerprints
US20150286279A1 (en) Systems and methods for guiding a user during calibration of a sensor
CN103983262A (en) Determination method and apparatus for advancing route based on optical communication
US20230037922A1 (en) Image display method and apparatus, computer device, and storage medium
WO2021088498A1 (en) Virtual object display method and electronic device
US20150331569A1 (en) Device for controlling user interface, and method of controlling user interface thereof
CN110567491B (en) Initial alignment method and device of inertial navigation system and electronic equipment
TWI555994B (en) Dynamically calibrating magnetic sensors
CN110969159B (en) Image recognition method and device and electronic equipment
JP2022531186A (en) Information processing methods, devices, electronic devices, storage media and programs
KR20200011209A (en) Method for correcting a sensor and direction information obtained via the sensor based on another direction information obtained via the satellite positioning circuit and electronic device thereof
EP3905661A1 (en) Electronic device and method for recommending image capturing place
US11543485B2 (en) Determining location or orientation based on environment information
CN113407045B (en) Cursor control method and device, electronic equipment and storage medium
JP6250708B2 (en) Traveling direction information output device, map matching device, traveling direction information output method, and program
CN116079697A (en) Monocular vision servo method, device, equipment and medium based on image
US11620846B2 (en) Data processing method for multi-sensor fusion, positioning apparatus and virtual reality device
CN103269400A (en) Method for turning over system screen of mobile communication equipment
CN110675445B (en) Visual positioning method, device and storage medium
US10386185B2 (en) Information processing method and electronic device
CN116718196B (en) Navigation method, device, equipment and computer readable storage medium
CN104995584A (en) Computing a magnetic heading

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant