WO2023146092A1 - Procédé de correction de capteur inertiel basé sur l'image et dispositif électronique pour sa mise en œuvre - Google Patents

Procédé de correction de capteur inertiel basé sur l'image et dispositif électronique pour sa mise en œuvre Download PDF

Info

Publication number
WO2023146092A1
WO2023146092A1 PCT/KR2022/018601 KR2022018601W WO2023146092A1 WO 2023146092 A1 WO2023146092 A1 WO 2023146092A1 KR 2022018601 W KR2022018601 W KR 2022018601W WO 2023146092 A1 WO2023146092 A1 WO 2023146092A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
value
posture
coordinate system
inertial sensor
Prior art date
Application number
PCT/KR2022/018601
Other languages
English (en)
Korean (ko)
Inventor
서영준
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020220014644A external-priority patent/KR20230114656A/ko
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Publication of WO2023146092A1 publication Critical patent/WO2023146092A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/18Stabilised platforms, e.g. by gyroscope
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass

Definitions

  • the embodiments below relate to image-based inertial sensor calibration techniques.
  • An inertial sensor including at least one of an accelerometer, a gyroscope, and a geomagnetic field may be used together with a location sensor such as a global positioning system (GPS) to estimate position, posture, and direction information of the electronic device.
  • a location sensor such as a global positioning system (GPS) to estimate position, posture, and direction information of the electronic device.
  • GPS global positioning system
  • Sensors included in the electronic device may generate errors depending on the surrounding environment, and inertial sensors may generate accumulated errors over time when tracking the posture of the electronic device.
  • An image-based inertial sensor calibration method and an electronic device performing the method according to an embodiment may calibrate a posture determined based on an output of an inertial sensor and an inertial sensor using information included in an image.
  • An electronic device includes a camera acquiring an image, an inertial sensor detecting motion of the electronic device, a position sensor detecting a position of the electronic device when acquiring the image, and the camera, the inertial sensor, and the A processor controlling the position sensor may be included.
  • the processor detects a first position value of a target object in a first image obtained at a first time, and the first position value of the target object is detected based on the position value measured by the position sensor at the first time. 1 position value can be estimated.
  • the processor may determine posture correction data for correcting the posture of the electronic device determined through the inertial sensor based on a first difference between the detected first position value and the estimated first position value. .
  • An electronic device includes a camera acquiring an image, an inertial sensor detecting motion of the electronic device, a position sensor detecting a position of the electronic device when acquiring the image, and the camera, the inertial sensor, and the A processor controlling the position sensor may be included.
  • the processor detects a first position value of a target object from a first image obtained at a first time, and the first position value of the target object based on the position value measured by the position sensor at the first time. can be estimated.
  • the processor detects a second position value of a target object from a second image obtained at a second time, and the second position value of the target object is detected based on the position value measured by the position sensor at the second time. 2 position values can be estimated.
  • the processor performs the processing based on a first difference value between the detected first position value and the estimated first position value and a second difference value between the detected second position value and the estimated second position value.
  • Inertial sensor calibration data for calibrating the inertial sensor may be determined.
  • a method for calibrating an inertial sensor of an electronic device includes an operation of detecting a first position value of a target object from a first image obtained at a first time using a camera of the electronic device, An operation of estimating a first position value of the target object based on a position value measured by a position sensor of the electronic device at 1 hour, and a difference between the detected first position value and the estimated first position value An operation of determining posture correction data for correcting the posture of the electronic device determined through the inertial sensor based on the first difference value.
  • FIG. 1 is a block diagram of an electronic device in a network environment according to various embodiments.
  • FIG. 2 is a diagram for explaining an outline of a method of correcting a posture of an electronic device according to an exemplary embodiment.
  • FIG. 3 is a flowchart of a method of correcting a posture of an electronic device according to an exemplary embodiment.
  • FIG. 4 is a diagram for explaining a position value of a target object detected from an image acquired through a camera of an electronic device according to an embodiment.
  • FIG. 5 is a diagram for explaining an overview of a method of calibrating an inertial sensor of an electronic device according to an exemplary embodiment.
  • FIG. 6 is a flowchart of a method of calibrating an inertial sensor of an electronic device according to an embodiment.
  • FIG. 7 is a diagram for explaining position values of a target object detected from images obtained through a camera of an electronic device according to an exemplary embodiment.
  • FIG. 8 is a diagram for explaining a data flow of an electronic device according to an exemplary embodiment.
  • FIG. 1 is a block diagram of an electronic device 101 within a network environment 100, according to various embodiments.
  • an electronic device 101 communicates with an electronic device 102 through a first network 198 (eg, a short-range wireless communication network) or through a second network 199. It may communicate with at least one of the electronic device 104 or the server 108 through (eg, a long-distance wireless communication network). According to one embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • a first network 198 eg, a short-range wireless communication network
  • the server 108 e.g, a long-distance wireless communication network
  • the electronic device 101 includes a processor 120, a memory 130, an input module 150, an audio output module 155, a display module 160, an audio module 170, a sensor module ( 176), interface 177, connection terminal 178, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196 , or the antenna module 197 may be included.
  • at least one of these components eg, the connection terminal 178) may be omitted or one or more other components may be added.
  • some of these components eg, sensor module 176, camera module 180, or antenna module 197) are integrated into a single component (eg, display module 160). It can be.
  • the processor 120 for example, executes software (eg, the program 140) to cause at least one other component (eg, hardware or software component) of the electronic device 101 connected to the processor 120. It can control and perform various data processing or calculations. According to one embodiment, as at least part of data processing or operation, the processor 120 transfers instructions or data received from other components (e.g., sensor module 176 or communication module 190) to volatile memory 132. , processing commands or data stored in the volatile memory 132 , and storing resultant data in the non-volatile memory 134 .
  • software eg, the program 140
  • the processor 120 transfers instructions or data received from other components (e.g., sensor module 176 or communication module 190) to volatile memory 132. , processing commands or data stored in the volatile memory 132 , and storing resultant data in the non-volatile memory 134 .
  • the processor 120 may include a main processor 121 (eg, a central processing unit or an application processor) or a secondary processor 123 (eg, a graphic processing unit, a neural network processing unit ( NPU: neural processing unit (NPU), image signal processor, sensor hub processor, or communication processor).
  • a main processor 121 eg, a central processing unit or an application processor
  • a secondary processor 123 eg, a graphic processing unit, a neural network processing unit ( NPU: neural processing unit (NPU), image signal processor, sensor hub processor, or communication processor.
  • NPU neural network processing unit
  • the secondary processor 123 may be implemented separately from or as part of the main processor 121 .
  • the secondary processor 123 may, for example, take the place of the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, or the main processor 121 is active (eg, running an application). ) state, together with the main processor 121, at least one of the components of the electronic device 101 (eg, the display module 160, the sensor module 176, or the communication module 190) It is possible to control at least some of the related functions or states.
  • the auxiliary processor 123 eg, image signal processor or communication processor
  • the auxiliary processor 123 may include a hardware structure specialized for processing an artificial intelligence model.
  • AI models can be created through machine learning. Such learning may be performed, for example, in the electronic device 101 itself where the artificial intelligence model is performed, or may be performed through a separate server (eg, the server 108).
  • the learning algorithm may include, for example, supervised learning, unsupervised learning, semi-supervised learning or reinforcement learning, but in the above example Not limited.
  • the artificial intelligence model may include a plurality of artificial neural network layers.
  • Artificial neural networks include deep neural networks (DNNs), convolutional neural networks (CNNs), recurrent neural networks (RNNs), restricted boltzmann machines (RBMs), deep belief networks (DBNs), bidirectional recurrent deep neural networks (BRDNNs), It may be one of deep Q-networks or a combination of two or more of the foregoing, but is not limited to the foregoing examples.
  • the artificial intelligence model may include, in addition or alternatively, software structures in addition to hardware structures.
  • the memory 130 may store various data used by at least one component (eg, the processor 120 or the sensor module 176) of the electronic device 101 .
  • the data may include, for example, input data or output data for software (eg, program 140) and commands related thereto.
  • the memory 130 may include volatile memory 132 or non-volatile memory 134 .
  • the program 140 may be stored as software in the memory 130 and may include, for example, an operating system 142 , middleware 144 , or an application 146 .
  • the input module 150 may receive a command or data to be used by a component (eg, the processor 120) of the electronic device 101 from the outside of the electronic device 101 (eg, a user).
  • the input module 150 may include, for example, a microphone, a mouse, a keyboard, a key (eg, a button), or a digital pen (eg, a stylus pen).
  • the sound output module 155 may output sound signals to the outside of the electronic device 101 .
  • the sound output module 155 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback.
  • a receiver may be used to receive an incoming call. According to one embodiment, the receiver may be implemented separately from the speaker or as part of it.
  • the display module 160 may visually provide information to the outside of the electronic device 101 (eg, a user).
  • the display module 160 may include, for example, a display, a hologram device, or a projector and a control circuit for controlling the device.
  • the display module 160 may include a touch sensor set to detect a touch or a pressure sensor set to measure the intensity of force generated by the touch.
  • the audio module 170 may convert sound into an electrical signal or vice versa. According to one embodiment, the audio module 170 acquires sound through the input module 150, the sound output module 155, or an external electronic device connected directly or wirelessly to the electronic device 101 (eg: Sound may be output through the electronic device 102 (eg, a speaker or a headphone).
  • the audio module 170 acquires sound through the input module 150, the sound output module 155, or an external electronic device connected directly or wirelessly to the electronic device 101 (eg: Sound may be output through the electronic device 102 (eg, a speaker or a headphone).
  • the sensor module 176 detects an operating state (eg, power or temperature) of the electronic device 101 or an external environmental state (eg, a user state), and generates an electrical signal or data value corresponding to the detected state. can do.
  • the sensor module 176 may include, for example, a gesture sensor, a gyro sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a bio sensor, It may include a temperature sensor, humidity sensor, or light sensor.
  • the interface 177 may support one or more designated protocols that may be used to directly or wirelessly connect the electronic device 101 to an external electronic device (eg, the electronic device 102).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card interface
  • audio interface audio interface
  • connection terminal 178 may include a connector through which the electronic device 101 may be physically connected to an external electronic device (eg, the electronic device 102).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 may convert electrical signals into mechanical stimuli (eg, vibration or motion) or electrical stimuli that a user may perceive through tactile or kinesthetic senses.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 may capture still images and moving images. According to one embodiment, the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 may manage power supplied to the electronic device 101 .
  • the power management module 188 may be implemented as at least part of a power management integrated circuit (PMIC), for example.
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101 .
  • the battery 189 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 190 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 101 and an external electronic device (eg, the electronic device 102, the electronic device 104, or the server 108). Establishment and communication through the established communication channel may be supported.
  • the communication module 190 may include one or more communication processors that operate independently of the processor 120 (eg, an application processor) and support direct (eg, wired) communication or wireless communication.
  • the communication module 190 is a wireless communication module 192 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (eg, : a local area network (LAN) communication module or a power line communication module).
  • a corresponding communication module is a first network 198 (eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network 199 (eg, a legacy communication module).
  • the wireless communication module 192 uses subscriber information (eg, International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 within a communication network such as the first network 198 or the second network 199.
  • IMSI International Mobile Subscriber Identifier
  • the wireless communication module 192 may support a 5G network after a 4G network and a next-generation communication technology, for example, NR access technology (new radio access technology).
  • NR access technologies include high-speed transmission of high-capacity data (enhanced mobile broadband (eMBB)), minimization of terminal power and access of multiple terminals (massive machine type communications (mMTC)), or high reliability and low latency (ultra-reliable and low latency (URLLC)).
  • eMBB enhanced mobile broadband
  • mMTC massive machine type communications
  • URLLC ultra-reliable and low latency
  • -latency communications can be supported.
  • the wireless communication module 192 may support a high frequency band (eg, mmWave band) to achieve a high data rate, for example.
  • the wireless communication module 192 uses various technologies for securing performance in a high frequency band, such as beamforming, massive multiple-input and multiple-output (MIMO), and full-dimensional multiplexing. Technologies such as input/output (FD-MIMO: full dimensional MIMO), array antenna, analog beam-forming, or large scale antenna may be supported.
  • the wireless communication module 192 may support various requirements defined for the electronic device 101, an external electronic device (eg, the electronic device 104), or a network system (eg, the second network 199).
  • the wireless communication module 192 is a peak data rate for eMBB realization (eg, 20 Gbps or more), a loss coverage for mMTC realization (eg, 164 dB or less), or a U-plane latency for URLLC realization (eg, Example: downlink (DL) and uplink (UL) each of 0.5 ms or less, or round trip 1 ms or less) may be supported.
  • eMBB peak data rate for eMBB realization
  • a loss coverage for mMTC realization eg, 164 dB or less
  • U-plane latency for URLLC realization eg, Example: downlink (DL) and uplink (UL) each of 0.5 ms or less, or round trip 1 ms or less
  • the antenna module 197 may transmit or receive signals or power to the outside (eg, an external electronic device).
  • the antenna module 197 may include an antenna including a radiator formed of a conductor or a conductive pattern formed on a substrate (eg, PCB).
  • the antenna module 197 may include a plurality of antennas (eg, an array antenna). In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is selected from the plurality of antennas by the communication module 190, for example. can be chosen A signal or power may be transmitted or received between the communication module 190 and an external electronic device through the selected at least one antenna.
  • other components eg, a radio frequency integrated circuit (RFIC) may be additionally formed as a part of the antenna module 197 in addition to the radiator.
  • RFIC radio frequency integrated circuit
  • the antenna module 197 may form a mmWave antenna module.
  • the mmWave antenna module includes a printed circuit board, an RFIC disposed on or adjacent to a first surface (eg, a lower surface) of the printed circuit board and capable of supporting a designated high frequency band (eg, mmWave band); and a plurality of antennas (eg, array antennas) disposed on or adjacent to a second surface (eg, a top surface or a side surface) of the printed circuit board and capable of transmitting or receiving signals of the designated high frequency band. can do.
  • peripheral devices eg, a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • signal e.g. commands or data
  • commands or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199 .
  • Each of the external electronic devices 102 or 104 may be the same as or different from the electronic device 101 .
  • all or part of operations executed in the electronic device 101 may be executed in one or more external electronic devices among the external electronic devices 102 , 104 , or 108 .
  • the electronic device 101 when the electronic device 101 needs to perform a certain function or service automatically or in response to a request from a user or another device, the electronic device 101 instead of executing the function or service by itself.
  • one or more external electronic devices may be requested to perform the function or at least part of the service.
  • One or more external electronic devices receiving the request may execute at least a part of the requested function or service or an additional function or service related to the request, and deliver the execution result to the electronic device 101 .
  • the electronic device 101 may provide the result as at least part of a response to the request as it is or additionally processed.
  • cloud computing distributed computing, mobile edge computing (MEC), or client-server computing technology may be used.
  • the electronic device 101 may provide an ultra-low latency service using, for example, distributed computing or mobile edge computing.
  • the external electronic device 104 may include an internet of things (IoT) device.
  • Server 108 may be an intelligent server using machine learning and/or neural networks. According to one embodiment, the external electronic device 104 or server 108 may be included in the second network 199 .
  • the electronic device 101 may be applied to intelligent services (eg, smart home, smart city, smart car, or health care) based on 5G communication technology and IoT-related technology.
  • FIG. 2 is a diagram for explaining an outline of a method of correcting a posture of an electronic device according to an exemplary embodiment.
  • a location sensor such as an inertial measurement unit (IMU) including at least one of an accelerometer, a gyroscope, and a geomagnetic field, and a global positioning system (GPS) to estimate the position, posture, and orientation information of the electronic device 101.
  • IMU inertial measurement unit
  • GPS global positioning system
  • inertial sensors and position sensors may be included in sensor module 176 of FIG. 1 .
  • the inertial sensor may determine the attitude of the electronic device 101 using at least one of an accelerometer, a gyroscope, and a geomagnetic field.
  • the attitude of the electronic device 101 may include information about the roll, pitch, and yaw of the electronic device 101 .
  • An error may occur in the inertial sensor included in the electronic device 101 depending on the surrounding environment. Accumulated errors may occur over time when the inertial sensor tracks the posture of the electronic device 101 .
  • the electronic device 101 estimates the attitude of the electronic device 101 using the inertial sensor of the electronic device 101, there may be an error in the estimated attitude due to the error of the inertial sensor, and more accurate results may be obtained. correction may be required.
  • the electronic device 101 corrects the posture of the electronic device 101 using the image 205 acquired through the camera (eg, the camera module 180 of FIG. 1 ) of the electronic device 101 . can do.
  • the camera eg, the camera module 180 of FIG. 1
  • FIG. 2 a process of correcting a posture of the electronic device 101 using an image 205 acquired through a camera is briefly illustrated.
  • the electronic device 101 may detect the target object 220 in the image 205 and correct the posture of the electronic device 101 based on the detected location of the target object 220 .
  • the electronic device 101 includes a camera for acquiring the image 205, an inertial sensor for detecting the motion of the electronic device 101, and a location for detecting the position of the electronic device 101 when the image 205 is acquired. May contain sensors.
  • the electronic device 101 may obtain an image 205 using a camera.
  • the electronic device 101 may determine the posture of the electronic device 101 using a sensor of the electronic device 101 .
  • a direction 210 toward which the electronic device 101 is headed determined by using a sensor of the electronic device 101 on an image 205 obtained through a camera and an actual direction 215 toward which the electronic device 101 is headed this is shown
  • the direction 210 toward which the electronic device 101 faces may be a direction corresponding to the posture of the electronic device 101 determined using a sensor, and is the actual direction 215 toward which the electronic device 101 faces. may be a direction corresponding to the actual posture of the electronic device 101 . Due to the error of the sensor, there may be an error in the attitude of the electronic device 101 determined using the sensor.
  • the electronic device 101 may detect the target object 220 in the image 205 .
  • the target object 220 may be, for example, a celestial body.
  • the electronic device 101 detects the position value of the target object 220 from the image 205 and uses a position sensor such as GPS to acquire the position value of the target object 220 at the time when the corresponding image 205 is acquired. can be estimated.
  • the detected position value and the estimated position value of the target object 220 may be a position value on the horizontal coordinate system 225 including an altitude value and an azimuth value.
  • the electronic device 101 may determine posture correction data for correcting the posture of the electronic device 101 based on a difference value between the detected position value and the estimated position value of the corresponding target object 220 . there is.
  • the electronic device 101 may update the posture correction table based on the determined posture correction data.
  • the posture correction table may be a comparison table capable of correcting a difference between the posture of the electronic device 101 determined through the sensor and the actual posture of the electronic device 101 .
  • the electronic device 101 may correct the posture of the electronic device 101 using the posture correction table, and may determine a posture corresponding to the actual direction 215 toward which the electronic device 101 faces.
  • the electronic device 101 corrects the posture of the electronic device 101 using the image 205 obtained through the camera of the electronic device 101, thereby correcting the posture of the electronic device 101 at a low cost.
  • An accurate posture can be obtained by correcting, and the accuracy of the posture can be increased by repeatedly correcting using several images 205 acquired at different times.
  • the electronic device 101 may automatically acquire the image 205 without user input to determine posture correction data, thereby providing the user with an accurate posture even if the user is not aware of the posture correction process.
  • FIG. 3 is a flowchart of a method of correcting a posture of an electronic device according to an exemplary embodiment.
  • an electronic device 101 including a camera acquiring an image, an inertial sensor detecting motion of the electronic device 101, and a position sensor detecting the position of the electronic device 101 when acquiring an image is electronic
  • a flowchart of a method of correcting the attitude of the electronic device 101 determined by using the inertial sensor of the device 101 is shown.
  • the electronic device 101 may detect a target object from a first image obtained through a camera at a first time.
  • the target object may be a celestial body such as the sun, moon, or satellite.
  • the electronic device 101 may detect a first position value of the target object in the first image.
  • the detected first position value of the target object may be determined based on the angle of view of the camera and the central point of the image.
  • the electronic device 101 may detect the first position value of the target object from the center point of the image by using information about the angle of view of the camera.
  • the first location value may include an altitude value and a direction value.
  • a target object 405 detected in an image 400 obtained through a camera of the electronic device 101 is illustrated.
  • the electronic device 101 may detect the first position value of the target object 405 from the center of the image 400 using the angle of view of the camera. For example, in the example of FIG. 4 , the electronic device 101 uses the angle of view of the camera to a target object 405 having an elevation value 410 of 5.2 degrees and an azimuth value 415 of 20.3 degrees from the center of the image 400. It is possible to detect the first position value for
  • the electronic device 101 determines the location of the electronic device 101 at the first time based on the location of the electronic device 101 measured at the first time using the position sensor of the electronic device 101.
  • a first position value of the target object may be estimated.
  • the estimated first position value may be a position value on a horizontal coordinate system having the position of the electronic device 101 as an origin. For example, when the electronic device 101 can know the time at which the first image was captured and the location at which the first image was captured, the electronic device 101 determines the actual location of the target object based on the corresponding time and location.
  • a first position value corresponding to may be estimated.
  • the electronic device 101 may determine a first difference value between the first position value of the target object detected in operation 310 and the first position value estimated in operation 315. Based on the first difference value, the electronic device 101 may determine attitude correction data for correcting the attitude of the electronic device 101 determined using the inertial sensor.
  • the first difference value may be a value including an altitude value and a direction value.
  • the electronic device 101 may convert the first difference value into a value on the Cartesian coordinate system.
  • the electronic device 101 may convert the first difference value converted into a value on the Cartesian coordinate system into a value on the world coordinate system based on the attitude of the electronic device 101 determined through the inertial sensor at the first time.
  • the world coordinate system is a single geocentric coordinate system and may be an absolute coordinate system for displaying a location with the center of mass (geometric center) of the earth as the origin.
  • the world coordinate system may be a Cartesian coordinate system.
  • the attitude of the electronic device 101 determined based on the output of the inertial sensor may be determined on a world coordinate system.
  • the attitude of the electronic device 101 determined based on the output of the inertial sensor may be information indicating the degree of rotation of the electronic device 101 with respect to each axis of the world coordinate system.
  • the electronic device 101 Since the posture of the electronic device 101 indicates the degree of rotation of the electronic device 101 with respect to each axis of the world coordinate system, the electronic device 101 uses the posture and the first difference value converted to a value on the Cartesian coordinate system. can be converted to a value on the world coordinate system.
  • the electronic device 101 may convert the first difference value converted into a value on the Cartesian coordinate system into a value on the world coordinate system using Equations 1, 2, and 3, for example.
  • Equation 1 Equation 2, and Equation 3
  • x, y, and z are values of the x-axis, y-axis, and z-axis of the first difference value converted to a Cartesian coordinate system, and is the value of the x-axis, y-axis and z-axis on the world coordinate system, and Means the roll, pitch, and yaw of the electronic device 101 determined using the inertial sensor at the first time, respectively.
  • the electronic device 101 may convert the first difference value converted into a value on the world coordinate system into a first reference posture, which is the posture of the electronic device 101 on the world coordinate system. In an embodiment, the electronic device 101 may determine the reference posture of the electronic device 101 on the world coordinate system using Equations 4, 5, and 6.
  • Equation 4 Equation 5 and Equation 6, and are values of the x-axis, y-axis, and z-axis on the world coordinate system determined in Equations 1, 2, and 3, respectively, and is the first reference posture of the electronic device 101 on the world coordinate system.
  • the difference between the posture of the electronic device 101 determined at the first time using the inertial sensor and the first reference posture may mean an error included in the posture determined using the inertial sensor.
  • the electronic device 101 may determine posture correction data for correcting the posture according to a difference between the posture of the electronic device 101 determined using the inertial sensor and the reference posture.
  • the electronic device 101 may update the posture correction table based on the determined posture correction data.
  • the electronic device 101 may correct the posture by applying the posture correction table to the posture determined through the inertial sensor of the electronic device 101 .
  • Operations 305, 310, 315, 320 and 325 may be periodically and/or iteratively performed using images acquired at different times.
  • FIG. 5 is a diagram for explaining an overview of a method of calibrating an inertial sensor of an electronic device according to an exemplary embodiment.
  • the electronic device 101 corrects the inertial sensor 515 included in the electronic device 101 using images 505 obtained through a camera is briefly illustrated.
  • the electronic device 101 detects the target object 520 from the images 505 and determines the amount of change in the posture of the electronic device 101 based on the position of the target object 520 detected from the images 505.
  • the inertial sensor 515 may be calibrated based on the determined amount of change.
  • the inertial sensor 515 may include at least one of an accelerometer, a gyroscope, and a geomagnetic field.
  • the electronic device 101 calibrates the inertial sensor 515 using the images 505 acquired through the camera of the electronic device 101, thereby calibrating the inertial sensor 515 at a low cost to obtain accurate accuracy.
  • the posture can be obtained, and the accuracy of the posture can be increased by repeatedly calibrating the inertial sensor 515 using several images 505 acquired at different times.
  • the electronic device 101 may automatically acquire images 505 without user input to determine the inertial sensor 515 calibration data, thereby providing the user with an accurate posture even if the user does not recognize the inertial sensor 515 calibration process. can provide
  • the electronic device 101 may detect the target object 520 from the images 505 .
  • the target object 520 may be, for example, a celestial body.
  • the electronic device 101 detects the location values of the target object 520 from the images 505, and uses a location sensor such as GPS to acquire the corresponding images 505 of the target object 520 at the time.
  • Position values can be estimated.
  • the detected location values and the estimated location values of the target object 520 may be location values on a horizontal coordinate system 525 including an altitude value and an azimuth value.
  • the electronic device 101 provides inertial sensor 515 calibration data for calibrating the inertial sensor 515 based on difference values between detected position values and estimated position values of the corresponding target object 520 . can decide The electronic device 101 may update the inertial sensor 515 calibration table based on the inertial sensor 515 calibration data. The electronic device 101 may calibrate the inertial sensor 515 using the inertial sensor 515 calibration table and accurately determine the attitude of the electronic device 101 .
  • FIG. 6 is a flowchart of a method of calibrating an inertial sensor of an electronic device according to an embodiment.
  • an electronic device 101 including a camera for obtaining an image, an inertial sensor for detecting motion of the electronic device 101, and a position sensor for detecting the position of the electronic device 101 when acquiring an image are electronic devices 101
  • a flow diagram of a method for calibrating an inertial sensor of device 101 is shown.
  • the electronic device 101 may detect a target object from a first image acquired through a camera at a first time, and may detect a target object from a second image acquired through a camera at a second time.
  • the target object may be a celestial body such as the sun, moon, or satellite.
  • the electronic device 101 may detect a first position value of the target object in the first image and a second position value of the target object in the second image.
  • the detected first position value and the detected second position value of the target object may be determined based on the angle of view of the camera and the central point of the image.
  • the electronic device 101 may detect the first position value and the second position value of the target object by using view angle information of the camera from the center point of the image.
  • the first location value and the second location value may include an altitude value and a direction value.
  • a target object 710 detected from images acquired through a camera of the electronic device 101 is illustrated.
  • the electronic device 101 may detect the first position value and the second position value of the target object 710 from the center of the image using the angle of view of the camera. For example, in the first image of FIG. 7(A), the electronic device 101 uses the angle of view of the camera to set the target object 710 of an altitude value 715 of 5.2 degrees and an azimuth value 720 of 20.3 degrees from the center of the image. ) It is possible to detect the first position value for. In the second image of FIG. 7 (B) , the electronic device 101 provides an altitude value 725 of 7.5 degrees and a direction value 730 of 9.8 degrees from the center of the image using the angle of view of the camera for the target object 710 . 2 position values can be detected.
  • the electronic device 101 determines the location of the electronic device 101 at the first time based on the location of the electronic device 101 measured at the first time using the position sensor of the electronic device 101.
  • a first position value of the target object may be estimated.
  • the electronic device 101 may estimate the second position value of the target object at the second time based on the measured position of the electronic device 101 at the second time using the position sensor of the electronic device 101 .
  • the estimated first and second position values may be position values on a horizontal coordinate system having the position of the electronic device 101 as an origin.
  • the electronic device 101 determines the actual location of the target object based on the corresponding time and location.
  • a first position value corresponding to may be estimated.
  • the electronic device 101 determines a second image corresponding to the actual location of the target object based on the corresponding time and corresponding location. 2 position values can be estimated.
  • the electronic device 101 detects a first difference value between the first position value of the target object detected in operation 610 and the first position value estimated in operation 615, and in operation 610 A second difference value between the calculated second position value and the second position value estimated in operation 615 may be determined.
  • the first difference value and the second difference value may be values including an altitude value and a direction value.
  • the electronic device 101 may convert the first difference value and the second difference value into values on the Cartesian coordinate system.
  • the electronic device 101 may convert the first difference value converted into a value on the Cartesian coordinate system into a value on the world coordinate system based on the posture of the electronic device 101 at the first time.
  • the electronic device 101 may convert the second difference value converted into a value on the Cartesian coordinate system into a value on the world coordinate system based on the posture of the electronic device 101 at the second time.
  • the world coordinate system is a single geocentric coordinate system and may be an absolute coordinate system for displaying a location with the center of mass (geometric center) of the earth as the origin.
  • the world coordinate system may be a Cartesian coordinate system.
  • the attitude of the electronic device 101 determined through the inertial sensor may be determined on a world coordinate system.
  • the posture of the electronic device 101 determined through the inertial sensor may be information indicating the degree of rotation of the electronic device 101 with respect to each axis of the world coordinate system.
  • the electronic device 101 determines the electronic device 101 through the inertial sensor at the first time and the second time ( Using the posture of 101), the first difference value and the second difference value converted to values on the Cartesian coordinate system may be converted to values on the world coordinate system.
  • the electronic device 101 may convert the first difference value converted into a value on the Cartesian coordinate system into a value on the world coordinate system using Equations 1, 2, and 3, for example.
  • the electronic device 101 may convert the second difference value converted into a value on the Cartesian coordinate system into a value on the world coordinate system.
  • the electronic device 101 may convert the first difference value and the second difference value converted into values on the world coordinate system into a reference posture, which is the posture of the electronic device 101 on the world coordinate system. In an embodiment, the electronic device 101 may determine the first reference posture of the electronic device 101 on the world coordinate system using Equations 4, 5, and 6. Similarly, the electronic device 101 may convert the second difference value converted into a value on the world coordinate system into a second reference posture, which is the posture of the electronic device 101 on the world coordinate system.
  • the electronic device 101 may determine a change rate of the posture of the electronic device 101 between the first time and the second time based on the amount of change between the first reference posture and the second reference posture and the difference between the first time and the second time. there is.
  • the electronic device 101 may determine inertial sensor correction data based on the output value of the inertial sensor between the first time and the second time and the determined attitude change speed.
  • the attitude change speed may include information about accelerations and angular velocities of the x-axis, y-axis, and z-axis of the electronic device 101 on the world coordinate system between the first time and the second time.
  • the electronic device 101 calculates the angular velocity of the attitude change rate as the angular velocity of the x-axis, y-axis, and z-axis of the electronic device 101 on the world coordinate system output during the time between the first time and the second time through the gyroscope of the inertial sensor.
  • the inertial sensor calibration data can be determined.
  • the electronic device 101 calculates the acceleration of the attitude change rate as the x-axis, y-axis, and x-axis of the electronic device 101 on the world coordinate system output during the time between the first time and the second time through an accelerometer of an inertial sensor.
  • the inertial sensor calibration data can be determined.
  • the electronic device 101 may update the inertial sensor calibration table based on the determined inertial sensor calibration data.
  • the inertial sensor calibration table may be a comparison table capable of correcting a difference between a raw value before calibration of the inertial sensor and a true value.
  • the electronic device 101 may calibrate the inertial sensor by applying the inertial sensor calibration table to the inertial sensor of the electronic device 101 .
  • Operations 605, 610, 615, 620 and 625 may be periodically and/or iteratively performed using images acquired at different times.
  • the electronic device 101 may perform all of operations 305, 310, 315, 320, and 325 of FIG. 3 and operations 605, 610, 615, 620, and 625 of FIG. .
  • FIG. 8 is a diagram for explaining a data flow of an electronic device according to an exemplary embodiment.
  • inertial sensor 805 and position sensor 830 may be included in sensor module 176 of FIG. 1 .
  • the inertial sensor 805 includes an accelerometer 810, a gyroscope 815 and a magnetometer 820, where the accelerometer 810, the gyroscope 815 and the magnetometer 820 are calibrated prior to calibration.
  • Raw values 835, 840, and 845 may be output.
  • the electronic device 101 applies the inertial sensor correction table 850 to the low values 835, 840, and 845 of the accelerometer 810, the gyroscope 815, and the earth magnetometer 820, and corrects the corrected values.
  • the posture 860 of the electronic device 101 may be determined using the position 860 .
  • the inertial sensor correction table 850 includes an accelerometer correction table 851 applied to the accelerometer 810, a gyroscope correction table 853 applied to the gyroscope 815, and an earth magnetometer 820 applied to the inertial sensor correction table 850.
  • a geomagnetic field correction table 855 may be included.
  • the camera 825 may obtain images 865 and the position sensor 830 may output altitude, longitude, latitude, and time as the position 875 of the electronic device 101 .
  • the camera 825 may acquire a first image at a first time and a second image at a second time.
  • the electronic device 101 detects a target object (eg, the target object 710 of FIG. 7 ) using the first image and the second image acquired by the camera 825, and the first position value 870 of the target object. ) and the second position value 870 can be detected.
  • the electronic device 101 may estimate the first position value 880 and the second position value 880 of the target object at the first and second times using the output of the position sensor 830 .
  • the electronic device 101 determines a first difference value 885 between the detected first position value 870 and the estimated first position value 880, and determines the detected second position value 870 and the estimated first position value 880.
  • a second difference value 885 between the two position values 880 may be determined.
  • the electronic device 101 converts the first difference value 885 into a value 890 on the world coordinate system using the attitude 860 of the electronic device 101 determined using the inertial sensor 805 at the first time, and , the second difference value 885 may be converted into a value 890 on the world coordinate system using the posture 860 of the electronic device 101 determined using the inertial sensor 805 at the second time.
  • the electronic device 101 may determine a first reference posture and a second reference posture based on the first difference value 890 and the second difference value 890 converted into values on the world coordinate system.
  • the electronic device 101 may determine a change rate of the posture of the electronic device 101 between the first time and the second time based on the amount of change between the first reference posture and the second reference posture and the difference between the first time and the second time. there is.
  • the electronic device 101 calculates the change in attitude 860 of the electronic device 101 measured using the inertial sensor 805 between the first time and the second time and the first difference value 890 converted into a value on the world coordinate system. ) and the attitude change rate determined based on the second difference value 890 to determine the inertial sensor correction data 895 .
  • the electronic device 101 may update the inertial sensor calibration table 850 based on the determined inertial sensor calibration data 895 .
  • the electronic device 101 may calibrate the inertial sensor 805 by applying the inertial sensor calibration table 850 to the inertial sensor 805 of the electronic device 101 .
  • the electronic device 101 includes a camera 825 for obtaining an image, an inertial sensor 805 for detecting a motion of the electronic device 101, and a location for detecting the position of the electronic device 101 when acquiring an image. It includes a processor 120 that controls the sensor 830 and the camera 825, the inertial sensor 805, and the position sensor 830, the processor 120, the first time (time) obtained at the first time (time) The first position value of the target object is detected in 1 image, the first position value of the target object is estimated based on the position value measured by the position sensor 830 at the first time, and the detected first position value and Based on the first difference value between the estimated first position values, posture correction data for correcting the posture of the electronic device 101 determined through the inertial sensor 805 may be determined.
  • the processor 120 may detect the first position value based on information on the angle of view of the camera 825 and the center of the first image.
  • the processor 120 converts the first difference value on the horizontal coordinate system into a value on the world coordinate system based on the attitude of the electronic device 101 measured at the first time through the inertial sensor 805, and converts the value on the world coordinate system.
  • a first reference posture may be determined based on the first difference value converted to
  • posture correction data may be determined based on a difference between the first reference posture and the posture of the electronic device 101 measured at the first time.
  • the processor 120 converts the first difference value on the horizontal coordinate system into a value on the Cartesian coordinate system, and converts the first difference value converted into a value on the Cartesian coordinate system based on the posture of the electronic device 101 measured at the first time. It can be converted to a value in the world coordinate system.
  • the processor 120 may update the posture correction table based on the determined posture correction data.
  • the processor 120 may correct the posture of the electronic device 101 determined through the inertial sensor 805 based on the updated posture correction table.
  • the processor 120 detects a second position value of the target object from a second image obtained at a second time, and based on the position value measured by the position sensor 830 at the second time, the target object Estimating the second position value of the inertia based on the first difference value between the detected first position value and the estimated first position value and the second difference value between the detected second position value and the estimated second position value
  • Inertial sensor 805 calibration data for calibrating sensor 805 may be determined.
  • the processor 120 converts the first difference value on the horizontal coordinate system into a value on the world coordinate system based on the attitude of the electronic device 101 at the first time, and based on the first difference value converted into the value on the world coordinate system. to determine the first reference posture, and based on the posture of the electronic device 101 at the second time, the second difference value on the horizontal coordinate system is converted into a value on the world coordinate system, and the second difference converted into a value on the world coordinate system
  • a second reference posture is determined based on the value
  • a posture change rate between a first time and a second time is determined based on the first and second reference postures
  • an inertial sensor between the first time and the second time ( Correction data of the inertial sensor 805 may be determined based on the output value of 805 and the attitude change rate.
  • the processor 120 may update the inertial sensor calibration table 850 based on the determined inertial sensor 805 calibration data.
  • the processor 120 may calibrate the inertial sensor 805 based on the updated inertial sensor calibration table 850 .
  • the electronic device 101 includes a camera 825 for obtaining an image, an inertial sensor 805 for detecting a motion of the electronic device 101, and a location for detecting the position of the electronic device 101 when acquiring an image. It includes a processor 120 that controls the sensor 830 and the camera 825, the inertial sensor 805, and the position sensor 830, the processor 120, the first time (time) obtained at the first time (time) Detect the first position value of the target object in 1 image, estimate the first position value of the target object based on the position value measured by the position sensor 830 at the first time, and at the second time (time) The second position value of the target object is detected in the acquired second image, the second position value of the target object is estimated based on the position value measured by the position sensor 830 at a second time, and the detected first position value is determined. Inertial sensor 805 for calibrating the inertial sensor 805 based on a first difference value between the position value and the estimated first position value and a second difference value between the detected
  • the processor 120 detects a first position value based on the angle of view information of the camera 825 and the center of the first image, and detects a second position value based on the angle of view information of the camera 825 and the center of the second image. can be detected.
  • the processor 120 converts the first difference value on the horizontal coordinate system into a value on the world coordinate system based on the attitude of the electronic device 101 at the first time, and based on the first difference value converted into the value on the world coordinate system. to determine the first reference posture, and based on the posture of the electronic device 101 at the second time, the second difference value on the horizontal coordinate system is converted into a value on the world coordinate system, and the second difference converted into a value on the world coordinate system
  • a second reference posture is determined based on the value
  • a posture change rate between a first time and a second time is determined based on the first and second reference postures
  • an inertial sensor between the first time and the second time ( Correction data of the inertial sensor 805 may be determined based on the output value of 805 and the attitude change rate.
  • the processor 120 converts the first difference value on the horizontal coordinate system into a value on the Cartesian coordinate system, and converts the first difference value converted into a value on the Cartesian coordinate system based on the posture of the electronic device 101 at the first time into a world coordinate system. Converts the second difference value on the horizontal coordinate system into a value on the Cartesian coordinate system, and converts the second difference value converted into a value on the Cartesian coordinate system based on the posture of the electronic device 101 at the second time in the world coordinate system. can be converted to a value of
  • the processor 120 may update the inertial sensor calibration table 850 based on the determined inertial sensor 805 calibration data and calibrate the inertial sensor 805 based on the updated inertial sensor calibration table 850. there is.
  • a method for calibrating an inertial sensor 805 includes an operation of detecting a first position value of a target object from a first image acquired at a first time using a camera of an electronic device, and an electronic device at the first time. An operation of estimating a first position value of the target object based on the position value measured by the position sensor 830 of the device 101, and a first difference value between the detected first position value and the estimated first position value An operation of determining posture correction data for correcting the posture of the electronic device 101 determined through the inertial sensor 805 based on the .
  • the operation of determining the posture correction data includes an operation of converting the first difference value on the horizontal coordinate system into a value of the world coordinate system based on the posture of the electronic device 101 at the first time, and the operation of converting the first difference value into a value of the world coordinate system.
  • a method for correcting an inertial sensor 805 includes an operation of updating a posture correction table based on determined posture correction data, and an electronic device determined through the inertial sensor 805 based on the updated posture correction table ( 101) may further include an operation of correcting the posture.
  • a method for correcting an inertial sensor 805 includes an operation of detecting a second position value of a target object from a second image obtained at a second time using a camera, and a position sensor 830 at the second time. An operation of estimating a second position value of the target object based on a position value measured by ), and a first difference value between the detected first position value and the estimated first position value and the detected second position value and estimation An operation of determining correction data of the inertial sensor 805 for calibrating the inertial sensor 805 based on the second difference between the second position values may be further included.
  • Electronic devices may be devices of various types.
  • the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance.
  • a portable communication device eg, a smart phone
  • a computer device e.g., a smart phone
  • a portable multimedia device e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a camera
  • a wearable device e.g., a smart bracelet
  • first, second, or first or secondary may simply be used to distinguish that component from other corresponding components, and may refer to that component in other respects (eg, importance or order) is not limited.
  • a (eg, first) component is said to be “coupled” or “connected” to another (eg, second) component, with or without the terms “functionally” or “communicatively.”
  • the certain component may be connected to the other component directly (eg by wire), wirelessly, or through a third component.
  • module used in various embodiments of this document may include a unit implemented in hardware, software, or firmware, and is interchangeable with terms such as, for example, logic, logical blocks, parts, or circuits.
  • a module may be an integrally constructed component or a minimal unit of components or a portion thereof that performs one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • a storage medium eg, internal memory 136 or external memory 138
  • a machine eg, electronic device 101
  • a processor eg, the processor 120
  • a device eg, the electronic device 101
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • the device-readable storage medium may be provided in the form of a non-transitory storage medium.
  • the storage medium is a tangible device and does not contain a signal (e.g. electromagnetic wave), and this term refers to the case where data is stored semi-permanently in the storage medium. It does not discriminate when it is temporarily stored.
  • a signal e.g. electromagnetic wave
  • the method according to various embodiments disclosed in this document may be included and provided in a computer program product.
  • Computer program products may be traded between sellers and buyers as commodities.
  • a computer program product is distributed in the form of a device-readable storage medium (e.g. compact disc read only memory (CD-ROM)), or through an application store (e.g. Play StoreTM) or on two user devices (e.g. It can be distributed (eg downloaded or uploaded) online, directly between smart phones.
  • a device-readable storage medium e.g. compact disc read only memory (CD-ROM)
  • an application store e.g. Play StoreTM
  • two user devices e.g. It can be distributed (eg downloaded or uploaded) online, directly between smart phones.
  • at least part of the computer program product may be temporarily stored or temporarily created in a device-readable storage medium such as a manufacturer's server, an application store server, or a relay server's memory.
  • each component (eg, module or program) of the above-described components may include a single object or a plurality of entities, and some of the plurality of entities may be separately disposed in other components. there is.
  • one or more components or operations among the aforementioned corresponding components may be omitted, or one or more other components or operations may be added.
  • a plurality of components eg modules or programs
  • the integrated component may perform one or more functions of each of the plurality of components identically or similarly to those performed by a corresponding component of the plurality of components prior to the integration. .
  • the actions performed by a module, program, or other component are executed sequentially, in parallel, iteratively, or heuristically, or one or more of the actions are executed in a different order, or omitted. or one or more other actions may be added.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manufacturing & Machinery (AREA)
  • Navigation (AREA)

Abstract

L'invention concerne un procédé de correction de capteur inertiel basé sur l'image et un dispositif électronique pour sa mise en œuvre. Le dispositif électronique comprend : une caméra pour l'acquisition d'une image ; un capteur inertiel pour la détection des mouvements du dispositif électronique ; un capteur de position pour la détection de la position du dispositif électronique au moment de l'acquisition de l'image ; et un processeur pour la commande de la caméra, du capteur inertiel et du capteur de position. Le processeur peut : détecter une première valeur de position d'un objet cible à partir d'une première image acquise à un premier instant ; estimer la première valeur de position de l'objet cible sur la base de la valeur de position mesurée par le capteur de position au premier instant ; et déterminer des données de correction de positionnement pour corriger le positionnement du dispositif électronique établi par le capteur inertiel, les données étant déterminées sur la base d'une première valeur de différence entre la première valeur de position détectée et la première valeur de position estimée.
PCT/KR2022/018601 2022-01-25 2022-11-23 Procédé de correction de capteur inertiel basé sur l'image et dispositif électronique pour sa mise en œuvre WO2023146092A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR20220010760 2022-01-25
KR10-2022-0010760 2022-01-25
KR10-2022-0014644 2022-02-04
KR1020220014644A KR20230114656A (ko) 2022-01-25 2022-02-04 영상 기반 관성 센서 보정 방법 및 이를 수행하는 전자 장치

Publications (1)

Publication Number Publication Date
WO2023146092A1 true WO2023146092A1 (fr) 2023-08-03

Family

ID=87472250

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2022/018601 WO2023146092A1 (fr) 2022-01-25 2022-11-23 Procédé de correction de capteur inertiel basé sur l'image et dispositif électronique pour sa mise en œuvre

Country Status (1)

Country Link
WO (1) WO2023146092A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100761011B1 (ko) * 2006-05-30 2007-09-21 학교법인 인하학원 카메라형 태양센서를 이용한 관성항법시스템의자세보정장치 및 방법
JP2012167940A (ja) * 2011-02-10 2012-09-06 Kokusai Kogyo Co Ltd 移動体姿勢の決定方法、移動体姿勢の決定プログラム、及び移動体姿勢の決定装置
KR101183866B1 (ko) * 2011-04-20 2012-09-19 서울시립대학교 산학협력단 Gps/ins/영상at를 통합한 실시간 위치/자세 결정 장치 및 방법
KR101211703B1 (ko) * 2010-10-15 2012-12-12 인하대학교 산학협력단 시선벡터를 이용한 자장계 오차 보정방법 및 이를 이용한 통합 항법 시스템
KR20130031280A (ko) * 2010-05-07 2013-03-28 퀄컴 인코포레이티드 배향 센서 캘리브레이션

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100761011B1 (ko) * 2006-05-30 2007-09-21 학교법인 인하학원 카메라형 태양센서를 이용한 관성항법시스템의자세보정장치 및 방법
KR20130031280A (ko) * 2010-05-07 2013-03-28 퀄컴 인코포레이티드 배향 센서 캘리브레이션
KR101211703B1 (ko) * 2010-10-15 2012-12-12 인하대학교 산학협력단 시선벡터를 이용한 자장계 오차 보정방법 및 이를 이용한 통합 항법 시스템
JP2012167940A (ja) * 2011-02-10 2012-09-06 Kokusai Kogyo Co Ltd 移動体姿勢の決定方法、移動体姿勢の決定プログラム、及び移動体姿勢の決定装置
KR101183866B1 (ko) * 2011-04-20 2012-09-19 서울시립대학교 산학협력단 Gps/ins/영상at를 통합한 실시간 위치/자세 결정 장치 및 방법

Similar Documents

Publication Publication Date Title
WO2021025420A1 (fr) Procédé de détermination de position dans un véhicule à l'aide d'un mouvement de véhicule et appareil associé
WO2021025272A1 (fr) Dispositif électronique pliable pour détecter un angle de pliage, et son procédé de fonctionnement
WO2022010156A1 (fr) Procédé d'étalonnage de capteur géomagnétique de dispositif électronique, et dispositif électronique correspondant
WO2023146092A1 (fr) Procédé de correction de capteur inertiel basé sur l'image et dispositif électronique pour sa mise en œuvre
WO2023043083A1 (fr) Chambre de mesure de performance d'antenne, système la comprenant et son procédé de fonctionnement
WO2023008677A1 (fr) Dispositif électronique et procédé de prédiction de coordonnées d'entrée
WO2022177299A1 (fr) Procédé de commande de fonction d'appel et dispositif électronique le prenant en charge
WO2022239976A1 (fr) Dispositif électronique comprenant un capteur de température et procédé
WO2022169085A1 (fr) Dispositif électronique et procédé de conservation d'une composition d'imagerie dans un dispositif électronique
WO2022005231A1 (fr) Dispositif électronique et procédé de correction de données géomagnétiques
WO2022191471A1 (fr) Dispositif électronique pour identifier la direction de déplacement d'un dispositif électronique, et procédé de fonctionnement pour dispositif électronique
WO2022260290A1 (fr) Dispositif et procédé de correction de données de champ magnétique
WO2023146091A1 (fr) Dispositif électronique présentant une unité d'affichage flexible
WO2023075250A1 (fr) Dispositif électronique et procédé de fourniture de carte tridimensionnelle
WO2023018014A1 (fr) Procédé pour suivre la position d'un dispositif cible, et dispositif électronique et serveur de suivi de position pour mettre en œuvre le procédé
WO2022119366A1 (fr) Dispositif électronique comprenant une antenne
KR20230114656A (ko) 영상 기반 관성 센서 보정 방법 및 이를 수행하는 전자 장치
WO2022103047A1 (fr) Dispositif électronique ayant une rainure de couplage
WO2022145926A1 (fr) Dispositif électronique et procédé de correction de données de capteur de dispositif électronique
WO2023085724A1 (fr) Dispositif électronique et procédé de réglage de la luminance d'un dispositif d'affichage sur la base d'un angle formé avec un écouteur bouton, et support de stockage non transitoire lisible par ordinateur
WO2024080549A1 (fr) Dispositif électronique permettant une détection d'emplacement et son procédé de fonctionnement
WO2023140656A1 (fr) Dispositif électronique de détection d'emplacement à l'aide de données géomagnétiques, et son procédé de commande
WO2022173164A1 (fr) Procédé et dispositif électronique d'affichage d'un objet de réalité augmentée
WO2022196996A1 (fr) Dispositif électronique de détection d'emplacement en utilisant des données géomagnétiques, et son procédé de commande
WO2022191444A1 (fr) Procédé de détermination de longueur de pas d'un utilisateur à l'aide d'un capteur de mouvements et d'un module gps de dispositif électronique, ainsi que dispositif électronique associé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22924345

Country of ref document: EP

Kind code of ref document: A1