WO2020075825A1 - Dispositif d'estimation de mouvement, instrument électronique, programme de commande et procédé d'estimation de mouvement - Google Patents

Dispositif d'estimation de mouvement, instrument électronique, programme de commande et procédé d'estimation de mouvement Download PDF

Info

Publication number
WO2020075825A1
WO2020075825A1 PCT/JP2019/040097 JP2019040097W WO2020075825A1 WO 2020075825 A1 WO2020075825 A1 WO 2020075825A1 JP 2019040097 W JP2019040097 W JP 2019040097W WO 2020075825 A1 WO2020075825 A1 WO 2020075825A1
Authority
WO
WIPO (PCT)
Prior art keywords
estimation
estimated
timing
electronic device
value
Prior art date
Application number
PCT/JP2019/040097
Other languages
English (en)
Japanese (ja)
Inventor
洋紀 山本
太郎 飯尾
Original Assignee
洋紀 山本
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 洋紀 山本 filed Critical 洋紀 山本
Priority to JP2019556289A priority Critical patent/JP6621167B1/ja
Publication of WO2020075825A1 publication Critical patent/WO2020075825A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P13/00Indicating or recording presence, absence, or direction, of movement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Definitions

  • the present disclosure relates to estimating motion of a device.
  • Patent Document 1 describes a technology related to electronic devices. Further, Patent Literature 2 and Non-Patent Literature 1 describe techniques related to Bayesian estimation.
  • the present invention has been made in view of the above points, and it is an object of the present invention to provide a technique capable of appropriately estimating the movement of a device.
  • the motion estimator is a motion estimator that estimates the motion of the device during vibration.
  • the motion estimation device includes an estimation unit.
  • the estimation unit obtains a prediction model representing the movement of the device, a detection value of a sensor that detects acceleration or rotation angular velocity as the first physical quantity of the device, and statistical prior knowledge about the vibration of the device using the detection value. Based on Bayesian estimation using the represented observation model, the movement of the device during vibration is estimated.
  • the electronic device includes the above-described motion estimation device, a display unit, and a display control unit.
  • the display control unit controls the position of the image with respect to the earth by controlling the display position of the image displayed on the display unit based on the estimated value of the displacement or the rotation angle generated by the estimation unit of the motion estimation device. To do.
  • control program is a control program for controlling the computer.
  • the control program causes the computer to perform a prediction model that represents the movement of the device, a detection value of a sensor that detects the acceleration or rotational angular velocity of the device, and an observation that represents statistical prior knowledge about the vibration of the device using the detection value. Based on Bayesian estimation using the model and, a process of estimating the movement of the device during vibration is executed.
  • the motion estimation method is a motion estimation method that estimates the movement of the device during vibration.
  • the motion estimation method is a prediction model that represents the motion of the device, a detection value of a sensor that detects the acceleration or rotational angular velocity of the device, and an observation model that uses the detection value and that represents statistical prior knowledge about the vibration of the device. Based on Bayesian estimation using, the motion of the device during vibration is estimated.
  • an electronic device 1 which is a type of device, includes a plate-shaped device case 11 that is substantially rectangular in a plan view.
  • the device case 11 constitutes the exterior of the electronic device 1.
  • a display surface 121 on which various information such as characters, symbols, and figures is displayed is located on the front surface 11a of the device case 11.
  • the display surface 121 is formed of a transparent portion included in the device case 11.
  • a touch panel 130 described later is located on the back side of the display surface 121.
  • a receiver hole 12 is located at the upper end of the front surface 11a of the device case 11.
  • the speaker hole 13 is located at the lower end of the front surface 11a.
  • a microphone hole 14 is located on the lower side surface 11c of the device case 11.
  • a lens 191 of a first camera 190 which will be described later, can be seen from the upper end of the front surface 11a of the device case 11.
  • a lens 201 included in a second camera 200 which will be described later, is visible from the upper end of the back surface 11b of the device case 11.
  • the electronic device 1 includes an operation button group 140 including a plurality of operation buttons.
  • Each of the plurality of operation buttons is a hardware button.
  • each of the plurality of operation buttons is a push button.
  • at least one operation button included in the operation button group 140 may be a software button displayed on the display surface 121.
  • the operation button group 140 includes operation buttons 141, 142, 143 located at the lower end of the front surface 11 a of the device case 11.
  • the operation button group 140 may include a power button and a volume button located on the surface of the device case 11.
  • the operation button 141 is, for example, a back button.
  • the back button is an operation button for switching the display on the display surface 121 to the previous display.
  • the operation button 142 is, for example, a home button.
  • the home button is an operation button for displaying the home screen on the display surface 121.
  • the operation button 143 is, for example, a history button.
  • the history button is an operation button for displaying the history of the application executed by the electronic device 1 on the display surface 121. When the user operates the operation button 143, the history of the application executed on the electronic device 1 is displayed on the display surface 121.
  • the electronic device 1 may be described using the XYZ Cartesian coordinate system shown in FIGS.
  • the X-axis direction, the Y-axis direction, and the Z-axis direction are set in the lateral direction, the longitudinal direction, and the thickness direction of the electronic device 1, respectively.
  • FIG. 3 is a block diagram mainly showing an example of the electrical configuration of the electronic device 1.
  • the electronic device 1 includes a control unit 100, a wireless communication unit 110, a display unit 120, a touch panel 130, an operation button group 140, and an acceleration sensor 150.
  • the electronic device 1 includes a receiver 160, a speaker 170, a microphone 180, a first camera 190, a second camera 200, and a battery 210.
  • These components included in the electronic device 1 are housed in a device case 11. It can be said that the electronic device 1 is a kind of computer.
  • the control unit 100 can control the operation of the electronic device 1 by controlling other components of the electronic device 1.
  • the control unit 100 can also be called a control device or a control circuit.
  • the controller 100 includes at least one processor to provide control and processing power to perform various functions, as described in further detail below.
  • At least one processor is implemented as a single integrated circuit (IC) or as a plurality of communicatively connected integrated circuits (ICs) and / or discrete circuits. May be. At least one processor may be implemented according to various known techniques.
  • a processor includes one or more circuits or units configured to perform one or more data calculation procedures or processes, such as by executing instructions stored in associated memory.
  • the processor may be firmware (eg, discrete logic component) configured to perform one or more data calculation procedures or processes.
  • the processor is one or more processors, controllers, microprocessors, microcontrollers, application specific integrated circuits (ASICs), digital signal processors, programmable logic devices, field programmable gate arrays, or the like. Any combination of devices or configurations, or combinations of other known devices and configurations, may be included to perform the functions described below.
  • ASICs application specific integrated circuits
  • the control unit 100 includes a CPU (Central Processing Unit) 101, a DSP (Digital Signal Processor) 102, and a storage unit 103.
  • the storage unit 103 includes a non-temporary recording medium such as a ROM (Read Only Memory) and a RAM (Random Access Memory) that can be read by the CPU 101 and the DSP 102.
  • the ROM included in the storage unit 103 is, for example, a flash ROM (flash memory) that is a non-volatile memory.
  • the storage unit 103 stores a plurality of control programs 103a and the like for controlling the electronic device 1.
  • Various functions of the control unit 100 are realized by the CPU 101 and the DSP 102 executing various control programs 103a in the storage unit 103.
  • the control unit 100 may include a plurality of CPUs 101.
  • the control unit 100 may include a main CPU that performs relatively complicated processing and has high processing capacity, and a sub CPU that performs relatively simple processing and that has low processing capacity.
  • the control unit 100 may not include the DSP 102 or may include a plurality of DSPs 102. Further, all the functions of the control unit 100 or a part of the functions of the control unit 100 may be realized by a hardware circuit that does not require software for realizing the function.
  • the storage unit 103 may include a computer-readable non-transitory recording medium other than the ROM and the RAM.
  • the storage unit 103 may include, for example, a small hard disk drive and an SSD (Solid State Drive).
  • the various control programs 103a in the storage unit 103 include various applications (application programs).
  • the storage unit 103 stores, for example, a call application for making a voice call and a video call, a browser for displaying a website, and a mail application for creating, browsing, and transmitting / receiving an email.
  • the storage unit 103 has a camera application for shooting a subject using the first camera 190 and the second camera 200, and a recorded image display application for displaying still images and moving images recorded in the storage unit 103.
  • a music reproduction control application for controlling reproduction of the music data stored in the storage unit 103 is stored.
  • At least one application in the storage unit 103 may be previously stored in the storage unit 103. Further, at least one application in the storage unit 103 may be downloaded by the electronic device 1 from another device and stored in the storage unit 103.
  • the wireless communication unit 110 has an antenna 111.
  • the wireless communication unit 110 can perform wireless communication using, for example, a plurality of types of communication methods by using the antenna 111.
  • the wireless communication of the wireless communication unit 110 is controlled by the control unit 100.
  • the wireless communication unit 110 can also be called a communication circuit or a wireless communication circuit.
  • the wireless communication unit 110 can wirelessly communicate with a base station of a mobile phone system.
  • the wireless communication unit 110 can communicate with a mobile phone, a web server, or the like different from the electronic device 1 through the base station and a network such as the Internet.
  • the electronic device 1 can perform data communication, voice call, video call, and the like with other mobile phones and the like.
  • wireless communication unit 110 can perform wireless communication using the wireless communication unit 110 and a wireless LAN (Local Area Network) such as WiFi.
  • the wireless communication unit 110 is also capable of short-range wireless communication.
  • the wireless communication unit 110 can perform wireless communication based on Bluetooth (registered trademark).
  • the wireless communication unit 110 may be capable of wireless communication in compliance with at least one of ZigBee (registered trademark) and NFC (Near Field Communication).
  • the wireless communication unit 110 performs various processing such as amplification processing on the signal received by the antenna 111, and outputs the processed received signal to the control unit 100.
  • the control unit 100 performs various processes on the received signal that is input, and acquires the information included in the received signal.
  • the control unit 100 also outputs a transmission signal including information to the wireless communication unit 110.
  • the wireless communication unit 110 performs various processing such as amplification processing on the input transmission signal, and wirelessly transmits the processed transmission signal from the antenna 111.
  • the display unit 120 includes a display surface 121 located on the front surface of the electronic device 1 and a display panel 122.
  • the display panel 122 is, for example, a liquid crystal display panel and includes a liquid crystal, a glass substrate, a polarizing plate, a backlight, and the like.
  • the display panel 122 can display various kinds of information.
  • the display panel 122 faces the display surface 121 in the device case 11. As a result, the information displayed on the display panel 122 is displayed on the display surface 121.
  • the display unit 120 can also be said to be a screen display unit.
  • the touch panel 130 can detect an operation on the display surface 121 by an operator such as a finger.
  • the touch panel 130 is, for example, a projected capacitive touch panel.
  • the touch panel 130 is located on the back side of the display surface 121, for example.
  • the control unit 100 can specify the content of the operation performed on the display surface 121 based on the electric signal (output signal) from the touch panel 130.
  • the control part 100 can perform the process according to the specified operation content.
  • the touch panel 130 can also be called a touch sensor. Instead of the display panel 122 and the touch panel 130, an in-cell type display panel incorporating a touch panel may be adopted.
  • control unit 100 can determine, for each operation button, whether or not the operation button has been operated.
  • the function assigned to the operated operation button is executed by the control unit 100, to which the operation signal is input, controlling the other components.
  • the microphone 180 can convert a sound input from the outside of the electronic device 1 into an electric sound signal and output the electric sound signal to the control unit 100. Sound from the outside of the electronic device 1 is taken into the inside of the electronic device 1 through the microphone hole 14 and input to the microphone 180.
  • the speaker 170 is, for example, a dynamic speaker.
  • the speaker 170 can convert an electrical sound signal from the control unit 100 into a sound and output the sound.
  • the sound output from the speaker 170 is output to the outside from the speaker hole 13. The user can hear the sound output from the speaker hole 13 even at a place away from the electronic device 1.
  • the receiver 160 can output the reception sound.
  • the receiver 160 is, for example, a dynamic speaker.
  • the receiver 160 can convert an electric sound signal from the control unit 100 into a sound and output the sound.
  • the sound output from the receiver 160 is output to the outside through the receiver hole 12.
  • the volume of the sound output from the receiver hole 12 is lower than the volume of the sound output from the speaker hole 13.
  • the user can hear the sound output from the receiver hole 12 by bringing the ear close to the receiver hole 12.
  • a vibrating element such as a piezoelectric vibrating element that vibrates the front surface of the device case 11 may be provided. In this case, the sound is transmitted to the user by vibrating the front portion.
  • the first camera 190 includes a lens 191 and an image sensor.
  • the second camera 200 includes a lens 201 and an image sensor. Each of the first camera 190 and the second camera 200 can photograph a subject under the control of the control unit 100, generate a still image or a moving image showing the photographed subject, and output the still image or the moving image to the control unit 100. .
  • the lens 191 of the first camera 190 is visible from the front surface 11a of the device case 11. Therefore, the first camera 190 can capture an object existing on the front surface side (display surface 121 side) of the electronic device 1.
  • the first camera 190 is called an in-camera.
  • the lens 201 of the second camera 200 is visible from the back surface 11b of the device case 11. Therefore, the second camera 200 can take an image of a subject existing on the back side of the electronic device 1.
  • the second camera 200 is called an out camera.
  • the acceleration sensor 150 can detect the acceleration of the electronic device 1.
  • the acceleration sensor 150 is, for example, a triaxial acceleration sensor.
  • the acceleration sensor 150 can detect the acceleration of the electronic device 1 in the X-axis direction, the Y-axis direction, and the Z-axis direction (see FIGS. 1 and 2).
  • the acceleration sensor 150 detects the acceleration at predetermined intervals and outputs the detection result.
  • the predetermined interval is 5 ms, but other values may be used.
  • the predetermined interval may be referred to as a detection interval.
  • the length of the detection interval may be represented by T.
  • the battery 210 can output the power source of the electronic device 1.
  • the battery 210 is, for example, a rechargeable battery.
  • the power output from the battery 210 is supplied to various components such as the control unit 100 and the wireless communication unit 110 included in the electronic device 1.
  • the electronic device 1 may not include the acceleration sensor 150.
  • the electronic device 1 may be wirelessly or wired connected to an acceleration sensor that is separate from the electronic device 1.
  • the electronic device 1 may include a sensor other than the acceleration sensor 150.
  • the electronic device 1 may include at least one of an atmospheric pressure sensor, a geomagnetic sensor, a temperature sensor, a proximity sensor, an illuminance sensor, a position sensor, and a gyro sensor.
  • the electronic device 1 may be connected to a sensor other than the acceleration sensor 150 other than the electronic device 1 in a wireless or wired manner.
  • the control unit 100 includes a motion estimation device 400 as a functional block, as shown in FIG.
  • the motion estimation device 400 can estimate the motion of the electronic device 1 during vibration based on the detection value 151 of the acceleration sensor 150.
  • the vibration is a concept including both periodic vibration and aperiodic vibration. Vibration can be said to be shaking or rocking.
  • the term “movement” simply means the movement of the electronic device 1.
  • the detection value 151 that is, the acceleration detected by the acceleration sensor 150 may be referred to as an acceleration detection value.
  • the motion estimation device 400 includes an estimation unit 300, a first filter 310, and a second filter 320. At least one of the estimation unit 300, the first filter 310, and the second filter 320 may be implemented by a hardware circuit that does not require software to implement its function.
  • the motion estimation apparatus 400 considers the usage of the motion estimation result described later, and, for example, generates a small random vibration of the electronic device 1 whose frequency band is 2 Hz or more and 5 Hz or less and whose amplitude is several mm.
  • Target vibration the control unit 100 can estimate the movement of the electronic device 1 that is subject to the target vibration.
  • the target vibration satisfying such a condition for example, when a user who holds the electronic device 1 in a vehicle such as a bus or a train is present, vibration of the electronic device 1 caused by the vibration of the vehicle is cited.
  • the target vibration may be vibration of the electronic device 1 when the hand of the user holding the electronic device 1 shakes due to aging or illness.
  • the target vibration is not limited to this example.
  • the detection value 151 of the acceleration sensor 150 is affected by other than the target vibration.
  • the detected value 151 is affected by gravity.
  • the detected value 151 is affected by the centrifugal force applied to the vehicle, and the detected value 151 of the vehicle when the vehicle starts or stops. It may be affected by acceleration.
  • the detected value 151 may also be affected when the user voluntarily slowly moves his hand. It is necessary to isolate and estimate the target motion to be corrected against these disturbances.
  • the first filter 310 filters the detected value 151 of the acceleration sensor 150 in order to remove effects other than the target vibration from the detected value 151.
  • the first filter 310 is, for example, a high-pass filter, and its pass band is set to, for example, 2 Hz or higher.
  • the first filter 310 may be a bandpass filter.
  • the pass band of this band pass filter is set to, for example, 2 Hz or more and 5 Hz or less.
  • the pass band of the first filter 310 is not limited to this example.
  • the estimating unit 300 estimates the movement of the electronic device 1 during the target vibration based on Bayesian estimation using the detection value 151 filtered by the first filter 310, the observation model 350, and the prediction model 360.
  • the estimation unit 300 estimates the movement of the electronic device 1 when the target vibration occurs, for example, by a method called a sequential Bayes filter.
  • the successive Bayes filter is described in Non-Patent Document 1 above.
  • the estimation unit 300 as the estimation result of the movement of the electronic device 1, for example, the estimated value of the acceleration of the electronic device 1, the estimated value of the speed of the electronic device 1, and the estimated value of the electronic device 1.
  • An estimated value of displacement and an estimated value of total displacement of the electronic device 1 are generated.
  • the displacement of the electronic device 1 means the amount of change in the position of the electronic device 1 from the reference position. It can be said that the displacement of the electronic device 1 indicates a relative position with respect to the reference position.
  • the total displacement of the electronic device 1 is the sum of the displacements of the electronic device 1 at each time.
  • the reference position for example, the position of the electronic device 1 when the motion estimation device 400 starts its operation is adopted.
  • the motion estimation apparatus 400 may start operation in response to the start of operation of the electronic device 1, or may start operation in response to a user's operation on the electronic device 1 in operation. Moreover, the motion estimation apparatus 400 may start operation when other conditions are satisfied in the electronic device 1 in operation.
  • the acceleration, velocity, displacement, and total displacement estimation values of the electronic device 1 obtained by Bayesian estimation may be referred to as the acceleration estimation value, speed estimation value, displacement estimation value, and total displacement estimation value, respectively.
  • acceleration means the acceleration of the electronic device 1
  • speed means the speed of the electronic device 1.
  • simple displacement means the displacement of the electronic device 1
  • simple total displacement means the total displacement of the electronic device 1. It can be said that each of the acceleration, the velocity, the displacement, and the total displacement is a physical quantity of the electronic device 1.
  • the second filter 320 filters the estimation result of the estimation unit 300.
  • the second filter 320 individually filters the acceleration estimation value, the velocity estimation value, the displacement estimation value, and the total displacement estimation value output from the estimation unit 300.
  • the second filter 320 may be a high pass filter or a band pass filter.
  • the filter characteristic of the second filter 320 may be the same as or different from the filter characteristic of the first filter 310.
  • the pass band of the second filter 320 may be the same as or different from the pass band of the first filter 310.
  • z 1: t ) at time t can be expressed by the following equation (1).
  • z 1: t ) is a probability distribution and is also called the posterior distribution p (x t
  • x is a vector representing a state and is called a state vector.
  • the state vector x t means the state vector x at time t.
  • z is a vector representing the observation and is called an observation vector.
  • the observation vector z t means the observation vector at time t.
  • the observation vector z 1: t means an observation vector from time 1 to time t.
  • z 1: t ) at the time t is the probability distribution p (z t
  • x t ) is called an observation model, and the probability distribution p (x t
  • At least one variable is included in each of the state vector x and the observation vector z.
  • the state vector x includes acceleration, velocity, displacement, and total displacement as variables.
  • the observation vector z includes acceleration, velocity, displacement, and total displacement as variables. It can be said that the variables included in the state vector x and the observation vector z are random variables.
  • the estimation unit 300 performs Bayesian estimation by obtaining the posterior probability p (x t
  • the estimation unit 300 uses the observation model 350 as p (z t
  • the estimation unit 300 sets the state vector that maximizes the calculated posterior probability p (x t
  • the observation model 350 is an observation model representing the target vibration.
  • the observation model 350 is a probability distribution that represents statistical prior knowledge about the target vibration.
  • the observation model 350 is a maximum four-dimensional probability distribution that represents the acceleration, velocity, displacement, and total displacement of the electronic device 1 during the target vibration.
  • the marginal probability distributions for each dimension of the observation model 350 may be referred to as a first probability distribution, a second probability distribution, a third probability distribution, and a fourth probability distribution, respectively.
  • the amplitude of the target vibration is small, as described above. Therefore, the position of the electronic device 1 does not change much during the target vibration.
  • the displacement of the electronic device 1 at the time of the target vibration is represented by, for example, a fixed first probability distribution having a zero mean and a small variance, where the displacement is a random variable, so that the electronic device in the observation model 350 is Express the knowledge that the position of 1 does not change much. In the observation model 350, it can be said that the average displacement of the electronic device 1 during the target vibration is expressed as zero.
  • the acceleration of the electronic device 1 at the time of the target vibration is represented by, for example, a second probability distribution in which the acceleration is a random variable and the average is the detected acceleration value and the variance is small.
  • the average of the second probability distribution for example, the acceleration detection value filtered by the first filter 310 is adopted.
  • the second probability distribution changes according to the acceleration detection value.
  • the velocity of the electronic device 1 at the time of the target vibration is represented by, for example, a fixed third probability distribution having a mean of zero and a small variance, where the velocity is a random variable. Express the knowledge that the speed of the electronic device 1 does not increase too much.
  • observation model 350 is an observation model representing the target vibration
  • the observation model 350 it is necessary to express the knowledge that the electronic device 1 vibrates in the observation model 350.
  • the total displacement of the electronic device 1 at the time of the target vibration is represented by, for example, a fixed fourth probability distribution having a mean of zero and the total displacement being a random variable. Expresses the knowledge that vibrates. It can be said that in the observation model 350, the average of the total displacement of the electronic device 1 at the time of the target vibration is expressed as zero.
  • the observation model 350 is determined for the target vibration by, for example, machine learning.
  • the observation model 350 may be determined using, for example, the maximum likelihood estimation method. Further, the observation model 350 may be determined based on the statistical distribution of actual measurement values in the vibration environment of the target vibration. Further, the observation model 350 may be determined based on both the machine learning and the statistical distribution of actual measurement values. For example, a normal distribution may be adopted as the first to fourth probability distributions. In this case, a fast Kalman filter can be adopted as a filter in Bayesian estimation.
  • Example of prediction model 360 the predicted value of acceleration at time t is represented by a (t), and the prediction error at time t regarding acceleration is represented by w_a (t).
  • a (t) the predicted value of acceleration at time t is represented by a (t)
  • w_a (t) the prediction error at time t regarding acceleration is represented by w_a (t).
  • the predicted value a (t) is represented by a probability distribution.
  • the predicted value a (t) may be referred to as a predicted probability distribution a (t).
  • the prediction error for example, a fixed probability distribution that does not change with time is used.
  • the prediction error is determined by machine learning, for example.
  • the prediction error may be determined using, for example, the maximum likelihood estimation method.
  • a normal distribution may be adopted as the probability distribution representing the prediction error.
  • a fast Kalman filter can be adopted as a filter in Bayesian estimation.
  • the estimation unit 300 calculates the predicted value a (t) of the acceleration at the time t using the formula (3). At this time, the estimation unit 300 calculates the acceleration for the a posteriori probability p (x t-1
  • z 1: t-1 ) at time t-1 is the estimated probability distribution of acceleration at time t-1.
  • z 1: t-2 ) at time t-2 can be said to be the estimated probability distribution of acceleration at time t-2. Therefore, in the prediction model 360, the predicted probability distribution a (t) of acceleration at time t is the estimated probability distribution of acceleration at time t ⁇ 1, the estimated probability distribution of acceleration at time t ⁇ 2, and the time t It can be said that it is represented by the prediction error w_a (t).
  • the predicted probability distribution a (t) of acceleration at time t is obtained by using the estimated probability distribution of acceleration at time t ⁇ 1 and the estimated probability distribution of acceleration at time t ⁇ 2. It can be said that it can be obtained by insertion.
  • the prediction model 360 it can be said that the predicted probability distribution of acceleration at a certain timing is represented by an extrapolation formula using the estimated probability distribution of acceleration at a timing earlier than that.
  • the estimation unit 300 determines that the initial probability distribution (for example, the average is zero when a (t-1) and a (t-2) is present if the past estimated probability distribution of acceleration does not exist. The normal distribution whose variance is a predetermined value) is substituted.
  • the predicted value of the velocity at time t is represented by v (t)
  • the predicted value of the displacement at time t is represented by r (t)
  • the predicted value of the total displacement at time t Is represented by s (t).
  • the speed of the electronic device 1 can be obtained by integrating the acceleration of the electronic device 1.
  • the displacement of the electronic device 1 can be obtained by integrating the speed of the electronic device 1.
  • the total displacement of the electronic device 1 is obtained by integrating the displacement of the electronic device 1.
  • the predicted value v (t) of the velocity, the predicted value r (t) of the displacement, and the predicted value s (t) of the total displacement use the following equation using the length T of the detection interval of the acceleration sensor 150. They are represented by (4), (5), and (6), respectively.
  • the prediction error at time t is represented by w_v (t)
  • the prediction error at time t is represented by w_r (t)
  • the total displacement is predicted at time t.
  • the error is represented by w_s (t).
  • the predicted value v (t), the predicted value r (t), and the predicted value s (t) are each represented by a probability distribution.
  • the predicted value v (t), the predicted value r (t), and the predicted value s (t) are referred to as the predicted probability distribution v (t), the predicted probability distribution r (t), and the predicted probability distribution s (t), respectively. May be called.
  • the estimation unit 300 obtains the predicted value v (t) of the speed at the time t using the equation (4). At this time, the estimation unit 300 estimates the posterior probability p (x t ⁇ 1
  • the estimation unit 300 obtains the predicted value r (t) of the displacement at the time t by using the equation (5).
  • the estimation unit 300 estimates the posterior probability p (x at time t-1 for a (t-1), v (t-1), and r (t-1) in the equation (5).
  • the equation (5) uses the estimated probability distribution of acceleration at time t ⁇ 1, the estimated probability distribution of velocity at time t ⁇ 1, and the estimated probability distribution of displacement at time t ⁇ 1. It can be said that this is a prediction formula for predicting the probability distribution of displacement at t.
  • the estimation unit 300 obtains the predicted value s (t) of the total displacement at the time t by using the formula (6). At this time, the estimation unit 300 calculates the predicted time t ⁇ for a (t ⁇ 1), v (t ⁇ 1), r (t ⁇ 1), and s (t ⁇ 1) in equation (6). Substituting marginal probability distributions of acceleration, velocity, displacement, and total displacement for the posterior probability p (x t-1
  • the equation (6) is calculated as follows: the estimated probability distribution of acceleration at time t-1, the estimated probability distribution of velocity at time t-1, the estimated probability distribution of displacement at time t-1, and the time t- It can be said that this is a prediction formula for predicting the probability distribution of the total displacement at time t, which is used with the estimated probability distribution of the total displacement at 1.
  • the estimation unit 300 performs Bayesian estimation using the acceleration detection value output from the acceleration sensor 150 (in other words, the detection value 151), the equations (3) to (6) in the prediction model 360, and the observation model 350. Based on this, a motion estimation process for estimating the current motion of the electronic device 1 is performed.
  • FIG. 5 is a flowchart showing an example of motion estimation processing in the estimation unit 300.
  • the motion estimation process shown in FIG. 5 is executed every time new acceleration is detected by the acceleration sensor 150.
  • the estimation unit 300 obtains an acceleration estimation value, a velocity estimation value, a displacement estimation value, and a total displacement estimation value in the X-axis direction in the motion estimation process.
  • the estimation unit 300 obtains, for example, an acceleration estimation value, a velocity estimation value, a displacement estimation value, and a total displacement estimation value in the Y axis direction.
  • the estimation unit 300 uses the prediction model 360 to obtain the current acceleration prediction value in step s1 as shown in FIG.
  • the estimation unit 300 uses the prediction model 360 to separately obtain a predicted value of acceleration in the X-axis direction and a predicted value of acceleration in the Y-axis direction.
  • step s1 the estimation unit 300 calculates the marginal probability distribution of the acceleration in the X-axis direction for the posterior probability p (x t
  • the estimation unit 300 calculates the marginal probability distribution of the acceleration in the X-axis direction for the posterior probability p (x t
  • the estimation unit 300 uses the prediction value a (t) thus obtained, that is, the prediction probability distribution a (t), as the prediction value of the current acceleration in the X-axis direction (in other words, the prediction probability distribution).
  • the estimation unit 300 similarly obtains the predicted value of the current acceleration in the Y-axis direction by using Expression (3).
  • step s2 the estimation unit 300 uses the prediction model 360 to obtain the predicted value of the current speed.
  • the estimation unit 300 uses the prediction model 360 to separately obtain the predicted value of the velocity in the X-axis direction and the predicted value of the velocity in the Y-axis direction.
  • the estimation unit 300 calculates the marginal probability distribution of the acceleration in the X-axis direction with respect to the posterior probability p (x t
  • the estimation unit 300 uses the prediction model 360 to obtain the predicted value of the current displacement.
  • the estimation unit 300 uses the prediction model 360 to separately obtain a predicted value of displacement in the X-axis direction and a predicted value of displacement in the Y-axis direction.
  • step s3 the estimation unit 300 calculates the marginal probability distribution of the acceleration in the X-axis direction for the posterior probability p (x t
  • the estimation unit 300 calculates the peripheral probability distribution of the displacement in the X-axis direction for the posterior probability p (x t
  • the estimation unit 300 uses the predicted value r (t) thus obtained, that is, the predicted probability distribution r (t), as the predicted value of the current displacement in the X-axis direction.
  • the estimating unit 300 similarly obtains the predicted value of the current displacement in the Y-axis direction by using Expression (5).
  • step s4 the estimation unit 300 uses the prediction model 360 to obtain the current predicted value of the total displacement.
  • the estimation unit 300 uses the prediction model 360 to separately obtain the predicted value of the total displacement in the X-axis direction and the predicted value of the total displacement in the Y-axis direction.
  • step s4 the estimation unit 300 calculates the marginal probability distribution of the acceleration in the X-axis direction for the posterior probability p (x t
  • the estimation unit 300 calculates the peripheral probability distribution of the displacement in the X-axis direction for the posterior probability p (x t
  • step s5 the estimation unit 300, based on the Bayesian estimation using the prediction value obtained in steps s1 to s4 and the observation model 350, the acceleration estimation value, the velocity estimation value, the displacement estimation value, and the total displacement estimation value. Generate an estimate of the displacement.
  • the estimation unit 300 estimates the acceleration, velocity, displacement, and total displacement in the X-axis direction based on Bayesian estimation using the predicted value in the X-axis direction obtained in steps s1 to s4 and the observation model 350. Generate a value.
  • the estimation unit 300 also calculates the estimated values of acceleration, velocity, displacement, and total displacement in the Y-axis direction based on Bayesian estimation using the Y-axis direction predicted value obtained in steps s1 to s4 and the observation model 350. To generate.
  • step s5 the estimation unit 300 sets the predicted values (in other words, the predicted probability distribution) of the acceleration, velocity, displacement, and total displacement in the X-axis direction obtained in steps s1 to s4 to p ( x t
  • the estimation unit 300 the posterior probability p obtained in step s5 the previous motion estimation process (x t
  • z 1: t ) at time t is estimated.
  • z 1: t ) is used to calculate a predicted value in the next motion estimation process.
  • the estimation unit 300 specifies the state vector that maximizes the estimated posterior probability p (x t
  • the estimation unit 300 sets the acceleration, velocity, displacement, and total displacement components of the estimated state vector as the current acceleration estimation value, velocity estimation value, displacement estimation value, and total displacement estimation value in the X-axis direction, respectively.
  • the estimation unit 300 uses Equation (1) to generate estimated values of acceleration, velocity, displacement, and total displacement in the Y-axis direction.
  • the estimation unit 300 uses the latest acceleration detection value in the Y-axis direction (specifically, the latest acceleration detection value in the Y-axis direction filtered by the first filter 310) as the average of the second probability distributions. .
  • the second filter 320 includes the estimated values of the acceleration, velocity, displacement, and total displacement in the X-axis direction generated by the estimation unit 300, and the acceleration, velocity, and displacement in the Y-axis direction generated by the estimation unit 300. And the estimated value of the total displacement are individually filtered.
  • the marginal probability distribution of the filtered acceleration may be input to a (t-1). Further, the marginal probability distribution of the filtered acceleration may be input to a (t-2). Further, the marginal probability distribution of the filtered velocity may be input to v (t ⁇ 1). In addition, the marginal probability distribution of the filtered displacement may be input to r (t-1). Further, the marginal probability distribution of the filtered total displacement may be input to s (t ⁇ 1).
  • At least one of the estimated values of acceleration, velocity, displacement, and total displacement in the Z-axis direction may be generated. Further, in the motion estimation process, at least one of the estimated values of the acceleration, the velocity, the displacement, and the total displacement in the X-axis direction may not be generated. Further, in the motion estimation process, at least one of the estimated values of acceleration, velocity, displacement, and total displacement in the Y-axis direction may not be generated.
  • Bayesian estimation may be performed on the rotational vibration in the same framework.
  • step s1 may be executed after any one of steps s2 to s4, and step s4 may be executed before any one of steps s1 to s3.
  • the estimation unit 300 can estimate the movement of the electronic device 1 at a timing earlier than the present.
  • FIG. 6 is a flowchart showing an example of the operation of the estimation unit 300 in this case.
  • M is an integer of 1 or more
  • T ⁇ (M ⁇ 1) the required number of 1 or more
  • the estimation unit 300 When the estimation unit 300 acquires a new acceleration detection value from the acceleration sensor 150, the estimation unit 300 executes a unit estimation process in step s11, as shown in FIG. After that, in step s12, the estimation unit 300 determines whether or not the unit estimation process has been performed the required number M of times.
  • step s12 the estimation unit 300 ends the motion estimation process. On the other hand, if NO is determined in step s12, the estimation unit 300 executes step s11 again, and executes the unit estimation process again. After that, the estimation unit 300 operates similarly.
  • step s1 of the second unit estimation processing in the motion estimation processing the marginal probability distribution of the acceleration in the X-axis direction regarding the posterior probability p (x t
  • the estimation unit 300 uses the prediction value a (t) (in other words, the prediction probability distribution a (t)) thus obtained as the prediction value of the acceleration in the X-axis direction at the timing T time ahead of the present time. .
  • step s1 of the second unit estimation process similarly, the predicted value of the acceleration in the Y-axis direction at the timing T time ahead of the present is obtained.
  • step s1 of the Na-th (3 ⁇ Na ⁇ M) unit estimation process in the motion estimation process the posterior probability p (x t
  • the marginal probability distribution of acceleration in the X-axis direction for () is substituted into a (t-1) in equation (3).
  • z 1: t ) estimated in the (Na ⁇ 2) th unit estimation process is a (t ⁇ Substituted in 2).
  • the estimated value a (t) thus obtained is used as the estimated value of the acceleration in the X-axis direction at the timing (T ⁇ (Na ⁇ 1)) time ahead of the present time.
  • step s1 of the Na-th unit estimation process similarly, the predicted value of the acceleration in the Y-axis direction at the timing (T ⁇ (Na ⁇ 1)) time ahead of the present time is obtained.
  • step s2 of the Nb-th (2 ⁇ Nb ⁇ M) unit estimation process of the motion estimation process the posterior probability p (x t
  • the marginal probability distribution of acceleration in the X-axis direction is substituted into a (t-1) in equation (4).
  • z 1: t ) estimated in the (Nb ⁇ 1) th unit estimation process is v (t ⁇ 1) in the equation (4). ) Is assigned to.
  • the estimation unit 300 uses the prediction value v (t) thus obtained as the prediction value of the velocity in the X-axis direction at the timing (T ⁇ (Nb ⁇ 1)) time ahead of the present time.
  • step s2 of the Nb-th unit estimation process similarly, a predicted value of the velocity in the Y-axis direction at a timing (T ⁇ (Nb ⁇ 1)) time ahead of the present time is obtained.
  • the estimation unit 300 estimates the estimation probability distribution at a certain timing at the previous timing in the extrapolation formula of the equation (3). It is used as a probability distribution to obtain a predicted probability distribution at a timing earlier than the certain timing, and based on Bayesian estimation using the obtained predicted probability distribution at the relevant timing, estimation at the relevant timing It can be said that the process of generating the probability distribution is repeatedly executed.
  • step s3 of the Nb-th unit estimation process of the motion estimation process the vicinity of the acceleration in the X-axis direction about the posterior probability p (x t
  • the probability distribution is substituted into a (t-1) in equation (5).
  • z 1: t ) estimated in the (Nb ⁇ 1) th unit estimation process is v (t ⁇ 1) in the equation (5). ) Is assigned to.
  • z 1: t ) estimated in the (Nb ⁇ 1) th unit estimation process is r (t ⁇ 1) in the equation (5). ) Is assigned to.
  • the estimated value r (t) thus obtained is used as the estimated value of the displacement in the X-axis direction at the timing (T ⁇ (Nb ⁇ 1)) time ahead of the present time.
  • step s3 of the Nb-th unit estimation process similarly, the predicted value of the displacement in the Y-axis direction at the timing (T ⁇ (Nb ⁇ 1)) time ahead of the present time is obtained.
  • step s4 of the Nb-th unit estimation process of the motion estimation process the vicinity of the acceleration in the X-axis direction about the posterior probability p (x t
  • the probability distribution is substituted into a (t-1) in equation (6).
  • z 1: t ) estimated in the (Nb ⁇ 1) th unit estimation process is v (t ⁇ 1) in the equation (6). ) Is assigned to.
  • z 1: t ) estimated in the (Nb ⁇ 1) th unit estimation process is r (t ⁇ 1) in Expression (6). ) Is assigned to. Then, the marginal probability distribution of the total displacement in the X-axis direction for the posterior probability p (x t
  • step s5 of the Nb-th unit estimation process of the motion estimation process the estimated values of acceleration, velocity, displacement, and total displacement in the X-axis direction at the timing (T ⁇ (Nb ⁇ 1)) time ahead of the present time.
  • the estimated values of the acceleration, velocity, displacement, and total displacement in the Y-axis direction at the timing (T ⁇ (Nb ⁇ 1)) ahead of the current time are generated.
  • the unit estimation process By executing the unit estimation process for the required number of times M as described above, the movement of the electronic device 1 at the timing that is (T ⁇ (M ⁇ 1)) time ahead of the present time is estimated.
  • the unit estimation process is executed by the required number M, so that the movement of the electronic device 1 at the timing (T ⁇ (M ⁇ 1)) time ahead of the timing at which the latest acceleration detection value is obtained. Presumed.
  • the motion of the electronic device 1 at the time of the target vibration is estimated at a timing (T ⁇ (M ⁇ 1)) time ahead of the present time.
  • the estimation unit 300 estimates the motion of the electronic device 1 at a timing ahead of the current time by an integer multiple of T.
  • the estimation unit 300 can also estimate the movement of the electronic device 1 at a desired timing earlier than the present timing, not at a timing earlier than the present by an integral multiple of T.
  • FIG. 7 is a flowchart showing an example of motion estimation processing in this case.
  • the desired timing earlier than the present is called the estimated timing.
  • step s21 the estimation unit 300 determines the required number M of times based on the estimation timing.
  • step s21 the estimation unit 300 determines that the estimated timing does not exceed the timing that is (T ⁇ (M ⁇ 1)) time ahead of the current time and is (T ⁇ (M ⁇ 1)) time ahead of the current time. Determine the value of M that is closest to
  • step s21 the estimation unit 300 executes steps s11 and s12 described above. If NO is determined in step s12, step s11 is executed again, and thereafter, the estimation unit 300 operates similarly. On the other hand, if YES is determined in the step s12, the step s22 is executed. In step s22, the estimation unit 300, based on the motion estimation result in step s5 of the (M ⁇ 1) th unit estimation process and the motion estimation result in step s5 of the Mth unit estimation process, The movement of the electronic device 1 at the estimation timing is estimated.
  • the estimation unit 300 obtains the estimated value of the acceleration in the X-axis direction at the timing (T ⁇ (M ⁇ 2)) time ahead of the present time obtained by the (M ⁇ 1) th unit estimation process.
  • the estimated value of the acceleration in the X-axis direction at the timing (T ⁇ (M ⁇ 1)) time ahead of the present time which is obtained in the M-th unit estimation process, based on the estimated value in the X-axis direction at the estimated timing.
  • the estimation unit 300 may estimate the acceleration value in the X-axis direction at the timing (T ⁇ (M ⁇ 2)) time ahead of the present time and (T ⁇ (M ⁇ 1)) time ahead of the present time.
  • An estimated value of the acceleration in the X-axis direction at the estimation timing is generated by linear interpolation using the estimated value of the acceleration in the X-axis direction at the timing.
  • the estimated value of the acceleration in the X-axis direction at the estimated timing is A0. Further, the estimated value of the acceleration in the X-axis direction at the timing (T ⁇ (M ⁇ 2)) time ahead of the present is set to A1, and the estimated value at the timing (T ⁇ (M ⁇ 1)) time ahead of the present.
  • the estimated value of the acceleration in the X-axis direction is A2. Further, (T ⁇ (M ⁇ 2)) is T1, and (T ⁇ (M ⁇ 1)) is T2.
  • the estimation unit 300 obtains the estimated value A0 of the acceleration in the X-axis direction at the estimation timing using, for example, the following equation (7).
  • step s22 the estimation unit 300 similarly obtains the estimated values of the velocity, displacement, and total displacement in the X-axis direction at the estimation timing. Further, in step s22, the estimation unit 300 similarly obtains the estimated values of acceleration, velocity, displacement, and total displacement in the Y-axis direction at the estimation timing.
  • the estimation unit 300 uses the motion estimation result in the M-th unit estimation process as the motion estimation result at the estimation timing.
  • the estimation unit 300 does not have to execute step s22. That is, the estimation unit 300 does not have to execute the linear interpolation. In this case, when T0 does not match (T ⁇ (M ⁇ 1)), the estimation unit 300 estimates the motion estimation result at the timing (T ⁇ (M ⁇ 1)) time ahead from the present time. It may be the result of motion estimation at the timing.
  • the acceleration detection value of the acceleration sensor 150, the prediction model 360 representing the movement of the electronic device 1, and the observation model 350 representing the vibration of the electronic device 1 using the acceleration detection value are set. Based on the Bayesian estimation used, the movement of the electronic device 1 during vibration is estimated. This makes it possible to appropriately estimate the movement of the electronic device 1 during vibration.
  • the predicted probability distribution of acceleration at a certain timing is the estimated probability distribution of acceleration at a timing before the certain timing. It is expressed by the extrapolation formula used. Then, the estimated probability distribution at a certain timing is used as the estimated probability distribution at the previous timing in the extrapolation formula to obtain the predicted probability distribution at the timing earlier than the certain timing, and at the determined previous timing. It is possible to obtain the estimated probability distribution of acceleration at a desired timing by repeatedly executing the process of generating the estimated probability distribution at the previous timing based on the Bayesian estimation using the predicted probability distribution of. Therefore, the extrapolation formula can be used to easily obtain the estimated value of the acceleration at the desired timing.
  • the observation model 350 expresses that the average displacement of the electronic device 1 during the target vibration is zero. In other words, in the observation model 350, the average of the marginal probability distribution of displacement is zero. Therefore, it is possible to reduce the possibility that the estimated value of the displacement of the electronic device 1 during the target vibration greatly deviates from zero. Therefore, it is possible to appropriately obtain the estimated value of the displacement according to the situation that the amplitude of the target vibration is small.
  • the prediction model 360 that is, the integral of the detected value, may be caused by an error in detecting the acceleration of the acceleration sensor 150 or the like. Considering only the calculation, consider the first situation in which the sign of the displacement of the electronic device 1 in the X-axis direction is always predicted to be positive.
  • the observation model 350 when the observation model 350 does not express that the average of the total displacement of the electronic device 1 at the time of the target vibration is zero, it is obtained by Bayesian estimation by the action of the first probability distribution of the displacement.
  • the estimated value of the displacement of the electronic device 1 in the X-axis direction is likely to be near zero, but is unlikely to be a negative value. Therefore, in the first situation, when the average of the marginal probability distribution of the total displacement of the observation model 350 is not set to zero, the sign of the estimated value of the displacement in the X-axis direction does not change from plus to minus, and
  • the estimated value may not be a value according to the vibration of the electronic device 1 along the X-axis direction.
  • the sign of the estimated value of the displacement in the X-axis direction does not change from minus to plus, and the estimated value is There is a possibility that the value does not correspond to the vibration of the electronic device 1 along the X-axis direction.
  • the total displacement in the X-axis direction continues to increase while the sign of the displacement in the X-axis direction is positive.
  • the sign of the displacement in the X-axis direction needs to be negative.
  • the estimated value of the total displacement becomes zero in the first situation.
  • the sign of the estimated displacement value will become negative as it approaches.
  • the sign of the estimated displacement value is likely to be positive so that the estimated total displacement value approaches zero. Therefore, as in this example, when the average of the fourth probability distribution of the total displacement of the observation model 350 is set to zero, the estimated value of the displacement that changes in accordance with the vibration of the electronic device 1 is appropriately set. Obtainable.
  • the observation model 350 expresses the knowledge that the speed of the electronic device 1 does not increase so much, but such an expression regarding the speed may be omitted. Further, the estimation unit 300 does not have to generate at least one of the estimated values of acceleration, velocity, displacement, and total displacement. That is, the estimation unit 300 may generate at least one estimated value of acceleration, velocity, displacement, and total displacement as the motion estimation result.
  • the motion estimation result of the motion estimation device 400 can be used in various situations.
  • the electronic device 1 uses the motion estimation result of the motion estimation device 400 in the image display control will be described.
  • the target vibration includes vibration of the electronic device 1 caused by vibration of the vehicle when a user holding the electronic device 1 is inside a vehicle such as a bus or a train.
  • the target vibration may be vibration of the electronic device 1 when the hand of the user holding the electronic device 1 shakes due to aging or illness.
  • the position of the electronic device 1 with respect to the earth may move when the electronic device 1 vibrates due to the vibration of the vehicle. As a result, the user may have difficulty visually recognizing the image displayed on the electronic device 1.
  • the electronic device 1 vibrates due to the hand of the user holding the electronic device 1 shaking due to circumstances such as aging or illness, the position of the electronic device 1 with respect to the earth may move. As a result, the user may have difficulty visually recognizing the image displayed on the electronic device 1.
  • the electronic device 1 controls the position of the image with respect to the earth by controlling the display position of the image displayed on the display unit 120 based on the motion estimation result of the motion estimation device 400. To do. Specifically, the electronic device 1 controls the display position of the image displayed on the display unit 120 based on the estimated value of the displacement generated by the motion estimation device 400 to determine the position of the image with respect to the earth. Control. This makes it difficult for the position of the image displayed on the electronic device 1 with respect to the earth to change during the target vibration, and improves the visibility of the image. Therefore, even when a user holding the electronic device 1 is present in the vehicle, the position of the image displayed on the electronic device 1 with respect to the earth is unlikely to change, and the visibility of the image can be improved.
  • FIG. 8 is a diagram showing an example of the configuration of the control unit 100 according to this example.
  • the control unit 100 includes a display control unit 500 that controls the display of the display unit 120 as a functional block.
  • the display control unit 500 controls the display position of the image displayed on the display unit 120 at a predetermined frame rate based on the displacement estimated value output from the second filter 320 of the motion estimation device 400, thereby Image position control such as controlling the position of the image with respect to the earth is performed. It can be said that the display position of the image is the position of the image on the display surface 121.
  • the image displayed on the display unit 120 at a predetermined frame rate may be a moving image or a still image.
  • the predetermined frame rate is, for example, 60 fps.
  • the predetermined frame rate may be other than 60 fps.
  • the predetermined frame rate may be 120 fps.
  • the displacement of the electronic device 1 in the X-axis direction may be referred to as X displacement
  • the displacement of the electronic device 1 in the Y-axis direction may be referred to as Y displacement
  • the plus direction and the minus direction of the X axis direction may be referred to as + X direction and ⁇ X direction, respectively.
  • the plus direction and the minus direction of the Y-axis direction may be referred to as + Y direction and ⁇ Y direction, respectively.
  • the display control unit 500 controls the display position in the X axis direction of the image displayed on the display unit 120 based on the estimated value of the X displacement output from the second filter 320 of the motion estimation device 400. To do. Further, in the image position control, the display control unit 500 causes the display position in the Y axis direction of the image displayed on the display unit 120 to be based on the estimated value of the Y displacement output from the second filter 320 of the motion estimation device 400. To control.
  • FIG. 9 is a diagram for explaining image position control.
  • the electronic device 1 in which the X displacement and the Y displacement are zero that is, the electronic device 1 at the reference position is shown.
  • an electronic device 1 having an X displacement of + L and a zero Y displacement that is, an electronic device 1 that has moved from the reference position in the Y axis direction but has moved in the + X axis direction by L. It is shown.
  • the receiver hole 12, the lens 191, the operation buttons 141 to 143, and the speaker hole 13 are omitted.
  • an image (still image) 600 including a character string 601 “ABC” is displayed on the display surface 121 of the electronic device 1.
  • the size of the image 600 matches the size of the display surface 121.
  • the image 600 is displayed on the display surface 121 so that the center position of the image 600 matches the center position of the display surface 121.
  • the central position of the image 600 displayed on the electronic device 1 in which the X displacement and the Y displacement are zero is indicated by reference numeral 701.
  • the central position of the image 600 displayed on the electronic device 1 in which the X displacement is + L and the Y displacement is zero is indicated by reference numeral 702.
  • the display position of the image displayed by the electronic device 1 at the reference position that is, the electronic device 1 having zero X displacement and Y displacement may be referred to as a reference display position.
  • the display position in the X-axis direction of the image displayed by the electronic device 1 with zero X displacement may be referred to as the X reference display position.
  • the display position in the Y-axis direction of the image displayed by the electronic device 1 with zero Y displacement may be referred to as the Y reference display position.
  • the display positions in the X-axis direction and the Y-axis direction of the image displayed by the electronic device 1 at the reference position are the X reference display position and the Y reference display position, respectively.
  • the X reference display position for example, a center line along the Y axis direction indicating the center of the display surface 121 in the X axis direction and a center line along the Y axis direction indicating the center of the image in the X axis direction.
  • the display positions in the X-axis direction of the images are overlapped.
  • the center line along the Y-axis direction indicating the center of the image 600 in the X-axis direction is a line along the Y-axis direction that passes through the center position 702 of the image 600. .
  • the center line indicating the center of the image in the X-axis direction and the X-axis of the display surface 121 overlaps.
  • the center line along the X-axis direction indicating the center of the display surface 121 in the Y-axis direction and a center line along the X-axis direction indicating the center of the image in the Y-axis direction.
  • the display positions in the Y-axis direction of the images are overlapped with each other.
  • the center line along the X-axis direction which indicates the center of the image 600 in the Y-axis direction, is a line along the X-axis direction that passes through the center position 702 of the image 600. .
  • the center line indicating the center of the image in the Y-axis direction and the Y-axis of the display surface 121 overlaps.
  • a case where the electronic device 1 having an X displacement of + L displays the image 600 such that its display position is displaced by L in the ⁇ X direction as compared with the image 600 displayed by the electronic device 1 having an X displacement of zero.
  • the earth of the image 600 displayed on the display surface 121 is displayed.
  • the positions in the X-axis direction are the same.
  • the image of the electronic device 1 having the X displacement of + L is shifted such that the center in the X-axis direction is displaced by L in the ⁇ X direction as compared with the image 600 displayed by the electronic device 1 having the X displacement of zero.
  • the center of the image 600 displayed on the display surface 121 in the X-axis direction is the same between the electronic device 1 having an X displacement of zero and the electronic device 1 having an X displacement of + L. . Therefore, when the electronic device 1 displays an image, the display position of the image is opposite to the X reference display position by the absolute value of the X displacement of the electronic device 1 opposite to the direction indicated by the sign of the X displacement.
  • the image is displayed so as to be displaced in the direction, the position of the image displayed on the display surface 121 in the X-axis direction with respect to the earth becomes difficult to change.
  • the sign of the X displacement is positive, the direction opposite to the direction indicated by the sign is the ⁇ X direction.
  • the display position of the image is opposite to the Y reference display position by the absolute value of the Y displacement of the electronic device 1 opposite to the direction indicated by the sign of the Y displacement.
  • the image is displayed so as to be displaced in the direction of, the position of the image displayed on the display surface 121 in the Y-axis direction with respect to the earth is unlikely to change.
  • the display control unit 500 executes the image position control so that the display position in the X-axis direction of the image displayed on the display unit 120 is the X displacement of the electronic device 1 more than the X reference display position. Only the absolute value of the estimated value is shifted in the direction opposite to the direction indicated by the sign of the estimated value. Further, the display control unit 500 executes the image display control so that the display position in the Y-axis direction of the image displayed on the display unit 120 is the absolute value of the estimated value of the Y displacement of the electronic device 1 rather than the Y reference display position. Only the value is shifted in the direction opposite to the direction indicated by the sign of the estimated value. This makes it difficult for the position of the image displayed by the electronic device 1 with respect to the earth to change during the target vibration. As a result, the visibility of the image displayed by the electronic device 1 during the target vibration is improved.
  • the display surface 121 is displayed as shown in the lower side of FIG. Only a part of the image 600 is displayed on the display screen 121, and an area 121a where the image 600 is not displayed occurs on the display surface 121.
  • a predetermined image different from the image 600 may be displayed in the area 121a.
  • an image in which the color of each pixel is a predetermined color (for example, black) may be displayed in the area 121a.
  • the area 121a may display an extrapolated image 600.
  • the size of the image 600 may be smaller or larger than the size of the display surface 121.
  • the display control unit 500 for each frame period, based on the estimated value of the displacement obtained in the frame period, the display position of the image displayed in, for example, one frame period after the frame period.
  • the frame period is a period in which one frame image is displayed, and if the frame rate is 60 fps, the length of the frame period is 1/60 second (about 16.7 ms).
  • the frame period of interest in the description of the electronic device 1 may be referred to as the target frame period.
  • a frame period that is one behind the target frame period may be referred to as a next frame period.
  • An image displayed in the next frame period (that is, a frame image) may be referred to as a target image.
  • the displacement estimated value used in determining the display position of the target image may be referred to as a target displacement estimated value.
  • the acceleration detection value used as the latest acceleration detection value when generating the target displacement estimation value may be referred to as a target acceleration detection value.
  • FIG. 10 is a flowchart showing an example of the operation of the electronic device 1 during the target frame period.
  • the electronic device 1 executes the process shown in FIG. 10 for each frame period.
  • the estimation unit 300 executes the motion estimation processing shown in FIG. 7 described above.
  • the estimation unit 300 acquires the target acceleration detection value.
  • the target acceleration detection value for example, the acceleration detection value output from the acceleration sensor 150 immediately after the vertical synchronization signal in the target frame period is generated is adopted.
  • step s32 the estimation unit 300 determines the estimation timing of the motion estimation process. Then, in step s33, the estimation unit 300 uses the target acceleration detection value as the latest acceleration detection value and executes the motion estimation process of estimating the movement of the electronic device 1 at the estimation timing determined in step s32.
  • FIG. 11 is a diagram for explaining an example of a method of determining the estimated timing.
  • FIG. 11 shows a generation start timing tm1 which is a timing at which the generation of the vertical synchronization signal in the target frame period starts and a generation start timing tm2 which is a timing at which the generation of the vertical synchronization signal in the next frame period starts.
  • an output timing tm3 which is the timing at which the acceleration sensor 150 outputs the target acceleration detection value
  • a display timing tm4 which is the timing at which the target image is actually displayed on the display surface 121. It can be said that the output timing tm3 is the timing at which the estimation unit 300 acquires the target acceleration detection value from the acceleration sensor 150.
  • step s32 the estimation unit 300 obtains a time T11 from the output timing tm3 of the target acceleration detection value to the vertical synchronization signal generation start timing tm2 in the next frame period.
  • the generation start timing tm2 is known to the estimation unit 300.
  • the estimation unit 300 adds the obtained time T11 to the time T12 from the generation start timing tm2 to the display timing tm4 to obtain the time T13 from the output timing tm3 to the display timing tm4. Then, the estimation unit 300 sets the obtained time T13 as T0 of the estimation timing. As a result, the estimated timing becomes the timing at which the target image is actually displayed on the display unit 120.
  • the estimated timing T0 is 28 ms.
  • the target image is displayed 28 ms after the present. Specifically, the target image is displayed 28 ms after the target acceleration detection value is obtained.
  • the time T12 is stored in the storage unit 103 in advance.
  • the time T12 includes the time from when the generation of the vertical synchronization signal is started until the display controller 122 receives a drive signal from the display control unit 500, and the reaction time of the display panel 122.
  • the reaction time of the display panel 122 is the time from when a drive signal is given to the display panel 122 until the display panel 122 actually displays an image.
  • the time T12 is obtained in advance by an actual machine experiment or simulation.
  • step s33 the estimation unit 300 uses the target acceleration detection value as the latest acceleration detection value and performs the motion estimation process shown in FIG. 7 described above.
  • the estimated values of acceleration, velocity, displacement and total displacement at the estimation timing are obtained. That is, estimated values of acceleration, velocity, displacement, and total displacement at the timing when the target image is displayed can be obtained.
  • the target image is displayed on the display unit 120 based on Bayesian estimation using the predicted values of acceleration, velocity, displacement, and total displacement at the timing when the target image is displayed on the display unit 120. It can be said that estimates of acceleration, velocity, displacement and total displacement are generated.
  • step s34 the display control unit 500 executes image position control. Specifically, the display control unit 500 determines the display position of the target image by using the estimated value of the displacement obtained in step s33 as the target displacement estimated value.
  • step s34 the display control unit 500 sets the estimated value of the displacement in the X-axis direction at the estimation timing, which is output from the second filter 320, as the target displacement estimated value in the X-axis direction and is output from the second filter 320.
  • the estimated value of the displacement in the Y-axis direction at the estimation timing is set as the target displacement estimated value in the Y-axis direction.
  • the display control unit 500 determines the display position of the target image based on the target displacement estimated value in the X-axis direction and the Y-axis direction.
  • step s34 the display control unit 500 sets the display position in the X-axis direction of the target image to the target displacement estimated value in the X-axis direction by the absolute value of the target displacement estimated value in the X-axis direction from the X reference display position.
  • the position is determined to be displaced in the direction opposite to the direction indicated by the reference numeral.
  • the display control unit 500 sets the display position of the target image in the Y-axis direction to the reference displacement estimated value in the Y-axis direction by the absolute value of the target displacement estimated value in the Y-axis direction relative to the Y reference display position.
  • the position is determined to be offset in the direction opposite to the direction shown. This makes it difficult for the position of the target image displayed in the next frame period with respect to the earth to deviate from the position of the image displayed by the electronic device 1 at the reference position with respect to the earth.
  • step s34 the display control unit 500 generates a drive signal for displaying the target image on the display unit 120 at the determined display position and gives the drive signal to the display unit 120.
  • the display section 120 displays the target image at the display position determined in step s34.
  • the control unit 100 executes the above-described processing of steps s31 to s34 by setting each frame period as the target frame period, so that the position of the image displayed on the display surface 121 at the predetermined frame rate with respect to the earth becomes difficult to move. .
  • the position of the image with respect to the earth during the execution of the image position control is less likely to change from the position of the image displayed by the electronic device 1 at the reference position with respect to the earth.
  • the visibility of the image is improved.
  • the position of the object character string 601 in FIG. 9) such as characters, symbols, and figures included in the image displayed on the electronic device 1 with respect to the earth becomes difficult to move, so that the visibility of the object is improved. improves.
  • the electronic device 1 of this example reduces the possibility that such a thing will occur.
  • the estimation unit 300 Since the estimation unit 300 performs the motion estimation process every 5 ms as described above, not only the motion estimation process of step s33 but also the motion estimation process unrelated to the image position control is executed in the target frame period. To be done. The estimation unit 300 does not have to execute the motion estimation process other than the motion estimation process of step s33 in the target frame period.
  • the estimating unit 300 may start executing the image display control when a predetermined condition is satisfied in the electronic device 1. For example, when the control unit 100 determines that the estimation unit 300 has performed an operation to instruct execution of image display control on the display surface 121, the estimation unit 300 starts execution of image display control, and FIG. You may perform the series of processes shown by.
  • the prediction probability distribution a (t) of the acceleration at a certain timing is the estimated probability distribution of the acceleration at a timing before the certain timing, as shown in Expression (3). It is represented by an extrapolation formula using a (t-1) and a (t-2). Further, in the prediction model 360, as shown in Expression (5), the predicted probability distribution r (t) of displacement at a certain timing is the estimated probability distribution a (t) of acceleration at a timing before the certain timing. It is represented by a prediction formula using -1).
  • the estimated probability distribution of acceleration at a certain timing is used as the estimated probability distribution of acceleration at the previous timing in the extrapolation formula (3), so The predicted probability distribution of acceleration at the previous timing is obtained, and the estimated probability distribution of acceleration at the preceding timing is generated based on Bayesian estimation using the calculated predicted probability distribution of acceleration at the preceding timing.
  • a specific process is performed.
  • the estimated probability distribution of acceleration obtained by repeatedly executing the specific process is used as the estimated probability distribution of acceleration at the previous timing in the prediction formula of Expression (5).
  • the predicted probability distribution of the displacement at the timing when the target image is displayed is obtained.
  • an estimated value of the displacement at the timing at which the target image is displayed is generated based on Bayesian estimation using the predicted probability distribution of the displacement at the timing at which the target image is displayed. .
  • the position of the image displayed on the display unit 120 with respect to the earth is more reliable. It becomes difficult to change to.
  • the acceleration sensor 150 it may take some time from the detection of acceleration by the internal filter function to the output of the detection result.
  • the time from the acceleration sensor 150 detecting the acceleration to the output of the detection result may be referred to as a sensor delay time.
  • the estimated value of the displacement at the timing T0 time later than the present time (that is, the estimated value of the displacement at the estimated timing) obtained in step s33 is, strictly speaking, the acceleration sensor 150 indicates the latest acceleration.
  • the timing is T0 time after the detected timing.
  • the sensor delay time is zero, assuming that the time T13 from the output timing tm3 to the display timing tm4 is T0 as described above, the estimated timing matches the timing at which the target image is displayed.
  • the time T13 is set to T0 when the sensor delay time is not zero, the estimated timing is a timing slightly before the timing at which the target image is displayed. Therefore, the estimated value of the displacement at the estimated timing is the estimated value of the displacement slightly before the timing at which the target image is displayed.
  • the estimation unit 300 may determine the estimation timing in consideration of the sensor delay time. Specifically, as shown in FIG. 12, the estimation unit 300 sets the time T14 from the timing tm5 when the acceleration sensor 150 detects the acceleration to the output timing tm3 of the detection result (that is, the target acceleration detection value) as the time T14.
  • the time T15 obtained by adding T13 may be T0 of the estimated timing. Thereby, the estimation unit 300 can more accurately estimate the displacement at the display timing of the target image.
  • the time T14 is obtained in advance by an actual machine experiment or simulation.
  • the estimation unit 300 estimates the displacement at the timing (T ⁇ (M ⁇ 1)) time ahead from the present.
  • the value may be used as the estimated value of the displacement at the estimation timing, and the estimated value may be used as the estimated value of the displacement at the display timing of the target image.
  • the electronic device 1 was a mobile phone such as a smartphone, but it may be another type of electronic device.
  • the electronic device 1 may be, for example, a tablet terminal, a laptop personal computer, a wearable device, or the like.
  • the wearable device adopted as the electronic device 1 may be a wrist band type or wrist watch type such as a wrist type, or a head band type or eyeglass type such as a head type. It may be of a type that is worn on the body such as a clothing pattern.
  • the above motion estimation technology can be used for screen correction in mobile terminals such as smartphones, smart watches and personal computers, projectors, head mounted displays, machine tools and electronic mirrors. Also, the above motion estimation technology Can also be used to correct the detection position of the touch panel.
  • the motion estimation technique described above can also be used for detecting motion of a game controller.
  • the game controller may have a fishing rod shape, for example.
  • the motion estimation techniques described above can also be used in motion detection for sports goods, running shoes and rackets. Further, the above-described motion estimation technique can be used for detecting abnormality of equipment, detection of floating motion of fishing, posture control of a robot, and the like.
  • the motion of the device during vibration estimated by the estimation unit 300 is the motion of the device during translational vibration. Therefore, it can be said that the estimated values of the acceleration, the velocity, the displacement, and the total displacement generated by the estimation unit 300 are the estimated values of the translational acceleration, the translational velocity, the translational displacement, and the total translational displacement.
  • the motion estimation device 400 may estimate the motion of the device during rotational vibration in the motion estimation process. The motion estimation process in this case will be described below.
  • the electronic device 1 according to the present example will be described by taking the case where the electronic device 1 is used as a head-up display mounted in a vehicle as an example.
  • the electronic device 1 that functions as a head-up display displays an image on a real landscape in front of the vehicle in which the electronic device 1 is mounted, for example, so as to display the image, so that augmented reality (AR) is achieved. ) Can be realized.
  • the electronic device 1 can display, for example, an image for supporting the driver in an overlapping manner with respect to the actual landscape.
  • the vehicle in which the electronic device 1 is mounted may be referred to as a target vehicle.
  • the electronic device 1 includes a gyro sensor 250.
  • the gyro sensor 250 can detect the rotational angular velocity of the electronic device 1.
  • the rotational angular velocity may be simply referred to as the angular velocity.
  • the gyro sensor 250 is, for example, a 3-axis gyro sensor.
  • the gyro sensor 250 is capable of detecting the angular velocity of rotation about the X1 axis, the angular velocity of rotation about the Y1 axis, and the angular velocity of rotation about the Z1 axis for the electronic device 1.
  • the gyro sensor 250 detects the angular velocity at predetermined intervals and outputs the detection result.
  • the predetermined interval is 5 ms, but other values may be used.
  • the predetermined interval may be referred to as a detection interval.
  • the length of the detection interval of the gyro sensor 250 may also be represented by T.
  • FIG. 14 is a diagram showing an example of the X1, Y1, and Z1 axes.
  • the X1 axis is set along the front-rear direction of the target vehicle 800 in which the electronic device 1 is mounted.
  • the Y1 axis is set so as to extend in the left-right direction of the target vehicle 800.
  • the Z1 axis is set along the height direction of the target vehicle 800.
  • the gyro sensor 250 can detect the angular velocity of rotation about the X1 axis, the angular velocity of rotation about the Y1 axis, and the angular velocity of rotation about the Z1 axis for the target vehicle 800.
  • the rotation around the X1 axis, the rotation around the Y1 axis, and the rotation around the Z1 axis may be referred to as rolling, pitching, and yawing, respectively.
  • the motion estimation device 400 of the electronic device 1 can estimate the motion of the electronic device 1 during rotational vibration based on the detection value 251 of the gyro sensor 250.
  • the motion estimation device 400 can estimate the motion of the target vehicle 800 during rotational vibration based on the detected value 251.
  • the detection value 251, that is, the angular velocity detected by the gyro sensor 250 may be referred to as an angular velocity detection value.
  • the motion estimation device 400 may determine, for example, that the rotational angle of the electronic device 1 is 0.5 Hz or more and 2 Hz or less. Rotational random vibration of a few degrees is taken as the target rotational vibration. Then, the motion estimation device 400 estimates the motion of the electronic device 1 that is performing the target rotational vibration.
  • the target rotational vibration is not limited to this example.
  • the detection value 251 of the gyro sensor 250 is affected by other than the target rotational vibration.
  • the detected value 251 may be affected by a gradual change in posture of the target vehicle 800 caused by a gradual change in the terrain in which the target vehicle 800 travels due to a slope or the like.
  • the detected value 251 may be affected by a gradual posture change of the target vehicle 800 due to acceleration / deceleration of the target vehicle 800.
  • the detected value 251 also includes a drift error. It is necessary to isolate and estimate the target motion to be corrected against these disturbances.
  • the first filter 310 of this example filters the detected value 251 of the gyro sensor 250 in order to remove influences other than the target rotational vibration.
  • the estimation unit 300 estimates the movement of the electronic device 1 during the target rotational vibration based on the Bayesian estimation using the detection value 251 filtered by the first filter 310, the observation model 350, and the prediction model 360. .
  • the estimation unit 300 of the present example as the estimation result of the movement of the electronic device 1, for example, the estimated value of the angular velocity of the electronic device 1, the estimated value of the rotation angle of the electronic device 1, and the estimated total rotation angle of the electronic device 1. Produces the values and.
  • the rotation angle of the electronic device 1 means the rotation angle of the posture of the electronic device 1 from the reference posture.
  • the rotation angle can take a positive value or a negative value.
  • the total rotation angle of the electronic device 1 is the sum of the rotation angles of the electronic device 1 at each time.
  • the total rotation angle can take a positive value and a negative value.
  • the reference attitude for example, the attitude of the electronic device 1 when the motion estimation apparatus 400 starts the operation is adopted.
  • the motion estimation apparatus 400 may start operation in response to the start of operation of the electronic device 1, or may start operation in response to a user's operation on the electronic device 1 in operation. Moreover, the motion estimation apparatus 400 may start operation when other conditions are satisfied in the electronic device 1 in operation.
  • the estimated values of the angular velocity, rotation angle, and total rotation angle of the electronic device 1 obtained by Bayesian estimation may be referred to as the angular velocity estimation value, the rotation angle estimation value, and the total rotation angle estimation value, respectively.
  • the term “angular velocity” simply means the angular velocity of the electronic device 1
  • the term “rotation angle” simply means the rotation angle of the electronic device 1
  • the term “total rotation angle” means the total rotation angle of the electronic device 1. means. It can be said that each of the angular velocity, the rotation angle, and the total rotation angle is a physical quantity of the electronic device 1.
  • the second filter 320 of this example performs filter processing on the estimation result of the estimation unit 300.
  • the second filter 320 individually filters the angular velocity estimated value, the rotation angle estimated value, and the total rotation angle estimated value output from the estimation unit 300.
  • the state vector x related to the above-mentioned equation (1) includes the angular velocity, the rotation angle, and the total rotation angle as variables.
  • the observation vector z includes, as variables, an angular velocity, a rotation angle, and a total rotation angle.
  • the estimation unit 300 of the present example performs Bayesian estimation by obtaining the posterior probability p (x t
  • the estimation unit 300 uses the observation model 350 as the probability distribution p (z t
  • the estimation unit 300 sets the state vector that maximizes the calculated posterior probability p (x t
  • the estimation unit 300 sets the angular velocity, rotation angle, and total rotation angle components of the estimated state vector as the angular velocity estimation value, the rotation angle estimation value, and the total rotation angle estimation value, respectively.
  • the observation model 350 of this example is an observation model representing the target rotational vibration.
  • the observation model 350 is a probability distribution that represents statistical prior knowledge about the target rotational vibration.
  • the observation model 350 is a maximum three-dimensional probability distribution that represents the angular velocity, the rotation angle, and the total rotation angle of the electronic device 1 during the target rotational vibration.
  • the marginal probability distributions for each dimension in the observation model 350 of this example may be referred to as a fifth probability distribution, a sixth probability distribution, and a seventh probability distribution, respectively.
  • the rotation angle of the target rotational vibration is small. Therefore, the posture of the electronic device 1 does not change much during the target rotational vibration.
  • the rotation angle of the electronic device 1 at the time of the target rotational vibration is represented by, for example, a fixed fifth probability distribution having a zero mean and a small variance with the rotation angle as a random variable.
  • the knowledge that the posture of the electronic device 1 does not change much is expressed.
  • the angular velocity of the electronic device 1 at the time of the target vibration is represented by, for example, a sixth probability distribution in which the average is the detected angular velocity and the variance is small, with the angular velocity as a random variable.
  • the angular velocity detection value filtered by the first filter 310 is adopted.
  • the sixth probability distribution changes according to the detected angular velocity value.
  • the total rotation angle of the electronic device 1 at the time of the target rotational vibration is represented by, for example, a fixed seventh probability distribution having an average of zero and having the total rotation angle as a random variable. In, the knowledge that the electronic device 1 rotationally vibrates is expressed.
  • the target rotational vibration is determined by machine learning, for example.
  • the observation model 350 may be determined using, for example, the maximum likelihood estimation method. Further, the observation model 350 may be determined based on the statistical distribution of the actual measurement values in the vibration environment of the target rotational vibration. Further, the observation model 350 may be determined based on both the machine learning and the statistical distribution of actual measurement values. For example, a normal distribution may be adopted as the fifth to seventh probability distributions.
  • the predicted value of the angular velocity at time t is represented by b (t), and the prediction error of the angular velocity at time t is represented by w_b (t).
  • the predicted value b (t) is represented by a probability distribution.
  • the predicted value b (t) may be referred to as a predicted probability distribution b (t).
  • the prediction error for example, a fixed probability distribution that does not change with time is adopted.
  • the prediction error is determined by machine learning, for example.
  • the prediction error may be determined using, for example, the maximum likelihood estimation method.
  • a normal distribution may be adopted as the probability distribution representing the prediction error.
  • the predicted value of the rotation angle at time t is represented by c (t)
  • the predicted value of the total rotation angle at time t is represented by d (t).
  • the predicted value c (t) of the rotation angle and the predicted value d (t) of the total rotation angle are represented by the same formulas as the above formulas (4) and (5).
  • the predicted value c (t) of the rotation angle and the predicted value d (t) of the total rotation angle are the following equations (9) and (10) using the length T of the detection interval of the gyro sensor 250.
  • the prediction error at the time t for the rotation angle is represented by w_c (t)
  • the prediction error at the time t for the total rotation angle is represented by w_d (t).
  • each of the predicted value c (t) and the predicted value d (t) is represented by a probability distribution.
  • the predicted value c (t) and the predicted value d (t) may be referred to as the predicted probability distribution c (t) and the predicted probability distribution d (t), respectively.
  • the estimation unit 300 in this example uses Equation (8) to obtain the predicted value a (t) of the angular velocity at time t.
  • the estimation unit 300 determines the angular velocity for the posterior probability p (x t-1
  • the estimation unit 300 calculates the angular velocity of the posterior probability p (x t-2
  • the estimation unit 300 obtains the predicted value c (t) of the rotation angle at the time t by using the equation (9). At this time, the estimation unit 300 estimates the posterior probability p (x t-1
  • z 1: t-1 ) are respectively substituted.
  • z 1: t-1 ) are the estimated probability distributions of the angular velocity, the rotation angle, and the total rotation angle, respectively. It can be said that there is.
  • the estimation unit 300 in this example uses the detected angular velocity output from the gyro sensor 250, the equations (8) to (10) in the prediction model 360, and Bayesian estimation using the observation model 350 to determine the current value.
  • a motion estimation process for estimating the motion of the electronic device 1 is performed.
  • FIG. 15 is a flowchart showing an example of the motion estimation processing of this example.
  • the motion estimation process shown in FIG. 15 is executed every time a new angular velocity is detected by the gyro sensor 250.
  • the estimation unit 300 of this example obtains, for example, a rolling acceleration estimation value, a velocity estimation value, a displacement estimation value, and a total displacement estimation value.
  • the estimation unit 300 obtains, for example, a pitching acceleration estimation value, a velocity estimation value, a displacement estimation value, and a total displacement estimation value.
  • the estimation unit 300 obtains, for example, an yaw acceleration estimation value, a velocity estimation value, a displacement estimation value, and a total displacement estimation value.
  • the estimation unit 300 obtains the current predicted value of the angular velocity using the prediction model 360 in step s51, as shown in FIG.
  • the estimation unit 300 separately uses the prediction model 360 to separately obtain a rolling angular velocity predicted value, a pitching angular velocity predicted value, and a yawing angular velocity predicted value.
  • step s51 the estimation unit 300 calculates the marginal probability distribution of the rolling angular velocity with respect to the posterior probability p (x t-1
  • step s52 the estimation unit 300 uses the prediction model 360 to calculate the current rolling angle prediction value, the pitching current rotation angle prediction value, and the yawing current rotation angle prediction value. Separately.
  • the estimation unit 300 calculates the marginal probability distribution of the rolling angular velocity with respect to the posterior probability p (x t-1
  • step s53 the estimation unit 300 uses the prediction model 360 to calculate the predicted current total rotation angle of rolling, the predicted total rotation angle of pitching, and the current total rotation angle of yawing. Obtain the predicted value separately.
  • the estimation unit 300 calculates the marginal probability distribution of the rolling angular velocity with respect to the posterior probability p (x t-1
  • the estimation unit 300 calculates the marginal probability distribution of the rolling total rotation angle with respect to the posterior probability p (x t ⁇ 1
  • step s54 the estimation unit 300 estimates the angular velocity, the rotation angle, and the total rotation angle based on Bayesian estimation using the prediction values obtained in steps s51 to s53 and the observation model 350. Generate a value.
  • step s54 the estimation unit 300 sets the predicted values of the rolling angular velocity, the rotation angle, and the total rotation angle (in other words, the prediction probability distribution) obtained in steps s51 to s53 to p (x t in the above equation (1). Substitute
  • the estimation unit 300 uses the latest rolling angular velocity detection value (specifically, the latest rolling rolling angular velocity detection value filtered by the first filter 310) as the average of the sixth probability distributions. Then, the estimation unit 300, the posterior probability obtained at step s54 the last motion estimation process p (x t
  • the estimation unit 300 specifies the state vector that maximizes the estimated posterior probability p (x t
  • the estimation unit 300 uses the components of the angular velocity, the rotation angle, and the total rotation angle of the estimated state vector as the rolling current angular velocity estimation value, the rotation angle estimation value, and the total rotation angle estimation value, respectively.
  • the electronic device 1 is performing the target rotational vibration, it can be said that the estimated values of the angular velocity, the rotation angle, and the total rotation angle of the rolling of the electronic device 1 at the time of the target rotational vibration are obtained in step s54.
  • step s54 the estimated values of the rolling angular velocity, the rotation angle, and the total rotation angle of the target vehicle 800 during the target rotational vibration are obtained.
  • the estimation unit 300 uses Equation (1) to generate the estimated values of the pitching angular velocity, the rotation angle, and the total rotation angle.
  • the estimation unit 300 uses the latest pitching angular velocity detection value (specifically, the latest pitching angular velocity detection value filtered by the first filter 310) as the average of the sixth probability distributions.
  • the estimation unit 300 uses Equation (1) to generate the estimated values of the yaw angular velocity, the rotation angle, and the total rotation angle.
  • the estimation unit 300 uses the latest yaw angular velocity detection value (specifically, the latest yaw angular velocity detection value filtered by the first filter 310) as the average of the sixth probability distributions.
  • the second filter 320 individually performs filter processing on each of the estimated values of the rolling angular velocity, the rotation angle, and the total rotation angle generated by the estimation unit 300. In addition, the second filter 320 individually performs filter processing on each of the estimated values of the pitching angular velocity, the rotation angle, and the total rotation angle generated by the estimation unit 300. Then, the second filter 320 individually performs filter processing on each of the estimated values of the yaw angular velocity, the rotation angle, and the total rotation angle generated by the estimation unit 300.
  • the marginal probability distribution of the filtered angular velocity may be input to b (t-1). Further, the marginal probability distribution of the filtered angular velocity may be input to b (t ⁇ 2). Further, the marginal probability distribution of the filtered rotation angle may be input to c (t-1). Further, the marginal probability distribution of the filtered total rotation angle may be input to d (t-1).
  • step s51 may be executed after any one of steps s52 and s53, and step s53 may be executed before any one of steps s51 and s52.
  • the estimation unit 300 in this example may estimate the movement of the electronic device 1 at a timing earlier than the present, in the same manner as in FIG. 6 described above.
  • the series of processing of steps s51 to s54 is unit estimation processing.
  • the movement of the electronic device 1 at the timing (T ⁇ (M ⁇ 1)) time ahead of the present time is estimated. That is, the estimated values of the angular velocity, the rotation angle, and the total rotation angle of the electronic device 1 at the time of (T ⁇ (M ⁇ 1)) time later than the current time of the target rotational vibration can be obtained.
  • the estimation unit 300 estimates the movement of the electronic device 1 at a desired estimation timing earlier than the current timing, not at the timing earlier than the current timing by an integral multiple of T, as in the case of FIG. 7 described above. Good.
  • the angular velocity detection value of the gyro sensor 250, the prediction model 360 that represents the movement of the electronic device 1, and the observation model 350 that represents the rotational vibration of the electronic device 1 that uses the angular velocity detection value are set. Based on the Bayesian estimation used, the movement of the electronic device 1 during rotational vibration is estimated. This makes it possible to appropriately estimate the movement of the electronic device 1 during rotational vibration.
  • the predicted probability distribution of angular velocities at a certain timing is the estimated probability distribution of angular velocities at a timing earlier than the certain timing. It is expressed by the extrapolation formula used. Then, the estimated probability distribution at a certain timing is used as the estimated probability distribution at the previous timing in the extrapolation formula to obtain the predicted probability distribution at the timing earlier than the certain timing, and at the determined previous timing. It is possible to obtain the estimated probability distribution of the angular velocity at the desired timing by repeatedly executing the process of generating the estimated probability distribution at the preceding timing based on the Bayesian estimation using the predicted probability distribution of. Therefore, the extrapolation formula can be used to easily obtain the estimated value of the angular velocity at the desired timing.
  • the observation model 350 it is expressed that the average rotation angle of the electronic device 1 during the target rotational vibration is zero. Therefore, it is possible to reduce the possibility that the estimated value of the rotation angle of the electronic device 1 during the target rotational vibration greatly deviates from zero. Therefore, it is possible to appropriately obtain the estimated value of the rotation angle according to the situation that the rotation angle of the target rotational vibration is small.
  • the observation model 350 it is expressed that the average of the total rotation angles of the electronic device 1 during the target rotational vibration is zero. Thereby, the estimated value of the rotation angle that changes according to the rotational vibration of the electronic device 1 can be appropriately obtained. For this reason, it is expressed that the average of the total displacements of the electronic device 1 at the time of the target vibration is zero, so that the estimated value of the displacement that changes according to the vibration of the electronic device 1 can be appropriately obtained. The reason is the same.
  • the estimation unit 300 does not have to generate at least one of the estimated values of the angular velocity, the rotation angle, and the total rotation angle. That is, the estimation unit 300 may generate at least one estimated value of the angular velocity, the rotation angle, and the total rotation angle as the motion estimation result. The estimation unit 300 may not generate at least one of the rolling angular velocity estimation value, the rotation angle estimation value, and the total rotation angle estimation value. Further, the estimation unit 300 may not generate at least one of the pitching angular velocity estimation value, the rotation angle estimation value, and the total rotation angle estimation value. Further, the estimation unit 300 may not generate at least one of the yaw angular velocity estimation value, the rotation angle estimation value, and the total rotation angle estimation value.
  • the electronic device 1 of this example that functions as a head-up display can correct the display position of the image displayed in an overlapping manner on the actual landscape based on the motion estimation result.
  • the display unit 120 included in the electronic device 1 of this example can display an image in a superimposed manner on a real landscape in front of the target vehicle 800.
  • the display unit 120 has, for example, a configuration similar to that of a projector.
  • the display control unit 500 included in the electronic device 1 of the present example can control the display position of the image displayed by the display unit 120 so as to be superimposed on the actual landscape.
  • an example of the operation of the display control unit 500 of this example will be described.
  • an image displayed by the display unit 120 so as to be superimposed on the actual landscape may be referred to as a superimposed image.
  • rotational vibration about the Y1 axis may occur in the target vehicle 800.
  • rotational vibration about the Y1 axis occurs in the target vehicle 800
  • rotational vibration about the Y1 axis also occurs in the electronic device 1.
  • the position of the superimposed image displayed by the display unit 120 with respect to the earth may shift in the vertical direction of the target vehicle 800.
  • the target vehicle 800 is viewed from the front side and the target vehicle 800 rotates downward
  • the position of the superimposed image with respect to the earth shifts downward.
  • the target vehicle 800 is viewed from the front side and the target vehicle 800 rotates upward
  • the position of the superimposed image with respect to the earth shifts upward. This may make it difficult for the user to visually recognize the superimposed image.
  • the display control unit 500 of the present example controls the display position of the superimposed image on the display unit 120 based on the estimated value of the pitching rotation angle of the electronic device 1 output from the second filter 320. , Controlling the position of the superimposed image with respect to the earth. As a result, the position of the superimposed image superimposed on the actual landscape with respect to the earth is unlikely to change, and the visibility of the superimposed image can be improved.
  • the display control unit 500 in this example estimates the amount of displacement of the superimposed image with respect to the earth and the direction of displacement of the superimposed image with respect to the earth based on the estimated value of the pitching rotation angle of the electronic device 1. Then, the display control unit 500 changes the display position of the superimposed image in the direction opposite to the estimated displacement direction by the estimated displacement amount. This makes it difficult for the position of the superimposed image superimposed on the actual landscape to change with respect to the earth. As a result, the visibility of the superimposed image is improved.
  • the operation of the display control unit 500 of this example will be described in detail below with reference to FIG. 10 described above.
  • the pitching rotation angle of the electronic device 1 may be referred to as a pitching rotation angle.
  • the superimposed image is displayed at a predetermined frame rate, for example.
  • the display control unit 500 determines, for each frame period, the display position of the superimposed image displayed in the frame period that is one frame after the frame period, for example, based on the estimated value of the pitching rotation angle obtained in the frame period. To do.
  • the frame period of interest in the description of the electronic device 1 of this example may be referred to as the target frame period.
  • a frame period that is one behind the target frame period may be referred to as a next frame period.
  • the superimposed image displayed in the next frame period may be referred to as a target superimposed image.
  • the estimated value of the pitching rotation angle used in the determination of the display position of the target superimposed image may be referred to as the target rotation angle estimated value.
  • the pitching angular velocity detection value used as the latest pitching angular velocity detection value when generating the target rotation angle estimation value may be referred to as a target angular velocity detection value.
  • the estimation unit 300 performs the estimation of the electronic device 1 at a desired estimation timing earlier than the current timing, instead of the timing earlier than the current timing by an integral multiple of T, as in FIG. 7 described above. Estimate the movement.
  • the estimation unit 300 acquires the target angular velocity detection value.
  • the target angular velocity detection value for example, a pitching angular velocity detection value output from the gyro sensor 250 immediately after the vertical synchronization signal in the target frame period is generated is adopted.
  • step s32 the estimation unit 300 determines the estimation timing of the motion estimation process. Then, in step s33, the estimation unit 300 uses the target angular velocity detection value as the latest angular velocity detection value for pitching, and executes the motion estimation process of estimating the motion of the electronic device 1 at the estimation timing determined in step s32. .
  • the estimated timing is determined in the same way as in FIG. 11 above.
  • the output timing tm3 is the timing at which the gyro sensor 250 outputs the target angular velocity detection value.
  • the estimation unit 300 obtains the time T11 from the output timing tm3 of the target angular velocity detection value to the vertical synchronization signal generation start timing tm2 in the next frame period.
  • the estimation unit 300 adds the obtained time T11 to the time T12 from the generation start timing tm2 to the display timing tm4 to obtain the time T13 from the output timing tm3 to the display timing tm4.
  • the estimation unit 300 sets the obtained time T13 as T0 of the estimation timing.
  • the estimated timing becomes the timing at which the target superimposed image is actually displayed on the display unit 120.
  • step s33 the estimation unit 300 uses the target angular velocity detection value as the latest angular velocity detection value and performs the motion estimation process similar to that in FIG. 7 described above.
  • the estimated values of the angular velocity, the rotation angle, and the total rotation angle at the estimation timing are obtained. That is, the estimated values of the angular velocity, the rotation angle, and the total rotation angle at the timing when the target superimposed image is displayed can be obtained.
  • step s34 the display control unit 500 determines the display position of the target superimposed image using the estimated value of the rotation angle obtained in step s33 as the target rotation angle estimated value.
  • step s34 the display control unit 500 sets the estimated value of the pitching rotation angle at the estimation timing, which is output from the second filter 320, as the target rotation angle estimation value. Then, the display control unit 500 determines the display position of the target superimposed image based on the target rotation angle estimated value.
  • step s34 the display control unit 500 estimates the displacement amount of the position of the superimposed image with respect to the earth and the displacement direction of the position of the superimposed image with respect to the earth, based on the target rotation angle estimated value. Then, the display control unit 500 changes the display position of the superimposed image displayed in the next frame period in the direction opposite to the estimated displacement direction by the estimated displacement amount. As a result, the position of the target superimposed image displayed in the next frame period with respect to the earth does not easily deviate from the position with respect to the earth.
  • the control unit 100 executes the above-described processing of steps s31 to s34 by using each frame period as the target frame period, so that the position of the superimposed image displayed on the display unit 120 at the predetermined frame rate with respect to the earth is difficult to move. Become. As a result, the visibility of the superimposed image superimposed on the actual landscape is improved.
  • the electronic device 1 of this example reduces the possibility of such a situation occurring.
  • the predicted probability distribution b (t) of the angular velocity at a certain timing is the estimated probability distribution of the angular velocity at a timing earlier than the certain timing. It is represented by an extrapolation formula using b (t-1) and b (t-2). Further, in the prediction model 360, as shown in Expression (9), the predicted probability distribution c (t) of the rotation angle at a certain timing is the estimated probability distribution b () of the angular velocity at a timing earlier than the certain timing. It is represented by a prediction formula using t-1).
  • the estimated probability distribution of angular velocities at a certain timing is used as the estimated probability distribution of angular velocities at the previous timing in the extrapolation formula (8),
  • the predicted probability distribution of the angular velocity at the previous timing is obtained, and the estimated probability distribution of the angular velocity at the previous timing is generated based on Bayesian estimation using the obtained predicted probability distribution of the angular velocity at the previous timing.
  • a specific process is performed.
  • the estimated probability distribution of the angular velocity obtained by repeatedly executing the specific process is used as the estimated probability distribution of the angular velocity at the previous timing in the prediction formula of Expression (9).
  • the predicted probability distribution of the rotation angle at the timing when the target superimposed image is displayed is obtained.
  • the estimated value of the rotation angle at the timing at which the target superimposed image is displayed is based on Bayesian estimation using the predicted probability distribution of the rotation angle at the timing at which the target superimposed image is displayed. Is generated.
  • the gyro sensor 250 it may take some time from the detection of the angular velocity to the output of the detection result by the internal filter function or the like.
  • the time from when the gyro sensor 250 detects the angular velocity to when the detection result is output may be referred to as a gyro sensor delay time.
  • the estimated value of the angular velocity at the timing T0 later than the present time (that is, the estimated value of the angular velocity at the estimated timing) obtained in step s33 is, strictly speaking, the gyro sensor 250 indicates the latest angular velocity.
  • the timing is T0 time after the detected timing.
  • the gyro sensor delay time is zero, assuming that the time T13 from the output timing tm3 to the display timing tm4 is T0 as described above, the estimated timing matches the timing at which the target superimposed image is displayed.
  • the time T13 is set to T0 when the gyro sensor delay time is not zero, the estimated timing is slightly before the timing at which the target superimposed image is displayed. Therefore, the estimated value of the rotation angle at the estimated timing is the estimated value of the rotation angle slightly before the timing at which the target superimposed image is displayed.
  • the estimation unit 300 may determine the estimation timing in consideration of the gyro sensor delay time, as in the case of FIG. 12 described above.
  • the timing tm5 is the timing at which the gyro sensor 250 detects the angular velocity.
  • the estimation unit 300 may set the time T15 obtained by adding the time T14 from the timing tm5 to the output timing tm3 of the detection result (that is, the target angular velocity detection value) to the time T13 as the estimation timing T0. Thereby, the estimation unit 300 can more accurately estimate the rotation angle at the display timing of the target superimposed image.
  • the estimation unit 300 determines the rotation angle at the timing (T ⁇ (M ⁇ 1)) time ahead from the present.
  • the estimated value may be used as the estimated value of the rotation angle at the estimated timing, and the estimated value may be used as the estimated value of the rotational angle at the display timing of the target superimposed image.
  • the electronic device 1 was used as a head-up display, but it may be used as another device.
  • the electronic device 1 may be used as a head mounted display for virtual reality (VR) or augmented reality.
  • the electronic device 1 may estimate the movement during rotational vibration and control the display position of the image to be displayed based on the estimation result, as described above.
  • the head mounted display may be a goggle type, an eyeglass type, or another shape.
  • the motion estimation device 400 may estimate the motion of a device that is performing translational vibration and rotational vibration.
  • the estimation unit 300 determines at least one of the acceleration estimation value, the velocity estimation value, the displacement estimation value, and the total displacement estimation value, and the angular velocity estimation value, the rotation angle estimation value, and the total variation rotation angle estimation value in the motion estimation process. At least one and may be generated. For example, consider a case where an acceleration estimated value, a velocity estimated value, a displacement estimated value, and a total displacement estimated value, and an angular velocity estimated value, a rotation angle estimated value, and a total variable rotation angle estimated value are generated in the motion estimation process.
  • the state vector x related to Expression (1) includes acceleration, velocity, displacement, total displacement, angular velocity, rotation angle, and total rotation angle as variables.
  • the observation vector z includes acceleration, velocity, displacement, total displacement, angular velocity, rotation angle, and total rotation angle as variables.
  • the observation model 350 has a maximum seven-dimensional probability distribution that represents acceleration, velocity, displacement, total displacement, angular velocity, rotation angle, and total rotation angle of the electronic device 1.
  • the estimation unit 300 calculates the acceleration, the velocity, the displacement, the total displacement, the angular velocity, the rotation angle, and the total rotation angle components of the estimated state vector, respectively, as the acceleration estimation value, the velocity estimation value, the displacement estimation value, and the total displacement estimation value. , Angular velocity estimation value, rotation angle estimation value, and total rotation angle estimation value.
  • the motion estimation apparatus 400 may not include at least one of the first filter 310 and the second filter 320. Further, the observation model 350 is not limited to the above example.
  • the prediction model 360 is not limited to the above example. For example, at least one of equations (4) to (6), (9), and (10) may not include a prediction error. Further, the method of using the motion estimation result in motion estimation apparatus 400 is not limited to the above example.
  • the motion estimation device 400 may not be included in the electronic device 1. Further, the motion estimation device 400 may estimate the motion of a device other than the electronic device 1.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un dispositif d'estimation de mouvement permettant d'estimer le mouvement d'un dispositif pendant une vibration, pourvu d'une unité d'estimation. L'unité d'estimation estime le mouvement du dispositif pendant une vibration sur la base d'une inférence bayésienne faisant appel à un modèle prédictif représentant le mouvement du dispositif, à une valeur détectée à partir d'un capteur pour détecter l'accélération ou la vitesse angulaire de rotation du dispositif, et à un modèle d'observation qui représente une connaissance préalable statistique relative à une vibration du dispositif, et dans lequel est utilisée la valeur détectée.
PCT/JP2019/040097 2018-10-12 2019-10-10 Dispositif d'estimation de mouvement, instrument électronique, programme de commande et procédé d'estimation de mouvement WO2020075825A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019556289A JP6621167B1 (ja) 2018-10-12 2019-10-10 動き推定装置、電子機器、制御プログラム及び動き推定方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-193363 2018-10-12
JP2018193363 2018-10-12

Publications (1)

Publication Number Publication Date
WO2020075825A1 true WO2020075825A1 (fr) 2020-04-16

Family

ID=70163822

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/040097 WO2020075825A1 (fr) 2018-10-12 2019-10-10 Dispositif d'estimation de mouvement, instrument électronique, programme de commande et procédé d'estimation de mouvement

Country Status (1)

Country Link
WO (1) WO2020075825A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7053087B1 (ja) * 2020-10-30 2022-04-12 株式会社スマートドライブ 移動体挙動情報取得方法、移動体挙動情報取得装置及びプログラム
WO2022091650A1 (fr) * 2020-10-30 2022-05-05 株式会社スマートドライブ Procédé d'acquisition d'informations de comportement de corps mobile, dispositif d'acquisition d'informations de comportement de corps mobile et programme
JP7318995B1 (ja) 2022-03-24 2023-08-01 株式会社スマートドライブ 移動体挙動情報取得方法、移動体挙動情報取得装置及びプログラム

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014515101A (ja) * 2011-03-31 2014-06-26 クアルコム,インコーポレイテッド 携帯デバイスの位置を推論するデバイス、方法、および装置
US8892391B2 (en) * 2011-06-03 2014-11-18 Apple Inc. Activity detection
JP5688326B2 (ja) * 2011-05-13 2015-03-25 Kddi株式会社 気圧センサを用いて昇降移動状態を推定する携帯装置、プログラム及び方法
JP2016109608A (ja) * 2014-12-09 2016-06-20 ヤマハ株式会社 姿勢推定装置及び姿勢推定装置の制御プログラム
JP2018136400A (ja) * 2017-02-21 2018-08-30 京セラ株式会社 表示装置、表示方法、制御装置および車両

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014515101A (ja) * 2011-03-31 2014-06-26 クアルコム,インコーポレイテッド 携帯デバイスの位置を推論するデバイス、方法、および装置
JP5688326B2 (ja) * 2011-05-13 2015-03-25 Kddi株式会社 気圧センサを用いて昇降移動状態を推定する携帯装置、プログラム及び方法
US8892391B2 (en) * 2011-06-03 2014-11-18 Apple Inc. Activity detection
JP2016109608A (ja) * 2014-12-09 2016-06-20 ヤマハ株式会社 姿勢推定装置及び姿勢推定装置の制御プログラム
JP2018136400A (ja) * 2017-02-21 2018-08-30 京セラ株式会社 表示装置、表示方法、制御装置および車両

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7053087B1 (ja) * 2020-10-30 2022-04-12 株式会社スマートドライブ 移動体挙動情報取得方法、移動体挙動情報取得装置及びプログラム
WO2022091650A1 (fr) * 2020-10-30 2022-05-05 株式会社スマートドライブ Procédé d'acquisition d'informations de comportement de corps mobile, dispositif d'acquisition d'informations de comportement de corps mobile et programme
JP7318995B1 (ja) 2022-03-24 2023-08-01 株式会社スマートドライブ 移動体挙動情報取得方法、移動体挙動情報取得装置及びプログラム
JP2023141437A (ja) * 2022-03-24 2023-10-05 株式会社スマートドライブ 移動体挙動情報取得方法、移動体挙動情報取得装置及びプログラム

Similar Documents

Publication Publication Date Title
WO2020075825A1 (fr) Dispositif d'estimation de mouvement, instrument électronique, programme de commande et procédé d'estimation de mouvement
KR102208329B1 (ko) 화상 처리 장치 및 화상 처리 방법, 컴퓨터 프로그램, 및 화상 표시 시스템
WO2019203189A1 (fr) Programme, dispositif de traitement d'informations et procédé de traitement d'informations
US10466775B2 (en) Method and apparatus for changing a field of view without synchronization with movement of a head-mounted display
US10940384B2 (en) Inciting user action for motion sensor calibration
US20160077166A1 (en) Systems and methods for orientation prediction
CN107438812B (zh) 信息处理设备、信息处理方法和程序
US20150286279A1 (en) Systems and methods for guiding a user during calibration of a sensor
JP2017073753A (ja) 補正方法、プログラム及び電子機器
JP7182020B2 (ja) 情報処理方法、装置、電子機器、記憶媒体およびプログラム
WO2020049847A1 (fr) Dispositif d'estimation, dispositif d'apprentissage, procédé d'estimation, procédé d'apprentissage et programme
JP2017147682A (ja) 全天球撮影システム
JP2014182612A (ja) 情報表示装置、方法及びプログラム
US9891446B2 (en) Imaging apparatus and image blur correction method
JP6621167B1 (ja) 動き推定装置、電子機器、制御プログラム及び動き推定方法
CN108196701B (zh) 确定姿态的方法、装置及vr设备
CN118302806A (zh) 使用分体式架构的增强现实
WO2018155127A1 (fr) Dispositif d'affichage, procédé d'affichage, dispositif de commande et véhicule
JP2010205214A (ja) 制御装置及びヘッドマウントディスプレイ装置
EP4075786A1 (fr) Dispositif de traitement d'image, système, procédé de traitement d'image et programme de traitement d'image
WO2018155128A1 (fr) Dispositif d'affichage, dispositif de commande et véhicule
JP2013140223A (ja) 情報表示装置、情報表示装置の制御方法、制御プログラム、制御プログラムを記録したコンピュータ読み取り可能な記録媒体
WO2018155123A1 (fr) Dispositif d'affichage, procédé d'affichage, dispositif de commande et véhicule
WO2018155122A1 (fr) Dispositif électronique, appareil de commande, véhicule, programme de commande et procédé d'actionnement de dispositif électronique
WO2019167214A1 (fr) Dispositif d'estimation, procédé d'estimation et programme

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2019556289

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19870389

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19870389

Country of ref document: EP

Kind code of ref document: A1