EP3832434A1 - Verhaltensbasiertes konfigurationsverfahren und verhaltensbasiertes konfigurationssystem - Google Patents

Verhaltensbasiertes konfigurationsverfahren und verhaltensbasiertes konfigurationssystem Download PDF

Info

Publication number
EP3832434A1
EP3832434A1 EP19213155.5A EP19213155A EP3832434A1 EP 3832434 A1 EP3832434 A1 EP 3832434A1 EP 19213155 A EP19213155 A EP 19213155A EP 3832434 A1 EP3832434 A1 EP 3832434A1
Authority
EP
European Patent Office
Prior art keywords
motion sensing
motion
body portion
sensing apparatus
human body
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP19213155.5A
Other languages
English (en)
French (fr)
Inventor
Ching-Ning Huang
Hua-Lun Lu
Yi-Kang Hsieh
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
XRspace Co Ltd
Original Assignee
XRspace Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by XRspace Co Ltd filed Critical XRspace Co Ltd
Priority to EP19213155.5A priority Critical patent/EP3832434A1/de
Publication of EP3832434A1 publication Critical patent/EP3832434A1/de
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object

Definitions

  • the present disclosure generally relates to a method for configuration, in particular, to a behavior-based configuration method and a behavior-based configuration system.
  • the motion of the user may be detected, to directly operate the electronic apparatus according to the motion of the user.
  • some electronic apparatuses may allow the human body portion (such as a hand, a leg, a head, etc.) of the user to control the operation of these electronic apparatuses.
  • a handheld controller or other wearable motion sensing apparatuses with a motion sensor may be provided for sensing the human body portion of the user.
  • these motion sensing apparatuses are designed for a specific human body portion.
  • a handheld controller is designed for the right hand of the user, and another handheld controller is designed for the left hand.
  • This limitation for specific human body portion is not intuitive for the user. The user has to identify which handheld controller is adapted for his/her operating hand first.
  • the present disclosure is directed to a behavior-based configuration method and a behavior-based configuration system, in which the operating mode can be configured based on the behavior of the user.
  • a behavior-based configuration method includes, but not limited to, the following steps. Whether a motion sensing apparatus is activated is determined based on first motion sensing data from the motion sensing apparatus. Second motion sensing data is analyzed to determine which human body portion of a user is acted with the motion sensing apparatus in response to the motion sensing apparatus being activated. The second motion sensing data is related to the human body portion acted with the motion sensing apparatus. A first operating mode for a first human body portion acted with the motion sensing apparatus is configured based on the analyzed result of the second motion sensing data in a first time period. A second operating mode for a second human body portion acted with the motion sensing apparatus is configured based on the analyzed result of the second motion sensing data in a first time period.
  • a behavior-based configuration system includes, but not limited to, a motion sensing apparatus and a processor.
  • the processor determines whether the motion sensing apparatus is activated based on first motion sensing data from the motion sensing apparatus, analyzes second motion sensing data to determine which human body portion of a user is acted with the motion sensing apparatus in response to the motion sensing apparatus being activated, configure a first operating mode for a first human body portion acted with the motion sensing apparatus based on the analyzed result of the second motion sensing data in a first time period, and configure a second operating mode for a second human body portion acted with the motion sensing apparatus based on the analyzed result of the second motion sensing data in a second time period.
  • the second motion sensing data is related to the human body portion acted with the motion sensing apparatus.
  • the motion of the human body portion of the user would be tracked, and the motion of the human body portion would be analyzed to identify which human body portion is acted with the motion sensing apparatus. Then, different operating modes could be configured for different human body portions according to the analyzed result, respectively. Accordingly, a flexible and convenient way to operate an electronic apparatus is provided.
  • FIG. 1 is a block diagram illustrating a behavior-based configuration system 100 according to one of the exemplary embodiments of the disclosure.
  • the behavior-based configuration system 100 includes, but not limited thereto, one or more motion sensing apparatuses 110, memory 130, and processor 150.
  • the behavior-based configuration system 100 can be adapted for VR, AR, MR, XR or other reality-related technology.
  • the behavior-based configuration system 100 can be adapted for operating an external apparatus (such as a computer, a game player, a smartphone, an in-dash system, a smart appliance, etc.).
  • the motion sensing apparatus 110 could be a handheld controller or a wearable apparatus, such as a wearable controller, a smartwatch, an ankle sensor, a waist belt, or the likes.
  • each motion sensing apparatus 100 is wearable on one human body portion of the user.
  • the human body portion may be left or right hand, a head, left or right ankle, left or right leg, a waist, or other portions.
  • the motion sensing apparatus 110 includes a motion sensor.
  • the motion sensor could be an accelerometer, a gyroscope, a magnetometer, a laser sensor, an inertial measurement unit (IMU), an infrared ray (IR) sensor, or any combination of aforementioned motion sensors.
  • the motion sensor is used for sensing the motion itself, and acted with the human body portion in which itself is placed. For example, the motion sensor detects the position in a 3-dimention space and the rotation situation itself.
  • the human body portion of the user may hold, wear, or carry the motion sensing apparatus 110, so that the motion sensor is acted with the human body portion. Therefore, the motion of the motion sensor may represent the motion of the human body portion.
  • the behavior-based configuration system 100 may further include one or more motion sensing apparatuses 120.
  • the motion sensing apparatus 120 could be a head-mounted display (HMD), a smartphone, a camera, a laptop, a positioning apparatus, or the likes.
  • the motion sensing apparatus 120 includes an image sensor.
  • the image sensor may be a camera, such as a monochrome camera or a color camera, a deep camera, a video recorder, or other image sensor capable of capturing images.
  • the image sensor may be used to capture toward one or more human body portions of the user, to generate the images including one or more human body portions of the user.
  • Memory 130 may be any type of a fixed or movable Random-Access Memory (RAM), a Read-Only Memory (ROM), a flash memory or a similar device or a combination of the above devices.
  • the memory 130 can be used to store program codes, device configurations, buffer data or permanent data (such as motion sensing data, images, motion sensing result, configurations, etc.), and these data would be introduced later.
  • the processor 150 is coupled to the memory 130, and the processor 150 is configured to load the program codes stored in the memory 130, to perform a procedure of the exemplary embodiment of the disclosure.
  • functions of the processor 150 may be implemented by using a programmable unit such as a central processing unit (CPU), a microprocessor, a microcontroller, a digital signal processing (DSP) chip, a field programmable gate array (FPGA), etc.
  • the functions of the processor 150 may also be implemented by an independent electronic device or an integrated circuit (IC), and operations of the processor 150 may also be implemented by software.
  • the processor 150 may or may not be disposed with the motion sensing apparatuses 110 and 120.
  • motion sensing apparatuses 110 and 120 and the processor 150 may further include or be connected with communication transceivers with compatible communication technology, such as Bluetooth, Wi-Fi, IR, or physical transmission line, to transmit/ receive data with each other.
  • FIG. 2 is a schematic diagram illustrating a behavior-based configuration system 200 according to one of the exemplary embodiments of the disclosure.
  • the behavior-based configuration system 200 includes a motion sensing apparatus 110 (which is a handheld controller) and a motion sensing apparatus 120 (which is an HMD).
  • a stereo camera 121 i.e., the image sensor
  • the processor 150 are embedded in the HMD, and the stereo camera 121 may be configured to capture camera images toward the operating portion B1 (i.e. the left hand of the user) and the operating portion B2 (i.e. the right hand of the user).
  • IMU 111 i.e., the motion sensor
  • the handheld controller to obtain the motion sensing result of the operating portion B2.
  • the behavior-based configuration system 100 or 200 further includes two ankle sensors and a waist belt.
  • the number of the motion sensing apparatuses 110 is not limited thereto.
  • FIG. 3 is a flowchart illustrating a behavior-based configuration method according to one of the exemplary embodiments of the disclosure.
  • the processor 150 determines whether the motion sensing apparatus 110 is activated based on first motion sensing data from the motion sensing apparatus 110 (step S310). Specifically, the user may hold, wear, or carry the motion sensing apparatus 110. However, the motion sensing apparatus 110 may also be placed at any place without being acted with the human body portion of the user.
  • the motion sensor of the motion sensing apparatus 110 may sense the motion of a corresponding human body portion of the user, which carries a motion sensing apparatus 110, for a time period, and the processor 150 may generate a sequence of first motion sensing data from the motion sensing result (e.g., sensed strength values, degree, etc.) of the motion sensor at multiple time points within the time period.
  • the first motion sensing data includes a 3-degree of freedom (3-DoF) data, and the 3-DoF data which are related to the rotation information of the human body portion in three-dimensional (3D) space, such as accelerations in yaw, roll, and pitch.
  • the first motion sensing data includes a relative position and/or displacement of a human body portion in the 2D/3D space.
  • the processor 150 may determine the motion of the motion sensing apparatus 110 based on motion sensing result of the motion sensor to determine the motion sensing apparatus 110 is activated. For example, the processor 150 may check the motion sensing apparatus 110 is not still.
  • the variation of the first motion sensing data obtained from the motion sensor of the motion sensing apparatus 110 at different time points may be determined. If the value of the variation between two time points is larger than a predefined threshold, the processor 150 may determine the motion sensing apparatus 110 is moving and activated. On the other hand, the processor 150 may determine the motion sensing apparatus 110 is not activated.
  • the processor 150 may compare the displacement and/or rotation situation of the motion sensing apparatus 110 based on the first motion sensing data with one or more predefined trajectories and/or rotations. If the first motion sensing data meets the predefined trajectories and/or rotations, the processor 150 may determine the motion sensing apparatus 110 is activated. On the other hand, the processor 150 may determine the motion sensing apparatus 110 is not activated.
  • the processor 150 may analyze the second motion sensing data to determine which human body portion of the user is acted with the motion sensing apparatus 110 (step S330). Specifically, the second motion sensing data is related to the motion of the human body portion. In one embodiment, the second motion sensing data is obtained from the motion sensing apparatus 120, and the processor 150 generates the second motion sensing data based on images captured by the image sensor of the motion sensing apparatus 120. In one embodiment, the processor 150 may detect whether the one or more human body portion is detected in the image. In some embodiments, the human body portion in the image would be identified through a machine learning technology (such as deep learning, artificial neural network (ANN), or support vector machine (SVM), etc.). In another embodiment, the human body portion may be identified through other object identification technologies, such as the binary classifier, the adaptive boosting (Adaboost), etc.
  • ANN artificial neural network
  • SVM support vector machine
  • the processor 150 may generate the second motion sensing data according to the motion of the human body portion in the image.
  • the sensing strength and the pixel position corresponding to the human body portion in the image can be used for estimating depth information of the first operating portion (i.e., a distance relative to the motion sensing apparatus 120 or other reference apparatuses) and estimating 2D position of the human body portion at a plane parallel to the motion sensing apparatus 120.
  • the processor 150 can generate a 3D position in a predefined coordinate system according to the distance and the 2D position of the human body portion.
  • the processor 150 may further estimate the displacement and the rotation data of the human body portion according to multiple positions at different time points, so as to generate a 6-degree of freedom (6-DoF) data (which would be considered as second first motion sensing data).
  • 6-DoF 6-degree of freedom
  • 3-DoF data, a relative position and/or displacement of the human body portion in the 2D/3D space could be the second motion sensing data.
  • the processor 150 may further identify the gesture of the hand in the image, or identify whether the motion sensing apparatus 110 exists in the image.
  • the second motion sensing data is obtained from the motion sensing apparatus 110, and the processor 150 generates the second motion sensing data based on motion sensing result of the motion sensor of the motion sensing apparatus 110.
  • the generation of the second motion sensing data can be referred to the generatation of the first motion sensing data, and its detailed description would be omitted.
  • the second motion sensing data is obtained from the motion sensing apparatus 110 and the motion sensing apparatus 120, and the processor 150 generates the second motion sensing data based on both motion sensing result of the motion sensor of the motion sensing apparatus 110 and images captured by the image sensor of the motion sensing apparatus 120.
  • the image could be used for estimating the position of the human body portion
  • the motion sensing result could be used for estimating the rotation situation of the human body portion.
  • both the image and the motion sensing result can be used for determining the position of the human body portion.
  • the second motion sensing data may record the position and rotation data based on the motion sensing result and the position and rotation data based on the image, respectively.
  • the processor 150 may determine whether the second motion sensing data meets a condition to generate the analyzed result.
  • the condition is related that the motion of the human body portion detected based on the second motion sensing data. It is assumed that the behavior of the user can be used to estimate which human body portion is carrying/ wearing/holding the motion sensing apparatus 110. For example, when the user holds a handheld controller, the user may lift his arm. For another example, when the user wears ankle sensors, the user may try to walk. On the other hand, the human may have a pair of hands, arms, legs, and feet. Sometimes, the displacement and the rotation may be different for two human body portions in each aforementioned human body portion pair and can be used to estimate which side of the human body portions.
  • the condition is related that motion of the human body portion existed in the images obtained from the motion sensing apparatus 120. It is assumed that the user may move the human body portion which carries the motion sensing apparatus 110. In some embodiments, each image may be divided into two or more areas, and the area where the human body portion exists can be used to determine which human body portion is moving. For example, the user raises the right hand, and the right hand may exist on the right side of an image.
  • the trajectory of the human body portion in the image can be used to determine which human body portion is moving. For example, the user walks, the knee of the user may move from the bottom to the middle of the image, so as to determine the legs is moving.
  • the gesture of the user's hand in the image can be used to determine which human body portion uses the motion sensing apparatus 110.
  • the fingertip of the thumb faces toward the right side in the image, so as determine the hand holds the motion sensing apparatus 110.
  • the condition is related that the motion of the human body portion detected in the motion sensing result.
  • the displacement and the rotation of the human body portion based on the motion sensing result can be used to determine which human body portion is moving. For example, a wave motion is detected, so as to determine the hand performs the wave motion.
  • the human body portion rotates horizontally, as as to determine the user twists the waist.
  • the position of the human body portion based on the motion sensing result can be used to determine which human body portion carries the motion sensing apparatus 110.
  • the processor 150 may estimate the human body portion is the left hand.
  • the condition is related that the motion of the human body portion detected in both the motion sensing result and the images.
  • the displacement, the position and/or the rotation of the human body portion can be determined based on the combination of the motion sensing result and the images, and the displacement, the position and/or the rotation can be used to estimate which human body portion is moving or wears/carries/holding the motion sensing apparatus 110 as mentioned above.
  • the processor 150 may identify the human body portion in the image and determine whether the motion of the human body portion is identical in both the motion sensing result and the images. The displacement, the position and/or the rotation in both motion sensing result and the images may be compared. If the compared result is identical, the processor 150 determines the condition is meet and determine the identified human body portion is acted with the motion sensing apparatus 110. On the other hand, if the compared result is not identical, the processor 150 determines the condition is not meet.
  • FIG. 4 is a schematic diagram illustrating a motion tracking method according to one of the exemplary embodiments of the disclosure.
  • the human body portions Bland B2 exist in the field of view FOV of the image sensor of the motion sensing apparatus 120.
  • the human body portion B1 there is no position data based on the motion sensing result obtained from the motion sensing apparatus 120 is identical to the position data based on the image.
  • the position data based on the motion sensing result obtained from the motion sensing apparatus 120 is identical to the position data based on the image for the human body portion B2.
  • the processor 150 may determine the human body portion B2 (i.e., the right hand) holds the motion sensing apparatus 110.
  • the processor 150 may use the third motion sensing data obtained from another motion sensing apparatus (which can be the motion sensing apparatus 120 or other motion sensing apparatuses different from the motion sensing apparatus 110) to sense motion of the human body portion. It is assumed the motion sensing apparatus 110 is not used by the user, so that the motion sensing result obtained from the motion sensing apparatus is not reliable, and another motion sensing data would be need.
  • the third motion sensing data may be generated based on the images or other data.
  • the processor 150 may use another motion sensing data from the image sensor of the HMD to determine the motion of the user's hand.
  • the processor 150 may configure a first operating mode for the first human body portion acted with the motion sensing apparatus 110 based on the analyzed result of the second motion sensing data in a first time period (step S350). Specifically, the analyzed result is related that the human body portion acted with the motion sensing apparatus 110.
  • the processor 150 may configure a first operating mode of the motion sensing apparatus 110 for the determined human body portion, which is the first human body portion. In one embodiment, the first operating mode is related to the right side or the left side of the human body portion pair. For example, the processor 150 configures the right hand operating mode or the left hand operating mode for the handheld controller.
  • the first operating mode could be related to a scenario, a command, a motion sensing mechanism, etc.
  • the first operating mode may be used for the motion sensing apparatus 120 or other external apparatuses.
  • the first time period is a duration when the first operating mode is configured. In some embodiments, the first time period may be ended if the first human body portion is not acted with the motion sensing apparatus 110.
  • the processor 150 may configure a second operating mode for the second human body portion acted with the motion sensing apparatus 110 based on the analyzed result of the second motion sensing data in a second time period (step S370). Similar to step S350, the processor 150 may configure a second operating mode of the motion sensing apparatus 110 for the determined human body portion, which is the second human body portion.
  • the second operating mode may be the same as or different from the first operating mode. For example, the first operating mode is related to the right side of the human body portion pair, and the second operating mode is related to the left side of the human body portion pair.
  • the first and second operating modes are both the operating mode for a user interface.
  • the second operating mode also could be related to a scenario, a command, a motion sensing mechanism, etc, and the second operating mode may be used for the motion sensing apparatus 120 or other external apparatuses.
  • the second time period is a duration when the second operating mode is configured. In some embodiments, the second time period may be ended if the second human body portion is not acted with the motion sensing apparatus 110. It should be noted that the second time period may be or be not overlapped with the first time period.
  • the motion sensing data obtained from the motion sensing apparatus 110 or 120 could be used to control the motion of a corresponding body portion of an avatar.
  • the motion sensing data is related that the left leg is raising in the real world, and the left leg of the avatar may raise accordingly in the virtual world.
  • the processor 150 may move a first body portion of an avatar corresponding to the first human body portion of the user in the first operating mode, and move a second body portion of the avatar corresponding to the second human body portion of the user in the second operating mode.
  • the body portion of the avatar could be a hand, a head, left or right ankle, left or right leg, a waist, or other portions.
  • the first body portion and the first human body portion is corresponding to the left hand of the user
  • the second body portion and the second human body portion is corresponding to the right hand of the user.
  • the motion information of the left hand of the avatar may be generated according to the motion of the left hand of the user with the left-hand mode of a handheld controller
  • the motion information of the right hand of the avatar may be generated according to the motion of the right hand of the user with the right-hand mode of the same handheld controller in different time periods.
  • the configuration of the motion sensing apparatus 110 can be set automatically based on the detected behavior of the user. For example, when the user holds a handheld controller by the right hand and waves the right hand in the real world, the processor 150 may configure the right hand operating mode for the handheld controller and an avatar of the user may wave its right hand in the virtual world. And then, when the user holds the handheld controller by the left hand and waves the left hand in the real world, the same handheld controller may be switched to the left hand operating mode and the avatar of the user may wave its left hand in the virtual world.
  • the exemplary embodiments described above depicted behavior-based configuration method and behavior-based configuration system, which may analyze the motion sensing data to determine the motion of the human body portion, and further determine which human body portion is acted with the motion sensing apparatus. Then, the operating mode for the determined human body portion can be configured. Accordingly, it is convenient for the user to operate with the motion sensing apparatus.
EP19213155.5A 2019-12-03 2019-12-03 Verhaltensbasiertes konfigurationsverfahren und verhaltensbasiertes konfigurationssystem Withdrawn EP3832434A1 (de)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP19213155.5A EP3832434A1 (de) 2019-12-03 2019-12-03 Verhaltensbasiertes konfigurationsverfahren und verhaltensbasiertes konfigurationssystem

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP19213155.5A EP3832434A1 (de) 2019-12-03 2019-12-03 Verhaltensbasiertes konfigurationsverfahren und verhaltensbasiertes konfigurationssystem

Publications (1)

Publication Number Publication Date
EP3832434A1 true EP3832434A1 (de) 2021-06-09

Family

ID=68766612

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19213155.5A Withdrawn EP3832434A1 (de) 2019-12-03 2019-12-03 Verhaltensbasiertes konfigurationsverfahren und verhaltensbasiertes konfigurationssystem

Country Status (1)

Country Link
EP (1) EP3832434A1 (de)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090007001A1 (en) * 2007-06-28 2009-01-01 Matsushita Electric Industrial Co., Ltd. Virtual keypad systems and methods
US20140333534A1 (en) * 2013-05-09 2014-11-13 Samsung Electronics Co., Ltd. Input apparatus, pointing apparatus, method for displaying pointer, and recordable medium
US20150212699A1 (en) * 2014-01-27 2015-07-30 Lenovo (Singapore) Pte. Ltd. Handedness for hand-held devices
US20190236344A1 (en) * 2018-01-29 2019-08-01 Google Llc Methods of determining handedness for virtual controllers

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090007001A1 (en) * 2007-06-28 2009-01-01 Matsushita Electric Industrial Co., Ltd. Virtual keypad systems and methods
US20140333534A1 (en) * 2013-05-09 2014-11-13 Samsung Electronics Co., Ltd. Input apparatus, pointing apparatus, method for displaying pointer, and recordable medium
US20150212699A1 (en) * 2014-01-27 2015-07-30 Lenovo (Singapore) Pte. Ltd. Handedness for hand-held devices
US20190236344A1 (en) * 2018-01-29 2019-08-01 Google Llc Methods of determining handedness for virtual controllers

Similar Documents

Publication Publication Date Title
EP3547216B1 (de) Tiefenlernen zur vorhersage von dreidimensionalem (3d) blick
JP6885935B2 (ja) 眼の特徴を用いる眼ポーズ識別
JP7178403B2 (ja) ロバストなバイオメトリックアプリケーションのための詳細な眼形状モデル
US11301677B2 (en) Deep learning for three dimensional (3D) gaze prediction
WO2019176308A1 (ja) 情報処理装置、情報処理方法、および、プログラム
CN109993115A (zh) 图像处理方法、装置及可穿戴设备
US11029753B2 (en) Human computer interaction system and human computer interaction method
US11460912B2 (en) System and method related to data fusing
US10996743B2 (en) Electronic system and controller and the operating method for the same
JP6507252B2 (ja) 機器操作装置、機器操作方法、及び電子機器システム
KR20200144196A (ko) 전자 장치 및 각막 이미지를 이용한 전자 장치의 기능 제공 방법
EP3832434A1 (de) Verhaltensbasiertes konfigurationsverfahren und verhaltensbasiertes konfigurationssystem
US20210165485A1 (en) Behavior-based configuration method and behavior-based configuration system
US9830512B2 (en) Method and apparatus for tracking gaze
TWI748299B (zh) 動作感測資料產生方法和動作感測資料產生系統
CN113031755A (zh) 基于行为的配置方法和基于行为的配置系统
EP3832436A1 (de) Bewegungsmessdatenerzeugungsverfahren und bewegungsmessdatenerzeugungssystem
TW202122970A (zh) 基於行為的配置方法和基於行為的配置系統
JP2021089693A (ja) 行動ベースの構成方法及び行動ベースの構成システム
US20210157395A1 (en) Motion sensing data generating method and motion sensing data generating system
CN113031753A (zh) 运动传感数据产生方法和运动传感数据产生系统
JP2021089692A (ja) 動作感知データ生成方法及び動作感知データ生成システム
US20200089940A1 (en) Human behavior understanding system and method
US11783492B2 (en) Human body portion tracking method and human body portion tracking system
US11892625B2 (en) Method for determining posture of user, host, and computer readable medium

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20211210