CN116718175A - Glasses posture adjusting method, electronic device and storage medium - Google Patents
Glasses posture adjusting method, electronic device and storage medium Download PDFInfo
- Publication number
- CN116718175A CN116718175A CN202310493180.0A CN202310493180A CN116718175A CN 116718175 A CN116718175 A CN 116718175A CN 202310493180 A CN202310493180 A CN 202310493180A CN 116718175 A CN116718175 A CN 116718175A
- Authority
- CN
- China
- Prior art keywords
- glasses
- data
- sliding
- posture
- pressure
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 239000011521 glass Substances 0.000 title claims abstract description 422
- 238000000034 method Methods 0.000 title claims abstract description 52
- 230000008859 change Effects 0.000 claims abstract description 22
- 230000036544 posture Effects 0.000 claims description 58
- 230000005484 gravity Effects 0.000 claims description 33
- 230000001133 acceleration Effects 0.000 claims description 32
- 238000004590 computer program Methods 0.000 claims description 14
- 230000033001 locomotion Effects 0.000 claims description 9
- 230000008602 contraction Effects 0.000 claims description 3
- 230000008569 process Effects 0.000 description 9
- 230000000007 visual effect Effects 0.000 description 7
- 238000004891 communication Methods 0.000 description 6
- 230000001276 controlling effect Effects 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000001105 regulatory effect Effects 0.000 description 3
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 2
- 210000003128 head Anatomy 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 239000002184 metal Substances 0.000 description 2
- 208000001491 myopia Diseases 0.000 description 2
- 229910052710 silicon Inorganic materials 0.000 description 2
- 239000010703 silicon Substances 0.000 description 2
- 206010020675 Hypermetropia Diseases 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 210000005069 ears Anatomy 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 230000002401 inhibitory effect Effects 0.000 description 1
- 239000011810 insulating material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000004379 myopia Effects 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 238000007619 statistical method Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000009182 swimming Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02C—SPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
- G02C11/00—Non-optical adjuncts; Attachment thereof
- G02C11/10—Electronic devices other than hearing aids
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01L—MEASURING FORCE, STRESS, TORQUE, WORK, MECHANICAL POWER, MECHANICAL EFFICIENCY, OR FLUID PRESSURE
- G01L5/00—Apparatus for, or methods of, measuring force, work, mechanical power, or torque, specially adapted for specific purposes
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01V—GEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
- G01V7/00—Measuring gravitational fields or waves; Gravimetric prospecting or detecting
- G01V7/02—Details
- G01V7/06—Analysis or interpretation of gravimetric records
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Automation & Control Theory (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Ophthalmology & Optometry (AREA)
- General Engineering & Computer Science (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Optics & Photonics (AREA)
- Otolaryngology (AREA)
- General Health & Medical Sciences (AREA)
- Acoustics & Sound (AREA)
- General Life Sciences & Earth Sciences (AREA)
- Geophysics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application discloses a method for adjusting the posture of glasses, electronic equipment and a storage medium, which belong to the technical field of wearable equipment, and the method for adjusting the posture of the glasses comprises the following steps: collecting target type data of the glasses in a wearing state; determining sliding gesture data of the glasses based on the change of the target type data; acquiring attitude control parameters of the glasses according to the sliding attitude data of the glasses; and adjusting the posture of the glasses based on the posture control parameters so as to inhibit sliding.
Description
Technical Field
The present application relates to the field of wearable devices, and in particular, to a method for adjusting an attitude of an eyeglass, an electronic device, and a storage medium.
Background
In the long-term wearing or movement process of the glasses, the falling state can occur, so that the whole visual experience of the user is affected, or the glasses are damaged.
Disclosure of Invention
In a first aspect, the present application provides a method for adjusting an attitude of eyeglasses, including:
collecting target type data of the glasses in a wearing state;
determining sliding gesture data of the glasses based on the change of the target type data;
acquiring attitude control parameters of the glasses according to the sliding attitude data of the glasses;
and adjusting the posture of the glasses based on the posture control parameters so as to inhibit sliding.
In some embodiments, the collecting the target type data of the glasses in the wearing state includes: and collecting pressure data, acceleration data and gravity data of the glasses.
In some embodiments, the determining the slide pose data of the glasses based on the change in the target type data comprises:
and comparing the target type data with the reference data of the glasses in a normal wearing state, and determining the sliding gesture data of the glasses.
In some embodiments, the collecting the target type data of the glasses in the wearing state includes: and collecting sliding touch data of the glasses.
In some embodiments, determining the slide attitude data of the glasses based on the change in the target type data comprises: and determining the sliding size and direction of the glasses according to the collected sliding touch data of the glasses.
In some embodiments, the adjusting the pose of the glasses based on the pose control parameters to inhibit slipping comprises: based on the attitude control parameters, the attitude of the glasses is adjusted, so that the attitude of the glasses is restored to a normal wearing state.
In some embodiments, the reference data includes a pressure threshold range of the glasses and a position offset threshold range of the glasses.
In some embodiments, the comparing the target type data with the reference data of the glasses in the normal wearing state, determining the sliding gesture data of the glasses includes:
comparing the pressure data of the glasses with a pressure threshold range of the glasses;
determining that the pressure data of the glasses are not in the pressure threshold range of the glasses, obtaining the direction offset of the glasses by combining the position offset threshold range of the glasses based on the acceleration data of the glasses and the gravity data of the glasses, and recording the current pressure value of the glasses;
and determining sliding gesture data of the glasses based on the pressure value and the direction offset.
In some embodiments, the comparing the pressure data of the glasses to the pressure threshold range of the glasses comprises at least one of:
comparing the leg pressure data of the glasses with a leg pressure threshold range of the glasses;
and comparing the nose support pressure data of the glasses with a nose support pressure threshold range of the glasses.
In some embodiments, the obtaining the attitude control parameter of the glasses according to the sliding attitude data of the glasses includes at least one of the following:
obtaining a glasses leg direction offset of the glasses based on the sliding gesture data of the glasses, and obtaining a flexibility parameter based on the glasses leg direction offset of the glasses;
obtaining a nose support direction offset of the glasses based on the sliding gesture data of the glasses, and obtaining a regulation degree parameter based on the nose support direction offset of the glasses;
the telescopic degree parameter is used for controlling the telescopic distance of the glasses legs of the glasses, and the adjusting degree parameter is used for controlling the adjusting range of the nose pads of the glasses.
In some embodiments, the adjusting the pose of the glasses based on the pose control parameters to inhibit slip comprises at least one of:
based on the expansion and contraction degree parameters, the length of the glasses legs of the glasses is adjusted, so that the postures of the glasses legs of the glasses are restored to a normal wearing state;
based on the adjustment degree parameters, the angle of the nose pad of the glasses is adjusted, so that the posture of the nose pad of the glasses is restored to a normal wearing state.
In a second aspect, the present application provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing a method for adjusting the attitude of spectacles according to any of the above when executing the program.
For example, the electronic device is a pair of glasses including a motion sensor coupled to the processor and configured to collect target type data of the pair of glasses in a worn state, the target type data including pressure data, acceleration data, and gravity data.
For example, the electronic device is a pair of glasses including a haptic sensor coupled to the processor and configured to collect target type data of the pair of glasses in a worn state, the target type data including sliding haptic data.
In a third aspect, the present application provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements a method of adjusting the attitude of eyeglasses as described in any of the above.
In a fourth aspect, the present application provides a computer program product comprising a computer program which when executed by a processor implements a method of adjusting the attitude of spectacles as described in any of the above.
Drawings
In order to more clearly illustrate the application or the technical solutions of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are some embodiments of the application, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a method for adjusting an attitude of an eyeglass according to an embodiment of the present application;
fig. 2 is a schematic diagram of the overall structure of AR glasses according to an embodiment of the present application;
FIG. 3 is a flowchart of comparing target type data with reference data of glasses in a normal wearing state to determine a sliding gesture of the glasses according to an embodiment of the present application;
fig. 4 is a schematic physical structure of an electronic device according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The terms first, second and the like in the description and in the claims, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or otherwise described herein, and that the "first" and "second" distinguishing between objects generally are not limited in number to the extent that the first object may, for example, be one or more. Furthermore, in the description and claims, "and/or" means at least one of the connected objects, and the character "/" generally means a relationship in which the associated object is an "or" before and after.
The glasses are composed of two parts, namely a lens and a frame. For the common eyewear product types, the frame includes two temples and two nose pads; in the wearing process, the two glasses legs are respectively erected above the two ears of the user, the two nose supports are respectively erected on the two sides of the nose bridge of the user, and the images displayed by the lenses are observed through eyes, so that the corresponding visual effect is achieved. In the long-term wearing process, for example, when the glasses are worn for long-time learning, typing, reading and other activities, the glasses are easy to slip due to the gravity of the glasses; in the process of wearing glasses for sports, such as swimming, running, basketball and the like, the state of sliding of the glasses is easy to occur due to the influence of the change of the motion posture of a human body.
Although the above is exemplified in the context of dual-temple glasses, other types of eyes, such as single-temple glasses, etc., are also suitable for use with the present application.
When the glasses are in a sliding state in the wearing process, manual adjustment is usually required by a user to enable the glasses to be restored to a normal wearing state, however, the mode is complex in operation, continuous visual experience cannot be obtained for the user, and the use experience of the user on the glasses is affected; the manual adjustment mode is generally limited to adjusting the position of the glasses relative to the face, the structure of the glasses cannot be adjusted, the glasses are more attached to the face, and the anti-slip effect is poor, so that the problem of sliding of the glasses cannot be fundamentally solved, and after the manual adjustment, the glasses are in a sliding state possibly in the subsequent wearing process.
The above-mentioned normal wearing state includes a state in which the user feels comfortable when wearing the glasses or the wearing habit of the user is satisfied.
Therefore, the application provides a glasses posture adjusting method, electronic equipment and storage medium, which are characterized in that the object type data of the glasses in the wearing state are collected, the sliding posture data of the glasses are determined based on the change of the object type data, the posture control parameters of the glasses are obtained according to the sliding posture data of the glasses, and the posture of the glasses is adjusted based on the posture control parameters so as to inhibit sliding, so that the intelligent and automatic degree is higher. The application can enable the user to obtain continuous visual experience, at least partially solves the problem of sliding of the glasses, and has good anti-skid effect.
In embodiments of the present application, the glasses may be conventional glasses without display capabilities, such as lenses being conventional optical lenses such as near-sighted lenses, far-sighted lenses, flat lenses, etc.; or glasses that do not have display capabilities but have audio capabilities, such as bluetooth glasses. The glasses may be glasses with display capabilities, for example, the lenses are devices with display capabilities, such as Micro OLED display screens, micro LED display screens, LCD display screens, OLED display screens, etc.; or the lens is a device having display capability, such as a display source by Liquid Crystal On Silicon (LCOS), liquid Crystal Display (LCD), digital Micromirror Device (DMD), digital Light Processing (DLP), silicon-based OLED (OLED), micro LED, micro OLED, etc., the capability of being projected through an optical waveguide, etc., to form a display screen to the eyes of a user, such as AR glasses.
Fig. 1 is a flowchart of a method for adjusting an attitude of glasses according to an embodiment of the present application. As shown in fig. 1, there is provided a glasses posture adjusting method including the steps of: step 110, step 120, step 130 and step 140. The method flow steps are only one possible implementation of the application.
The main body of the method for adjusting the posture of the glasses may be the glasses, or may be an edge device connected to the glasses in a communication manner. The edge device may be a terminal, server, or other device having computing resources.
Step 110, collecting target type data of the glasses in a wearing state.
The target type data refers to data reflecting physical states of the glasses, such as movement state data of the glasses.
Alternatively, the glasses may be smart glasses, such as AR glasses, or general glasses provided with a motion sensor, such as myopia glasses, presbyopic glasses, sunglasses, and the like. The AR glasses will be described below as an example.
In some embodiments, the target type data includes: pressure data, acceleration data, gravity data. The collecting the target type data of the glasses in the wearing state comprises the following steps: and collecting pressure data, acceleration data and gravity data of the glasses.
The pressure data of the glasses refer to pressure data generated by contact of the glasses with the facial skin of the user. Pressure data of the glasses can be collected by a pressure sensor.
The acceleration data of the glasses refer to acceleration data when the glasses slide down. Acceleration data of the glasses can be collected through an acceleration sensor.
The gravity data of the glasses refer to gravity data when the glasses slide down, and include direction data of the glasses slide down. Gravity data of the glasses can be collected through a gravity sensor.
The pressure sensor, the acceleration sensor and the gravity sensor may be provided separately or may be at least one sensor component for collecting the target data integrally in the same sensor.
In this embodiment, the analysis of the posture of the glasses is facilitated by collecting the pressure data, acceleration data and gravity data of the glasses in the wearing state.
Fig. 2 is a schematic diagram of an overall structure of AR glasses according to an embodiment of the present application. As shown in fig. 2, an AR glasses 200 includes two glasses frames 210, a beam 220 and a glasses leg 230, the two glasses frames 210 are parallel to each other and fixedly connected through the beam 220, lenses are disposed in the glasses frames 210, the outer side of the glasses frame 210 is hinged with the telescopic glasses leg 230, and the inner side of the glasses frame 210 is rotatably connected with a rotatable nose pad 240; the rear end of the glasses leg 230 is provided with a first pressure sensor 231, the nose pad 240 is provided with a second pressure sensor 241, and the cross beam 220 is provided with an acceleration sensor 221 and a gravity sensor 222.
Alternatively, the acceleration sensor 221 and the gravity sensor 222 may be provided on the temple 230.
The first pressure sensor 231 is used for acquiring pressure data of the legs of the glasses, the second pressure sensor 241 is used for acquiring pressure data of the nose pads, the acceleration sensor 221 is used for acquiring acceleration data of the glasses, and the gravity sensor 222 is used for acquiring gravity data of the glasses.
In some embodiments, the target type data includes: sliding the haptic data. The collecting the target type data of the glasses in the wearing state comprises the following steps: and collecting sliding touch data of the glasses.
The sliding touch data refers to touch data generated on the skin by the skin relative to the glasses during the sliding process of the glasses, and can be acquired through a sliding sensor.
The sliding sensor is one of the touch sensors, can be used for judging and measuring sliding generated by an object, and can be classified into a non-directional type, a unidirectional type and an omni-directional type according to the presence or absence of a sliding direction detection function.
In some embodiments, to determine the direction of the sliding of the glasses, it is desirable to collect sliding haptic data using an omni-directional sliding sensor.
In the embodiment, the sliding touch data of the glasses in the wearing state is collected, so that the sliding postures of the glasses can be analyzed conveniently.
For example, the slip sensor may be positioned in the middle of the temple, nose pad, or other suitable location where contact with a person's head, face, etc. may occur.
And 120, determining sliding gesture data of the glasses based on the change of the target type data.
Optionally, the sliding gesture of the glasses may be left sliding gesture, right sliding gesture or lower sliding gesture. The slipping gesture data is data for representing the slipping gesture of the glasses, and can reflect the degree of slipping.
In some embodiments, the change in the target type data includes: a change in pressure data, a change in acceleration data, a change in gravity data.
For example, determining slip attitude data for the eyewear based on the change in the target type data includes:
and comparing the target type data with the reference data of the glasses in a normal wearing state, and determining the sliding gesture data of the glasses.
Specifically, in a state that the glasses are worn normally, basic data of state indexes of the glasses are collected at preset frequency or in real time, data processing is carried out, and the basic data are obtained through statistical analysis.
For example, the reference data may also be preset.
Wherein the reference data comprises a threshold range of state indexes of the glasses, and can be used for calibrating wearing states of the glasses.
For example, if the target type data exceeds the range indicated by the reference data, it is determined that the glasses slip, and thus the slip posture data of the glasses can be obtained.
In the embodiment of the application, the sliding gesture data of the glasses can be accurately determined by comparing the target type data corresponding to the glasses with the reference data.
In some embodiments, the change in the target type data includes a change in sliding haptic data. Further, determining slip attitude data of the glasses based on the change in the target type data includes:
and determining the sliding size and direction of the glasses according to the collected sliding touch data of the glasses.
For example, the sliding haptic data is collected by an omni-directional sensor. The omni-directional sensor adopts a metal ball which is coated with insulating materials on the surface and forms conductive and non-conductive areas distributed in longitude and latitude. When the omnidirectional sensor slides, the metal ball rotates, so that the conductive area and the non-conductive area on the spherical surface alternately contact the electrode, thereby generating on-off signals, and the size and the direction of the sliding can be measured through counting and judging the on-off signals. Based on the sliding touch data of the glasses in the wearing state, which are acquired by adopting the omni-directional sensor, the sliding size and the sliding direction of the glasses can be accurately determined.
And 130, obtaining attitude control parameters of the glasses according to the sliding attitude data of the glasses.
The posture control parameter refers to a physical parameter that affects the state change of the glasses, such as a telescoping degree parameter, a regulating degree parameter, and the like.
In some embodiments, the slip gesture data of the glasses may be converted into gesture control parameters through a calculation formula.
And 140, adjusting the posture of the glasses based on the posture control parameters so as to inhibit sliding.
Specifically, based on the attitude control parameters, the attitude of the glasses is adjusted, for example, the length of the temples of the glasses is adjusted according to the telescoping degree parameters, and the angle of the nose pads of the glasses is adjusted according to the adjustment degree parameters.
In embodiments of the present application, inhibiting slipping includes at least partially reducing or minimizing slipping of the eyeglasses, stopping slipping of the eyeglasses, returning the eyeglasses to a normal wear state, and the like.
According to the embodiment of the application, the sliding gesture data of the glasses are determined by collecting the target type data of the glasses in the wearing state, the gesture control parameters of the glasses are obtained according to the sliding gesture data of the glasses, and the gestures of the glasses are regulated based on the gesture control parameters, so that the sliding of the glasses is restrained, the intelligent degree and the automation degree are high, a user can obtain continuous visual experience, the problem of sliding of the glasses can be fundamentally solved, and the anti-slip effect is good.
It should be noted that each embodiment of the present application may be freely combined, exchanged in order, or separately executed, and does not need to rely on or rely on a fixed execution sequence.
In some embodiments, the adjusting the pose of the glasses based on the pose control parameters to inhibit slipping comprises: based on the attitude control parameters, the attitude of the glasses is adjusted, so that the attitude of the glasses is restored to a normal wearing state.
The posture of the glasses is adjusted based on the posture control parameters, so that the posture of the glasses is restored to a normal wearing state, and the visual field is clear, the comfort level is good and the glasses can be read continuously.
In the embodiment of the application, the posture of the glasses is adjusted based on the posture control parameters, so that the posture of the glasses is restored to a normal wearing state, and the continuous visual experience of the user can be obtained.
In some embodiments, the reference data includes a pressure threshold range of the glasses and a position offset threshold range of the glasses.
It is understood that the pressure threshold range of the glasses refers to the threshold range of the pressure data of the glasses in the normal wearing state, and the position offset threshold range of the glasses refers to the threshold range of the position offset of the glasses in the normal wearing state.
In this embodiment, the adjustment of the posture of the glasses is facilitated by acquiring the pressure threshold range of the glasses and the position offset threshold range of the glasses.
Fig. 3 is a flow chart of comparing target type data with reference data of glasses in a normal wearing state to determine a sliding gesture of the glasses according to an embodiment of the present application. As shown in fig. 3, step 120 of comparing the target type data with the reference data of the glasses in the normal wearing state, determining the sliding gesture data of the glasses includes: step 121, step 122 and step 123.
Step 121, comparing the pressure data of the glasses with a pressure threshold range of the glasses;
specifically, the pressure data of the glasses are compared with the pressure threshold range of the glasses, and whether the pressure data of the glasses are within the pressure threshold range of the glasses is judged.
Step 122, determining that the pressure data of the glasses are not in the pressure threshold range of the glasses, obtaining the direction offset of the glasses by combining the position offset threshold range of the glasses based on the acceleration data of the glasses and the gravity data of the glasses, and recording the current pressure value of the glasses;
it can be understood that if the pressure data of the glasses are not within the pressure threshold range of the glasses, the glasses are in a sliding state, and then the collected acceleration data of the glasses and the gravity data of the glasses are processed, and the position offset threshold range of the glasses is used as a comparison to obtain the direction offset of the glasses.
In some embodiments, for each sliding process, the starting point in time T of the glasses sliding is obtained 1 And end timePoint T 2 For the threshold range of slip time (T 1 ,T 2 ) And processing the acceleration data of the glasses and the gravity data of the glasses acquired internally to obtain the direction offset of the glasses.
Optionally, the directional offset is characterized by a coordinate system, including X, Y, Z axis coordinates.
If the pressure data of the glasses is within the pressure threshold range of the glasses, it is indicated that the glasses do not slip.
And step 123, determining sliding gesture data of the glasses based on the pressure value and the direction offset.
Namely, the sliding gesture data of the glasses in the embodiment includes the pressure value and the direction offset, which can reflect the sliding degree of the glasses.
In this embodiment, the pressure data of the glasses are compared with the pressure threshold range of the glasses, and when the pressure data of the glasses are not within the pressure threshold range of the glasses, the collected acceleration data of the glasses and the gravity data of the glasses are processed to obtain the directional offset of the glasses, so that the sliding gesture data of the glasses can be accurately determined based on the pressure value and the directional offset.
In some embodiments, the comparing the pressure data of the glasses to the pressure threshold range of the glasses comprises at least one of:
comparing the leg pressure data of the glasses with a leg pressure threshold range of the glasses;
and comparing the nose support pressure data of the glasses with a nose support pressure threshold range of the glasses.
Optionally, the pressure data of the glasses includes at least one of temple pressure data and nose pad pressure data.
Optionally, the range of pressure thresholds for the glasses includes at least one of a range of temple pressure thresholds and a range of nose pad pressure thresholds.
The pressure data of the glasses legs refers to pressure data generated by contact of the glasses legs with skin, and the pressure data of the glasses legs can be collected through pressure sensors arranged on the glasses legs.
The nose pad pressure data refer to pressure data generated by contact of the nose pad with skin, and can be collected through a pressure sensor arranged on the nose pad.
In the embodiment, the leg pressure data is compared with the leg pressure threshold range, so that whether the leg pressure data is in the leg pressure threshold range or not is conveniently judged; the nose pad pressure data is compared with the nose pad pressure threshold range, so that whether the nose pad pressure data is in the nose pad pressure threshold range or not can be conveniently judged, and whether the glasses are in a sliding state or not can be conveniently judged.
In some embodiments, step 122 comprises:
determining that the leg pressure data of the glasses are not in the leg pressure threshold range of the glasses, obtaining the direction offset of the glasses by combining the position offset threshold range of the glasses based on the acceleration data of the glasses and the gravity data of the glasses, and recording the current leg pressure value of the glasses;
the range of the position offset threshold of the glasses comprises the range of the position offset threshold of the glasses legs of the glasses.
It can be understood that when the temple pressure data is determined not to be within the temple pressure threshold range, the glasses are in a sliding state, the collected acceleration data of the glasses and the gravity data of the glasses are processed, the temple position offset threshold range is used as a comparison, the direction offset of the glasses is obtained, and the temple pressure value of the current glasses is recorded.
In this embodiment, it is determined that the temple pressure data is not within the temple pressure threshold range, and the acquired acceleration data of the glasses and the gravity data of the glasses are processed to obtain the directional offset, so that the control parameters of the temple are conveniently obtained through conversion.
In some embodiments, step 122 comprises:
determining that nose support pressure data of the glasses are not in a nose support pressure threshold range of the glasses, obtaining direction offset of the glasses by combining the position offset threshold range of the glasses based on acceleration data of the glasses and gravity data of the glasses, and recording the nose support pressure value of the current glasses;
the position offset threshold range of the glasses comprises a nose pad position offset threshold range of the glasses.
It can be understood that if the nose support pressure data is determined not to be in the nose support pressure threshold range, the glasses are in a sliding state, then the collected acceleration data of the glasses and the gravity data of the glasses are processed, the nose support position offset threshold range is used as a comparison, the direction offset of the glasses is obtained, and the nose support pressure value of the current glasses is recorded.
In the embodiment, it is determined that the nose pad pressure data is not in the nose pad pressure threshold range, and the direction offset is obtained by processing the collected acceleration data of the glasses and the gravity data of the glasses, so that the nose pad control parameters are conveniently obtained through conversion.
In some embodiments, the size and direction of the sliding of the glasses are determined through collected sliding touch data of the glasses, wherein the method comprises the steps of counting and judging the size and direction of the sliding through electrode on-off signals in a sliding sense sensor when the wearing positions of the glasses and the head slide.
And obtaining the sliding size and direction of the glasses, namely sliding gesture data of the glasses, based on the sliding touch data of the glasses.
In some embodiments, the step 130 obtains the attitude control parameters of the glasses according to the sliding attitude data of the glasses, including at least one of the following:
obtaining a glasses leg direction offset of the glasses based on the sliding gesture data of the glasses, and obtaining a flexibility parameter based on the glasses leg direction offset of the glasses;
obtaining a nose support direction offset of the glasses based on the sliding gesture data of the glasses, and obtaining a regulation degree parameter based on the nose support direction offset of the glasses;
the telescopic degree parameter is used for controlling the telescopic distance of the glasses legs of the glasses, and the adjusting degree parameter is used for controlling the adjusting range of the nose pads of the glasses.
Optionally, the temple directional offset comprises a directional offset of one or both temples of the glasses; the nose pad directional offset comprises a directional offset of one or both nose pads of the glasses.
In some embodiments, the telescoping degree parameter and the adjustment degree parameter are obtained based on the sliding gesture data of the glasses, and the conversion process can be represented by the following formula:
(E,A)=transfer(P,D,offsetX,offsetYT,offsetYZ)
wherein E is a telescoping degree parameter, A is a regulating degree parameter, P is a pressure value, D is a direction value, the two directions are left and right, positive value is left direction, negative value is right direction, offsetX is an offset value on X axis, offsetY is an offset value on Y axis, and offsetZ is an offset value on Z axis.
In the embodiment, based on sliding gesture data of the glasses, the direction offset of the glasses legs of the glasses is obtained, so that the elasticity parameter is obtained; based on the sliding gesture data of the glasses, the nose support direction offset of the glasses is obtained, so that the adjustment degree parameter is obtained, the adjustment of the glasses legs and/or the nose support of the glasses is facilitated according to different conditions, and the accuracy of the gesture adjustment of the glasses is improved.
In some embodiments, the adjusting 140 the posture of the glasses based on the posture control parameters to inhibit slipping includes at least one of:
based on the expansion and contraction degree parameters, the length of the glasses legs of the glasses is adjusted, so that the postures of the glasses legs of the glasses are restored to a normal wearing state;
in this embodiment, the temple is a retractable temple, such as an electric retractable temple. Illustratively, the electric telescopic glasses legs are controlled to stretch based on the stretching degree parameters, the lengths of the glasses legs are adjusted, and the postures of the glasses legs are restored to a normal wearing state.
Based on the adjustment degree parameters, the angle of the nose pad of the glasses is adjusted, so that the posture of the nose pad of the glasses is restored to a normal wearing state.
In some embodiments, the nose pads are rotatable nose pads (rotatable relative to the face nose surface to form the desired horizontal and longitudinal opening angles), such as by providing a rotating motor on the eyeglasses and a nose pad at one end of the rotating shaft of the rotating motor, the nose pad rotation being adjustable. Illustratively, the rotating motor is controlled to rotate based on the adjustment degree parameter, so that the nose pad is driven to rotate, the angle of the nose pad is adjusted, and the sliding of the glasses is restrained.
In some embodiments, the nose pad is internally provided with an electromagnetic module, the magnitude of magnetic attraction between the two nose pads is controlled by controlling the electric parameters of the electromagnetic module, and the distance between the nose pads is controlled so as to indirectly adjust the contact tightness between the nose pads and the nose and inhibit the sliding of the glasses.
Fig. 4 illustrates a physical schematic diagram of an electronic device, as shown in fig. 4, which may include: processor 410, communication interface (Communications Interface) 420, memory 430, communication bus 440, and sensor 450, wherein processor 410, communication interface 420, memory 430, and sensor 450 communicate with each other via communication bus 440. The processor 410 may invoke logic instructions in the memory 430 to perform a method of adjusting the pose of the lens, the method comprising: collecting target type data of the glasses in a wearing state; determining sliding gesture data of the glasses based on the change of the target type data; according to the sliding gesture data of the glasses, gesture control parameters of the glasses are obtained; and adjusting the posture of the glasses based on the posture control parameters so as to inhibit sliding.
Further, the logic instructions in the memory 430 described above may be implemented in the form of software functional units and may be stored in a computer-readable storage medium when sold or used as a stand-alone product. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
In some embodiments, the electronic device is a pair of glasses including a motion sensor coupled to the processor and configured to collect target type data of the pair of glasses in a worn state, the target type data including pressure data, acceleration data, and gravity data. The processor executes the method for adjusting the posture of the glasses according to the foregoing method embodiment, which is not described herein.
For example, the motion sensor may include a plurality of pressure sensors, acceleration sensors, gravity sensors, and the like.
For example, the motion sensor may be integrated with a variety of data acquisition functions. Such as MEMS accelerometers that integrate gravity data and acceleration data acquisition functions.
In some embodiments, the electronic device is an eyeglass comprising a haptic sensor coupled to the processor and configured to collect target type data of the eyeglass in a worn state, the target type data comprising sliding haptic data. The processor executes the method for adjusting the posture of the glasses according to the foregoing method embodiment, which is not described herein.
For example, the tactile sensor includes an omni-directional slide sensor.
In another aspect, the present application also provides a computer program product, where the computer program product includes a computer program, where the computer program can be stored on a non-transitory computer readable storage medium, and when the computer program is executed by a processor, the computer can execute the method for adjusting the posture of glasses provided by the above method embodiments, and the method includes: collecting target type data of the glasses in a wearing state; determining sliding gesture data of the glasses based on the change of the target type data; according to the sliding gesture data of the glasses, gesture control parameters of the glasses are obtained; and adjusting the posture of the glasses based on the posture control parameters so as to inhibit sliding.
In yet another aspect, the present application further provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, is implemented to perform the method for adjusting the posture of glasses provided by the above-mentioned method embodiments, the method comprising: collecting target type data of the glasses in a wearing state; determining sliding gesture data of the glasses based on the change of the target type data; according to the sliding gesture data of the glasses, gesture control parameters of the glasses are obtained; and adjusting the posture of the glasses based on the posture control parameters so as to inhibit sliding. The apparatus embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present application without undue burden.
From the above description of the embodiments, it will be apparent to those skilled in the art that the embodiments may be implemented by means of software plus necessary general hardware platforms, or of course may be implemented by means of hardware. Based on this understanding, the foregoing technical solution may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a computer readable storage medium, such as ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method described in the respective embodiments or some parts of the embodiments.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application.
Claims (15)
1. The method for adjusting the posture of the glasses is characterized by comprising the following steps:
collecting target type data of the glasses in a wearing state;
determining sliding gesture data of the glasses based on the change of the target type data;
acquiring attitude control parameters of the glasses according to the sliding attitude data of the glasses;
and adjusting the posture of the glasses based on the posture control parameters so as to inhibit sliding.
2. The method for adjusting the posture of the glasses according to claim 1, wherein the collecting the target type data of the glasses in the wearing state includes: and collecting pressure data, acceleration data and gravity data of the glasses.
3. The glasses posture adjustment method according to claim 2, wherein the determining the sliding-down posture data of the glasses based on the change in the target type data includes:
and comparing the target type data with the reference data of the glasses in a normal wearing state, and determining the sliding gesture data of the glasses.
4. The method for adjusting the posture of the glasses according to claim 1, wherein the collecting the target type data of the glasses in the wearing state includes: and collecting sliding touch data of the glasses.
5. The method of claim 4, wherein determining slip attitude data of the eyeglasses based on the change in the target type data comprises: and determining the sliding size and direction of the glasses according to the collected sliding touch data of the glasses.
6. The method according to claim 1, wherein the adjusting the posture of the glasses based on the posture control parameter to suppress slipping comprises: based on the attitude control parameters, the attitude of the glasses is adjusted, so that the attitude of the glasses is restored to a normal wearing state.
7. A glasses posture adjustment method according to claim 3, wherein the reference data includes a pressure threshold range of the glasses and a position offset threshold range of the glasses.
8. The method of claim 7, wherein comparing the target type data with reference data of the glasses in a normal wearing state, determining slip-off posture data of the glasses, comprises:
comparing the pressure data of the glasses with a pressure threshold range of the glasses;
determining that the pressure data of the glasses are not in the pressure threshold range of the glasses, obtaining the direction offset of the glasses by combining the position offset threshold range of the glasses based on the acceleration data of the glasses and the gravity data of the glasses, and recording the current pressure value of the glasses;
and determining sliding gesture data of the glasses based on the pressure value and the direction offset.
9. The eyewear attitude adjustment method of claim 8, wherein the comparing the pressure data of the eyewear to the pressure threshold range of the eyewear comprises at least one of:
comparing the leg pressure data of the glasses with a leg pressure threshold range of the glasses;
and comparing the nose support pressure data of the glasses with a nose support pressure threshold range of the glasses.
10. The method for adjusting the posture of the glasses according to any one of claims 1 to 9, wherein the step of obtaining the posture control parameter of the glasses according to the sliding posture data of the glasses includes at least one of the following:
obtaining a glasses leg direction offset of the glasses based on the sliding gesture data of the glasses, and obtaining a flexibility parameter based on the glasses leg direction offset of the glasses;
obtaining a nose support direction offset of the glasses based on the sliding gesture data of the glasses, and obtaining a regulation degree parameter based on the nose support direction offset of the glasses;
the telescopic degree parameter is used for controlling the telescopic distance of the glasses legs of the glasses, and the adjusting degree parameter is used for controlling the adjusting range of the nose pads of the glasses.
11. The method of claim 10, wherein the adjusting the attitude of the glasses based on the attitude control parameters to suppress slipping comprises at least one of:
based on the expansion and contraction degree parameters, the length of the glasses legs of the glasses is adjusted, so that the postures of the glasses legs of the glasses are restored to a normal wearing state;
based on the adjustment degree parameters, the angle of the nose pad of the glasses is adjusted, so that the posture of the nose pad of the glasses is restored to a normal wearing state.
12. Electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method for adjusting the pose of spectacles according to any of claims 1-11 when executing the computer program.
13. The electronic device of claim 12, wherein the electronic device is a pair of glasses comprising a motion sensor coupled to the processor and configured to collect target type data for the pair of glasses in a worn state, the target type data comprising pressure data, acceleration data, and gravity data.
14. The electronic device of claim 12, wherein the electronic device is a pair of glasses comprising a haptic sensor coupled to the processor and configured to gather target type data for the pair of glasses in a worn state, the target type data comprising sliding haptic data.
15. A non-transitory computer readable storage medium having stored thereon a computer program, wherein the computer program when executed by a processor implements the method of adjusting the pose of eyeglasses according to any of claims 1 to 11.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310493180.0A CN116718175A (en) | 2023-05-04 | 2023-05-04 | Glasses posture adjusting method, electronic device and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310493180.0A CN116718175A (en) | 2023-05-04 | 2023-05-04 | Glasses posture adjusting method, electronic device and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116718175A true CN116718175A (en) | 2023-09-08 |
Family
ID=87872325
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310493180.0A Pending CN116718175A (en) | 2023-05-04 | 2023-05-04 | Glasses posture adjusting method, electronic device and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116718175A (en) |
-
2023
- 2023-05-04 CN CN202310493180.0A patent/CN116718175A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11836289B2 (en) | Use of eye tracking to adjust region-of-interest (ROI) for compressing images for transmission | |
CN110520824B (en) | Multimode eye tracking | |
CN109803574B (en) | Wearable device with display, lens, illuminator, and image sensor | |
EP3158424B1 (en) | System and method for display enhancement | |
US8690326B2 (en) | Method and systems for measuring interpupillary distance | |
US10684469B2 (en) | Detecting and mitigating motion sickness in augmented and virtual reality systems | |
US20170287446A1 (en) | Real-time user adaptive foveated rendering | |
JP2022538669A (en) | Improved eye tracking latency | |
US11313759B2 (en) | Method and device for measuring the local refractive power and/or the refractive power distribution of a spectacle lens | |
WO2014077462A1 (en) | Apparatus for measuring eye rotation angle for setting length of corridor of progressive lens and method thereof | |
JP7081599B2 (en) | Information processing equipment, information processing methods, and programs | |
CN110366388B (en) | Information processing method, information processing apparatus, and computer-readable storage medium | |
KR20220088678A (en) | Apparatus and method for mapping a visual scene to a projection plane | |
CN111708166A (en) | Degree adjusting method and device and head-mounted display equipment | |
CN116718175A (en) | Glasses posture adjusting method, electronic device and storage medium | |
CN114661152B (en) | AR display control system and method for reducing visual fatigue | |
TWI830785B (en) | Glasses and Procedures | |
JP2015123262A (en) | Sight line measurement method using corneal surface reflection image, and device for the same | |
CN111163680B (en) | Method and system for adapting the visual and/or visual motor behaviour of an individual | |
US20220198789A1 (en) | Systems and methods for determining one or more parameters of a user's eye | |
US20240119594A1 (en) | Determining Digital Markers Indicative of a Neurological Condition Using Eye Movement Parameters | |
US11806078B1 (en) | Tear meniscus detection and evaluation system | |
US20230057524A1 (en) | Eyeglass devices and related methods | |
WO2023023398A1 (en) | Eyeglass devices and related methods |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |