CN109407848B - Step detection method and device - Google Patents

Step detection method and device Download PDF

Info

Publication number
CN109407848B
CN109407848B CN201811295929.6A CN201811295929A CN109407848B CN 109407848 B CN109407848 B CN 109407848B CN 201811295929 A CN201811295929 A CN 201811295929A CN 109407848 B CN109407848 B CN 109407848B
Authority
CN
China
Prior art keywords
detection
infrared
convex polygon
foot
barycentric coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811295929.6A
Other languages
Chinese (zh)
Other versions
CN109407848A (en
Inventor
左延坤
杨翼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Yuanchuang Intelligent Technology Co ltd
Original Assignee
Guangdong Yuanchuang Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Yuanchuang Intelligent Technology Co ltd filed Critical Guangdong Yuanchuang Intelligent Technology Co ltd
Priority to CN201811295929.6A priority Critical patent/CN109407848B/en
Publication of CN109407848A publication Critical patent/CN109407848A/en
Application granted granted Critical
Publication of CN109407848B publication Critical patent/CN109407848B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a method and a device for detecting steps, and belongs to the technical field of detection. The method comprises the following steps: acquiring a plurality of first detection points on the periphery of a front sole and a plurality of second detection points on the periphery of a rear sole in a detection area; the first detection points are connected in sequence to form a first convex polygon, and the second detection points are connected in sequence to form a second convex polygon; respectively calculating to obtain a first barycentric coordinate of the first convex polygon and a second barycentric coordinate of the second convex polygon; and obtaining the moving direction and the moving distance of the footstep based on the first barycentric coordinate and the second barycentric coordinate. The method and the device for detecting the steps solve the problem that the step positions and the step moving directions of the users cannot be accurately detected in VR, and greatly improve the experience of the users; the hardware of the step detection device consists of conventional electronic components, so that the hardware cost is greatly reduced; can be directly fixed on the periphery of the game platform, and is very easy to install and maintain.

Description

Step detection method and device
Technical Field
The invention belongs to the technical field of detection, and particularly relates to a method and a device for detecting steps.
Background
Virtual Reality (VR) is a computer simulation system that creates and experiences a virtual world, using a computer to create a simulated environment, which is a systematic simulation of multi-source information-fused, interactive three-dimensional dynamic views and physical behaviors to immerse users in the environment.
The VR mainly includes aspects of simulating environment, perception, natural skills and sensing equipment, wherein the natural skills refer to head rotation, eyes, gestures, foot movement or other human behavior actions of a person, data adaptive to the actions of participants are processed by a computer, and real-time response is made to input of a user and is respectively fed back to five sense organs of the user.
Currently, in the VR field, there are two methods for step detection:
the first is to place several cameras around the platform and judge the moving range and distance of the footsteps by capturing the footstep images. The disadvantages are that: high cost, complex technology and poor detection accuracy.
The second is to wear a device with a foot speed sensor on the foot, and judge the moving amplitude and distance of the foot by calculating the angular speed. The disadvantages are as follows: the cost is high, the detection accuracy is poor, and the moving direction of the footsteps cannot be judged.
Disclosure of Invention
In order to solve the above problems in the prior art, the present invention provides a method and an apparatus for detecting steps.
The technical scheme of the invention is as follows:
in one aspect, the present invention provides a method for detecting steps, comprising:
acquiring a plurality of first detection points on the periphery of a front sole and a plurality of second detection points on the periphery of a rear sole in a detection area;
the first detection points are connected in sequence to form a first convex polygon, and the second detection points are connected in sequence to form a second convex polygon;
respectively calculating to obtain a first barycentric coordinate of the first convex polygon and a second barycentric coordinate of the second convex polygon;
and obtaining the moving direction and the moving distance of the footstep based on the first barycentric coordinate and the second barycentric coordinate.
Optionally, the method for obtaining a plurality of first detection points on the periphery of the front sole and a plurality of second detection points on the periphery of the rear sole in the detection region includes:
sequentially scanning by a plurality of groups of infrared detection devices arranged around the detection area to respectively obtain a first shielding area shielded by the front sole and a second shielding area shielded by the rear sole;
and obtaining a plurality of first detection points according to the outer edge infrared intersection points of the first shielding area, and obtaining a plurality of second detection points according to the outer edge infrared intersection points of the second shielding area.
Optionally, the method for obtaining the first barycentric coordinate of the first convex polygon through calculation includes:
dividing the first convex polygon into a plurality of triangles;
respectively calculating to obtain the gravity center and the area of each triangle;
and calculating to obtain the first barycentric coordinate based on a preset algorithm according to the barycenter and the area of each triangle.
Further optionally, the method for obtaining the first barycentric coordinate based on the preset algorithm includes:
and calculating the weighted average value by the gravity center of each triangle according to the area weight.
On the other hand, the invention provides a foot detection device, which comprises a foot tray and a plurality of infrared detection units arranged around the foot tray, wherein the infrared detection units are electrically connected with a controller, and the foot detection device detects the moving direction and the moving distance of the foot of a user by applying the foot detection method.
Preferably, infrared detecting unit includes infrared emission board and infrared ray receiving board, the symmetry set up in the both sides of step tray, be provided with a plurality of infrared emission pipes on the infrared emission board side by side, correspond on the infrared receiving board position department of infrared emission pipe has set gradually a plurality of infrared receiving pipes.
Preferably, the foot tray is of an octagonal structure, and 4 infrared transmitting plates and 4 corresponding infrared receiving plates are sequentially arranged on the periphery of the foot tray.
Compared with the prior art, the technical scheme provided by the invention has the following beneficial effects or advantages:
the method and the device for detecting the steps solve the problem that the step position and the step moving direction of the user cannot be accurately detected in VR, and greatly improve the experience of the user; moreover, the hardware of the step detection device provided by the invention consists of conventional electronic components, so that the hardware cost is greatly reduced; can be directly fixed on the periphery of the game platform, and is very easy to install and maintain.
Drawings
FIG. 1 is a flowchart of a method for detecting steps according to an embodiment of the present invention;
FIG. 2 is a schematic structural diagram of a first convex polygon in an embodiment of the present invention;
FIG. 3 is a schematic diagram illustrating a division of a first convex polygon according to an embodiment of the present invention;
FIG. 4 is a schematic structural diagram of a step detection device according to an embodiment of the present invention;
FIG. 5 is a cross-sectional view of a step detection device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It will be understood by those skilled in the art that, unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
As shown in fig. 1, an embodiment of the present invention provides a method for detecting steps, including:
step S1: a plurality of first detection points on the periphery of the front sole and a plurality of second detection points on the periphery of the rear sole in the detection area are obtained.
In a specific implementation process, there are many methods for obtaining a plurality of first detection points on the periphery of the forefoot and a plurality of second detection points on the periphery of the forefoot in the detection region, and optionally, in an embodiment of the present invention, the method for obtaining a plurality of first detection points on the periphery of the forefoot and a plurality of second detection points on the periphery of the forefoot in the detection region specifically includes:
sequentially scanning by a plurality of groups of infrared detection devices arranged around the detection area to respectively obtain a first shielding area shielded by the front sole and a second shielding area shielded by the rear sole;
and obtaining a plurality of first detection points according to the outer edge infrared intersection points of the first shielding area, and obtaining a plurality of second detection points according to the outer edge infrared intersection points of the second shielding area.
The infrared detection device has low cost and is convenient to install and maintain.
After completion of step S1, step S2 is executed: the first detection points are connected in sequence to form a first convex polygon, and the second detection points are connected in sequence to form a second convex polygon. Fig. 2 is a schematic structural view of the first convex polygon.
After the first convex polygon and the second convex polygon are formed, step S3 is executed: and respectively calculating to obtain a first barycentric coordinate of the first convex polygon and a second barycentric coordinate of the second convex polygon.
In a specific implementation process, in order to quickly and accurately obtain the first barycentric coordinate, a method adopted in an embodiment of the present invention specifically includes:
first, the first convex polygon is divided into a plurality of triangles, and as shown in fig. 3, the first convex polygon is divided into three triangles a, b, and c.
After the division, the gravity center and the area of each triangle are obtained through calculation respectively.
Specifically, the method for calculating the gravity center of the triangle a specifically includes:
the coordinates of the three first detection points A, B, C of the triangle a are: a (x)1,y1);B(x2,y2);C(x3,y3);
Center of gravity abscissa Wax=(x1+x2+x3)/3;
Ordinate Wa of center of gravityy=(y1+y2+y3)/3。
The area calculation formula of triangle a is as follows:
Sa=(x1(y2-y3)+x2(y3-y1)+x3(y1-y2))。
the gravity center Wb of the triangle b can be obtained by the same methodx、WbySum area Sb and center of gravity Wc of triangle cx、WcyAnd an area Sc.
And after the gravity center and the area of each triangle are obtained, calculating and obtaining the first gravity center coordinate based on a preset algorithm according to the gravity center and the area of each triangle.
Specifically, the method for obtaining the first barycentric coordinate based on the preset algorithm includes:
and calculating the gravity center of each triangle by an area weight to obtain a weighted average value, wherein the specific calculation formula is as follows:
x=(Wax×Sa+Wbx×Sb+Wcx×Sc...)/(Sa+Sb+Sc+...);
y=(Way×Sa+Wy2×Sb+Wcy×Sc...)/(Sa+Sb+Sc+...)。
the weighted average is the first barycentric coordinate.
The second centroid coordinate may be obtained based on the same calculation.
After completion of step S3, step S4 is executed: and obtaining the moving direction and the moving distance of the footstep based on the first barycentric coordinate and the second barycentric coordinate.
The step detection method provided by the embodiment of the invention can accurately detect the step position and the step moving direction of the user, and greatly improves the experience of the user.
Corresponding to the above-mentioned foot detecting method, an embodiment of the present invention further provides a foot detecting device, as shown in fig. 4 and 5, including a foot tray 1 and a plurality of infrared detecting units 2 disposed around the foot tray, the infrared detecting units 2 being electrically connected to the controller, the foot detecting device detecting the moving direction and the moving distance of the foot of the user by using the above-mentioned foot detecting method.
In a specific implementation process, in order to obtain the step position information more accurately, it is preferable that the infrared detection unit 2 in the embodiment of the present invention includes an infrared emission plate 21 and an infrared receiving plate 22, which are symmetrically disposed on two sides of the step tray, wherein a plurality of infrared emission tubes are disposed on the infrared emission plate 21 side by side, and a plurality of infrared receiving tubes are sequentially disposed on the infrared receiving plate 22 at positions corresponding to the infrared emission tubes. Through setting up infrared emission pipe and infrared ray receiver tube side by side, can make the infrared ray cover denser to more accurately obtain step position information.
In a specific implementation process, the foot tray 1 in the embodiment of the present invention is preferably an octagonal structure, and 4 infrared emission plates 21 and 4 corresponding infrared reception plates 22 are sequentially disposed around the octagonal structure. The device is arranged to be an octagonal structure, so that the sole can be scanned at multiple angles, more first detection points and more second detection points are obtained, and more accurate step position information is obtained.
The hardware of the step detection device provided by the embodiment of the invention consists of conventional electronic components, so that the hardware cost is greatly reduced; can be directly fixed on the periphery of the game platform, and is very easy to install and maintain.
In one or more embodiments provided by the present invention, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (7)

1. A method for detecting steps, the method comprising:
acquiring a plurality of first detection points on the periphery of a front sole and a plurality of second detection points on the periphery of a rear sole in a detection area;
the first detection points are connected in sequence to form a first convex polygon, and the second detection points are connected in sequence to form a second convex polygon;
respectively calculating to obtain a first barycentric coordinate of the first convex polygon and a second barycentric coordinate of the second convex polygon;
and obtaining the moving direction and the moving distance of the footstep based on the first barycentric coordinate and the second barycentric coordinate.
2. The method of claim 1, wherein the step of obtaining a plurality of first detection points at the periphery of the forefoot and a plurality of second detection points at the periphery of the rearfoot in the detection region comprises:
sequentially scanning by a plurality of groups of infrared detection devices arranged around the detection area to respectively obtain a first shielding area shielded by the front sole and a second shielding area shielded by the rear sole;
and obtaining a plurality of first detection points according to the outer edge infrared intersection points of the first shielding area, and obtaining a plurality of second detection points according to the outer edge infrared intersection points of the second shielding area.
3. The method of claim 1, wherein the step of calculating the first barycentric coordinate of the first convex polygon comprises:
dividing the first convex polygon into a plurality of triangles;
respectively calculating to obtain the gravity center and the area of each triangle;
and calculating to obtain the first barycentric coordinate based on a preset algorithm according to the barycenter and the area of each triangle.
4. The method of claim 3, wherein the step of calculating the first barycentric coordinate based on a predetermined algorithm comprises:
and calculating the weighted average value by the gravity center of each triangle according to the area weight.
5. A foot detection device comprises a foot tray and a plurality of infrared detection units arranged around the foot tray, wherein the infrared detection units are electrically connected with a controller, and the foot detection device is characterized in that the foot detection device detects the moving direction and the moving distance of the foot of a user by applying any one of the foot detection methods of 1-4.
6. The device for detecting steps as claimed in claim 5, wherein the infrared detecting unit comprises an infrared emitting plate and an infrared receiving plate, which are symmetrically disposed on both sides of the step tray, the infrared emitting plate is disposed with a plurality of infrared emitting tubes side by side, and the infrared receiving plate is disposed with a plurality of infrared receiving tubes in sequence corresponding to the infrared emitting tubes.
7. The device as claimed in claim 6, wherein the foot tray is an octagonal structure, and has 4 infrared emitting panels and 4 corresponding infrared receiving panels sequentially disposed around the periphery thereof.
CN201811295929.6A 2018-11-01 2018-11-01 Step detection method and device Active CN109407848B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811295929.6A CN109407848B (en) 2018-11-01 2018-11-01 Step detection method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811295929.6A CN109407848B (en) 2018-11-01 2018-11-01 Step detection method and device

Publications (2)

Publication Number Publication Date
CN109407848A CN109407848A (en) 2019-03-01
CN109407848B true CN109407848B (en) 2021-07-13

Family

ID=65471152

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811295929.6A Active CN109407848B (en) 2018-11-01 2018-11-01 Step detection method and device

Country Status (1)

Country Link
CN (1) CN109407848B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102495672A (en) * 2011-10-20 2012-06-13 广州市迪拓信息科技有限公司 Position judging method in touch control
CN105101400A (en) * 2014-05-16 2015-11-25 中国民用航空总局第二研究所 Unknown node positioning method of utilizing multi-hop nodes to reduce possible position area
CN105759967A (en) * 2016-02-19 2016-07-13 电子科技大学 Global hand gesture detecting method based on depth data
CN105999619A (en) * 2016-04-01 2016-10-12 厦门鑫奥力电器有限公司 Intelligent running machine and control method thereof
CN107481267A (en) * 2017-08-14 2017-12-15 华南理工大学 A kind of shooting projection interactive system and method based on binocular vision
CN108283793A (en) * 2018-03-10 2018-07-17 杭州虚现科技有限公司 A kind of Omni-mobile platform and the method that double-legged information is accurately tracked based on this platform

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10724864B2 (en) * 2014-06-17 2020-07-28 Chief Architect Inc. Step detection methods and apparatus

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102495672A (en) * 2011-10-20 2012-06-13 广州市迪拓信息科技有限公司 Position judging method in touch control
CN105101400A (en) * 2014-05-16 2015-11-25 中国民用航空总局第二研究所 Unknown node positioning method of utilizing multi-hop nodes to reduce possible position area
CN105759967A (en) * 2016-02-19 2016-07-13 电子科技大学 Global hand gesture detecting method based on depth data
CN105999619A (en) * 2016-04-01 2016-10-12 厦门鑫奥力电器有限公司 Intelligent running machine and control method thereof
CN107481267A (en) * 2017-08-14 2017-12-15 华南理工大学 A kind of shooting projection interactive system and method based on binocular vision
CN108283793A (en) * 2018-03-10 2018-07-17 杭州虚现科技有限公司 A kind of Omni-mobile platform and the method that double-legged information is accurately tracked based on this platform

Also Published As

Publication number Publication date
CN109407848A (en) 2019-03-01

Similar Documents

Publication Publication Date Title
US9552673B2 (en) Grasping virtual objects in augmented reality
US8175374B2 (en) Volume recognition method and system
CN105073210B (en) Extracted using the user's body angle of depth image, curvature and average terminal position
TWI442311B (en) Using a three-dimensional environment model in gameplay
US9058058B2 (en) Processing of gesture-based user interactions activation levels
US8223147B1 (en) Method and system for vision-based interaction in a virtual environment
TW452723B (en) Method and apparatus for three-dimensional input entry
US8613666B2 (en) User selection and navigation based on looped motions
WO2018214697A1 (en) Graphics processing method, processor, and virtual reality system
CN110559653B (en) Control method, device, terminal and storage medium of virtual aircraft
US20170315609A1 (en) Method for simulating and controlling virtual sphere in a mobile device
CN110286754A (en) Projective techniques and relevant device based on eyeball tracking
JP2018014078A (en) Information processing device, information processing method, and program
CN106980377A (en) The interactive system and its operating method of a kind of three dimensions
CN112657176A (en) Binocular projection man-machine interaction method combined with portrait behavior information
CN110348370B (en) Augmented reality system and method for human body action recognition
CN109407848B (en) Step detection method and device
WO2023078272A1 (en) Virtual object display method and apparatus, electronic device, and readable medium
CN108446023B (en) Virtual reality feedback device and positioning method, feedback method and positioning system thereof
Park et al. AR room: Real-time framework of camera location and interaction for augmented reality services
Kim et al. ThunderPunch: A bare-hand, gesture-based, large interactive display interface with upper-body-part detection in a top view
CN102495672A (en) Position judging method in touch control
TWI413018B (en) Volume recognition method and system
TWI835289B (en) Virtual and real interaction method, computing system used for virtual world, and virtual reality system
Černeková et al. Single camera pointing gesture recognition for interaction in edutainment applications

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: No.84, Lai'an 2nd Street, Guangzhou Economic and Technological Development Zone, Guangdong 510730

Applicant after: Guangdong Yuanchuang Intelligent Technology Co.,Ltd.

Address before: No.84, Lai'an 2nd Street, Guangzhou Economic and Technological Development Zone, Guangdong 510730

Applicant before: GUANGZHOU YUANCHUANG NETWORK TECHNOLOGY Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant
CB03 Change of inventor or designer information

Inventor after: Zuo Yankun

Inventor after: Yang Ji

Inventor before: Zuo Yankun

Inventor before: Yang Yi

CB03 Change of inventor or designer information