CN207380667U - Augmented reality interactive system based on radar eye - Google Patents

Augmented reality interactive system based on radar eye Download PDF

Info

Publication number
CN207380667U
CN207380667U CN201720994909.2U CN201720994909U CN207380667U CN 207380667 U CN207380667 U CN 207380667U CN 201720994909 U CN201720994909 U CN 201720994909U CN 207380667 U CN207380667 U CN 207380667U
Authority
CN
China
Prior art keywords
module
data
radar
communication module
radar eye
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201720994909.2U
Other languages
Chinese (zh)
Inventor
王锐
钱学雷
赵阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Shenmigu Digi Tech Co ltd
Original Assignee
Suzhou Mysterious Valley Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Mysterious Valley Digital Technology Co Ltd filed Critical Suzhou Mysterious Valley Digital Technology Co Ltd
Priority to CN201720994909.2U priority Critical patent/CN207380667U/en
Application granted granted Critical
Publication of CN207380667U publication Critical patent/CN207380667U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The utility model discloses a kind of augmented reality interactive systems based on radar eye.The system comprises:Radar eye measurement module, is connected with communication module, detects the distance of the object and radar eye in interactive environment, compared with the location information of radar eye, obtains interactive environment image data, and data above is transferred to data module;Can touch control display module, be connected with communication module, customer interaction information be transferred to data module, received response interaction results data and simultaneously show;Communication module, with radar eye measurement module, can touch control display module and data module be connected, support communication therebetween;Data module is connected with communication module, is identified for the object of which movement feature in interactive environment, and by recognition result be transmitted to can touch control display module show.The utility model has the characteristics that height degree of immersing, real-time, interactive, high-precision, supports user gesture identified input, can support the interaction of multiconductor gesture identification and external environment object and virtual environment.

Description

Augmented reality interactive system based on radar eye
Technical field
The utility model belongs to augmented reality field more particularly to a kind of augmented reality interaction system based on radar eye System.
Background technology
In numerous interaction display forms, it is by the most widely used, also the most widely known is Touch Screen. Growing with the market demand, the requirement for Touch Screen size is increasing, but the electricity that touch-control effect is the most stable Rong Ping is due to technology, technique etc., and screen size can not be more than 50 cun, and this imbalance between supply and demand has expedited the emergence of screen and touch-control system The separation of system.It comes into being every empty touch-control system, what application was more at present is the body language that people is judged based on infrared communication, So as to form interaction with screen-picture, but infrared ray is easily disturbed be subject to ambient light, irritation or reaction failure occurs Etc. variety of problems, seriously affect use, therefore usage scenario is limited.
With the development of technology of Internet of things and the progress of virtual reality technology, people are to PC World and electronic product Interactive form is not limited only to mouse-keyboard, touch controller etc., and user increasingly favors that feeling of immersion is strong, the high augmented reality of participation Either mixed reality form is acted by the limb action or gesture control of user itself, come operate equipment in reality or Object or content in virtual environment.It is more existing such as eyeball control, blink control and the needs such as motion sensing control it is special Camera and image recognition processing, of high cost and discrimination are relatively low;Although wearable capture equipment precision is high, for general It is of high cost for logical augmented reality application.Therefore a kind of high degree of immersing, real-time, interactive, high-precision are needed, user gesture is supported to know It does not input, multiconductor gesture identification and external environment object can be supported to interact system with the augmented reality of the interaction of virtual environment System.
Utility model content
In order to solve above-mentioned problems of the prior art, the utility model proposes a kind of enhancings based on radar eye to show Real interactive system.
The utility model proposes a kind of augmented reality interactive system based on radar eye include radar eye measurement module, can Touch control display module, communication module and data module, wherein:
The radar eye measurement module is connected with communication module, for detecting between the object in interactive environment and radar eye Distance and the location information compared with radar eye, obtain interactive environment image data, and by position detection data, interaction ring Border image data and radar eye parameter are transferred to data module by communication module;
It is described can touch control display module be connected with communication module, for customer interaction information to be transferred to by communication module Data module, receive data module transmission for user's operation response interaction results data and show;
The communication module and radar eye measurement module, can touch control display module and data module be connected, for supporting thunder Up to eye measurement module, can be between touch control display module and data module communication;
The data module is connected with communication module, for being identified for the object of which movement feature in interactive environment, And by recognition result by communication module be transmitted to can touch control display module show.
Optionally, the radar eye measurement module includes at least one radar eye equipment.
Optionally, at least one radar eye equipment cascading is placed.
Optionally, at least one radar eye is parallel is sequentially placed.
Optionally, the communication module is near-field communication module and/or telecommunication module.
Optionally, the data module includes data input module, data processing module and data outputting module, wherein:
The data input module by communication module and radar eye measurement module and can touch control display module be connected, be used for Obtain position detection data, interactive environment image data, radar eye parameter and customer interaction information;
The data processing module is connected with data input module, and the data for being obtained for data input module carry out Processing;
The data outputting module is connected with data processing module, for the number of results for handling data processing module According to by communication module be output to can touch control display module shown.
Optionally, the data processing module includes:Image processing circuit, image of interest identification circuit, feature judge electricity Road, data processing circuit, wherein:
Described image process circuit is used for the interactive environment image data for obtaining radar eye measurement module generation convenient for place The image data of reason;
The image of interest identification circuit is connected with image processing circuit, for obtaining the figure of object of interest in interactive environment As data message;
The feature decision circuitry is connected with image of interest identification circuit, for the figure based on object of interest in interactive environment As data message judges to obtain the characteristic of object of interest;
The data processing circuit is connected with feature decision circuitry, for being handled for obtained characteristic, and By treated, characteristic is transferred to by data outputting module and can be shown in touch control display module.
Optionally, the object of interest includes interaction object and/or operating personnel's gesture.
Optionally, the characteristic of the object of interest includes gesture and/or action message.
Optionally, it is described can touch control display module for can touch liquid crystal screen.
The utility model has the characteristics that height degree of immersing, real-time, interactive, high-precision, support user gesture identified input, can Support the interaction of multiconductor gesture identification and external environment object and virtual environment.
Description of the drawings
Fig. 1 is the structural frames according to the augmented reality interactive system based on radar eye of one embodiment of the utility model Figure.
Fig. 2 is the cascade radar eye equipment identification region schematic diagram according to one embodiment of the utility model.
Specific embodiment
For the purpose of this utility model, technical solution and advantage is more clearly understood, below in conjunction with specific embodiment, and Referring to the drawings, the utility model is further described.
Fig. 1 is the structural frames according to the augmented reality interactive system based on radar eye of one embodiment of the utility model Figure, as shown in Figure 1, the system comprises radar eye measurement module, can touch control display module, communication module and data module, In:
The radar eye measurement module is connected with communication module, for detecting between the object in interactive environment and radar eye Distance and the location information compared with radar eye, obtain interactive environment image data, and by position detection data, interaction ring Border image data and radar eye parameter are transferred to data module by communication module, to carry out object of which movement identifying processing;
It is described can touch control display module be connected with communication module, be in the interactive system displaying augmented reality interactive system The pith of interaction results and interactive object for customer interaction information to be transferred to data module by communication module, is Data module analyzing and processing external gesture operation and touch operation provide data support, while receive data module transmission for The response interaction results data of user's operation are simultaneously shown;
The communication module mainly handles external data input and the output of inter-process result in the interactive system, For supporting radar eye measurement module, communication that can be between touch control display module and data module;
The data module is the core in the augmented reality interactive system, for for the object in interactive environment Body motion feature is identified, and by recognition result by communication module be transmitted to can touch control display module show.
In one embodiment of the utility model, the radar eye measurement module includes at least one radar eye equipment.
In one embodiment of the utility model, the communication module can be near-field communication module, or long distance From communication module or both including near-field communication module or including telecommunication module.
In one embodiment of the utility model, it is described can touch control display module for can touch liquid crystal screen etc. can touch-control show Screen.
The data module further comprises data input module, data processing module and data outputting module, wherein:
The data input module by communication module and radar eye measurement module and can touch control display module be connected, be used for Obtain position detection data, interactive environment image data, radar eye parameter and customer interaction information;
The data processing module is connected with data input module, and the data for being obtained for data input module carry out It operates and touches in processing, mainly the interactive environment image data of processing radar eye measurement module scanning acquisition and Touch Screen Touch data a little;
The data outputting module is connected with data processing module, for the number of results for handling data processing module According to by communication module be output to can touch control display module shown.
Wherein, the data processing module includes:Image processing circuit, image of interest identification circuit, feature decision circuitry, Data processing circuit, wherein:
Described image process circuit is used for the interactive environment image data for obtaining radar eye measurement module generation convenient for place The image data of reason;
The image of interest identification circuit is used to obtain the image data information of object of interest in interactive environment, wherein, institute Stating object of interest includes the objects such as interaction object, operating personnel's gesture, and the image of interest identification circuit can utilize Blob to identify Method obtains the image data information of object of interest in interactive environment, and Blob recognition methods is a kind of to be rejected by background image Method obtains the method for object of interest image data information in a certain environment;
The feature decision circuitry judges to obtain interest for the image data information based on object of interest in interactive environment The characteristics such as the gesture of object and action;
The feature decision circuitry first determines whether the right-hand man of interaction personnel in interactive environment, then in conjunction with Kalman Prediction Algorithm is analyzed for multiframe interactive environment image data, with the gesture of further identification interaction personnel, for example is pulled, contracting It puts, rotate, such as can be according to obtained image data, the coordinate data information of combination of touch control screen touch point, analysis identification Spatial positional information of the gesture information of interaction personnel and external interaction object etc..
The data processing circuit is used to handle for obtained characteristic, and characteristic is led to by treated It crosses data outputting module and is transferred to and can be shown in touch control display module.
The feature decision circuitry is additionally configured to operate time of contact threshold value t, area according to the single touch of default It is single-point operation or continuous gesture operation to divide a certain operation, when judging that operation operates for single-point, according to the list of default Point operating gesture, matches for the touch operation of user, and corresponding interaction results response is generated when matching gesture success, By communication module be sent to can touch control display module shown.Continuous gesture operation can be divided into single-point and multiple spot continuous gesture Operation, cardinal principle is to draw out touch point movement locus according to touch point data acquisition system, according to different time touch point position Coordinate information combination Kalman Prediction method analyzes the gesture-type of matching operation personnel, and such as dragging shows scene, Zoom display The operations such as scene, at the same can realize analog mouse operation some gesture identifications, finally generate interaction results response also by Communication module be sent to can touch control display module shown.
The interactive system, specifically, the feature decision circuitry can be configured as carrying out for regular shape object Identification, such as circle, rectangle, five-pointed star etc..According to the interactive environment image data that radar eye measurement module detects, utilize Image binaryzation processing method can be extracted to obtain the boundary information of object in interactive environment, can be expressed as according to any shape A plane curve f under polar coordinates, f is a multivalued function, and meets f (0)=f (2 π) this principle, two same types Shape, two plane curve f corresponding to them1、f2Relation be shown below:
Wherein x ∈ [0,2 π], b ∈ [0,2 π], a represent the volume of two figures of same type shape or area ratio.
Thus the polar coordinates contour curve function f of regular shape can be obtained, being cut using parallel lines can be in above-mentioned pole every method On coordinate contour curve, with parallel lines r=r2Intercept a class interval, r=f (θ) is the equivalence class multivalued function of arbitrary curve, θ ∈ [0,2 π], the section intercepted are r >=r on this curvilinear function2θ positions where part.Being cut by parallel lines can obtain every method The shape distance of two shapes is obtained, the similarity of two shapes is can determine whether with this, shape distance representative function is as follows:
The function, which may be considered, judges whether two figures are similar or whether are same type of discriminant function, In, giExpression judges the polar coordinates contour curve function of shaped objects, fiThe contour curve letter of shaped objects is determined known to expression Number, I expressions on two plane curves take corresponding I height value, are equivalent to the difference of r in equivalent functions r=f (θ) respectively Value, as I values are bigger, d (fi, gi) the curve precision of two articles portrayed is higher, i.e., the similarity of two body forms is got over Height, how much related the confidence level and sample magnitude number of similarity value be, and I values are equivalent to the number of sample magnitude, in sample magnitude I Bigger, the result of shape distance function is smaller, represents that two shapes are closer.It in practical applications can be according to available accuracy needs Suitable I values and precision threshold T are chosen with the size of body form, as d (fi, giDuring)≤T, it is believed that need to judge the object of shape Body and the shape for comparing object are substantially similar.The object in interactive environment with regular shape can be known according to above method Not, classification storage then is carried out to it further according to the object recognized, it can be according to identified object only in interactive operation One id information, position and other operation informations to the object are handled, so as to accelerate the speed of data processing and interaction response Degree.
Since the identification range of current radar eye is distributed into sector, maximum identification angular range is 270 °, farthest identification away from From for 40m, therefore the maximum refresh rate of the separate unit radar eye equipment used in the interactive system is 40Hz, maximum identification range For 40m × 20m, resolution ratio is in 1cm-10cm.The identification region of usual separate unit radar eye equipment cannot cover entire interaction area, Therefore the interactive system is sequentially placed using more radar eye equipment are parallel, and the identification range of bigger is formed with cascade.For example, Two radar eye equipment cascadings form a new rectangle identification region, it is necessary to which the object interacted is located at this rectangle cog region In domain, the width of rectangle identification region and highly relevant, the length of rectangle identification region and the cascade of cascade device overlapping region The recognizable angular range of the position of equipment, distance from each other and cascade device is related.The rectangle identification region can As shown in Fig. 2, in Fig. 2, the recognizable interaction area of one radar eye equipment formation of A expressions, i.e., the region that left arc surrounds, B represents the recognizable interaction area that another radar eye equipment is formed, i.e., the region that right circular arc surrounds, and C represents two radars The new recognizable interaction area that eye equipment cascading is formed, i.e., the region that rectangular wire frame surrounds.
Cascade device region needs the space bit for recording the identification number of each radar eye equipment and each radar eye equipment Confidence ceases.In C regions, according to the image data of cascade radar eye equipment sensing scanning generation, denoising is carried out to it, is obtained Obtaining interest object in C regions, can basis in data processing module compared with the distance and angle information of cascade radar eye equipment The identification number for cascading radar eye equipment obtains the spatial positional information of corresponding radar eye equipment, and the radar radiated by radar eye is penetrated Line can obtain the absolute spatial position information of interest object with horizontal plane angle parameter, for moving object, by analyzing not The movement locus of interest object can be obtained with interest object space location information in data frame, obtained space coordinates is converted For can touch control display module screen coordinate, by data outputting module be output to outside can touch control display module shown.
Particular embodiments described above has carried out into one the purpose of this utility model, technical solution and advantageous effect Step is described in detail, it should be understood that the foregoing is merely specific embodiment of the utility model, is not limited to this Utility model, within the spirit and principle of the utility model, any modification, equivalent substitution, improvement and etc. done should all wrap It is contained within the scope of protection of the utility model.

Claims (6)

1. a kind of augmented reality interactive system based on radar eye, which is characterized in that the system comprises at least one radar eyes Equipment, touchable screen, communication module and data module, wherein:
The radar eye equipment is connected with communication module;
The touchable screen is connected with communication module;
The communication module and radar eye equipment, can touch liquid crystal screen and data module be connected;
The data module is connected with communication module.
2. system according to claim 1, which is characterized in that at least one radar eye equipment cascading is placed.
3. system according to claim 1 or 2, which is characterized in that at least one radar eye equipment is parallel to put successively It puts.
4. system according to claim 1, which is characterized in that the communication module is for near-field communication module and/or far Distance communication module.
5. system according to claim 1, which is characterized in that the data module includes data input module, at data Module and data outputting module are managed, wherein:
The data input module is connected by communication module with radar eye equipment and touchable screen;
The data processing module is connected with data input module;
The data outputting module is connected with data processing module.
6. system according to claim 5, which is characterized in that the data processing module includes:It is image processing circuit, emerging Interesting image identification circuit, feature decision circuitry, data processing circuit, wherein:
The image of interest identification circuit is connected with image processing circuit;
The feature decision circuitry is connected with image of interest identification circuit;
The data processing circuit is connected with feature decision circuitry.
CN201720994909.2U 2017-08-10 2017-08-10 Augmented reality interactive system based on radar eye Active CN207380667U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201720994909.2U CN207380667U (en) 2017-08-10 2017-08-10 Augmented reality interactive system based on radar eye

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201720994909.2U CN207380667U (en) 2017-08-10 2017-08-10 Augmented reality interactive system based on radar eye

Publications (1)

Publication Number Publication Date
CN207380667U true CN207380667U (en) 2018-05-18

Family

ID=62342477

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201720994909.2U Active CN207380667U (en) 2017-08-10 2017-08-10 Augmented reality interactive system based on radar eye

Country Status (1)

Country Link
CN (1) CN207380667U (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111773658A (en) * 2020-07-03 2020-10-16 珠海金山网络游戏科技有限公司 Game interaction method and device based on computer vision library
CN112904999A (en) * 2020-12-30 2021-06-04 江苏奥格视特信息科技有限公司 Augmented reality somatosensory interaction method and system based on laser radar

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111773658A (en) * 2020-07-03 2020-10-16 珠海金山网络游戏科技有限公司 Game interaction method and device based on computer vision library
CN111773658B (en) * 2020-07-03 2024-02-23 珠海金山数字网络科技有限公司 Game interaction method and device based on computer vision library
CN112904999A (en) * 2020-12-30 2021-06-04 江苏奥格视特信息科技有限公司 Augmented reality somatosensory interaction method and system based on laser radar

Similar Documents

Publication Publication Date Title
US10394334B2 (en) Gesture-based control system
Sun et al. Magichand: Interact with iot devices in augmented reality environment
US9207771B2 (en) Gesture based user interface
Lee et al. Vision‐Based Finger Action Recognition by Angle Detection and Contour Analysis
US20180088663A1 (en) Method and system for gesture-based interactions
CN103353935B (en) A kind of 3D dynamic gesture identification method for intelligent domestic system
US10592002B2 (en) Gesture sequence recognition using simultaneous localization and mapping (SLAM) components in virtual, augmented, and mixed reality (xR) applications
CN108229324A (en) Gesture method for tracing and device, electronic equipment, computer storage media
US9122353B2 (en) Kind of multi-touch input device
CN111259751A (en) Video-based human behavior recognition method, device, equipment and storage medium
WO2013028279A1 (en) Use of association of an object detected in an image to obtain information to display to a user
CN105229582A (en) Based on the gestures detection of Proximity Sensor and imageing sensor
CN104049760B (en) The acquisition methods and system of a kind of man-machine interaction order
Linqin et al. Dynamic hand gesture recognition using RGB-D data for natural human-computer interaction
CN207380667U (en) Augmented reality interactive system based on radar eye
Wang et al. Research on gesture recognition method based on computer vision
CN113378836A (en) Image recognition method, apparatus, device, medium, and program product
Chong et al. Hand Gesture recognition using appearance features based on 3D point cloud
Nguyen et al. Hand segmentation and fingertip tracking from depth camera images using deep convolutional neural network and multi-task segnet
Simion et al. Finger detection based on hand contour and colour information
Pradhan et al. Design of intangible interface for mouseless computer handling using hand gestures
CN115993927A (en) Screen display method, device, equipment and storage medium based on gesture recognition
Ke et al. [Retracted] A Visual Human‐Computer Interaction System Based on Hybrid Visual Model
Feng et al. FM: Flexible mapping from one gesture to multiple semantics
Xu et al. Arm removal for static hand gesture recognition

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20230904

Address after: Room 2639, 2nd Floor, Building 17, No. 17 Tianzhu Jiayuan, Tianzhu Town, Shunyi District, Beijing, 101312

Patentee after: BEIJING SHENMIGU DIGI-TECH Co.,Ltd.

Address before: 215100 No. 58 Nantian Cheng Road, Suzhou high speed railway, Jiangsu

Patentee before: SUZHOU MANAVR DIGITAL TECHNOLOGY CO.,LTD.

TR01 Transfer of patent right