CN108170277A - A kind of device and method of intelligent visual interaction - Google Patents
A kind of device and method of intelligent visual interaction Download PDFInfo
- Publication number
- CN108170277A CN108170277A CN201810014215.7A CN201810014215A CN108170277A CN 108170277 A CN108170277 A CN 108170277A CN 201810014215 A CN201810014215 A CN 201810014215A CN 108170277 A CN108170277 A CN 108170277A
- Authority
- CN
- China
- Prior art keywords
- terminal
- unit
- screen
- radar
- processor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
Abstract
The invention discloses a kind of devices of intelligent visual interaction, including first terminal, first processor, radar installations and/or several second terminals, first terminal is the target terminal of interactive controlling, it includes a kind of display screen unit, processor unit and/or screen representation and transmission unit, first processor includes equipment access service unit, parameter configuration unit, coordinate calculating unit, storage unit and gesture identification computing unit, radar installations includes radar cell, radar power supply unit, radar pulse transmitting and receiving unit and radar signal collecting unit, second terminal includes display screen unit and processor unit.The invention belongs to human-computer interaction device technical fields, in order to overcome the intelligent interaction of the above-mentioned screen of non-touch-control in the prior art, the technical problems such as human-computer interaction virtual visualization are solved, the present invention proposes a kind of device and method of intelligent visual interaction, realizes accurate and real-time human-computer interactive control.
Description
Technical field
The invention belongs to human-computer interaction device technical field, intelligent interconnection more particularly, to a kind of Internet of Things field and mutually
Dynamic technology, in particular to a kind of device and method of intelligent visual interaction.
Background technology
It is man-machine between input equipments and subject processor that existing human-computer interaction device includes early stage traditional keyboard, mouse etc.
The human-computer interaction of touching device is loaded in interaction and smart machine.For example, the China that notification number is CN201510567852 is real
With new patent, " a kind of equipment is every empty control method " is disclosed, by 60GHz millimetre-wave radars to the gesture motion of hand
Measure and calculate, be integrated into wearable device, user can be detached from stylus and Touch Screen to equipment carry out input and
Control.It is such the interactive mode of wearable device to be integrated by radar there are following defects:First, radar range finding has certain
Blind area does not acquire signal in a certain range, within more than ten centimetres as, specifically refer to Radar Technology specification and work
Cheng Shixian, therefore, for integrating the radar installations of wearable device and the distance of gesture in the range of the blind area of radar or hand
Gesture action in multi-point interaction coordinate and radar installations origin in the case that collinear, then such scheme for
This upper described two kinds of situations can not be identified correctly;Second, for measuring the distance of hand motion by pulsed radar signal
Information, continuous wave radar measure the velocity information of hand and the fusion of the two information is embodied in the VR videos of wearable device broadcasting
In spatial perception, be extremely difficult to it is accurate with it is synchronous, therefore lead to a degree of lag, the dizzy sense that VR can be brought to experience;
Third, the technical solution can not accomplish the visual presentation of visible i.e. gained, and interactive controlling experience is poor.
In addition, presently, there are conventional input device and control method can not meet Internet of Things development the needs of, such as
Traditional mouse, keyboard or stylus etc. need the puzzlement of additional carrying and physics wiring;Based on screen touch control manner
The equipment such as smart mobile phone and tablet, are relatively limited to device screen size and touch-control finger, when having moisture or sweat on human hand
When liquid, the positioning of gesture and recognition efficiency degradation.Therefore, it is necessary to a kind of input equipment devices can break away from current intelligence
The limitation of energy equipment touch-control interaction, to meet wider application scenarios demand.
Invention content
In order to overcome the intelligent interaction of the above-mentioned screen of non-touch-control in the prior art, the technologies such as human-computer interaction virtual visualization are solved
Problem, the present invention propose a kind of device and method of intelligent visual interaction, realize accurate and real-time human-computer interactive control.
The technical solution that the present invention takes is as follows:The device of intelligent visual interaction of the present invention, including first terminal, first
Processor, radar installations and/or several second terminals;
The first terminal is the target terminal of interactive controlling, including a kind of display screen unit, processor unit and/or
Screen representation and transmission unit;The first terminal is connected with first processor by serial ports and/or network;Processor unit
It is responsible for serial ports and/or network communication, calculation processing and interaction content is presented;Its serial ports and/or network communication and processing module packet
It includes and receives interactive controlling command unit;Calculation processing module is responsible for obtaining logical screen resolution sizes and reports to the first processing
Device and the execution and processing for implementing interactive instruction;Logical screen resolution ratio is including horizontally and vertically size, unit are pixels;It hands over
Mutual content rendering module is responsible for the rendering of interaction content, to be shown in first terminal display screen;Screen representation and transmission unit are born
Duty calculates in user interface operable element and/or by image or the entire user interface content of Video coding, with
Both upper two ways select one to be transmitted to first processor;
The first processor includes equipment access service unit, parameter configuration unit, coordinate calculating unit, storage unit
With gesture identification computing unit;The equipment access service unit is responsible for accessing first terminal and at least more than one second end
End;The parameter configuration unit is responsible for receiving the logical resolution size of first terminal, receives at least more than one second terminal
Physical screen resolution sizes and receive radar installations assembly parameter information, be issued to first processor system implementation
It calculates;The coordinate calculating unit is according to the logical resolution size of first terminal, the resolution ratio of the physical screen of second terminal
Size, the location type of radar installations assembling point and the relative position coordinates for second terminal physical screen left upper apex, meter
Calculate coordinate points of each coordinate points of the coordinate set of processing hand motion in the case where the logical resolution of first terminal corresponds to;It is described
Storage unit storage processing coordinate calculating unit is different coordinate point set on continuous time as a result, and pre-defined
The corresponding action meaning of information data and gesture of gesture;Gesture identification computing unit is integrated into according to the coordinate points of storage unit
Change on different time axis, with reference to the gesture information that storage unit pre-defines, calculate current gesture motion type;
The radar installations includes radar cell, radar power supply unit, radar pulse transmitting and receiving unit and radar letter
Number collecting unit;The radar cell includes radar physical hardware main body, electric motor module and serial ports or network access module,
The working frequency that electric motor module is used to specify with one, which rotates, completes the scanning work of certain angle range degree of freedom
Unit, serial ports or network access module provide radar installations and first processor control signal and radar measured data letter
The network communication and transmission of breath;The radar power supply unit is the Power Entry Module for driving radar module work;The radar
Pulse transmitting and receiving unit receives the signal being reflected back for sending laser pulse signal, to reach measure hand motion
Distance;The radar signal collecting unit is used to acquire each point coordinates of hand motion, and calculating forms coordinate set;
The second terminal includes display screen unit and processor unit;Second terminal passes through serial ports and/or network communication
Mode is connected with radar installations, while second terminal and first processor are in network connection;The display screen unit first
Function is to show the content for the user interface that first processor sends over first terminal, and the second function is such described aobvious
Display screen is also used for human-computer interaction input control, and the first function provides the virtual interactive interface control of visible i.e. gained for the second function
System;The interior of the user interface for the first terminal that first processor sends over is responsible for receiving and be handled to the processor unit
Hold, calibration and the assembly parameter information for obtaining radar installations, obtain the coordinate set of hand motion;The assembling of the radar installations
The position of parameter information instruction radar installations assembling point is for the relative position coordinates of second terminal physical screen left upper apex.
Further, the preserving jar include material inlet, material outlet and large granular materials outlet, the material into
Filter screen is equipped between mouth and material outlet, the large granular materials outlet is set between material inlet and filter screen.It will fill out
It fills composition granule to put into from the import of preserving jar, filler particle can effectively stop the larger filler of grain size by filter screen filtration
Particle enters in material accelerator, prevents machine jams out of service, and large granular materials outlet in turn ensures that grain size is larger
Filler is discharged in time, effectively reduces the energy consumption of parabolic belt feeder.
The invention also discloses a kind of method of intelligent visual interaction, this method uses radar shadown compensation technique, solution
The occlusion issue of linear type multi-point interaction certainly of the prior art, and interaction area accurately covers, and without any dead angle, supports thunder
Any position around second terminal display screen is mounted on up to device, particularly with extra-large LED mosaic screen, breaks away from equipment mounting portion
The place of administration and the limitation in space.
Further, the scanning range region of the method concrete foundation radar installations, the scanning of the radar installations are penetrated
Journey Range Representation into:[Dmin,Dmax];The assembly parameter information of the radar installations of the second terminal calibration, indicates thunder
Up to the position of device assembling point for the relative position coordinates (x0, y0) of second terminal physical screen left upper apex, the assembling
The relative position coordinates of the opposite second terminal screen of point interactively enter control screen size and radar by covering second terminal
The scanning range region [Dmin, Dmax] of device, which calculates, to be determined;Especially when the assembling of radar installations point be horizontal position installation, need
Measure and calculation determines x0 to more than Dmin sizes, and similarly, assembling point is upright position installation, and measure and calculation is needed to determine y0 extremely
More than Dmin sizes;If the maximum distance of scanning screen area beyond Dmax, need to be in screen in addition symmetrical side installation the
Two radar installations, the method, can essences to ensure to cover Scanning Detction to the interaction area of the entire second terminal display screen
It really calculates and represents to Pixel-level unit.
Further, the method makes full use of the scanning work frequency of radar installations, during interactive controlling, dynamic
The frequency acquisition of radar installations electric motor described in computation of Period, when the frequency acquisition of gained is less than human-computer interactive control setting
Target operating frequency, such as 25Hz, the second terminal adjust upward and set the rotating speed interface parameters of electric motor, so defeated
Enter to set the rotating speed of electric motor and the working frequency of actual motion, constitute the closed-loop control system of a feedback, finally
Actual motion frequency has also reached stable state, i.e., the target operating frequency of described setting;Therefore, technical scheme of the present invention is real
Now accurate human-computer interactive control and no-delay technique effect.
Further, the technique effect of a kind of intelligent visualization human-computer interaction and control, the side can be achieved in the present invention
Method is especially by the target terminal of interactive controlling described in the screen representation of first terminal and transmission unit calculation processing, i.e., first eventually
The user interface content at end, the screen representation and transmission unit of the first terminal are responsible for can in calculating user interface
It operates element and/or a transmission is selected by image/or both entire user interface contents of Video coding, both the above mode
To first processor, the first processor transmits the user interface content to second terminal, and second terminal calculating obtains
The user interface content taken, and shown on the display screen of the second terminal, to reach visual human-computer interaction
Technique effect.
Preferably, when first terminal screen resolution is more than 1920x1080, the method that the present invention uses is described
The screen representation and transmission unit of first terminal are by calculating the operable element in user interface, gray value sampling and warp
Cross image/or Video coding user interface content;When first terminal screen resolution is less than 1920x1080, described the
The screen representation and transmission unit of one terminal are by calculating the operable element in user interface, color sampling and process figure
As/or Video coding user interface content.
Further, the method supports the terminal local of multiple non-touch-controls and/or place remote to access jointly to same
The interaction manipulation of target terminal, method can be applied to the competitive interactions application of multiple non-touch-control equipment;What first processor included
Equipment access service unit is responsible for accessing first terminal and at least more than one second terminal, realizes that multiple second terminals are handed over simultaneously
The mutually target device of control first terminal;In multiple second terminal access interaction service systems, sent respectively to first processor
Access service message is registered, registration access service message includes the resolution sizes and second terminal of the physical screen of second terminal
The assembly parameter information of the radar installations of calibration, the resolution sizes instruction second terminal of the physical screen of second terminal are shown
The interaction area of display screen, the position of the assembly parameter information instruction radar installations assembling point of the radar installations of second terminal calibration
Put the relative position coordinates (x0, y0) for second terminal physical screen left upper apex.First processor returns to described first eventually
The screen representation Method type instruction ginseng that the logical resolution size at end and the screen representation and transmission unit of first terminal calculate
The data flow of number and screen representation;The logical resolution size of second terminal acquisition first terminal, the screen representation of first terminal
Method type indicates parameter and screen representation data flow, carries out calculation processing and renders output to second terminal display screen;Interaction
In the process, the coordinate set of second terminal acquisition hand motion is transmitted to the first processor, first processor storage and conjunction
And the coordinate set of the multiple second terminal synchronization is handled, coordinate calculating unit calculates seat different on continuous time
Punctuate set as a result, and pre-defined gesture information data and the corresponding action meaning of gesture, matching primitives it is current
Gesture motion type, current gesture motion type real-time Transmission to first terminal calculates and performs interaction by first processor
Instruction.
What the present invention obtained using the above structure has the beneficial effect that:The device of this programme intelligent visual interaction and side
Method, first, technical scheme of the present invention uses radar shadown compensation technique, solves linear type multi-point interaction of the prior art
Occlusion issue, and interaction area accurately covers, and without any dead angle, radar installations is supported to be mounted on around second terminal display screen
Any position, particularly with extra-large LED mosaic screen, break away from the place of equipment installation and deployment and the limitation in space.Second, this hair
Bright technical solution realizes accurate human-computer interactive control and no-delay.Third, technical scheme of the present invention support non-touch-control to show
Shield the visual intelligent interaction of terminal What You See Is What You Get, realize touch and/or interact manipulation, natural interaction experience every empty formula
It is good.4th, the present invention supports the terminal local of multiple non-touch-controls and/or the place remote access friendship to same target terminal jointly
Mutually manipulation, technical solution can be applied to the competitive interactions application field of multiple non-touch-control equipment.
Description of the drawings
Fig. 1 is the apparatus module schematic diagram of intelligent visual of the present invention interaction;
Fig. 2 is another embodiment schematic diagram of device of intelligent visual of the present invention interaction;
Fig. 3 is the structure diagram of radar installations installation and deployment of the present invention;
Fig. 4 is the flow diagram of present invention visualization human-computer interaction.
Wherein, 1, first terminal, 2, first processor, 3, radar installations, 4, second terminal, 5, radar installations installation site
Region line.
Specific embodiment
With reference to attached drawing, the present invention is described in further details.
As shown in Figure 1, the device of intelligent visual human-computer interaction of the present invention, including first terminal 1, first processor 2, thunder
Up to device 3 and/or at least more than one second terminal 4.
Embodiment 1, as shown in Figure 1, first terminal 1 is the target terminal of interactive controlling, including a kind of display screen unit,
Processor unit and/or screen representation and transmission unit.The first terminal 1 and first processor 2 pass through serial ports and/or network
It is connected.Such display screen unit does not support non-touch-control to interact, display screen liquid crystal display such as LED, LCD, CRT and IPS,
The curtain that projection device carries mosaic screen that either white wall etc. or the multiple Liquid Crystal Modules of relevant materials form etc., in existing skill
Touch-control is not supported to input under the conditions of art.Processor unit is responsible for serial ports and/or network communication, calculation processing and interaction content is in
Existing, serial ports and/or network communication and processing module include receiving interactive controlling command unit.Calculation processing module is responsible for acquisition
Logical screen resolution sizes and the execution and processing for reporting to first processor 2 and implementation interactive instruction.Logical screen is differentiated
Rate is including horizontally and vertically size, unit are pixels.Interaction content is presented module and is responsible for the rendering of interaction content, to be shown in the
One terminal display screen.Screen representation and transmission unit be responsible for calculate user interface in operable element and/or by image/
Or both entire user interface contents of Video coding, both the above mode select one to be transmitted to first processor 2.
First processor 2 include equipment access service unit, parameter configuration unit, coordinate calculating unit, storage unit and
Gesture identification computing unit etc..Equipment access service unit is responsible for accessing first terminal 1 and at least one second terminal 4.It is above-mentioned
Parameter configuration unit is responsible for receiving the logical resolution size of first terminal 1, receives the physics of at least more than one second terminal 4
The resolution sizes of screen and the assembly parameter information for receiving radar installations 3 are issued to 2 system of first processor and implement to calculate.
Above-mentioned coordinate calculating unit according to the logical resolution size of first terminal, the resolution sizes of the physical screen of second terminal 4,
The location type and the relative position coordinates for 4 physical screen left upper apex of second terminal of the assembling point of radar installations 3, at calculating
Manage coordinate points of each coordinate points of the coordinate set of hand motion in the case where the logical resolution of first terminal 1 corresponds to.Storage is single
Member storage processing coordinate calculating unit is different coordinate point set on continuous time as a result, and pre-defined gesture
Information data and the corresponding action meaning of gesture.When gesture identification computing unit is integrated into different according to the coordinate points of storage unit
Change on countershaft, with reference to the gesture information that storage unit pre-defines, calculate current gesture motion type.
Radar installations 3 is adopted including radar cell, radar power supply unit, radar pulse transmitting and receiving unit and radar signal
Collect unit.Radar cell includes radar physical hardware main body, electric motor module and serial ports or network access module, electric motor
Module is used for the working frequency specified with one and rotates the unit of the scanning work for completing certain angle range degree of freedom, string
Mouth or network access module provide radar installations 3 and above-mentioned first processor 2 controls signal and the net of radar measured data information
Network communicates and transmission.Above-mentioned radar power supply unit is the Power Entry Module for driving radar module work.Above-mentioned radar pulse hair
It send with receiving unit for sending laser pulse signal, and receives the signal being reflected back, to reach the distance for measuring hand motion.
Above-mentioned radar signal collecting unit is used to acquire each point coordinates of hand motion, and calculating forms coordinate set.
Second terminal 4 includes display screen unit and processor unit.Second terminal 4 passes through serial ports and/or network communication side
Formula is connected with radar installations, while second terminal 4 and first processor 2 are in network connection.The first function of display screen unit
It is the content for showing the user interface that first processor sends over first terminal, the second function is such display screen
Human-computer interaction input control is also used for, the first function provides the virtual interactive interface control of visible i.e. gained for the second function.Place
The content of the user interface of first terminal 1 that first processor 2 sends over is responsible for receiving and be handled to reason device unit, demarcates
With the assembly parameter information for obtaining radar installations, the coordinate set of hand motion is obtained.The assembly parameter information of radar installations 3 refers to
Show relative position coordinates of the position for second terminal physical screen left upper apex of radar installations assembling point.
In above-mentioned 4 access interaction service system of multiple second terminals, send registration respectively to the first processor 2 and connect
Enter service message, the registration access service message includes the resolution sizes of physical screen of second terminal 4 and second terminal 4
The assembly parameter information of the above-mentioned radar installations 3 of calibration, the resolution sizes instruction second of the physical screen of the second terminal 4
The interaction area of 4 display screen of terminal, the assembly parameter information instruction radar for the above-mentioned radar installations 3 that the second terminal 4 is demarcated
The position of device assembling point is for the relative position coordinates (x0, y0) of 4 physical screen left upper apex of second terminal.At described first
Reason device 2 returns to the logical resolution size of the first terminal 1 and the screen representation and transmission unit of the first terminal 1 calculate
Screen representation Method type instruction parameter and screen representation data flow.The second terminal 4 obtains above-mentioned first terminal 1
Logical resolution size, the screen representation Method type instruction parameter of first terminal 1 and screen representation data flow, carry out at calculating
Reason and rendering are exported to 4 display screen of second terminal.In interactive process, the second terminal 4 obtains the coordinate set of hand motion
It is transmitted to the first processor 2, the storage of first processor 2 and the multiple 4 synchronization of second terminal of merging treatment
Coordinate set, coordinate calculating unit calculates different coordinate point set on continuous time as a result, and pre-defined
The corresponding action meaning of information data and gesture of gesture, the current gesture motion type of matching primitives, the first processor 2
Current gesture motion type real-time Transmission to the first terminal 1 is calculated and performed interactive instruction.
Embodiment 2, physically for installation and deployment around first terminal 1, first terminal 1 passes through serial ports to radar installations 3 nearby
And/or network communication mode is connected with radar installations 3, while first terminal 1 and first processor 2 are in network connection, such as
Shown in Fig. 2.First terminal 1 includes display screen unit and processor unit, and the display screen unit of first terminal 1 does not support non-touch-control
Interaction, display screen liquid crystal display such as LED, LCD, CRT and IPS, projection device carry curtain either white wall etc. or
Mosaic screen of the multiple Liquid Crystal Module compositions of relevant materials etc., does not support touch-control to input under prior art conditions.On first terminal 1
Report point of the assembly parameter information of radar installations 3, the logical resolution size of first terminal 1 and the physical screen of first terminal 1
The information such as resolution size are to first processor 2, and in interactive process, the coordinate set that the first terminal 1 obtains hand motion passes
The first processor 2 is transported to, the first processor 2 stores and calculate the coordinate set of first terminal 1, coordinate calculating unit
Calculate it is different coordinate point set on continuous time as a result, and the information data and gesture of pre-defined gesture correspond to
Action meaning, the current gesture motion type of matching primitives, the first processor 2 is real-time by current gesture motion type
It is transmitted to the first terminal 1 and calculates and perform interactive instruction.Touch control operation is carried out on the display screen of first terminal 1, it can be real
Now directly to the human-computer interactive control of first terminal 1.Additionally preferably, transparent material is installed before the display screen of first terminal 1
Partition board or device, respective diaphragms or device and radar installations 3 are in same plane, and the technical solution of the present embodiment realizes non-connects
Touch first terminal 1 display screen every empty interactive experience.
Embodiment 3, the structure diagram of 3 installation and deployment of radar installations, as shown in Figure 3.
According to the scanning range region of radar installations 3, the scanning range region of the radar installations 3 is expressed as:[Dmin,
Dmax].The assembly parameter information for the above-mentioned radar installations 3 that above-mentioned second terminal 4 is demarcated, the position of instruction radar installations 3 assembling point
Put the relative position coordinates (x0, y0) for second terminal physical screen left upper apex, the opposite second terminal of the assembling point
The relative position coordinates of screen by covering second terminal 4 interactively enter control screen size and radar installations 3 scanning range
Range [Dmin, Dmax], which calculates, to be determined.Preferably, it is contemplated that the scan blind spot of radar and specifications parameter, the assembling of radar installations 3
Point can be selected in radar installations installation site region line 5 and to be installed at the perimeter of this line, radar assembling promise interaction circle
Minimum scanning distance is Dmin in the regional extent of face, and maximum scanning distance then need to be in the other symmetrical side of screen beyond Dmax
Second radar installations 3 is installed, ensures that radar covers Scanning Detction to the interaction area of entire 4 display screen of second terminal.
Embodiment 4 visualizes the flow diagram of human-computer interaction, such as Fig. 4.
When 1 screen resolution of first terminal is more than 1920x1080, the technical solution adopted by the present invention is described first whole
The screen representation and transmission unit at end are by calculating the operable element in user interface, after gray value sampling processing, warp
Cross image/or Video coding user interface content;When first terminal screen resolution is less than 1920x1080, described the
The screen representation and transmission unit of one terminal 1 are by calculating the operable element in user interface, after color sampling processing,
By image/or Video coding user interface content.First processor 2 returns to the logical resolution of the first terminal 1
The screen representation Method type instruction parameter and screen table that the screen representation and transmission unit of size and the first terminal 1 calculate
The data flow shown.The second terminal 4 obtain the logical resolution size of above-mentioned first terminal 1, first terminal 1 screen representation
Method type indicates parameter and screen representation data flow, carries out calculation processing and render to export to 4 display screen of second terminal, so as to
Reach the technique effect of visual human-computer interaction.
The present invention and embodiments thereof are described above, this description is no restricted, attached shown in figure
Only one of embodiments of the present invention, practical scheme are not limited thereto.All in all if the ordinary skill of this field
Personnel are enlightened by it, without departing from the spirit of the invention, are not inventively designed and the technical solution phase
As scheme mode and embodiment, be within the scope of protection of the invention.
Claims (7)
1. a kind of device of intelligent visual interaction, which is characterized in that including first terminal, first processor, radar installations and/
Or several second terminals;
The first terminal is the target terminal of interactive controlling, including a kind of display screen unit, processor unit and/or screen
Expression and transmission unit;The first terminal is connected with first processor by serial ports and/or network;Processor unit is responsible for
Serial ports and/or network communication, calculation processing and interaction content are presented;Its serial ports and/or network communication and processing module include connecing
Receive interactive controlling command unit;Calculation processing module is responsible for obtaining logical screen resolution sizes and reports to first processor,
With the execution and processing for implementing interactive instruction;Logical screen resolution ratio is including horizontally and vertically size, unit are pixels;In interaction
Hold and the rendering that module is responsible for interaction content is presented, to be shown in first terminal display screen;Screen representation and transmission unit are responsible for meter
Calculate in user interface operable element and/or by image or the entire user interface content of Video coding, more than two
Both kind modes select one to be transmitted to first processor;
The first processor includes equipment access service unit, parameter configuration unit, coordinate calculating unit, storage unit and hand
Gesture identifies computing unit;The equipment access service unit is responsible for accessing first terminal and at least more than one second terminal;Institute
The logical resolution size that parameter configuration unit is responsible for receiving first terminal is stated, receives the physics of at least more than one second terminal
The resolution sizes of screen and the assembly parameter information for receiving radar installations are issued to first processor system and implement to calculate;Institute
Coordinate calculating unit is stated according to the logical resolution size of first terminal, the resolution sizes of the physical screen of second terminal, thunder
Location type up to device assembling point and the relative position coordinates for second terminal physical screen left upper apex, calculation processing hand
Coordinate points of each coordinate points of the coordinate set of portion's action in the case where the logical resolution of first terminal corresponds to;The storage unit
Storage processing coordinate calculating unit it is different coordinate point set on continuous time as a result, and pre-defined gesture letter
Cease data and the corresponding action meaning of gesture;Gesture identification computing unit is integrated into different time according to the coordinate points of storage unit
Change on axis, with reference to the gesture information that storage unit pre-defines, calculate current gesture motion type;
The radar installations is adopted including radar cell, radar power supply unit, radar pulse transmitting and receiving unit and radar signal
Collect unit;The radar cell includes radar physical hardware main body, electric motor module and serial ports or network access module, motor
Motor module is used for the working frequency specified with one and rotates the list of the scanning work for completing certain angle range degree of freedom
Member, serial ports or network access module provide radar installations and first processor control signal and radar measured data information
Network communication and transmission;The radar power supply unit is the Power Entry Module for driving radar module work;The radar pulse
Transmitting and receiving unit receives the signal being reflected back for sending laser pulse signal, with reach measure hand motion away from
From;The radar signal collecting unit is used to acquire each point coordinates of hand motion, and calculating forms coordinate set;
The second terminal includes display screen unit and processor unit;Second terminal passes through serial ports and/or network communication mode
It is connected with radar installations, while second terminal and first processor are in network connection;First function of display screen unit
It is the content for showing the user interface that first processor sends over first terminal, the second function is such display screen
Human-computer interaction input control is also used for, the first function provides the virtual interactive interface control of visible i.e. gained for the second function;Institute
The content that the user interface for the first terminal that first processor sends over is responsible for receiving and be handled to processor unit is stated, is marked
Assembly parameter information that is fixed and obtaining radar installations obtains the coordinate set of hand motion;The assembly parameter of the radar installations
The position of information instruction radar installations assembling point is for the relative position coordinates of second terminal physical screen left upper apex.
A kind of 2. method of intelligent visual interaction, which is characterized in that this method uses radar shadown compensation technique, solves existing
The occlusion issue of linear type multi-point interaction in technology, and interaction area accurately covers, and without any dead angle, supports radar installations
The field of equipment installation and deployment particularly with extra-large LED mosaic screen, is broken away from any position around second terminal display screen
Ground and the limitation in space.
3. a kind of method of intelligent visual interaction according to claim 2, it is characterised in that:The method concrete foundation
The scanning range region of radar installations, the scanning range region of the radar installations are expressed as:[Dmin,Dmax];Described second
The assembly parameter information of the radar installations of terminal calibration, the position of instruction radar installations assembling point is for second terminal physics
The relative position coordinates (x0, y0) of screen left upper apex, the relative position coordinates of the opposite second terminal screen of the assembling point
It is calculated really by the scanning range region [Dmin, Dmax] for interactively entering control screen size and radar installations of covering second terminal
It is fixed;Especially when the assembling of radar installations point be horizontal position installation, measure and calculation is needed to determine x0 to more than Dmin sizes, equally
Ground, assembling point is upright position installation, and measure and calculation is needed to determine y0 to more than Dmin sizes;Scan the maximum distance of screen area
If beyond Dmax, need in screen, in addition the second radar installations is installed in symmetrical side, the method is to ensure covering scanning inspection
The interaction area of the entire second terminal display screen is measured, accurately can calculate and represent Pixel-level unit.
4. a kind of method of intelligent visual interaction according to claim 3, it is characterised in that:The method makes full use of
The scanning work frequency of radar installations, during interactive controlling, cycle dynamics calculate adopting for the radar installations electric motor
Collect frequency, when the frequency acquisition of gained is less than the target operating frequency of human-computer interactive control setting, the second terminal is to up-regulation
Whole and setting electric motor rotating speed interface parameters, so the work frequency of the rotating speed of input setting electric motor and actual motion
Rate constitutes the closed-loop control system of a feedback, and final actual motion frequency has also reached stable state, i.e., described setting
Target operating frequency.
5. a kind of method of intelligent visual interaction according to claim 2, it is characterised in that:The method especially by
The user of the target terminal of interactive controlling described in the screen representation of first terminal and transmission unit calculation processing, i.e. first terminal hands over
Mutual interface content, the screen representation and transmission unit of the first terminal are responsible for calculating operable element in user interface,
And/or one is selected to be transmitted at first by image/or both entire user interface contents of Video coding, both the above mode
Device is managed, the first processor transmits the user interface content to second terminal, and second terminal calculates the user obtained
Interactive interface content, and shown on the display screen of the second terminal, to reach the technique effect of visual human-computer interaction.
6. a kind of method of intelligent visual interaction according to claim 5, it is characterised in that:When first terminal screen point
When resolution is more than 1920x1080, the method that the present invention uses is that the screen representation of the first terminal and transmission unit pass through meter
The operable element in user interface is calculated, gray value samples and pass through image/or Video coding user interface content;
When first terminal screen resolution is less than 1920x1080, the screen representation and transmission unit of the first terminal pass through calculating
Operable element in user interface, color sampling and process image/or Video coding user interface content.
7. a kind of method of intelligent visual interaction according to claim 2, it is characterised in that:The method is supported multiple
Terminal local and/or the place remote access of non-touch-control jointly manipulate the interaction of same target terminal, and method can be applied to more
The competitive interactions application of a non-touch-control equipment;The equipment access service unit that first processor includes be responsible for access first terminal and
At least more than one second terminal realizes the target device of multiple second terminals while interactive controlling first terminal;Multiple second
In terminal access interaction service system, send registration access service message respectively to first processor, register access service message
The assembly parameter information of the radar installations of resolution sizes and the second terminal calibration of physical screen including second terminal,
The interaction area of the resolution sizes instruction second terminal display screen of the physical screen of second terminal, second terminal are demarcated described
The position of the assembly parameter information instruction radar installations assembling point of radar installations is for second terminal physical screen left upper apex
Relative position coordinates (x0, y0);First processor returns to the logical resolution size of the first terminal and the screen of first terminal
Scene plot shows the screen representation Method type instruction parameter calculated with transmission unit and the data flow of screen representation;Second terminal obtains
The logical resolution size of first terminal, the screen representation Method type instruction parameter of first terminal and screen representation data flow,
It carries out calculation processing and renders output to second terminal display screen;In interactive process, second terminal obtains the coordinate of hand motion
Set is transmitted to the seat of the first processor, first processor storage and the multiple second terminal synchronization of merging treatment
Mark set, coordinate calculating unit calculate different coordinate point set on continuous time as a result, and pre-defined gesture
Information data and the corresponding action meaning of gesture, the current gesture motion type of matching primitives, first processor will be current
Gesture motion type real-time Transmission calculates and performs interactive instruction to first terminal.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810014215.7A CN108170277B (en) | 2018-01-08 | 2018-01-08 | Intelligent visual interaction device and method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810014215.7A CN108170277B (en) | 2018-01-08 | 2018-01-08 | Intelligent visual interaction device and method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108170277A true CN108170277A (en) | 2018-06-15 |
CN108170277B CN108170277B (en) | 2020-12-11 |
Family
ID=62517662
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810014215.7A Active CN108170277B (en) | 2018-01-08 | 2018-01-08 | Intelligent visual interaction device and method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108170277B (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109104635A (en) * | 2018-08-31 | 2018-12-28 | 四川长虹电器股份有限公司 | The method and system of instant delivery screen picture |
CN109828695A (en) * | 2018-12-29 | 2019-05-31 | 合肥金诺数码科技股份有限公司 | A kind of large-screen interactive system based on laser radar positioning |
CN109975799A (en) * | 2019-03-13 | 2019-07-05 | 谭伟 | A kind of method and its system of radar identification material |
CN109975769A (en) * | 2019-03-13 | 2019-07-05 | 谭伟 | It is a kind of for showing interactive radar module and its display exchange method |
CN110058727A (en) * | 2019-03-13 | 2019-07-26 | 谭伟 | A kind of interactive system and its method of integrated radar |
CN110111785A (en) * | 2019-04-29 | 2019-08-09 | 深圳前海微众银行股份有限公司 | Exchange and interdynamic method, apparatus, equipment and computer readable storage medium |
CN114063821A (en) * | 2021-11-15 | 2022-02-18 | 深圳市海蓝珊科技有限公司 | Non-contact screen interaction method |
WO2023273517A1 (en) * | 2021-07-02 | 2023-01-05 | 深圳Tcl新技术有限公司 | Screen control method and apparatus, storage medium, and terminal device |
WO2023011296A1 (en) * | 2021-08-04 | 2023-02-09 | 北京字跳网络技术有限公司 | Interaction method, electronic device, storage medium and program product |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101055193A (en) * | 2006-04-12 | 2007-10-17 | 株式会社日立制作所 | Noncontact input operation device for in-vehicle apparatus |
CN102445694A (en) * | 2011-09-20 | 2012-05-09 | 中南大学 | Navigation robot obstacle detection method and system |
CN205485300U (en) * | 2016-03-22 | 2016-08-17 | 智合新天(北京)传媒广告股份有限公司 | 3D holographically projected demonstrates platform alternately |
KR20170030980A (en) * | 2015-09-10 | 2017-03-20 | 엘지전자 주식회사 | Watch-type mobile terminal and operating method thereof |
WO2017111358A1 (en) * | 2015-12-24 | 2017-06-29 | Samsung Electronics Co., Ltd. | User terminal device, and mode conversion method and sound system for controlling volume of speaker thereof |
CN106980368A (en) * | 2017-02-28 | 2017-07-25 | 深圳市未来感知科技有限公司 | A kind of view-based access control model calculating and the virtual reality interactive device of Inertial Measurement Unit |
-
2018
- 2018-01-08 CN CN201810014215.7A patent/CN108170277B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101055193A (en) * | 2006-04-12 | 2007-10-17 | 株式会社日立制作所 | Noncontact input operation device for in-vehicle apparatus |
CN102445694A (en) * | 2011-09-20 | 2012-05-09 | 中南大学 | Navigation robot obstacle detection method and system |
KR20170030980A (en) * | 2015-09-10 | 2017-03-20 | 엘지전자 주식회사 | Watch-type mobile terminal and operating method thereof |
WO2017111358A1 (en) * | 2015-12-24 | 2017-06-29 | Samsung Electronics Co., Ltd. | User terminal device, and mode conversion method and sound system for controlling volume of speaker thereof |
CN205485300U (en) * | 2016-03-22 | 2016-08-17 | 智合新天(北京)传媒广告股份有限公司 | 3D holographically projected demonstrates platform alternately |
CN106980368A (en) * | 2017-02-28 | 2017-07-25 | 深圳市未来感知科技有限公司 | A kind of view-based access control model calculating and the virtual reality interactive device of Inertial Measurement Unit |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109104635A (en) * | 2018-08-31 | 2018-12-28 | 四川长虹电器股份有限公司 | The method and system of instant delivery screen picture |
CN109828695A (en) * | 2018-12-29 | 2019-05-31 | 合肥金诺数码科技股份有限公司 | A kind of large-screen interactive system based on laser radar positioning |
CN109828695B (en) * | 2018-12-29 | 2022-02-18 | 合肥金诺数码科技股份有限公司 | Large screen interaction system based on laser radar positioning |
CN109975799A (en) * | 2019-03-13 | 2019-07-05 | 谭伟 | A kind of method and its system of radar identification material |
CN109975769A (en) * | 2019-03-13 | 2019-07-05 | 谭伟 | It is a kind of for showing interactive radar module and its display exchange method |
CN110058727A (en) * | 2019-03-13 | 2019-07-26 | 谭伟 | A kind of interactive system and its method of integrated radar |
CN110111785A (en) * | 2019-04-29 | 2019-08-09 | 深圳前海微众银行股份有限公司 | Exchange and interdynamic method, apparatus, equipment and computer readable storage medium |
CN110111785B (en) * | 2019-04-29 | 2021-04-23 | 深圳前海微众银行股份有限公司 | Communication interaction method, device, equipment and computer readable storage medium |
WO2023273517A1 (en) * | 2021-07-02 | 2023-01-05 | 深圳Tcl新技术有限公司 | Screen control method and apparatus, storage medium, and terminal device |
WO2023011296A1 (en) * | 2021-08-04 | 2023-02-09 | 北京字跳网络技术有限公司 | Interaction method, electronic device, storage medium and program product |
CN114063821A (en) * | 2021-11-15 | 2022-02-18 | 深圳市海蓝珊科技有限公司 | Non-contact screen interaction method |
Also Published As
Publication number | Publication date |
---|---|
CN108170277B (en) | 2020-12-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108170277A (en) | A kind of device and method of intelligent visual interaction | |
CN105116922B (en) | A kind of there-dimensional laser scanning device control system | |
CN102425991B (en) | Automation storage yard laser measurement device and application method thereof | |
CN102131101A (en) | Intelligent video image quality automatic analysis system and method | |
CN103475858A (en) | Video monitoring system with cradle head three-dimensional preparatory function based on GIS (Geographic Information System) | |
CN106710001A (en) | Substation inspection robot based centralized monitoring and simulation system and method thereof | |
WO2022121911A1 (en) | Virtual inspection system and visualized factory system in augmented reality environment | |
CN108363519B (en) | Distributed infrared visual detection and projection fusion automatic correction touch display system | |
WO2012171138A1 (en) | Camera registration and video integration in 3-d geometry model | |
CN110147931A (en) | Intelligent panoramic system and intelligent panoramic power grid | |
CN201182035Y (en) | Control device of X ray fault detection machine | |
CN105610087B (en) | Power grid transmission line inspection tour system | |
CN103559809B (en) | Computer-based on-site interaction demonstration system | |
CN102891991B (en) | Automatic and continuous cruising method for cradle head | |
CN101183301A (en) | Multi-screen display process and system | |
CN105867226B (en) | Field biological detection and early warning system | |
CN103064532A (en) | Air mouse remote controller | |
CN114187414B (en) | Three-dimensional roaming inspection method and system for roadway | |
CN202885975U (en) | Three-dimensional infrared temperature measurement monitoring system of transformer substation | |
CN205105340U (en) | Interactive display device of 3D of transformer substation | |
CN106730300A (en) | A kind of intelligent shin moisturizer generation instrument | |
CN105468154A (en) | Interactive panorama display system for power system operation | |
CN201983762U (en) | Shooting and measuring device of glass pane width | |
JP2633802B2 (en) | 3D display device for weather radar images | |
CN202382708U (en) | Automatic storage yard laser measuring device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |