CN107357422A - Video camera projection interaction touch control method, device and computer-readable recording medium - Google Patents

Video camera projection interaction touch control method, device and computer-readable recording medium Download PDF

Info

Publication number
CN107357422A
CN107357422A CN201710506544.9A CN201710506544A CN107357422A CN 107357422 A CN107357422 A CN 107357422A CN 201710506544 A CN201710506544 A CN 201710506544A CN 107357422 A CN107357422 A CN 107357422A
Authority
CN
China
Prior art keywords
hand
image
video camera
finger
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710506544.9A
Other languages
Chinese (zh)
Other versions
CN107357422B (en
Inventor
宋呈群
程俊
方璡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Institute of Advanced Technology of CAS
Shenzhen Tencent Computer Systems Co Ltd
Original Assignee
Shenzhen Institute of Advanced Technology of CAS
Shenzhen Tencent Computer Systems Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Institute of Advanced Technology of CAS, Shenzhen Tencent Computer Systems Co Ltd filed Critical Shenzhen Institute of Advanced Technology of CAS
Priority to CN201710506544.9A priority Critical patent/CN107357422B/en
Publication of CN107357422A publication Critical patent/CN107357422A/en
Application granted granted Critical
Publication of CN107357422B publication Critical patent/CN107357422B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Position Input By Displaying (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The present invention relates to area of pattern recognition, there is provided a kind of video camera projection interaction touch control method, device and computer-readable recording medium, high-precision gesture identification when realizing video camera projection interaction touch-control with cheap cost.Methods described includes:The hand and hand shadows image of user is extracted in the image captured using background subtraction and frame-to-frame differences method from video camera;By judging that whether finger tip and finger tip shade merge in hand and hand shadows image, judge whether the finger of user contacts with projection screen;If the finger of user contacts with projection screen, it is determined that the position of touch when finger of user contacts with the projection screen.On the one hand technical scheme provided by the invention only needs single camera to can be achieved with touch-control to judge for the hand motion extraction of interaction, higher gesture identification precision is also can guarantee that under the photoenvironment of complexity;On the other hand, it is not necessary to expensive picture pick-up device, so as to reduce the cost of implementation of video camera projection interaction touch-control.

Description

Video camera-projection interaction touch control method, device and computer-readable recording medium
Technical field
The invention belongs to field of human-computer interaction, more particularly to a kind of video camera-projection interaction touch control method, device and calculating Machine readable storage medium storing program for executing.
Background technology
Video camera-projection interaction touch-control system can Computer display picture project arbitrary plane and allow user to pass through Single-handedly manipulate computer, it provided the user it is a kind of using finger and computer can lively natural interactive mode, example Such as, virtual keyboard, drag object are clicked on, file is opened, stirs webpage etc..
At present, video camera-projection interaction touch control method has many kinds.A kind of existing video camera-projection interaction touch-control Method is to utilize a depth camera to extract hand region from scene around to be used to identify gesture.However, due to depth camera It is expensive, and due to the interference of ambient illumination, therefore, above-mentioned existing video camera-projection interacts touch control method, its institute The video camera being used in-projection interaction touch-control system is not only with high costs, and precision is not high.
Urgently industry solves above-mentioned technical problem.
The content of the invention
The present invention provides a kind of video camera-projection interaction touch control method, device and computer-readable recording medium, with cheap Cost high-precision gesture identification when realizing video camera-projection interaction touch-control.
First aspect present invention provides a kind of video camera-projection interaction touch control method, applied to computer, projector, takes the photograph The video camera of camera and projection screen composition-projection interaction touch-control system, methods described include:
The hand and hand shadows of user is extracted in the image captured using background subtraction and frame-to-frame differences method from video camera Image;
By judging that whether finger tip and finger tip shade merge in hand and the hand shadows image, judge the user's Whether finger contacts with projection screen;
If the finger of the user contacts with the projection screen, it is determined that the finger of the user and the projection screen Position of touch during contact.
Second aspect of the present invention provides a kind of video camera-projection interaction contactor control device, applied to computer, projector, takes the photograph The video camera of camera and projection screen composition-projection interaction touch-control system, described device include:
Extraction module, the hand for extraction user in the image that capture using background subtraction and frame-to-frame differences method from video camera Portion and hand shadows image;
Judge module, for by judging whether finger tip and finger tip shade merge in hand and the hand shadows image, Judge whether the finger of the user contacts with projection screen;
Determining module, if the finger for the user contacts with the projection screen, it is determined that the finger of the user Position of touch when being contacted with the projection screen.
Third aspect present invention provides a kind of terminal device, including memory, processor and storage are in memory And the computer program that can be run on a processor, following steps are realized during computing device computer program:
The hand and hand shadows of user is extracted in the image captured using background subtraction and frame-to-frame differences method from video camera Image;
By judging that whether finger tip and finger tip shade merge in hand and the hand shadows image, judge the user's Whether finger contacts with projection screen;
If the finger of the user contacts with the projection screen, it is determined that the finger of the user and the projection screen Position of touch during contact.
The fourth aspect of the embodiment of the present invention provides a kind of computer-readable recording medium, computer-readable recording medium Computer program is stored with, following steps are realized when computer program is executed by processor:
The hand and hand shadows of user is extracted in the image captured using background subtraction and frame-to-frame differences method from video camera Image;
By judging that whether finger tip and finger tip shade merge in hand and the hand shadows image, judge the user's Whether finger contacts with projection screen;
If the finger of the user contacts with the projection screen, it is determined that the finger of the user and the projection screen Position of touch during contact.
It was found from the technical scheme that the invention described above provides, on the one hand, only need single camera to be taken the photograph without multiple Camera is that can be achieved with touch-control to judge for the hand motion extraction of interaction, by improved multiframe differences method, in the light of complexity According to also can guarantee that higher gesture identification precision under environment;On the other hand, it is not necessary to expensive depth camera or high-resolution CCD camera, so as to reduce the cost of implementation of video camera-projection interaction touch-control.
Brief description of the drawings
Technical scheme in order to illustrate the embodiments of the present invention more clearly, below will be to embodiment or description of the prior art In the required accompanying drawing used be briefly described, it should be apparent that, drawings in the following description be only the present invention some Embodiment, for those of ordinary skill in the art, can also be according to these accompanying drawings under the premise of creative work is not paid Obtain other accompanying drawings.
Fig. 1 is the implementation process schematic diagram of video camera provided in an embodiment of the present invention-projection interaction touch control method;
Fig. 2 is the structural representation of video camera provided in an embodiment of the present invention-projection interaction contactor control device;
Fig. 3 is the structural representation for video camera-projection interaction contactor control device that another embodiment of the present invention provides;
Fig. 4 is the structural representation for video camera-projection interaction contactor control device that another embodiment of the present invention provides;
Fig. 5 is the structural representation of terminal device provided in an embodiment of the present invention.
Embodiment
In describing below, in order to illustrate rather than in order to limit, it is proposed that such as tool of particular system structure, technology etc Body details, thoroughly to understand the embodiment of the present invention.However, it will be clear to one skilled in the art that there is no these specific The present invention can also be realized in the other embodiments of details.In other situations, omit to well-known system, device, electricity Road and the detailed description of method, in case unnecessary details hinders description of the invention.
In order to illustrate technical scheme, illustrated below by specific embodiment.
Accompanying drawing 1 is the implementation process schematic diagram of video camera provided in an embodiment of the present invention-projection interaction touch control method, can be answered For video camera-projection interaction touch-control system of computer, projector, video camera and projection screen composition, mainly including following step Rapid S101 to S103, it is described further below:
S101, the hand and hand of user are extracted in the image captured using background subtraction and frame-to-frame differences method from video camera Shadow image.
Unlike the prior art, the present invention is only to need a video camera rather than at least two video cameras to catch when realizing Obtain image.It should be noted that " user " in the embodiment of the present invention refers to using hand in video camera-projection interaction touch control method Finger removes touch projection screen, so as to going the people interacted with project content.As one embodiment of the invention, using background subtraction Extracted with frame-to-frame differences method from video camera capture images user hand and hand shadows image can as follows S1011 to S1014 is realized:
S1011, the hand and hand shadows image of user are extracted in the image captured using frame-to-frame differences method from video camera, is obtained To n image zooming-out result of frame-to-frame differences method, wherein, n is the integer more than 1.
First, the image that camera captures is mapped with projected image by geometric correction method, the step of geometric correction It is rapid as follows:1st, a width chessboard table images are loaded from computer and are sent to projector;2nd, projector projects gridiron pattern picture Projection screen;3rd, projection screen picture is obtained using a fixed video camera;4th, gridiron pattern on video camera capturing visual is detected Angle point;5th, the 3 of both images are calculated using the position of the angle point detected and X-comers in corresponding projected image × 3 homography matrix H;6th, the image captured using 3 × 3 homography matrix H to video camera carries out geometric correction.In above-mentioned geometry In 4th step of correction, X-comers detection method is as follows:1st, RGB image is converted into gray level image;2nd, gray-scale map is searched out The edge of picture, 3, the straight line using Hough transition detection edge images;4th, quadrangle is gone out using above-mentioned fitting a straight line;5th, find The angle point of quadrangle.
Secondly, the pixel for belonging to hand and hand shadows is judged.If image is directly incident upon on screen, video camera The current frame image of capture should be identical with the prior image frame that video camera captures;If there is a hand before projection screen, project The projection image reflectivity of screen can be changed, and the reflectivity changes of pixel [x, y] can be calculated by reflectivity a [x, y] Obtain:
Wherein, IkBe video camera capture image gray-scale map, Ik-1Be video camera capture, IkThe gray scale of previous frame image Figure, Ik[x, y] is pixel [x, y] in gray-scale map IkIn gray value, generally also illustrate that kth frame image in itself, Ik-1[x, y] is picture Plain [x, y] is in gray-scale map Ik-1In gray value, generally also illustrate that the previous frame image i.e. two field picture of kth -1 sheet of kth frame image Body.
If pixel [x, y] meets condition
a[x,y]<1-s, or a [x, y]>1+s
Then pixel [x, y] belongs to the pixel of hand and hand shadows, and in above formula, s is reflectivity serious forgiveness, and s's is usual Span is [0.5,0.8].
Finally, hand and hand shadows image are extracted using the frame-to-frame differences of adjacent two field pictures.In the embodiment of the present invention Using multiple image Ik[x,y]、Ik-5[x,y]、Ik-10[x, y] and Ik-15[x, y] is used to do frame-to-frame differences, the hand of frame-to-frame differences method extraction Portion and hand shadows image, it is assumed that n=3, then 3 image zooming-out results for obtaining frame-to-frame differences method are as follows:
Wherein, alIt is current frame image Ik[x, y] and prior image frame Ik-1Reflectivity between [x, y].
S1012, the hand and hand shadows image of user are extracted in the image captured using background subtraction from video camera, Obtain n image zooming-out result of background subtraction.
Background subtraction and frame-to-frame differences method are substantially similar, and simply the contrast object of two field picture is different, i.e. Background difference It is by the way that real-time frame and background image are contrasted to detect moving region.In embodiments of the present invention, will be continually changing Projected image is as background image.As above it is similar, it is assumed that n=3, then according to background image and the video frame images of geometric correction Between difference, 3 image zooming-out results for obtaining background subtraction are as follows:
Wherein, abIt is current frame image IkThe reflectivity of [x, y] between background image.
S1013, by n image zooming-out result of the frame-to-frame differences method obtained through step S1011 respectively with being obtained through step S1012 To n image zooming-out result of background subtraction carry out logic "and" operation and obtain n centre of hand and hand shadows carrying Take image.
By taking n=3 as an example, by n image zooming-out result of the frame-to-frame differences method obtained through step S1011 respectively with through step N image zooming-out result of the background subtraction that S1012 is obtained carries out logic "and" operation and obtains 3 of hand and hand shadows Centre extraction image is as follows:
D1=Tk5(x,y)∩Tk' (x, y), D2=Tk10(x,y)∩Tk' (x, y), D3=Tk15(x,y)∩Tk'(x,y)
S1014, n middle extraction image of the hand obtained through step S1013 and hand shadows is subjected to logical "or" Computing, the extraction result of the result D of logical "or" computing as hand and hand shadows image.
It is by D by taking above-mentioned n=3 as an example1、D2And D3Logical "or" computing is carried out, the result D of logical "or" computing is as hand Portion and the extraction result of hand shadows image.
Often have that fraction color or gray scale are similar due to noise jamming be present, and between target and background image, two-value Often contain many isolated points and hole in the image obtained after change, these can disturbed motion target detection, therefore, Need, by corrosion and expansion process, isolated point to be removed, and small holes are filled, i.e., it is above-mentioned to be obtained through step S1013 Hand and n of hand shadows among extraction image after logical "or" computing obtains result D, include D progress two Value, morphological erosion and expansion process, for example, D first can be carried out into binary conversion treatment, and then morphological erosion is carried out again And expansion process, D carry out the result after binaryzation, morphological erosion and expansion process as hand and hand shadows image most Extraction result eventually.
S102, whether merged by finger tip in the hand and hand shadows image of the user for judging to extract and finger tip shade, Judge whether the finger of user contacts with projection screen.
As one embodiment of the invention, by judging whether finger tip and finger tip shade melt in hand and hand shadows image Close, judge whether the finger of user contacts with projection screen and S1021 to S1023 can realize as follows:
S1021, build vertical linearity model.
S1022, the hand extracted using vertical linearity model scanning through step S101 and hand shadows image.
S1023, if step S1022 scanning processes are from beginning to end, there is 0 limited pixel portion of middle part in vertical linearity model Point, and the gray value of two end pixels of limited 0 pixel portion in the middle part is 255, it is determined that the finger and projection screen of user Curtain contact, otherwise, it determines the finger of user does not contact with projection screen.
S103, if the finger of user contacts with projection screen, it is determined that the touching when finger of user contacts with projection screen Control position.
Specifically, if the finger of user contacts with projection screen, it is determined that when the finger of user contacts with projection screen Position of touch can be:If user is located at the right side of projection screen, left column in the hand extracted and hand shadows image Foreground pixel is defined as the pixel of finger tip image, and the foreground pixel position of left column connects for the finger of user with projection screen Position of touch when touching;If user is located at the left side of projection screen, right column in the hand extracted and hand shadows image Foreground pixel is defined as the pixel of finger tip image, and the foreground pixel position of right column connects for the finger of user with projection screen Position of touch when touching.
It was found from video camera-projection interaction touch control method of the above-mentioned example of accompanying drawing 1, on the one hand, only need single camera It is that can be achieved with touch-control to judge for the hand motion extraction of interaction without multiple video cameras, passes through improved more frame-to-frame differences Method, higher gesture identification precision is also can guarantee that under the photoenvironment of complexity;On the other hand, it is not necessary to expensive depth camera Machine or high resolution CCD camera, so as to reduce the cost of implementation of video camera-projection interaction touch-control.
Fig. 2 is the schematic diagram of video camera provided in an embodiment of the present invention-projection interaction contactor control device, main to include extraction mould Block 201, judge module 202 and determining module 203, describe in detail as follows:
Extraction module 201, for extracting user in the image that is captured using background subtraction and frame-to-frame differences method from video camera Hand and hand shadows image;
Judge module 202, for the hand and hand shadows image middle finger of the user extracted by judging extraction module 201 Whether point and finger tip shade merge, and judge whether the finger of user contacts with projection screen;
Determining module 203, if the finger for user contacts with projection screen, it is determined that the finger and projection screen of user Position of touch during contact.
It should be noted that device provided in an embodiment of the present invention, due to being based on same structure with the inventive method embodiment Think, its technique effect brought is identical with the inventive method embodiment, and particular content can be found in the inventive method embodiment Narration, here is omitted.
The extraction module 201 of the example of accompanying drawing 2 can patrol including the first extraction unit 301, the second extraction unit 302, first Volume ALU 304 of arithmetic element 303 and second, such as video camera-projection interaction contactor control device of the example of accompanying drawing 3, wherein:
First extraction unit 301, for extracted in the image that capture from video camera using frame-to-frame differences method the hand of user with Hand shadows image, n image zooming-out result of frame-to-frame differences method is obtained, wherein, n is the integer more than 1;
Second extraction unit 302, for extracting the hand of user from the capture images of video camera using background subtraction With hand shadows image, n image zooming-out result of background subtraction is obtained;
First ALU 303, for by n image zooming-out result of frame-to-frame differences method respectively with background subtraction N image zooming-out result carries out logic "and" operation and obtains n middle extraction image of hand and hand shadows;
Second ALU 304, for extraction image among n of hand and hand shadows to be carried out into logical "or" Computing, the extraction result of the result D of logical "or" computing as hand and hand shadows image.
The video camera of the example of accompanying drawing 3-projection interaction contactor control device can also include processing module 401, such as the example of accompanying drawing 4 Video camera-projection interaction contactor control device.The D that processing module 401 is used to obtain the second ALU 304 carries out two-value Change, morphological erosion and expansion process, the result of binaryzation, morphological erosion and expansion process is as hand and hand shadows figure The final extraction result of picture.
The judge module 202 of the example of accompanying drawing 2 can include model construction unit, scanning element and determining unit, wherein:
Model construction unit, for building vertical linearity model;
Scanning element, for using vertical linearity model scanning hand and hand shadows image;
Determining unit, if for scanning element scanning process from beginning to end, vertical linearity model has limited 0 in middle part Pixel portion, and the gray value of two end pixels of limited 0 pixel portion in the middle part is 255, it is determined that the finger of user with Projection screen contacts, otherwise, it determines the finger of user does not contact with projection screen.
The determining module 203 of the example of accompanying drawing 2 can include the first position of touch determining unit and the second position of touch determines Unit, wherein:
First position of touch determining unit, if being located at the right side of projection screen for user, the hand and hand extracted The foreground pixel of left column is defined as the pixel of finger tip image in shadow image, and the foreground pixel position of left column is user Position of touch of finger when being contacted with projection screen;
Second position of touch determining unit, if being located at the left side of projection screen for user, the hand and hand extracted The foreground pixel of right column is defined as the pixel of finger tip image in shadow image, and the foreground pixel position of right column is user Position of touch of finger when being contacted with projection screen.
Fig. 5 is the structural representation for the terminal device that one embodiment of the invention provides.As shown in figure 5, the end of the embodiment End equipment 5 includes:Processor 50, memory 51 and it is stored in the computer that can be run in memory 51 and on processor 50 Program 52, such as the program of video camera-projection interaction touch control method.Processor 50 realizes above-mentioned take the photograph when performing computer program 52 Step in camera-projection interaction touch control method embodiment, such as the step S101 to S103 shown in Fig. 1.Or processor 50 The function of each module/unit in above-mentioned each device embodiment, such as extraction module shown in Fig. 2 are realized when performing computer program 52 201st, the function of judge module 202 and determining module 203.
Exemplary, the computer program 52 of video camera-projection interaction touch control method mainly includes:Using background subtraction The hand and hand shadows image of user is extracted in the image captured with frame-to-frame differences method from video camera;By judging hand and hand Whether finger tip and finger tip shade merge in shadow image, judge whether the finger of user contacts with projection screen;If the hand of user Finger contacts with projection screen, it is determined that the position of touch when finger of user contacts with projection screen.Computer program 52 can be with It is divided into one or more module/units, one or more module/unit is stored in memory 51, and by handling Device 50 performs, to complete the present invention.One or more module/units can be the series of computation machine that can complete specific function Programmed instruction section, the instruction segment are used to describe implementation procedure of the computer program 52 in computing device 5.For example, computer journey Sequence 52 can be divided into the function of extraction module 201, judge module 202 and determining module 203 (module in virtual bench), Each module concrete function is as follows:Extraction module 201, for the image captured using background subtraction and frame-to-frame differences method from video camera The hand and hand shadows image of middle extraction user;Judge module 202, for by judging hand and hand shadows image middle finger Whether point and finger tip shade merge, and judge whether the finger of user contacts with projection screen;Determining module 203, if for user Finger contacted with projection screen, it is determined that the position of touch when finger of user contacts with projection screen.
Terminal device 5 may include but be not limited only to processor 50, memory 51.It will be understood by those skilled in the art that Fig. 5 The only example of terminal device 5, the restriction to terminal device 5 is not formed, can included than illustrating more or less portions Part, some parts or different parts are either combined, such as terminal device can also include input-output equipment, network connects Enter equipment, bus etc..
Alleged processor 50 can be CPU (Central Processing Unit, CPU), can also be Other general processors, digital signal processor (Digital Signal Processor, DSP), application specific integrated circuit (Application Specific Integrated Circuit, ASIC), ready-made programmable gate array (Field- Programmable Gate Array, FPGA) either other PLDs, discrete gate or transistor logic, Discrete hardware components etc..General processor can be microprocessor or the processor can also be any conventional processor Deng.
Memory 51 can be the internal storage unit of terminal device 5, such as the hard disk or internal memory of terminal device 5.Storage Device 51 can also be the plug-in type hard disk being equipped with the External memory equipment of Relay Server 5, such as terminal device 5, intelligently deposit Card storage (Smart Media Card, SMC), secure digital (Secure Digital, SD) card, flash card (Flash Card) Deng.Further, memory 51 can also both include the internal storage unit of terminal device 5 or including External memory equipment.Deposit Reservoir 51 is used to store computer program and other programs and data needed for terminal device.Memory 51 can be also used for temporarily When store the data that has exported or will export.
It is apparent to those skilled in the art that for convenience of description and succinctly, only with above-mentioned each work( Can unit, module division progress for example, in practical application, can be as needed and by above-mentioned function distribution by different Functional unit, module are completed, i.e., the internal structure of device are divided into different functional units or module, to complete above description All or part of function.Each functional unit, module in embodiment can be integrated in a processing unit or Unit is individually physically present, can also two or more units it is integrated in a unit, above-mentioned integrated unit Both it can be realized, can also be realized in the form of SFU software functional unit in the form of hardware.In addition, each functional unit, mould The specific name of block is not limited to the protection domain of the application also only to facilitate mutually distinguish.It is single in said system Member, the specific work process of module, may be referred to the corresponding process in preceding method embodiment, will not be repeated here.
In the above-described embodiments, the description to each embodiment all emphasizes particularly on different fields, and is not described in detail or remembers in some embodiment The part of load, it may refer to the associated description of other embodiments.
Those of ordinary skill in the art are it is to be appreciated that the list of each example described with reference to the embodiments described herein Member and algorithm steps, it can be realized with the combination of electronic hardware or computer software and electronic hardware.These functions are actually Performed with hardware or software mode, application-specific and design constraint depending on technical scheme.Professional and technical personnel Described function can be realized using distinct methods to each specific application, but this realization is it is not considered that exceed The scope of the present invention.
In embodiment provided by the present invention, it should be understood that disclosed device/terminal device and method, can be with Realize by another way.For example, device described above/terminal device embodiment is only schematical, for example, mould The division of block or unit, only a kind of division of logic function, can there is other dividing mode when actually realizing, such as multiple Unit or component can combine or be desirably integrated into another system, or some features can be ignored, or not perform.It is another Point, shown or discussed mutual coupling or direct-coupling or communication connection can be by some interfaces, device or The INDIRECT COUPLING of unit or communication connection, can be electrical, mechanical or other forms.
The unit illustrated as separating component can be or may not be physically separate, be shown as unit Part can be or may not be physical location, you can with positioned at a place, or can also be distributed to multiple networks On unit.Some or all of unit therein can be selected to realize the purpose of this embodiment scheme according to the actual needs.
In addition, each functional unit in each embodiment of the present invention can be integrated in a processing unit, can also That unit is individually physically present, can also two or more units it is integrated in a unit.Above-mentioned integrated list Member can both be realized in the form of hardware, can also be realized in the form of SFU software functional unit.
If integrated module/unit is realized in the form of SFU software functional unit and is used as independent production marketing or use When, it can be stored in a computer read/write memory medium.Based on such understanding, the present invention realizes above-described embodiment side All or part of flow in method, by computer program the hardware of correlation can also be instructed to complete, video camera-projection is handed over The computer program of mutual touch control method can be stored in a computer-readable recording medium, and the computer program is held by processor During row, can be achieved above-mentioned each embodiment of the method the step of, i.e. captured using background subtraction and frame-to-frame differences method from video camera The hand and hand shadows image of user is extracted in image;By judging finger tip and finger tip shade in hand and hand shadows image Whether merge, judge whether the finger of user contacts with projection screen;If the finger of user contacts with projection screen, it is determined that uses The position of touch when finger at family contacts with projection screen.Wherein, computer program includes computer program code, computer journey Sequence code can be source code form, object identification code form, executable file or some intermediate forms etc..Computer-readable medium It can include:Any entity or device, recording medium, USB flash disk, mobile hard disk, magnetic disc, the light of computer program code can be carried Disk, computer storage, read-only storage (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), electric carrier signal, telecommunication signal and software distribution medium etc..It is it should be noted that computer-readable The content that medium includes can carry out appropriate increase and decrease according to legislation in jurisdiction and the requirement of patent practice, such as at certain A little jurisdictions, electric carrier signal and telecommunication signal are not included according to legislation and patent practice, computer-readable medium.More than Embodiment is merely illustrative of the technical solution of the present invention, rather than its limitations;Although the present invention is carried out with reference to the foregoing embodiments Detailed description, it will be understood by those within the art that:It still can be to the skill described in foregoing embodiments Art scheme is modified, or carries out equivalent substitution to which part technical characteristic;And these modifications or replacement, do not make phase Answer the essence of technical scheme to depart from the spirit and scope of various embodiments of the present invention technical scheme, should be included in the protection of the present invention Within the scope of.

Claims (10)

  1. A kind of 1. video camera-projection interaction touch control method, applied to taking the photograph for computer, projector, video camera and projection screen composition Camera-projection interaction touch-control system, it is characterised in that methods described includes:
    The hand and hand shadows image of user is extracted in the image captured using background subtraction and frame-to-frame differences method from video camera;
    By judging that whether finger tip and finger tip shade merge in hand and the hand shadows image, judge the finger of the user Whether contacted with projection screen;
    If the finger of the user contacts with the projection screen, it is determined that the finger of the user contacts with the projection screen When position of touch.
  2. 2. video camera as claimed in claim 1-projection interaction touch control method, it is characterised in that it is described using background subtraction and The hand and hand shadows image of user is extracted in the image that frame-to-frame differences method captures from video camera, including:
    The hand and hand shadows image of user is extracted in the image captured using the frame-to-frame differences method from the video camera, is obtained N image zooming-out result of the frame-to-frame differences method, the n are the integer more than 1;
    Hand and hand shadows image using the background subtraction from the middle extraction user of the video camera capture images, are obtained To n image zooming-out result of the background subtraction;
    N image zooming-out result of the n image zooming-out result of the frame-to-frame differences method respectively with the background subtraction is carried out Logic "and" operation obtains n middle extraction image of hand and hand shadows;
    N middle extraction image of the hand and hand shadows is subjected to logical "or" computing, the logical "or" computing As a result extraction results of the D as the hand and hand shadows image.
  3. 3. video camera as claimed in claim 2-projection interaction touch control method, it is characterised in that methods described also includes:
    The D is subjected to binaryzation, morphological erosion and expansion process, the binaryzation, morphological erosion and expansion process As a result the final extraction result as the hand and hand shadows image.
  4. 4. video camera as claimed in claim 1-projection interaction touch control method, it is characterised in that described by judging the hand Whether merged with finger tip in hand shadows image and finger tip shade, judge whether the finger of the user contacts with projection screen, Including:
    Build vertical linearity model;
    Using hand and hand shadows image described in the vertical linearity model scanning;
    If scanning process is from beginning to end, there is 0 limited pixel portion of middle part in the vertical linearity model, and the middle part has The gray value of two end pixels of 0 pixel portion of limit is 255, it is determined that the finger of the user contacts with projection screen, no Then, determine that the finger of the user does not contact with projection screen.
  5. 5. video camera as claimed in claim 1-projection interaction touch control method, it is characterised in that if the finger of the user Contacted with the projection screen, it is determined that the position of touch when finger of the user contacts with the projection screen, including:
    If the user is located at the right side of the projection screen, left column in the hand and hand shadows image of the extraction Foreground pixel is defined as the pixel of finger tip image, the finger and institute of the foreground pixel position of the left column for the user State position of touch during projection screen contact;
    If the user is located at the left side of the projection screen, right column in the hand and hand shadows image of the extraction Foreground pixel is defined as the pixel of finger tip image, the finger and institute of the foreground pixel position of the right column for the user State position of touch during projection screen contact.
  6. A kind of 6. video camera-projection interaction contactor control device, applied to taking the photograph for computer, projector, video camera and projection screen composition Camera-projection interaction touch-control system, it is characterised in that described device includes:
    Extraction module, for extracted in the image that capture using background subtraction and frame-to-frame differences method from video camera the hand of user with Hand shadows image;
    Judge module, for by judging whether finger tip and finger tip shade merge in hand and the hand shadows image, judging Whether the finger of the user contacts with projection screen;
    Determining module, if the finger for the user contacts with the projection screen, it is determined that the finger of the user and institute State position of touch during projection screen contact.
  7. 7. video camera as claimed in claim 6-projection interaction contactor control device, it is characterised in that the extraction module includes:
    First extraction unit, for extracted in the image that capture from the video camera using the frame-to-frame differences method hand of user with Hand shadows image, obtains n image zooming-out result of the frame-to-frame differences method, and the n is the integer more than 1;
    Second extraction unit, for extracting the hand of user from the capture images of the video camera using the background subtraction With hand shadows image, n image zooming-out result of the background subtraction is obtained;
    First ALU, for by n image zooming-out result of the frame-to-frame differences method respectively with the background subtraction N image zooming-out result carry out logic "and" operation and obtain extracting image among n of hand and hand shadows;
    Second ALU, for extraction image among n of the hand and hand shadows to be carried out into logical "or" fortune Calculate, the extraction result of the result D of the logical "or" computing as the hand and hand shadows image.
  8. 8. video camera as claimed in claim 7-projection interaction contactor control device, it is characterised in that described device also includes:
    Processing module, for the D to be carried out into binaryzation, morphological erosion and expansion process, the binaryzation, morphological erosion With the final extraction result of the result of expansion process as the hand and hand shadows image.
  9. 9. a kind of terminal device, including memory, processor and it is stored in the memory and can be on the processor The computer program of operation, it is characterised in that realize such as claim 1 to 5 described in the computing device during computer program The step of any one methods described.
  10. 10. a kind of computer-readable recording medium, the computer-readable recording medium storage has computer program, and its feature exists In realization is such as the step of Claims 1-4 any one methods described when the computer program is executed by processor.
CN201710506544.9A 2017-06-28 2017-06-28 Camera-projection interactive touch control method, device and computer readable storage medium Active CN107357422B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710506544.9A CN107357422B (en) 2017-06-28 2017-06-28 Camera-projection interactive touch control method, device and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710506544.9A CN107357422B (en) 2017-06-28 2017-06-28 Camera-projection interactive touch control method, device and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN107357422A true CN107357422A (en) 2017-11-17
CN107357422B CN107357422B (en) 2023-04-25

Family

ID=60272598

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710506544.9A Active CN107357422B (en) 2017-06-28 2017-06-28 Camera-projection interactive touch control method, device and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN107357422B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108874030A (en) * 2018-04-27 2018-11-23 努比亚技术有限公司 Wearable device operating method, wearable device and computer readable storage medium
CN109375833A (en) * 2018-09-03 2019-02-22 深圳先进技术研究院 A kind of generation method and equipment of touch command
CN112882563A (en) * 2019-11-29 2021-06-01 中强光电股份有限公司 Touch projection system and method thereof
CN114020145A (en) * 2021-09-30 2022-02-08 联想(北京)有限公司 Method, device and equipment for interacting with digital content and readable storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101409004A (en) * 2008-11-24 2009-04-15 浙江大学 Safety defense monitoring method based on Symbian intelligent mobile phone platform
CN101719015A (en) * 2009-11-03 2010-06-02 上海大学 Method for positioning finger tips of directed gestures
US20120249422A1 (en) * 2011-03-31 2012-10-04 Smart Technologies Ulc Interactive input system and method
CN102799875A (en) * 2012-07-25 2012-11-28 华南理工大学 Tracing method of arbitrary hand-shaped human hand
CN104167006A (en) * 2014-07-10 2014-11-26 华南理工大学 Gesture tracking method of any hand shape
CN105023231A (en) * 2015-07-23 2015-11-04 四川数智通软件有限责任公司 Bus data acquisition method based on video recognition and cell phone GPS
CN106774846A (en) * 2016-11-24 2017-05-31 中国科学院深圳先进技术研究院 Alternative projection method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101409004A (en) * 2008-11-24 2009-04-15 浙江大学 Safety defense monitoring method based on Symbian intelligent mobile phone platform
CN101719015A (en) * 2009-11-03 2010-06-02 上海大学 Method for positioning finger tips of directed gestures
US20120249422A1 (en) * 2011-03-31 2012-10-04 Smart Technologies Ulc Interactive input system and method
CN102799875A (en) * 2012-07-25 2012-11-28 华南理工大学 Tracing method of arbitrary hand-shaped human hand
CN104167006A (en) * 2014-07-10 2014-11-26 华南理工大学 Gesture tracking method of any hand shape
CN105023231A (en) * 2015-07-23 2015-11-04 四川数智通软件有限责任公司 Bus data acquisition method based on video recognition and cell phone GPS
CN106774846A (en) * 2016-11-24 2017-05-31 中国科学院深圳先进技术研究院 Alternative projection method and device

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108874030A (en) * 2018-04-27 2018-11-23 努比亚技术有限公司 Wearable device operating method, wearable device and computer readable storage medium
CN109375833A (en) * 2018-09-03 2019-02-22 深圳先进技术研究院 A kind of generation method and equipment of touch command
CN112882563A (en) * 2019-11-29 2021-06-01 中强光电股份有限公司 Touch projection system and method thereof
CN114020145A (en) * 2021-09-30 2022-02-08 联想(北京)有限公司 Method, device and equipment for interacting with digital content and readable storage medium

Also Published As

Publication number Publication date
CN107357422B (en) 2023-04-25

Similar Documents

Publication Publication Date Title
JP7357998B2 (en) Image processing methods, smart devices and computer programs
CN107357422A (en) Video camera projection interaction touch control method, device and computer-readable recording medium
CN113012059B (en) Shadow elimination method and device for text image and electronic equipment
EP2863362B1 (en) Method and apparatus for scene segmentation from focal stack images
EP3819820B1 (en) Method and apparatus for recognizing key identifier in video, device and storage medium
CN108596944A (en) A kind of method, apparatus and terminal device of extraction moving target
CN110070551B (en) Video image rendering method and device and electronic equipment
CN112200187A (en) Target detection method, device, machine readable medium and equipment
CN108961183B (en) Image processing method, terminal device and computer-readable storage medium
CN107688824A (en) Picture match method and terminal device
CN110047122A (en) Render method, apparatus, electronic equipment and the computer readable storage medium of image
CN113095106A (en) Human body posture estimation method and device
CN107209556B (en) System and method for processing depth images capturing interaction of an object relative to an interaction plane
CN112367559B (en) Video display method and device, electronic equipment, server and storage medium
CN107851308A (en) system and method for identifying target object
CN109920018A (en) Black-and-white photograph color recovery method, device and storage medium neural network based
CN110991310A (en) Portrait detection method, portrait detection device, electronic equipment and computer readable medium
CN111144337A (en) Fire detection method and device and terminal equipment
CN105049706A (en) Image processing method and terminal
CN110503704A (en) Building method, device and the electronic equipment of three components
CN106774846B (en) Alternative projection method and device
CN114119964A (en) Network training method and device, and target detection method and device
CN114758268A (en) Gesture recognition method and device and intelligent equipment
CN113139905B (en) Image stitching method, device, equipment and medium
CN114373078A (en) Target detection method and device, terminal equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant