CN110390693A - A kind of control method, control device and terminal device - Google Patents

A kind of control method, control device and terminal device Download PDF

Info

Publication number
CN110390693A
CN110390693A CN201910577667.0A CN201910577667A CN110390693A CN 110390693 A CN110390693 A CN 110390693A CN 201910577667 A CN201910577667 A CN 201910577667A CN 110390693 A CN110390693 A CN 110390693A
Authority
CN
China
Prior art keywords
target object
image
processed
objective plane
recognized
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910577667.0A
Other languages
Chinese (zh)
Other versions
CN110390693B (en
Inventor
邓生全
宋来喜
曾宏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen World Vision Technology Co Ltd
Original Assignee
Shenzhen World Vision Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen World Vision Technology Co Ltd filed Critical Shenzhen World Vision Technology Co Ltd
Priority to CN201910577667.0A priority Critical patent/CN110390693B/en
Publication of CN110390693A publication Critical patent/CN110390693A/en
Application granted granted Critical
Publication of CN110390693B publication Critical patent/CN110390693B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0616Means for conducting or scheduling competition, league, tournaments or rankings
    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2102/00Application of clubs, bats, rackets or the like to the sporting activity ; particular sports involving the use of balls and clubs, bats, rackets, or the like
    • A63B2102/24Ice hockey
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/10Positions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/806Video cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30221Sports video; Sports image
    • G06T2207/30224Ball; Puck

Abstract

The present invention is suitable for control technology field, provides a kind of control method, control device and terminal device, the control method, comprising: obtains to be processed image of at least two frames about objective plane by camera, includes target object in the image to be processed;The image to be processed described at least two frames pre-processes, and obtains at least two frame images to be recognized;According at least two frame images to be recognized, target object is obtained in the mobile message of the objective plane;According to the mobile message, estimate the specified region of the target object and the objective plane estimate intersection point and the target object reaches the estimated time in the specified region;Intersection point and the estimated time are estimated according to described, control blocking module is moved, so that estimating intersection point described in blocking module arrival intercepts the target object.By the invention it is possible to meet demand of the people to efficiently interesting interacting activity, user experience is promoted.

Description

A kind of control method, control device and terminal device
Technical field
The invention belongs to control technology field more particularly to a kind of control methods, control device and terminal device.
Background technique
With the improvement of living standards, demand of the people to colourful recreation is also higher and higher.And inventor It is higher to place and personnel requirement it was found that various ball equal amusements generally require more people and participate in, therefore, it is desirable to can be with Efficiently interesting interacting activity is provided, to promote user experience.
Summary of the invention
In view of this, the embodiment of the invention provides a kind of control method, control device and terminal device, to meet people Demand to efficiently interesting interacting activity promotes user experience.
The first aspect of the embodiment of the present invention provides a kind of control method, comprising:
To be processed image of at least two frames about objective plane is obtained by camera, includes mesh in the image to be processed Mark object;
The image to be processed described at least two frames pre-processes, and obtains at least two frame images to be recognized;
According at least two frame images to be recognized, target object is obtained in the mobile message of the objective plane;
According to the mobile message, estimate the specified region of the target object and the objective plane estimate intersection point and The target object reaches the estimated time in the specified region;
Intersection point and the estimated time are estimated according to described, control blocking module is moved, so that the interception mould Block estimates the intersection point interception target object described in reaching.
The second aspect of the embodiment of the present invention provides a kind of control device, comprising:
Photographing module, it is described wait locate for obtaining to be processed image of at least two frames about objective plane by camera Managing includes target object in image;
First processing module is pre-processed for the image to be processed described at least two frames, is obtained at least two frames and is waited knowing Other image;
Second processing module, for it is flat in the target to obtain target object according at least two frame images to be recognized The mobile message in face;
Module is estimated, for estimating the specified area of the target object Yu the objective plane according to the mobile message Domain estimate intersection point and the target object reaches the estimated time in the specified region;
Control module, for estimating intersection point and the estimated time according to, control blocking module is moved, so that It obtains and estimates the intersection point interception target object described in the blocking module arrival.
The third aspect of the embodiment of the present invention provides a kind of terminal device, including memory, processor and is stored in In the memory and the computer program that can run on the processor, when the processor executes the computer program The step of realizing method as described above.
The fourth aspect of the embodiment of the present invention provides a kind of computer readable storage medium, the computer-readable storage Media storage has the step of computer program, the computer program realizes method as described above when being executed by processor.
Existing beneficial effect is the embodiment of the present invention compared with prior art: in the embodiment of the present invention, passing through camera To be processed image of at least two frames about objective plane is obtained, includes target object in the image to be processed;To at least two frames The image to be processed is pre-processed, and at least two frame images to be recognized are obtained;According at least two frame images to be recognized, obtain Target object is obtained in the mobile message of the objective plane;According to the mobile message, the target object and the mesh are estimated Mark plane specified region estimate intersection point and the target object reaches the estimated time in the specified region;According to described pre- Estimate intersection point and the estimated time, control blocking module is moved, so that estimating intersection point described in blocking module arrival Intercept the target object.The embodiment of the present invention can by least two frame images to be recognized, obtain it is ball etc. can be in mesh Mark mobile message of the target object in the objective plane of sliding on planes or rolling, thus estimate the target object with The intersection point in the specified region on the objective plane and the time for reaching the intersection point, to be blocked to the target object It cuts, is interacted at this point, user can be realized by pushing the target object with the blocking module, therefore, the present invention is real Interactive entertainment activity can be provided for user by applying example, have stronger interest, better user experience, practicability and ease for use It is relatively strong.
Detailed description of the invention
It to describe the technical solutions in the embodiments of the present invention more clearly, below will be to embodiment or description of the prior art Needed in attached drawing be briefly described, it should be apparent that, the accompanying drawings in the following description is only of the invention some Embodiment for those of ordinary skill in the art without creative efforts, can also be attached according to these Figure obtains other attached drawings.
Fig. 1 is the implementation process schematic diagram for the control method that the embodiment of the present invention one provides;
Fig. 2 is the implementation process schematic diagram of control method provided by Embodiment 2 of the present invention;
Fig. 3 is a kind of set-up mode of the camera provided by Embodiment 2 of the present invention;
Fig. 4 is the schematic diagram for the control device that the embodiment of the present invention three provides;
Fig. 5 is the schematic diagram for the terminal device that the embodiment of the present invention four provides.
Specific embodiment
In being described below, for illustration and not for limitation, the tool of such as particular system structure, technology etc is proposed Body details, to understand thoroughly the embodiment of the present invention.However, it will be clear to one skilled in the art that there is no these specific The present invention also may be implemented in the other embodiments of details.In other situations, it omits to well-known system, device, electricity The detailed description of road and method, in case unnecessary details interferes description of the invention.
It should be appreciated that ought use in this specification and in the appended claims, term " includes " instruction is described special Sign, entirety, step, operation, the presence of element and/or component, but be not precluded one or more of the other feature, entirety, step, Operation, the presence or addition of element, component and/or its set.
It is also understood that mesh of the term used in this description of the invention merely for the sake of description specific embodiment And be not intended to limit the present invention.As description of the invention and it is used in the attached claims, unless on Other situations are hereafter clearly indicated, otherwise " one " of singular, "one" and "the" are intended to include plural form.
It will be further appreciated that the term "and/or" used in description of the invention and the appended claims is Refer to any combination and all possible combinations of one or more of associated item listed, and including these combinations.
As used in this specification and in the appended claims, term " if " can be according to context quilt Be construed to " when ... " or " once " or " in response to determination " or " in response to detecting ".Similarly, phrase " if it is determined that " or " if detecting [described condition or event] " can be interpreted to mean according to context " once it is determined that " or " in response to true It is fixed " or " once detecting [described condition or event] " or " in response to detecting [described condition or event] ".
In order to illustrate technical solutions according to the invention, the following is a description of specific embodiments.
Fig. 1 is the implementation process schematic diagram for the control method that the embodiment of the present invention one provides.
In the embodiment of the present invention, the control method can be realized by the inclusion of the terminal device of message processing module, Message processing module in the terminal device may include one or more modules, for example, may include one or more CPU And one or more GPU, the set-up mode of the message processing module can there are many, for example, may include single-chip microcontroller.This Outside, the message processing module can also respectively with camera and mechanical arm, motor, mechanical module (such as blocking module) Other equipment or module coupling ground connection.For example, can (such as Bluetooth communication, Wi-Fi be logical by cable network, wireless network Letter, 3rd generation mobile communication technology (the 3rd Generation mobile communication technology, 3G), Fourth generation mobile communication technology (the 4th Generation mobile communication technology, 4G) or 5th third-generation mobile communication technology (the 5th Generation mobile communication technology, 5G) net Network) etc. be communicatively coupled, so as to obtain image data, and control motor drive machinery arm or other machines can be passed through Tool module.It is set it should be noted that the camera and mechanical arm, mechanical module etc. other equipment can be the terminal Standby a part, can also be used as external equipment, at this point, the external equipment can be external in the terminal device.
The control method as shown in Figure 1 may comprise steps of:
Step S101 obtains to be processed image of at least two frames about objective plane, the figure to be processed by camera It include target object as in.
In the embodiment of the present invention, the objective plane can be configured according to practical application scene.Illustratively, described Objective plane may include horizontal table top, and the horizontal table top can have boundary, and the target object can be placed on described It in horizontal table top, and can be moved in the horizontal table top, such as sliding or rolling.Certainly, the objective plane It may be other planes, for example, the objective plane can be not on horizontal plane, and have certain tilt angle etc. Deng.
Illustratively, the target object can be ice hockey, billiard ball, slidable block objects etc..The object Body can move on the objective plane, such as roll or slide.In actual scene, can be user by directly or Indirect mode is to the target object applied force, so that the target object moves on the objective plane.
The camera can be used for acquiring the image of specific region.Illustratively, the camera can be fixed in advance In designated position, at this point, shooting area corresponding to the image that the camera collects can determine.
In the embodiment of the present invention, the interval of the acquisition time of each image to be processed can be according to the target object Actual motion situation, the information such as the performance of moving region size and the camera preset, be not limited thereto.
Optionally, the objective plane is preset level desktop, and the target object is can be in the preset level table The object moved on face.
Corresponding boundary has can be set in the horizontal table top, to limit the moving range of the target object.
Step S102, the image to be processed described at least two frames pre-process, and obtain at least two frame images to be recognized.
It is described to pre-process the accuracy that can be used for being promoted subsequent image recognition in the embodiment of the present invention.For example, described Pretreatment may include noise filtering, gray proces, image enhancement processing etc..
Optionally, the image to be processed described at least two frames pre-processes, comprising:
Gray proces are carried out to each image to be processed respectively, and are made an uproar to the image to be processed after gray proces Point filtering, obtains the images to be recognized.
In the embodiment of the present invention, the noise filtering can be realized in several ways, illustratively, can be by pre- If the methods of filter (such as median filtering) handles the image to be processed.In addition, in some embodiments, The gray proces and noise filtering can be not limited to the processing of the image to be processed, and can also include other processing.
Step S103 obtains target object in the movement of the objective plane according at least two frame images to be recognized Information.
In the embodiment of the present invention, the object in each images to be recognized can be identified respectively by the methods of edge detection The coordinate of body, and the coordinate according to the target object in each images to be recognized are converted to the target object in institute State the position on objective plane.For example, profile of the target object respectively in each images to be recognized can be extracted Information, and according to the profile information, calculate separately characteristic point coordinate of the target object in the images to be recognized.This Outside, it can also identify the target object in the images to be recognized respectively by modes such as neural networks, and obtain the target The coordinate of object.Illustratively, the neural network may include VggNet, GoogLeNet, ResNet etc..
The mobile message may include one of information such as movement velocity, acceleration, the direction of motion, motion path or It is a variety of.
In the embodiment of the present invention, at least two frame images to be recognized in the different acquisition time are arrived due to available, because This, it is available to the target object in the different acquisition time, the corresponding coordinate in images to be recognized, then convert to obtain Position of the target object on the objective plane, so that the target object is described according in the different acquisition time Position on objective plane obtains the mobile message of the target object.For example, for the quilt being placed on the objective plane The ice hockey of user's force, can calculate the ice hockey described according to ice hockey coordinate corresponding to two acquisition times The direction of motion, movement velocity on objective plane etc. mobile message.
Step S104 estimates the specified region of the target object and the objective plane according to the mobile message It estimates intersection point and the target object reaches the estimated time in the specified region.
In the embodiment of the present invention, the specified region can be developer or user is pre-set.For example, if exploitation An edge line is arranged in the side of the possible zone of action of the target object in person, and it is expected to intercept the target object, makes Another side region of the edge line cannot be entered by preset zone of action by obtaining the target object, then, the specified area Domain can be another side region of the edge line.
Illustratively, for be placed on the objective plane by user exert a force ice hockey, can be according to the movement The movement velocity of the ice hockey in information, the direction of motion etc. information estimate estimating for the ice hockey and preset edge line Intersection point and the football reach the estimated time of the edge line.
Step S105 estimates intersection point and the estimated time according to described, and control blocking module is moved, so that institute It states and estimates the intersection point interception target object described in blocking module arrival.
In the embodiment of the present invention, it can be moved by blocking module described in motor driven.The blocking module can be with Different components is used according to practical application scene, for example, the blocking module may include mechanical arm, baffle, hand-type component Or component of other forms etc..In addition, on the blocking module can also include specific mechanical structure, be used for so that The subsequent motion of the target object more meets the demand of user, for example, can the blocking module be arranged storage pocket or Storage basket, so that falling into the storage pocket or storage basket after the target object is intercepted by the blocking module, facilitating use The subsequent storage at family;Alternatively, caching component can be arranged on the blocking module, so that the target object is blocked by described After cutting block intercepts, the speed of rebound, which reduces, not to rebound even.Certainly, in some other application scenarios, the blocking module It also may include the component etc. for increasing ball rebound velocity.The specific set-up mode of the blocking module can be carried out according to user demand Setting.
Optionally, the blocking module includes baffle, includes cushion on the baffle.
In the embodiment of the present invention, the material of the baffle and area etc. are herein with no restrictions.Illustratively, the gear Cushion on plate can be made of the material of the porous structures such as sponge.By the cushion, the object can be made After body is intercepted by the blocking module, the speed of rebound, which reduces, not to rebound even, consequently facilitating the subsequent processing of user.
A kind of specific embodiment of the present embodiment in concrete application scene is illustrated with a specific example below.
In the embodiment of the present invention, the target object can be ice hockey, and the objective plane, which can be, is set to a certain put down A desktop on platform.The desktop may include frame, with the zone of action for limiting the ice hockey.User can pass through Ad hoc fashion (such as passing through specific push rod or hand, foot) releases the ice hockey, and ice hockey is moved up in the desktop It is dynamic.At this point, being acquired at predetermined intervals by the camera for being set to specific position above the desktop about the table At least two frames image to be processed in face, and gray proces and noise filtering are done to the image to be processed, images to be recognized is obtained, So that mobile message of the ice hockey on the desktop is obtained, to judge the ice hockey by least two frame images to be recognized When the interception region of the desktop other side is reached.At this point, the blocking module can intercept the ice hockey, so as to so that User can make repeated attempts and release the ice hockey to entertain or train etc..And this exemplary equipment dismounting letter Single, convenient for safeguarding, user experience is friendly, and the speed of service is fast, can be used in the scenes such as various commercial exhibitions and market.
The embodiment of the present invention can be by least two frame images to be recognized, and obtain ball etc. can slide on objective plane Dynamic or rolling target object is put down to estimate the target object with the target in the mobile message of the objective plane The intersection point in the specified region on face and the time for reaching the intersection point, to be intercepted to the target object, at this point, user It can be realized by pushing the target object and to be interacted with the blocking module, therefore, the embodiment of the present invention can be to use Family provides interactive entertainment activity, has stronger interest, better user experience, practicability and ease for use are stronger.
On the basis of the above embodiments, Fig. 2 is the method for Router machine people control provided by Embodiment 2 of the present invention Implementation process schematic diagram, this method as shown in Figure 2 may comprise steps of:
Step S201 obtains to be processed image of at least two frames about objective plane, the figure to be processed by camera It include target object as in, shooting area corresponding to each image to be processed is identical.
It, can be described to be processed to acquire by setting fixed position for the camera in advance in the embodiment of the present invention Image, so that shooting area corresponding to each image to be processed is identical.At this point it is possible to make each wait locate It is mutually unified to manage coordinate system corresponding to image, is convenient for subsequent image procossing.
Step S202, the image to be processed described at least two frames pre-process, and obtain at least two frame images to be recognized.
Step S203 determines that the mapping between the image coordinate system and objective plane coordinate system in the image to be processed is closed System.
It, can be according to the acquisition parameters, the camera and the objective plane of the camera in the embodiment of the present invention Relative positional relationship etc. determine that stating the mapping between the image coordinate system in image to be processed and objective plane coordinate system closes System.
Illustratively, with a specific example illustrate to determine the image coordinate system and target in the image to be processed below A kind of illustrative embodiments of mapping relations between plane coordinate system.
As shown in figure 3, being a kind of set-up mode of the camera.Wherein, the rectangle frame is the target object Moving region.At this point it is possible to projection Oc of the camera optical centre on the objective plane for the objective plane The origin of coordinate system, at this point, according to the distance H of the upright projection of the optical centre of the camera to the objective plane, institute The lateral pixel quantity Sx of image to be processed, the lateral pixel quantity Sy of the image to be processed, the camera is stated to clap Longitudinal reflected light when maximum value α, the camera shooting of longitudinal reflected light path and the angle of the objective plane when taking the photograph Lateral reflected light path and the objective plane when minimum value β and the camera of the angle of road and the objective plane are shot Angle maximum value γ, can establish reflecting between image coordinate system and objective plane coordinate system in the image to be processed Relationship is penetrated, at this point it is possible to can determine the picture according to the mapping relations for any pixel in the image to be processed Position of the element in the objective plane coordinate system.
Step S204 identifies first coordinate of the target object in each images to be recognized.
In the embodiment of the present invention, first coordinate is the first seat in the image coordinate system in the images to be recognized Mark.Illustratively, first coordinate can be coordinate of the central point of the target object in each images to be recognized, this Outside, first coordinate is also possible to other characteristic points of the target object (such as the target object is along moving direction Front end etc.) coordinate, it is not limited here.
Step S205, according to the mapping relations between described image coordinate system and objective plane coordinate system, by described first Coordinate transformation obtains second coordinate of the target object in the objective plane coordinate system.
Second coordinate can indicate the target object respectively in the acquisition time of the image to be processed, described Position in objective plane.
Step S206, according to the target object second coordinate corresponding to each images to be recognized and each institute Timestamp corresponding to image to be processed is stated, obtains target object in the mobile message of the objective plane.
In the embodiment of the present invention, the timestamp can indicate the acquisition time of each image to be processed.
Step S207 estimates the specified region of the target object and the objective plane according to the mobile message It estimates intersection point and the target object reaches the estimated time in the specified region.
Step S208 estimates intersection point and the estimated time according to described, and control blocking module is moved, so that institute It states and estimates the intersection point interception target object described in blocking module arrival.
The present embodiment step S202, S207, S208 are same or similar with above-mentioned steps S102, S104, S105 respectively, specifically It can be found in step S102, S104, S105 associated description, details are not described herein.
In the embodiment of the present invention, shooting area corresponding to each image to be processed is identical, can make it is each to It is mutually unified to handle coordinate system corresponding to image, is convenient for subsequent image procossing;And by determining in the image to be processed Image coordinate system and objective plane coordinate system between mapping relations, can be according to the seat of the target object in the picture Mark, obtains coordinate of the target object on objective plane, so as to relatively accurately predict institute according to less data The information such as movement velocity, the direction of motion of target object are stated, to intercept to the target object.
It should be understood that the size of the serial number of each step is not meant that the order of the execution order in above-described embodiment, each process Execution sequence should be determined by its function and internal logic, the implementation process without coping with the embodiment of the present invention constitutes any limit It is fixed.
Fig. 4 is the schematic diagram for the control device that the embodiment of the present invention three provides.For ease of description, it illustrates only and this hair The relevant part of bright embodiment.
The control device 400 includes:
Photographing module 401, for obtaining to be processed image of at least two frames about objective plane by camera, it is described to Handling includes target object in image;
First processing module 402 is pre-processed for the image to be processed described at least two frames, is obtained at least two frames and is waited for Identify image;
Second processing module 403, for obtaining target object in the target according at least two frame images to be recognized The mobile message of plane;
Module 404 is estimated, for according to the mobile message, estimating the specified of the target object and the objective plane Region estimate intersection point and the target object reaches the estimated time in the specified region;
Control module 405, for estimating intersection point and the estimated time according to, control blocking module is moved, So that estimating intersection point described in the blocking module arrival intercepts the target object.
Optionally, the first processing module 402 is specifically used for:
Gray proces are carried out to each image to be processed respectively, and are made an uproar to the image to be processed after gray proces Point filtering, obtains the images to be recognized.
Optionally, the objective plane is preset level desktop, and the target object is can be in the preset level table The object moved on face.
Optionally, the blocking module includes baffle, includes cushion on the baffle.
Optionally, shooting area corresponding to each image to be processed is identical;
Correspondingly, the control device 400 further include:
Determining module, for determining reflecting between image coordinate system and objective plane coordinate system in the image to be processed Penetrate relationship;
The Second processing module 403 specifically includes:
Recognition unit, for identification first coordinate of the target object in each images to be recognized;
Computing unit will be described for according to the mapping relations between described image coordinate system and objective plane coordinate system First coordinate transformation obtains second coordinate of the target object in the objective plane coordinate system;
Processing unit, for according to the target object second coordinate corresponding to each images to be recognized and respectively Timestamp corresponding to a image to be processed obtains target object in the mobile message of the objective plane.
The embodiment of the present invention can be by least two frame images to be recognized, and obtain ball etc. can slide on objective plane Dynamic or rolling target object is put down to estimate the target object with the target in the mobile message of the objective plane The intersection point in the specified region on face and the time for reaching the intersection point, to be intercepted to the target object, at this point, user It can be realized by pushing the target object and to be interacted with the blocking module, therefore, the embodiment of the present invention can be to use Family provides interactive entertainment activity, has stronger interest, better user experience, practicability and ease for use are stronger.
Fig. 5 is the schematic diagram for the terminal device that the embodiment of the present invention four provides.As shown in figure 5, the terminal of the embodiment is set Standby 5 include: processor 50, memory 51 and are stored in the meter that can be run in the memory 51 and on the processor 50 Calculation machine program 52.The terminal device can also include camera and/or blocking module, the camera and/or blocking module It is connect respectively with processor coupling.In addition, to can also be used as external equipment external for the camera and/or blocking module In terminal device, the connection type and communication mode of the camera and/or blocking module and the processor can bases Practical application scene is configured, herein with no restrictions.
The processor 50 realizes the step in above-mentioned each control method embodiment when executing the computer program 52, Such as step 101 shown in FIG. 1 is to 105.Alternatively, the processor 50 realizes above-mentioned each dress when executing the computer program 52 Set the function of each module/unit in embodiment, such as the function of module 401 to 405 shown in Fig. 4.
Illustratively, the computer program 52 can be divided into one or more module/units, it is one or Multiple module/units are stored in the memory 51, and are executed by the processor 50, to complete the present invention.Described one A or multiple module/units can be the series of computation machine program instruction section that can complete specific function, which is used for Implementation procedure of the computer program 52 in the terminal device 5 is described.For example, the computer program 52 can be divided It is cut into photographing module, first processing module, Second processing module, estimates module, control module, each module concrete function is as follows:
Photographing module, it is described wait locate for obtaining to be processed image of at least two frames about objective plane by camera Managing includes target object in image;
First processing module is pre-processed for the image to be processed described at least two frames, is obtained at least two frames and is waited knowing Other image;
Second processing module, for it is flat in the target to obtain target object according at least two frame images to be recognized The mobile message in face;
Module is estimated, for estimating the specified area of the target object Yu the objective plane according to the mobile message Domain estimate intersection point and the target object reaches the estimated time in the specified region;
Control module, for estimating intersection point and the estimated time according to, control blocking module is moved, so that It obtains and estimates the intersection point interception target object described in the blocking module arrival.
The terminal device 5 can be the calculating such as desktop PC, notebook, palm PC and cloud server and set It is standby.The terminal device may include, but be not limited only to, processor 50, memory 51.It will be understood by those skilled in the art that Fig. 5 The only example of terminal device 5 does not constitute the restriction to terminal device 5, may include than illustrating more or fewer portions Part perhaps combines certain components or different components, such as the terminal device can also include input-output equipment, net Network access device, bus etc..
Alleged processor 50 can be central processing unit (Central Processing Unit, CPU), can also be Other general processors, digital signal processor (Digital Signal Processor, DSP), specific integrated circuit (Application Specific Integrated Circuit, ASIC), ready-made programmable gate array (Field- Programmable Gate Array, FPGA) either other programmable logic device, discrete gate or transistor logic, Discrete hardware components etc..General processor can be microprocessor or the processor is also possible to any conventional processor Deng.
The memory 51 can be the internal storage unit of the terminal device 5, such as the hard disk or interior of terminal device 5 It deposits.The memory 51 is also possible to the External memory equipment of the terminal device 5, such as be equipped on the terminal device 5 Plug-in type hard disk, intelligent memory card (Smart Media Card, SMC), secure digital (Secure Digital, SD) card dodge Deposit card (Flash Card) etc..Further, the memory 51 can also both include the storage inside list of the terminal device 5 Member also includes External memory equipment.The memory 51 is for storing needed for the computer program and the terminal device Other programs and data.The memory 51 can be also used for temporarily storing the data that has exported or will export.
It is apparent to those skilled in the art that for convenience of description and succinctly, only with above-mentioned each function Can unit, module division progress for example, in practical application, can according to need and by above-mentioned function distribution by different Functional unit, module are completed, i.e., the internal structure of described device is divided into different functional unit or module, more than completing The all or part of function of description.Each functional unit in embodiment, module can integrate in one processing unit, can also To be that each unit physically exists alone, can also be integrated in one unit with two or more units, it is above-mentioned integrated Unit both can take the form of hardware realization, can also realize in the form of software functional units.In addition, each function list Member, the specific name of module are also only for convenience of distinguishing each other, the protection scope being not intended to limit this application.Above system The specific work process of middle unit, module, can refer to corresponding processes in the foregoing method embodiment, and details are not described herein.
In the above-described embodiments, it all emphasizes particularly on different fields to the description of each embodiment, is not described in detail or remembers in some embodiment The part of load may refer to the associated description of other embodiments.
Those of ordinary skill in the art may be aware that list described in conjunction with the examples disclosed in the embodiments of the present disclosure Member and algorithm steps can be realized with the combination of electronic hardware or computer software and electronic hardware.These functions are actually It is implemented in hardware or software, the specific application and design constraint depending on technical solution.Professional technician Each specific application can be used different methods to achieve the described function, but this realization is it is not considered that exceed The scope of the present invention.
In embodiment provided by the present invention, it should be understood that disclosed device/terminal device and method, it can be with It realizes by another way.For example, device described above/terminal device embodiment is only schematical, for example, institute The division of module or unit is stated, only a kind of logical function partition, there may be another division manner in actual implementation, such as Multiple units or components can be combined or can be integrated into another system, or some features can be ignored or not executed.Separately A bit, shown or discussed mutual coupling or direct-coupling or communication connection can be through some interfaces, device Or the INDIRECT COUPLING or communication connection of unit, it can be electrical property, mechanical or other forms.
The unit as illustrated by the separation member may or may not be physically separated, aobvious as unit The component shown may or may not be physical unit, it can and it is in one place, or may be distributed over multiple In network unit.It can select some or all of unit therein according to the actual needs to realize the mesh of this embodiment scheme 's.
It, can also be in addition, the functional units in various embodiments of the present invention may be integrated into one processing unit It is that each unit physically exists alone, can also be integrated in one unit with two or more units.Above-mentioned integrated list Member both can take the form of hardware realization, can also realize in the form of software functional units.
If the integrated module/unit be realized in the form of SFU software functional unit and as independent product sale or In use, can store in a computer readable storage medium.Based on this understanding, the present invention realizes above-mentioned implementation All or part of the process in example method, can also instruct relevant hardware to complete, the meter by computer program Calculation machine program can be stored in a computer readable storage medium, the computer program when being executed by processor, it can be achieved that on The step of stating each embodiment of the method.Wherein, the computer program includes computer program code, the computer program Code can be source code form, object identification code form, executable file or certain intermediate forms etc..Computer-readable Jie Matter may include: can carry the computer program code any entity or device, recording medium, USB flash disk, mobile hard disk, Magnetic disk, CD, computer storage, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), electric carrier signal, telecommunication signal and software distribution medium etc..It should be noted that described The content that computer-readable medium includes can carry out increasing appropriate according to the requirement made laws in jurisdiction with patent practice Subtract, such as does not include electric carrier signal and electricity according to legislation and patent practice, computer-readable medium in certain jurisdictions Believe signal.
Embodiment described above is merely illustrative of the technical solution of the present invention, rather than its limitations;Although referring to aforementioned reality Applying example, invention is explained in detail, those skilled in the art should understand that: it still can be to aforementioned each Technical solution documented by embodiment is modified or equivalent replacement of some of the technical features;And these are modified Or replacement, the spirit and scope for technical solution of various embodiments of the present invention that it does not separate the essence of the corresponding technical solution should all It is included within protection scope of the present invention.

Claims (10)

1. a kind of control method characterized by comprising
To be processed image of at least two frames about objective plane is obtained by camera, includes object in the image to be processed Body;
The image to be processed described at least two frames pre-processes, and obtains at least two frame images to be recognized;
According at least two frame images to be recognized, target object is obtained in the mobile message of the objective plane;
According to the mobile message, that estimates the target object and the specified region of the objective plane estimates intersection point and described Target object reaches the estimated time in the specified region;
Intersection point and the estimated time are estimated according to described, control blocking module is moved, so that the blocking module arrives The target object is intercepted up to the intersection point of estimating.
2. control method as described in claim 1, which is characterized in that the image to be processed described at least two frames carries out pre- Processing, comprising:
Gray proces are carried out to each image to be processed respectively, and noise mistake is carried out to the image to be processed after gray proces Filter, obtains the images to be recognized.
3. control method as described in claim 1, which is characterized in that the objective plane is preset level desktop, the mesh Marking object is the object that can be moved on the preset level desktop.
4. control method as described in claim 1, which is characterized in that the blocking module includes baffle, is wrapped on the baffle Include cushion.
5. the control method as described in Claims 1-4 any one, which is characterized in that each image institute to be processed is right The shooting area answered is identical;
Correspondingly, obtaining target object in the mobile message of the objective plane according at least two frame images to be recognized Before, further includes:
Determine the mapping relations between the image coordinate system and objective plane coordinate system in the image to be processed;
At least two frame images to be recognized according to obtain target object in the mobile message of the objective plane, comprising:
Identify first coordinate of the target object in each images to be recognized;
According to the mapping relations between described image coordinate system and objective plane coordinate system, first coordinate transformation is obtained into institute State second coordinate of the target object in the objective plane coordinate system;
According to the target object second coordinate corresponding to each images to be recognized and each image to be processed Corresponding timestamp obtains target object in the mobile message of the objective plane.
6. a kind of control device characterized by comprising
Photographing module, for obtaining to be processed image of at least two frames about objective plane, the figure to be processed by camera It include target object as in;
First processing module is pre-processed for the image to be processed described at least two frames, obtains at least two frames figure to be identified Picture;
Second processing module, for obtaining target object in the objective plane according at least two frame images to be recognized Mobile message;
Module is estimated, for estimating the specified region of the target object and the objective plane according to the mobile message It estimates intersection point and the target object reaches the estimated time in the specified region;
Control module, for estimating intersection point and the estimated time according to, control blocking module is moved, so that institute It states and estimates the intersection point interception target object described in blocking module arrival.
7. control device as claimed in claim 6, which is characterized in that the first processing module is specifically used for:
Gray proces are carried out to each image to be processed respectively, and noise mistake is carried out to the image to be processed after gray proces Filter, obtains the images to be recognized.
8. control device as claimed in claims 6 or 7, which is characterized in that shooting corresponding to each image to be processed Region is identical;
Correspondingly, the control device further include:
Determining module, for determining that the mapping between image coordinate system and objective plane coordinate system in the image to be processed is closed System;
The Second processing module specifically includes:
Recognition unit, for identification first coordinate of the target object in each images to be recognized;
Computing unit, for according to the mapping relations between described image coordinate system and objective plane coordinate system, by described first Coordinate transformation obtains second coordinate of the target object in the objective plane coordinate system;
Processing unit, for according to the target object second coordinate corresponding to each images to be recognized and each institute Timestamp corresponding to image to be processed is stated, obtains target object in the mobile message of the objective plane.
9. a kind of terminal device, including memory, processor and storage are in the memory and can be on the processor The computer program of operation, which is characterized in that the processor realizes such as claim 1 to 5 when executing the computer program The step of any one control method.
10. a kind of computer readable storage medium, the computer-readable recording medium storage has computer program, and feature exists In realization is as described in any one of claim 1 to 5 the step of control method when the computer program is executed by processor.
CN201910577667.0A 2019-06-28 2019-06-28 Control method, control device and terminal equipment Active CN110390693B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910577667.0A CN110390693B (en) 2019-06-28 2019-06-28 Control method, control device and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910577667.0A CN110390693B (en) 2019-06-28 2019-06-28 Control method, control device and terminal equipment

Publications (2)

Publication Number Publication Date
CN110390693A true CN110390693A (en) 2019-10-29
CN110390693B CN110390693B (en) 2022-02-22

Family

ID=68285879

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910577667.0A Active CN110390693B (en) 2019-06-28 2019-06-28 Control method, control device and terminal equipment

Country Status (1)

Country Link
CN (1) CN110390693B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113448427A (en) * 2020-03-24 2021-09-28 华为技术有限公司 Equipment control method, device and system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN204864942U (en) * 2015-08-07 2015-12-16 昆山塔米机器人有限公司 Interactive platform of desktop puck robot
CN107261474A (en) * 2017-07-21 2017-10-20 佛山科学技术学院 A kind of intelligent billiards training system
CN107680122A (en) * 2017-10-12 2018-02-09 昆山塔米机器人有限公司 The Forecasting Methodology and device of ice hockey track on a kind of table
CN109366501A (en) * 2018-12-13 2019-02-22 中国石油大学(华东) Gas ice hockey robot control method, device and gas ice hockey equipment
CN109395370A (en) * 2018-12-21 2019-03-01 杭州青杉奇勋科技有限公司 A kind of dual-purpose type household soccer table
CN109740441A (en) * 2018-12-10 2019-05-10 平安科技(深圳)有限公司 Object detection method, device and terminal device based on image recognition
CN208990191U (en) * 2018-10-15 2019-06-18 昆山塔米机器人有限公司 A kind of Zhuo Shang ice hockey robot

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN204864942U (en) * 2015-08-07 2015-12-16 昆山塔米机器人有限公司 Interactive platform of desktop puck robot
CN107261474A (en) * 2017-07-21 2017-10-20 佛山科学技术学院 A kind of intelligent billiards training system
CN107680122A (en) * 2017-10-12 2018-02-09 昆山塔米机器人有限公司 The Forecasting Methodology and device of ice hockey track on a kind of table
CN208990191U (en) * 2018-10-15 2019-06-18 昆山塔米机器人有限公司 A kind of Zhuo Shang ice hockey robot
CN109740441A (en) * 2018-12-10 2019-05-10 平安科技(深圳)有限公司 Object detection method, device and terminal device based on image recognition
CN109366501A (en) * 2018-12-13 2019-02-22 中国石油大学(华东) Gas ice hockey robot control method, device and gas ice hockey equipment
CN109395370A (en) * 2018-12-21 2019-03-01 杭州青杉奇勋科技有限公司 A kind of dual-purpose type household soccer table

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王子亨 等: "摄像机非线性标定方法", 《计算机工程与设计》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113448427A (en) * 2020-03-24 2021-09-28 华为技术有限公司 Equipment control method, device and system
CN113448427B (en) * 2020-03-24 2023-09-12 华为技术有限公司 Equipment control method, device and system
US11880220B2 (en) 2020-03-24 2024-01-23 Huawei Technologies Co., Ltd. Device control method, apparatus, and system

Also Published As

Publication number Publication date
CN110390693B (en) 2022-02-22

Similar Documents

Publication Publication Date Title
CN107564012B (en) Augmented reality method and device for unknown environment
AU2016310451B2 (en) Eyelid shape estimation using eye pose measurement
TWI520102B (en) Tracking method
CN109737974A (en) A kind of 3D navigational semantic map updating method, device and equipment
CN110276317B (en) Object size detection method, object size detection device and mobile terminal
CN104732587B (en) A kind of indoor 3D semanteme map constructing method based on depth transducer
US8998718B2 (en) Image generation system, image generation method, and information storage medium
EP3729381A1 (en) Viewpoint dependent brick selection for fast volumetric reconstruction
CN105069751B (en) A kind of interpolation method of depth image missing data
CN107527046B (en) Unlocking control method and related product
CN105989608B (en) A kind of vision capture method and device towards intelligent robot
CN105260726B (en) Interactive video biopsy method and its system based on human face posture control
JP2016534461A (en) Method and apparatus for representing a physical scene
CN102184531A (en) Deep map confidence filtering
CN109035330A (en) Cabinet approximating method, equipment and computer readable storage medium
US11508141B2 (en) Simple environment solver using planar extraction
CN108985220A (en) A kind of face image processing process, device and storage medium
CN108509857A (en) Human face in-vivo detection method, electronic equipment and computer program product
CN109359514A (en) A kind of gesture tracking identification federation policies method towards deskVR
US9280209B2 (en) Method for generating 3D coordinates and mobile terminal for generating 3D coordinates
CN109800699A (en) Image-recognizing method, system and device
CN106203364B (en) System and method is tried in a kind of interaction of 3D glasses on
CN109886101A (en) Posture identification method and relevant apparatus
JP2015184054A (en) Identification device, method, and program
CN110390693A (en) A kind of control method, control device and terminal device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant