CN109621401A - A kind of interaction game system and control method - Google Patents

A kind of interaction game system and control method Download PDF

Info

Publication number
CN109621401A
CN109621401A CN201811640775.XA CN201811640775A CN109621401A CN 109621401 A CN109621401 A CN 109621401A CN 201811640775 A CN201811640775 A CN 201811640775A CN 109621401 A CN109621401 A CN 109621401A
Authority
CN
China
Prior art keywords
game system
control method
identification pattern
image capturing
visible image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811640775.XA
Other languages
Chinese (zh)
Inventor
林绍孟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Ming Dynasty Interactive Technology Co Ltd
Original Assignee
Guangzhou Ming Dynasty Interactive Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Ming Dynasty Interactive Technology Co Ltd filed Critical Guangzhou Ming Dynasty Interactive Technology Co Ltd
Priority to CN201811640775.XA priority Critical patent/CN109621401A/en
Publication of CN109621401A publication Critical patent/CN109621401A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/65Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition

Abstract

The present invention provides a kind of interaction game system and control method, game station design field, including wearable device and motion capture equipment, wearable device outer surface circulation are equipped with identification pattern, and the identification pattern includes dark color portion and light color portion;Motion capture equipment is equipped with the infrared camera and visible image capturing head controlled by processor, the infrared camera and visible image capturing focal length having the same and camera lens is axial;The identification pattern is used to be acquired simultaneously and identified by the processor by infrared camera and visible image capturing head.The present invention provides a kind of interaction game system and control method, the realization capture game user that the game system and control method can be inexpensive acts and establishes the actor model synchronous with game user movement in virtual scene or real scene.

Description

A kind of interaction game system and control method
Technical field
The invention belongs to game station design field more particularly to a kind of interaction game systems with motion capture function And control method.
Background technique
In remote interaction game with network savvy, mixed reality technology (MR) or virtual reality technology can be used (VR) a kind of game experiencing of immersion is realized, concrete implementation mode is by camera or to be attached to game user body On motion capture device capture game user limb action and generate limb action data, then using the data in gaming The actor model of variation synchronous with limb action data is established, finally by synchronous actor model and virtual scene (VR) or very Real field scape (MR) is merged, to realize game function.
In the prior art, using the limb action of camera capture game user, cost is relatively low, but needs processor to complete Each frame of the video information of portion's acquisition carries out the examination and processing of dynamic element, the especially Zhen to the profile of game user , not computationally intensive, it be easy to cause between the movement of actor model and the realistic operation of game user there are too long delay, influences Game experiencing can not screen the quick acting of limbs, and loss acts details, and be easy by all not other dynamics The interference of object;Using motion capture device capture game user limb action actor model and game user between delay compared with Small, movement details is not easy to lose, and when identification not will receive the interference of other dynamic objects of periphery, but higher cost, additionally needs Setting, relative effect game experiencing are largely attached in user's limbs.
Summary of the invention
It is an object of that present invention to provide a kind of interaction game system and control method, the game system and control method can be with The realization of low cost captures game user movement and establishes in virtual scene or real scene synchronous with game user movement Actor model.
Present invention provide the technical scheme that
A kind of interaction game system, including wearable device and motion capture equipment, wearable device outer surface circulation are equipped with mark Know pattern, the identification pattern includes dark color portion and light color portion;Motion capture equipment is equipped with the infrared photography controlled by processor Head and visible image capturing head, the infrared camera and visible image capturing focal length having the same and camera lens are axial;The mark Know pattern to be used to be acquired simultaneously and identified by the processor by infrared camera and visible image capturing head.
In one embodiment of above-mentioned technical proposal, the wearable device is equipped with to be sent out by the ultrasonic signal of driver control Raw device, the motion capture equipment are equipped with the ultrasound signal receipt device for connecting the processor.
In one embodiment of above-mentioned technical proposal, the identification pattern is Omron ring;The dark color portion is Omron Five black circles in ring, the light color portion are the white background in the ring of Omron;Alternatively, the dark color portion is Omron ring In black background, the light color portion be Omron ring in five white circles.
The present invention also provides a kind of control methods of interaction game system, a reality for above-mentioned interaction game system Apply example, comprising the steps of: by infrared camera and visible image capturing head acquires Infrared Image Real-time frame simultaneously and visible light is real-time Picture frame;The profile that high-temperature area is extracted from Infrared Image Real-time frame, using in the contours extract visible light realtime graphic frame Corresponding user images region;The region in user images region comprising identification pattern is chosen as final identification region.
In one embodiment of the above method, when extracting the profile of high-temperature area from Infrared Image Real-time frame, setting One temperature gradient threshold value and a width, when the temperature gradient of the transition region of the width between high-temperature area and low-temperature region is big When the temperature gradient threshold value, the transition region is by the contours extract as high-temperature area.
In one embodiment of the above method, when extracting the profile of high-temperature area from Infrared Image Real-time frame, only mention Take the profile of the high-temperature area comprising identification pattern.
The present invention also provides the control methods of another interaction game system, and one for above-mentioned interaction game system Embodiment, comprising the steps of: the signal emitted by ultrasound signal receipt device received ultrasonic signal generator;According to the letter Number calculate wearable device at a distance from motion capture equipment;According to the distance adjustment infrared camera and visible image capturing head Focal length.
In one embodiment of the above method, the tranmitting frequency of ultrasonic signal emitters is given by processor.
One aspect of the present invention has the benefit that the temperature gradient information acquired by infrared camera, can To extract the profile of human body, for the outline identification of game user, to save the meter that processor distinguishes human body and environment Calculation amount, while to guarantee that the visual field of infrared camera and visible image capturing head is consistent, and unified focusing ability is provided, Distance is measured to game user by ultrasonic wave, cooperation identification pattern can be with rapidly extracting clearly human body contour outline, by above-mentioned Principle, game system and control method can be realized with lower calculating cost capture game user movement and in virtual scene or The function of the actor model synchronous with game user movement is established in real scene.
Detailed description of the invention
Fig. 1 is the interaction game system block diagram in one embodiment of the invention;
Fig. 2 is the identification pattern schematic diagram of wearable device in one embodiment of the invention;
Fig. 3 is the control method schematic diagram of interaction game system in one embodiment of the invention;
Wherein, 10, wearable device, 11, identification pattern, 12, ultrasonic signal generator, 13, driver, 20, movement catch Catch equipment, 21, processor, 22, ultrasound signal receipt device, 23, infrared camera, 24, visible image capturing head, 30, display.
Specific embodiment
Further clear and complete explanation is made to technical solution provided by the invention combined with specific embodiments below.
Embodiment one
As shown in Figure 1, the present embodiment provides firstly a kind of interaction game system, game user can be mentioned in game system The game action for implementing interaction to virtual role in the virtual environment of confession, virtual role can be in the software of game system and create The NPC made or the virtual image mapped in gaming from other distal end game users.Interaction trip in the present embodiment Play system, including wearable device 10 and motion capture equipment 20 are equipped with inside motion capture equipment 20 and are used for running game software Processor 21, simultaneously operation capture equipment 20 be connected with display 30 to show that game picture, display 30 can be projection Instrument.When game carries out, game user is worn or is worn wearable device 10 and makes game action, and wearable device 10 is for assisting Motion capture equipment 20 captures the game action of game user, and converts virtual role model for the game action of game user Participate in triggering judgement and the visual presence of Games Software.Specifically, the interaction game system of the present embodiment belongs to single-play game, After motion capture equipment 20 captures game action, the processor 21 for being only used for motion capture equipment sentences the behavior of game user It is disconnected, and different Mission Objectives is triggered according to concrete behavior, without being connect with remote server, need not also be established by server The interactive game scene of more virtual roles.
As shown in Fig. 2, wearable device 10 is specially waistband, and waistband, which is equipped with, has obvious comparison of light and shade in the present embodiment The identification pattern 11 of decorative pattern, identification pattern 11 are circularly set multiple around waistband, and are covered with waistband surface.Specifically, this implementation The Orion that example selects Omron ring to be made of at least five in the annulus of solid background as identification pattern, Omron ring Pattern, wherein annulus is black, forms the dark portion in identification pattern, and solid background is white, is formed shallow in identification pattern Color portion.Above-mentioned dark color portion and light color portion are due under general microwave radiation background, with certain temperature difference in infrared imaging, Dark portion absorbs radiation multi-temperature height, and it is low that light portion absorbs the few temperature of radiation, even if being attached on game user body has closely As temperature, but still can be formed in infrared imaging obviously can by computer identify decorative pattern, i.e., because waistband on With the identification pattern 11 being made of dark portion and light color portion, game user is captured by infrared imaging in motion capture equipment When, it can identify that the human body with identification pattern is game user in more than two human bodies.In the present embodiment, Omron ring In solid background may be black, while all circles are white.In the other embodiment of the present invention, dark portion and light color The color in portion all has a color difference so that processor identifies in picture frame in infrared imaging.The other embodiment of the present invention In, identification pattern is also possible to LP Code, two dimensional code etc. and is easy the pattern identified by computer by assignment algorithm.
In the present embodiment, motion capture equipment 20 is equipped with the infrared camera and visible image capturing controlled by processor 21 Head, the infrared camera 23 and the focal length having the same of visible image capturing head 24 and camera lens are axial, red so as at any time Outer camera 23 and visible image capturing head 24 can be with the picture frames of synchronous acquisition to same field of view, and are carried out by processor 21 infrared The coincidence of camera image frame and visible image capturing head picture frame is handled.Identification pattern by infrared camera and visible light for being taken the photograph As head acquires simultaneously and is identified by the processor.
In the present embodiment, wearable device 10 further includes the ultrasonic signal generator 12 controlled by driver 13, driver 13 are issued the ultrasonic wave of specified wavelength by the process control ultrasonic signal generator 12 configured, which can be by driver The parameter of 13 internal storages storage is specified, can be from the wavelength setting sent in motion capture equipment 20 to driver 13 Signal, wavelength setting signal are sent out from the wireless communication module of motion capture equipment 20 to the wireless transport module of wearable device 10 It penetrates.The ultrasound signal receipt device 22 of connection processor is equipped in motion capture equipment 20, ultrasound signal receipt device 22 receives After the ultrasonic wave issued to wearable device 10, one side calculating action captures equipment at a distance from wearable device, and by the distance For being adjusted in synchronism the focal length of infrared camera and visible image capturing head, wearable device on the other hand is obtained using Doppler effect Relative motion captures the movement tendency of equipment, the moving direction for forecasting game user.
A kind of control method of interaction game system for above-mentioned interaction game system is additionally provided in the present embodiment, is moved Make capture equipment 20 and Infrared Image Real-time frame and visible light reality are acquired by infrared camera 23 and visible image capturing head 24 simultaneously When picture frame;The profile that high-temperature area is extracted from Infrared Image Real-time frame, uses the contours extract visible light realtime graphic frame In corresponding user images region;The region in user images region comprising identification pattern is chosen as final identification region.Tool Body, as shown in figure 3, synchronization point at the time of one determining, A is the collected real-time figure of moment visible image capturing head 24 As frame, B is the collected realtime graphic frame of moment infrared camera 23, general in the prior art to be distinguished in fact by machine learning When picture frame A in personage and ambient enviroment boundary, when appointing in A there are when more than two human bodies, the prior art is difficult to differentiate between The boundary of business, so that the movement influenced to game user judges.In Fig. 3 this it appears that in realtime graphic frame B, game is used Family has apparent boundary or profile for ambient enviroment, and the highlight regions surrounded in boundary are high-temperature area, specially Human body can be according to the personage in the Boundary Extraction A provided in B when processor 21 receives the data that A and B is carried simultaneously Information, so that game user and ambient enviroment are distinguished, if occurring more than two infrared recognizable objects in the visual field of A and B, To occur two boundaries so in B, processor 21, which can choose in the region of boundary encirclement, has (the figure acceptance of the bid of identification pattern 11 Know pattern 11 omit) region as identification region, to solve the problems, such as multiple human bodies.C is to make in processor 21 in Fig. 3 The human body proposed with the boundary in B, D are the virtual role that processor is made according to C.
Specifically, in the above-mentioned profile for extracting high-temperature area in B, it can be seen that the image that B is shown is a kind of temperature Gradient map can be set a temperature gradient threshold value and one by width as unit of pixel in processor 21, work as high-temperature region When the temperature gradient of the transition region of the width is greater than the temperature gradient threshold value between domain and low-temperature region, the transition region is by as height The contours extract of temperature area, as can be set in the present embodiment in the length that width is 20 pixels, if temperature gradient in B Variation is greater than 10 DEG C of temperature gradient threshold value, then take the midpoint of the width as a profile point on profile, when extracting from B When the spacing of multiple profile points and each profile point is less than 2 pixel out, being sequentially connected all profile points is high-temperature region The profile in domain.
In the control method of the interaction game system for above-mentioned interaction game system provided in the present embodiment, it can wrap Include following steps:
S100, the signal emitted by ultrasound signal receipt device received ultrasonic signal generator;
S200 calculates wearable device at a distance from motion capture equipment according to the signal;
S300, according to the focal length of distance the adjustment infrared camera and visible image capturing head.
Embodiment two
A kind of interaction game system is present embodiments provided, there is network savvy, the difference of the present embodiment and embodiment one It is, the present embodiment further includes the remote server of running game, and remote server is with motion capture equipment networking and by dividing Cloth, which calculates, realizes game function, i.e., some or all of function of processor 21 is completed by remote server in embodiment one, Correspondingly, the control method that embodiment one provides can also be completed partly or completely by remote server.

Claims (9)

1. a kind of interaction game system, including wearable device and motion capture equipment, it is characterised in that: wearable device outer surface is followed It is equipped with identification pattern, the identification pattern includes dark color portion and light color portion;Motion capture equipment is equipped with and is controlled by processor Infrared camera and visible image capturing head, the infrared camera and visible image capturing focal length having the same and camera lens axis To;The identification pattern is used to be acquired simultaneously and identified by the processor by infrared camera and visible image capturing head.
2. a kind of interaction game system according to claim 1, it is characterised in that: the wearable device is equipped with by driver The ultrasonic signal generator of control, the motion capture equipment are equipped with the ultrasound signal receipt device for connecting the processor.
3. a kind of interaction game system according to claim 1, it is characterised in that: the identification pattern is Omron ring; The dark color portion is five black circles in the ring of Omron, and the light color portion is the white background in the ring of Omron;Alternatively, institute Stating dark portion is the black background in the ring of Omron, and the light color portion is five white circles in the ring of Omron.
4. a kind of control method of interaction game system, which is characterized in that be used for interactive game system as described in claim 1 System, comprising the steps of: Infrared Image Real-time frame and visible light are acquired simultaneously by infrared camera and visible image capturing head and schemed in real time As frame;The profile that high-temperature area is extracted from Infrared Image Real-time frame, using right in the contours extract visible light realtime graphic frame The user images region answered;The region in user images region comprising identification pattern is chosen as final identification region.
5. a kind of control method of interaction game system according to claim 4, which is characterized in that from infrared real-time figure As when the profile of extraction high-temperature area, a temperature gradient threshold value and a width being arranged, when high-temperature area and low-temperature space in frame The mistake of the width between domain
6. the temperature gradient for crossing area is greater than the temperature gradient threshold value, the transition region is by the contours extract as high-temperature area.
7. a kind of control method of interaction game system according to claim 4, which is characterized in that from infrared real-time figure As when the profile of extraction high-temperature area, only extracting the profile of the high-temperature area comprising identification pattern in frame.
8. a kind of control method of interaction game system, which is characterized in that be used for interactive game system as claimed in claim 2 System, comprising the steps of: the signal emitted by ultrasound signal receipt device received ultrasonic signal generator;According to the signal meter Wearable device is calculated at a distance from motion capture equipment;According to the coke of distance the adjustment infrared camera and visible image capturing head Away from.
9. a kind of control method of interaction game system according to claim 7, which is characterized in that ultrasonic signal transmitting The tranmitting frequency of device is given by processor.
CN201811640775.XA 2018-12-29 2018-12-29 A kind of interaction game system and control method Pending CN109621401A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811640775.XA CN109621401A (en) 2018-12-29 2018-12-29 A kind of interaction game system and control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811640775.XA CN109621401A (en) 2018-12-29 2018-12-29 A kind of interaction game system and control method

Publications (1)

Publication Number Publication Date
CN109621401A true CN109621401A (en) 2019-04-16

Family

ID=66054747

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811640775.XA Pending CN109621401A (en) 2018-12-29 2018-12-29 A kind of interaction game system and control method

Country Status (1)

Country Link
CN (1) CN109621401A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102622591A (en) * 2012-01-12 2012-08-01 北京理工大学 3D (three-dimensional) human posture capturing and simulating system
CN105320249A (en) * 2014-06-13 2016-02-10 广州杰赛科技股份有限公司 Interactive method for reality enhancement
CN105844240A (en) * 2016-03-23 2016-08-10 深圳云天励飞技术有限公司 Method and device for detecting human faces in infrared temperature measurement system
CN105987693A (en) * 2015-05-19 2016-10-05 北京蚁视科技有限公司 Visual positioning device and three-dimensional surveying and mapping system and method based on visual positioning device
CN107102749A (en) * 2017-04-23 2017-08-29 吉林大学 A kind of three-dimensional pen type localization method based on ultrasonic wave and inertial sensor
CN206584210U (en) * 2017-03-21 2017-10-24 湖南工程学院 A kind of many people's three-dimensional space position harvesters
CN207732844U (en) * 2017-12-06 2018-08-14 深圳市灼华互娱科技有限公司 A kind of novel facial expression and motion capture system
CN109069920A (en) * 2017-08-16 2018-12-21 广东虚拟现实科技有限公司 Hand-held controller, method for tracking and positioning and system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102622591A (en) * 2012-01-12 2012-08-01 北京理工大学 3D (three-dimensional) human posture capturing and simulating system
CN105320249A (en) * 2014-06-13 2016-02-10 广州杰赛科技股份有限公司 Interactive method for reality enhancement
CN105987693A (en) * 2015-05-19 2016-10-05 北京蚁视科技有限公司 Visual positioning device and three-dimensional surveying and mapping system and method based on visual positioning device
CN105844240A (en) * 2016-03-23 2016-08-10 深圳云天励飞技术有限公司 Method and device for detecting human faces in infrared temperature measurement system
CN206584210U (en) * 2017-03-21 2017-10-24 湖南工程学院 A kind of many people's three-dimensional space position harvesters
CN107102749A (en) * 2017-04-23 2017-08-29 吉林大学 A kind of three-dimensional pen type localization method based on ultrasonic wave and inertial sensor
CN109069920A (en) * 2017-08-16 2018-12-21 广东虚拟现实科技有限公司 Hand-held controller, method for tracking and positioning and system
CN207732844U (en) * 2017-12-06 2018-08-14 深圳市灼华互娱科技有限公司 A kind of novel facial expression and motion capture system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
肖泽龙等: "《无线电近程探测原理与系统设计》", 31 May 2018 *

Similar Documents

Publication Publication Date Title
US8639020B1 (en) Method and system for modeling subjects from a depth map
CN103099602B (en) Based on the physical examinations method and system of optical identification
US20030012410A1 (en) Tracking and pose estimation for augmented reality using real features
CN106037651B (en) A kind of heart rate detection method and system
CN106843507B (en) Virtual reality multi-person interaction method and system
CN109453517B (en) Virtual character control method and device, storage medium and mobile terminal
WO2010038693A1 (en) Information processing device, information processing method, program, and information storage medium
WO2003017680A1 (en) 3d video conferencing system
CN107211165A (en) Devices, systems, and methods for automatically delaying video display
CN106997618A (en) A kind of method that virtual reality is merged with real scene
EP1946567A2 (en) Device for generating three dimensional surface models of moving objects
JP2018156408A (en) Image recognizing and capturing apparatus
CN107656611A (en) Somatic sensation television game implementation method and device, terminal device
CN110910449B (en) Method and system for identifying three-dimensional position of object
CN111583386A (en) Multi-view human body posture reconstruction method based on label propagation algorithm
CN115035546A (en) Three-dimensional human body posture detection method and device and electronic equipment
KR20120002723A (en) Device and method for recognizing person by using 3 dimensional image information
Xueqin et al. Depth camera in computer vision and computer graphics: an overview
CN106371607A (en) Man-machine interaction method and system based on cooperative game
CN111435550A (en) Image processing method and apparatus, image device, and storage medium
CN104980727A (en) Image synthesis method based on virtual reality technology and headphone display device
CN117238031A (en) Motion capturing method and system for virtual person
CN109621401A (en) A kind of interaction game system and control method
CN115393962A (en) Motion recognition method, head-mounted display device, and storage medium
CN109089032A (en) A kind of smog vision imaging device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 510000 No. 16 Keyun Road, Tianhe District, Guangzhou City, Guangdong Province

Applicant after: GUANGZHOU MINGCHAO INTERACTIVE TECHNOLOGY Co.,Ltd.

Address before: 510000 No. 16 Keyun Road, Tianhe District, Guangzhou City, Guangdong Province:Room 402-407 (Office only)

Applicant before: GUANGZHOU MINGCHAO INTERACTIVE TECHNOLOGY Co.,Ltd.

CB02 Change of applicant information
RJ01 Rejection of invention patent application after publication

Application publication date: 20190416

RJ01 Rejection of invention patent application after publication