CN105160296A - Pen-wielding motion capturing system and device and painting style simulating method - Google Patents

Pen-wielding motion capturing system and device and painting style simulating method Download PDF

Info

Publication number
CN105160296A
CN105160296A CN201510436345.6A CN201510436345A CN105160296A CN 105160296 A CN105160296 A CN 105160296A CN 201510436345 A CN201510436345 A CN 201510436345A CN 105160296 A CN105160296 A CN 105160296A
Authority
CN
China
Prior art keywords
pen
brush
style
module
feedback
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510436345.6A
Other languages
Chinese (zh)
Other versions
CN105160296B (en
Inventor
谢宁
王盛
胡斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Chinsen Information Technology Co Ltd
Original Assignee
Zhejiang Chinsen Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Chinsen Information Technology Co Ltd filed Critical Zhejiang Chinsen Information Technology Co Ltd
Priority to CN201510436345.6A priority Critical patent/CN105160296B/en
Publication of CN105160296A publication Critical patent/CN105160296A/en
Application granted granted Critical
Publication of CN105160296B publication Critical patent/CN105160296B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention relates to image processing field and especially to a pen-wielding motion capturing system and device and a painting style simulating method. The system comprises a master control module, a video acquiring module, an image projection module, a data storage module, and a processing module used for sampling, extracting and synchronously processing pen-wielding posture data of a brush and forming an artistic stylized effect. According to the steps of the painting style simulating method, pictures input by a user form the above artistic stylized effect. The painting style simulating method not only enables arbitrarily-input pictures to form the art style, but also achieves stereoscopic, integral, and serialized style analysis.

Description

Pen-wielding motion capture system and device and painting style simulation method
Technical Field
The invention relates to the field of image processing, in particular to a pen-moving motion capturing system, a pen-moving motion capturing device and a painting style simulation method.
Background
In recent years, the Chinese digital content industry has been developed rapidly, and especially the level of digital entertainment products such as animation and games has been improved rapidly, so that the Chinese digital content industry has gradually surpassed the potential of foreign products. However, as the market matures, consumer demand for digital entertainment products, both in quantity and quality, continues to increase. Therefore, in the production link of digital entertainment products (games and cartoons), more severe requirements are put on a content creation platform and a digital content data processing technology. Therefore, how to accelerate the production of high-quality digital entertainment products such as cartoons and games becomes a challenge in the industry at present. From the technical angle of digital hand drawing, the problem of improving the quality and the yield of stylized effect of digital drawing is emphatically solved.
Computer graphics is widely used in the field of digital content processing. Especially non-realistic rendering techniques, hand-drawn artistic stylized rendering are widely implemented in digital creative software tools (e.g., adobe photoshop, corelpanter, GIMP drawing tools software, etc.). Artists utilize these software to create animated, game-like characters, large realistic scenes and beautiful visual effects. Among many algorithms for imitating natural hand-drawing effects, drawing manners based on strokes are the most widely applied manners, and are widely applied to animation, roles of games, props and scene visualization effects.
The workflow in the existing relevant drawing software products mainly adopts a mode of multiplexing brush texture bitmaps, namely: and cutting, deforming and splicing the existing stroke texture mapping to finish the generation of a pre-generated new stroke effect. However, in order to create more realistic characters and macro scenes, designers have had to make fine manual adjustments to each stroke. This overly manual and cumbersome and time-consuming operation becomes one of the bottlenecks that restrict digital content creation. More importantly, in order to unify the artistic styles of the same series of digital entertainment works, a designer team needs to perform a series of strict and standardized procedures on parameter adjustment. Such customized operations are considered "stylized behavior" in the present method.
For a team of multiple people, in order to achieve uniform artistic style of works, each member needs to perform standardized manual operation which is very tedious and time-consuming. It follows that manual setting operations for stylizing works become another major bottleneck restricting the creation of digital content. In summary, due to the insufficient intelligent processing capability of the digital content of the creative software, "relying on manual operation excessively" and "difficult setting of artistic stylization" become bottlenecks that hinder the production of cultural creative industry, especially high-quality hand-drawing style digital content (cartoon, game) products. Intelligent processing technology of digital content is urgently needed to break the predicament. Aiming at the key technology of digital content intelligent processing and media software creation of a digital media content platform, the solution of painting behavior style learning and simulation is a very challenging task.
And the traditional artistic stylization algorithm focuses on the analysis of the painting itself and the expression of style characteristics. This type of method is a mere analysis of static works, and the analysis of styles is flat, localized and non-continuous.
Disclosure of Invention
The invention aims to provide a pen-moving motion capture system, a pen-moving motion capture device and a painting style simulation method, and aims to capture pen-moving behaviors of painters by design equipment, learn through a computer program and simulate painting to assist creation.
The technical purpose of the invention is realized by the following technical scheme:
a pen stroke motion capture system comprising: the main control module is used for generating a control signal; the video acquisition module is used for acquiring image data; the image projection module is used for playing the video signal; the data storage module is used for storing pen-moving video data; the processing module is used for sampling, extracting and synchronizing the pen-carrying posture data of the pen brush; the processing module is provided with a feature vector expression function and a feedback style behavior feature expression function which are constructed for the image data acquired by the video acquisition module and used for generating an atom feature information sample; and an intelligent auxiliary hand-drawing stylized optimization strategy is arranged.
The image data collected by the video collection module is played through a video signal of the image projection module, then the processing module carries out sampling, extraction and synchronization processing on the pen-moving posture data according to the characteristic vector expression function, the feedback style and behavior characteristic expression function and the intelligent auxiliary hand-drawing stylized optimization strategy to form a specific style and record the specific style in the data storage module.
Preferably, the feature vector expression functionConstructed from atomic featuresWhereinAs atomic feature vectors:(ii) a x is an M-dimensional binary marker which represents whether the atom vector participates in the construction of a certain feedback characteristic; z and U represent prior knowledge, Z specifically marks certain atomic features to participate in feedback features, and U represents the probability distribution of a certain pair of atomic features occurring at the same time aiming at the feedback features.
Preferably, the feedback style behavior characteristic expression function R is
Where s denotes a state at a certain time, a denotes an action at a certain time, and f denotes posture information of the brush at a certain time, specifically,
preferably, the intelligent auxiliary hand-drawn stylized optimization strategy comprises deterministic decisionDecision parametersIs formed by having a hyper-parameterIs derived from a prior distribution of; the desired feedback value may be expressed as a hyperparameterIn the form of:
wherein,data, which is a set of states and actions, can be represented as(ii) a Hyper-parameterIs optimized to obtain the maximum value
The method for simulating the painting style by the pen-moving motion capture system is characterized by comprising the following steps of:
step 1, the image projection module plays video signals.
And 2, the video acquisition module acquires and records image data of the pen touch drawing process.
And 3, sampling, extracting and synchronizing the pen carrying posture of the brush in the pen touch drawing process by the processing module.
And 4, extracting and analyzing atomic features by the processing module according to the pen-moving posture acquired in the step 3.
And 5, constructing a characteristic vector expression function and a feedback style behavior characteristic expression function R and forming a specific style by the processing module according to the atomic characteristics in the step 4.
Step 6, introducing the step 5 to form a behavior characteristic expression function with a specific style, and adopting a deterministic decisionAnd on the basis of drawing prior knowledge, a random mode is introduced into drawing behavior parameters in the model to form an intelligent auxiliary hand-drawing stylized optimization strategy.
And 7, setting a brush intelligent agent, and drawing according to the pen-moving posture formed in the step to obtain the artistic stylized result of the image data in the step 1.
Preferably, the method for extracting atomic features in step 4 includes the following steps:
1) segmenting the image data acquired in the step 1 into rgb images (Red, Green, Blue);
2) segmenting the image data acquired in the step 1 into hsv (Hue, Saturation, Value), grab and sat images;
3) comparing the red partial image in the step 1) with the saturation component in the hsv model in the step 2) to search a target track point of the image, wherein if the color difference between red and saturation is small, the target track point is found;
4) comparing with a preset threshold value, wherein the threshold value is the difference value of red and saturation color difference; if not, re-executing the step 4); if the number of the strokes is larger than the threshold value, searching atomic features extracted by a stroke based on a Principal Component Analysis (PCA) algorithm;
5) establishing a subfolder, and automatically storing the acquired image data;
6) and saving the gray-scale image.
Preferably, the brush intelligent agent in step 7 is to model the brush to form a brush mark model based on a paper surface, and the model is given an autonomous strategy to realize unmanned control, and the brush mark can automatically move a pen to write a brush to complete actions of autonomous displacement, direction change and size change.
Specifically, feature vectors of the atomic feature information samples extracted in the step 4 form training set data D data, a specific style is formed according to the step 5, parameters are learned and assigned by using the training set data D data, and then the strategy of the brush intelligent agent is subjected to self-optimization learning aiming at a feedback style behavior feature expression function R, so that the picture-assisted hand-drawing stylization operation can be realized.
Preferably, the method for searching the atomic features of the extracted brush strokes based on the Principal Component Analysis (PCA) algorithm in the step 4) comprises the following steps: the method comprises the steps of splitting a video into a group of frames, further analyzing a stroke generating process in the brush moving process, calculating main shaft information of a brush in motion by utilizing a Principal Component Analysis (PCA) algorithm for each frame, and positioning and calculating motion posture information of the controlled brush according to the main shaft information, wherein the motion posture information comprises motion speed, direction pointed by a pen point, pen touch posture, relative position information of the same environment (target pen touch shape) and regular elements changing along with time.
A pen-moving motion capture device for collecting image data according to a painting style simulation method comprises a framework provided with a pen-touch drawing working area, a collection plate arranged in the pen-touch drawing working area and used for capturing pen-moving postures, image collection equipment used for recording painting motions, projection equipment used for projecting images in the pen-touch drawing working area, and a main control computer used for controlling, storing and processing the whole device, wherein the main control computer is provided with a data storage module and a processing module.
In order to facilitate the capture of pen-carrying gestures, a transparent acrylic plate is adopted to support a pen-touch drawing working area.
When drawing, traditional drawing materials such as a brush pen, common rice paper, ink and the like are selected as drawing tools.
The image acquisition equipment comprises a camera which is arranged at the center of the bottom surface of the frame.
The projection equipment enables a user to observe real pictures of objects to be drawn while drawing, and the real pictures are arranged at the center of the bottom surface of the frame.
In conclusion, the invention has the following beneficial effects:
1. because the artistic style of a common photo is recorded by the pen-moving motion capture system and the device and the artistic stylized result is formed, the artistic style can be formed by any input photo according to the painting style simulation method of the invention.
2. The invention effectively captures and analyzes the action behavior habit of pen-moving action in the hand drawing process, is not only suitable for static works, but also realizes three-dimensional, integrated and continuous style analysis.
Drawings
FIG. 1 is a schematic diagram of the operation flow of the drawing style simulation method according to the pen-exercising motion capture system.
FIG. 2 is a schematic flow chart of the atomic feature extraction method in step 3 of the present invention.
FIG. 3 is a schematic diagram of a pen motion capture device of the present invention.
FIG. 4 is a schematic diagram of the present invention for extracting atomic features of brush strokes based on Principal Component Analysis (PCA) algorithm.
FIG. 5 is a schematic diagram of the architecture of the brush intelligent agent of the present invention.
In the figure, 1, frame; 2. collecting a plate; 3. an image acquisition device; 4. a projection device; 5. the main control computer.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings.
A pen stroke motion capture system comprising: the main control module is used for generating a control signal; the video acquisition module is used for acquiring image data; the image projection module is used for playing the video signal; the data storage is used for storing pen carrying video data; the processing module is used for sampling, extracting and synchronizing pen-carrying posture data of the pen brush and is provided with a characteristic vector expression function constructed for the image data acquired by the video acquisition moduleThe system comprises a feedback style behavior characteristic expression function R and an intelligent auxiliary hand-drawing stylized optimization strategy P.
In particular, the feature vector expression functionWhereinAs atomic feature vectors:
x is an M-dimensional binary marker which represents whether the atom vector participates in the construction of a certain feedback characteristic;
z and U represent prior knowledge, Z specifically marks certain atomic features to participate in feedback features, and U represents the probability distribution of a certain pair of atomic features occurring at the same time aiming at the feedback features.
A feedback style behavior feature expression function R:
where s denotes a state at a certain time, a denotes an action at a certain time, and f denotes posture information of the brush at a certain time, specifically,
deterministic decision makingDecision parametersIs formed by having a hyper-parameterIs derived from a prior distribution of;
the desired feedback value may be expressed as a hyperparameterIn the form of:
wherein,data, which is a set of states and actions, can be represented as
Hyper-parameterIs optimized to obtain the maximum value
As shown in FIG. 1, the painting style simulation method according to the pen-exercising motion capture system and device comprises the following steps:
step 1, the image projection module plays video signals.
And 2, the video acquisition module acquires image data of the pen-touch drawing process.
And 3, sampling, extracting and synchronizing the pen carrying posture of the brush in the pen touch drawing process by the processing module.
Step 4, the processing module performs atomic feature extraction and analysis according to the pen-moving posture collected in step 3, as shown in fig. 2 specifically, the method includes the following steps:
1) segmenting the image data acquired in the step 1 into rgb images (Red, Green, Blue);
2) segmenting the image data acquired in the step 1 into hsv (Hue, Saturation, Value), grab and sat images;
3) comparing the red partial image in the step 1) with the saturation component in the hsv model in the step 2) to search a target track point of the image, wherein if the color difference between red and saturation is small, the target track point is found;
4) comparing with a preset threshold value, wherein the threshold value is the difference value of red and saturation color difference; if not, re-executing the step 3); if the distance is larger than the threshold value, searching atomic features of the brush based on Principal Component Analysis (PCA) algorithm extraction, dividing the video into a group of frames, further analyzing the generation process of strokes in the movement process of the brush, calculating the principal axis information of the brush in movement by using the PCA algorithm for each frame, and positioning and calculating the movement posture information of the controlled brush according to the principal axis information, wherein the movement posture information comprises the movement speed, the direction pointed by the brush head, the brush touch posture, the relative position information of the same environment (target brush touch shape) and regular elements changing along with time;
5) establishing a subfolder, and automatically storing the acquired image data;
6) and saving the gray-scale image.
And 5, constructing a characteristic vector expression function and a feedback style behavior characteristic expression function R and forming a specific style by the processing module according to the atomic characteristics in the step 4.
Step 6, introducing the step 5 to form a behavior characteristic expression function with a specific style, and adopting a deterministic decisionAnd on the basis of drawing prior knowledge, a random mode is introduced into drawing behavior parameters in the model to form an intelligent auxiliary hand-drawing stylized optimization strategy.
And 7, setting a brush intelligent agent, and drawing according to the pen-moving posture formed in the step to obtain the artistic stylized result of the image data in the step 1.
As shown in fig. 3, the pen-motion capture device comprises: including being provided with the frame 1 that the work area was drawn to the pen stroke, setting be in the work area is drawn to the pen stroke be used for fortune a collection board 2 that the gesture was caught, be used for the image acquisition equipment 3 of drawing action of record, be used for image projection 4 in the work area is drawn to the pen stroke and the main control computer 5 that is used for the control and the data storage of whole device and handles relevant work, main control computer 5 be provided with data storage module and processing module.
In particular use, step 1, the projection device 4 projects image data onto the acquisition board 2.
And 2, the image acquisition equipment 3 acquires and records image data of the pen touch drawing process.
And 3, when the drawing tool draws, sampling, extracting and synchronizing the pen-carrying posture of the pen brush in the pen-touch drawing process by a processing module of the main control computer 5.
And 4, performing atomic feature extraction and analysis on the pen-moving posture acquired in the step 3 by a processing module of the main control computer 5, specifically as shown in fig. 4.
Step 5, the processing module of the main control computer 5 constructs a feature vector expression function according to the atomic features in the step 4And a feedback style behavior feature expression function R and forming a specific style.
Step 6, introducing the step 5 to form a behavior characteristic expression function with a specific style, and adopting a deterministic decisionAnd on the basis of drawing prior knowledge, a random mode is introduced into drawing behavior parameters in the model to form an intelligent auxiliary hand-drawing stylized optimization strategy.
And 7, setting a brush intelligent agent, and drawing according to the pen-moving posture formed in the step to obtain the artistic stylized result of the image data in the step 1. The drawing process specifically refers to modeling of the brush to form a brush mark model based on paper, an autonomous strategy is given to the model, unattended control is achieved, and the brush mark can automatically move a pen to write a pen touch to finish actions of autonomous displacement, direction change and size change.
Specifically, as shown in fig. 5, the feature vectors of the atomic feature information samples extracted in step 4 form training set data D data, then a specific style is formed according to step 5, parameters are learned and assigned by using the training set data D data, and then the strategy of the brush intelligent agent is autonomously optimized and learned with respect to a feedback style behavior feature expression function R, so that the purpose of realizing the picture-assisted hand drawing stylized operation is achieved.
When the user enters a real photo, it can be stylized as above based on the intelligent agent of the brush.
The present embodiment is only for explaining the present invention, and it is not limited to the present invention, and those skilled in the art can make modifications of the present embodiment without inventive contribution as needed after reading the present specification, but all of them are protected by patent law within the scope of the claims of the present invention.

Claims (10)

1. A pen stroke motion capture system comprising:
the main control module is used for generating a control signal;
the video acquisition module is used for acquiring image data;
the image projection module is used for playing the video signal;
the data storage module is used for storing pen-moving video data;
the processing module is used for sampling, extracting and synchronizing the pen-carrying posture data of the pen brush;
the method is characterized in that: the processing module is also provided with a feature vector expression function which is constructed for the image data acquired by the video acquisition module and used for generating an atom feature information sample and a feedback style behavior feature expression function which is used for analyzing and forming a specific style; and an intelligent auxiliary hand-drawing stylized optimization strategy is arranged.
2. The pen motion capture system of claim 1, wherein: the characteristic vector expression functionConstructed from atomic featuresWhereinAs atomic feature vectors:
x is an M-dimensional binary marker which represents whether the atom vector participates in the construction of a certain feedback characteristic;
z and U represent prior knowledge, Z specifically marks certain atomic features to participate in feedback features, and U represents the probability distribution of a certain pair of atomic features occurring at the same time aiming at the feedback features.
3. The pen motion capture system of claim 1 or 2, wherein: the feedback style behavior characteristic expression function R is
Where s denotes a state at a certain time, a denotes an action at a certain time, and f denotes posture information of the brush at a certain time, specifically,
4. the pen motion capture system of claim 1 or 2, wherein: the intelligent auxiliary hand-drawing stylized optimization strategy comprises deterministic decisionDecision parametersIs formed by having a hyper-parameterIs derived from a prior distribution of;
the desired feedback value may be expressed as a hyperparameterIn the form of:
wherein,data, which is a set of states and actions, can be represented as
Hyper-parameterIs optimized to obtain the maximum value
5. The pen motion capture system of claim 3, wherein: the intelligent auxiliary hand-drawing stylized optimization strategy comprises deterministic decisionDecision parametersIs formed by having a hyper-parameterIs derived from a prior distribution of;
the desired feedback value may be expressed as a hyperparameterIn the form of:
wherein,data, which is a set of states and actions, can be represented as
Hyper-parameterIs optimized to obtain the maximum value
6. A painting style simulation method is characterized by comprising the following steps:
step 1, playing a video signal by an image projection module;
step 2, the video acquisition module acquires and records image data of a pen-touch drawing process;
step 3, the processing module samples, extracts and synchronizes the pen carrying posture of the brush in the process of drawing the pen touch;
step 4, the processing module extracts and analyzes atomic features according to the pen-moving posture collected in the step 3;
step 5, the processing module constructs a feature vector expression function according to the atomic features in the step 4And a feedback style behavior feature expression function R and forming a specific style:
step 6, introducing the step 5 to form a behavior characteristic expression function with a specific style, and adopting a deterministic decisionOn the basis of drawing prior knowledge, a random mode is introduced into drawing behavior parameters in the model to form an intelligent auxiliary hand-drawing stylized optimization strategy;
and 7, setting a brush intelligent agent, and drawing according to the pen-moving posture formed in the step to obtain the artistic stylized result of the image data in the step 1.
7. The method for painterly simulation according to claim 6, wherein said step 4 of primitive feature extraction comprises the steps of:
1) segmenting the image data acquired in the step 1 into rgb images (Red, Green, Blue);
2) segmenting the image data acquired in the step 1 into hsv (Hue, Saturation, Value), grab and sat images;
3) comparing the red partial image in the step 1) with the saturation component in the hsv model in the step 2) to search a target track point of the image, wherein if the color difference between red and saturation is small, the target track point is found;
4) comparing with a preset threshold value, wherein the threshold value is the difference value of red and saturation color difference; if not, re-executing the step 3); if the number of the strokes is larger than the threshold value, searching atomic features extracted by a stroke based on a Principal Component Analysis (PCA) algorithm;
5) establishing a subfolder, and automatically storing the acquired image data;
6) and saving the gray-scale image.
8. The method for drawing-style simulation according to claim 7, wherein the step 6) of finding atomic features for extracting strokes based on Principal Component Analysis (PCA) algorithm comprises: splitting the video into a group of frames, and further analyzing the stroke generation process in the stroke movement process; for each frame, calculating the main axis information of the brush in motion by utilizing a Principal Component Analysis (PCA) algorithm; and positioning and calculating the motion attitude information of the controlled brush according to the main shaft information, wherein the motion attitude information comprises the motion speed, the direction pointed by the pen point, the pen touch attitude, the relative position information with the environment (the target pen touch shape) and the regular elements changing along with time.
9. The painting style simulation method according to claim 6, 7 or 8, wherein the intelligent agent of the brush in the step 7 is modeling the brush to form a brush-print model based on paper, and the model is endowed with an autonomous strategy to realize unmanned control, and the brush-print can automatically carry a pen to write a brush to complete actions of autonomous displacement, direction change and size change.
10. The utility model provides a fortune pen action capture device, its characterized in that is in including frame (1), the setting that is provided with a stroke drawing work area collection board (2) that a stroke drawing work area was used for fortune pen gesture to catch, image acquisition equipment (3) that are used for the record drawing action, be used for with image projection equipment (4) and the main control computer (5) that are used for the control and the data storage of whole device and handle relevant work in the stroke drawing work area, main control computer (5) be provided with data storage module and processing module.
CN201510436345.6A 2015-07-23 2015-07-23 One kind wield the pen motion capture system, device and drawing style emulation mode Expired - Fee Related CN105160296B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510436345.6A CN105160296B (en) 2015-07-23 2015-07-23 One kind wield the pen motion capture system, device and drawing style emulation mode

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510436345.6A CN105160296B (en) 2015-07-23 2015-07-23 One kind wield the pen motion capture system, device and drawing style emulation mode

Publications (2)

Publication Number Publication Date
CN105160296A true CN105160296A (en) 2015-12-16
CN105160296B CN105160296B (en) 2019-06-14

Family

ID=54801148

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510436345.6A Expired - Fee Related CN105160296B (en) 2015-07-23 2015-07-23 One kind wield the pen motion capture system, device and drawing style emulation mode

Country Status (1)

Country Link
CN (1) CN105160296B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050169552A1 (en) * 1997-07-15 2005-08-04 Kia Silverbrook Method of automatic image processing
CN101794454A (en) * 2010-04-08 2010-08-04 西安交通大学 Oil painting stylizing method based on image
CN101976128A (en) * 2010-10-11 2011-02-16 庄永基 Digital calligraphy/painting real-time data acquisition simulation system and acquisition method thereof
CN103218074A (en) * 2013-03-06 2013-07-24 广东欧珀移动通信有限公司 Method for realizing touch stroke structure of intelligent terminal and intelligent terminal

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050169552A1 (en) * 1997-07-15 2005-08-04 Kia Silverbrook Method of automatic image processing
CN101794454A (en) * 2010-04-08 2010-08-04 西安交通大学 Oil painting stylizing method based on image
CN101976128A (en) * 2010-10-11 2011-02-16 庄永基 Digital calligraphy/painting real-time data acquisition simulation system and acquisition method thereof
CN103218074A (en) * 2013-03-06 2013-07-24 广东欧珀移动通信有限公司 Method for realizing touch stroke structure of intelligent terminal and intelligent terminal

Also Published As

Publication number Publication date
CN105160296B (en) 2019-06-14

Similar Documents

Publication Publication Date Title
US11532172B2 (en) Enhanced training of machine learning systems based on automatically generated realistic gameplay information
US9449253B2 (en) Learning painting styles for painterly rendering
Mohr et al. Retargeting video tutorials showing tools with surface contact to augmented reality
Spencer ZBrush character creation: advanced digital sculpting
US10071316B2 (en) Systems and methods for creating a playable video game from a three-dimensional model
US20190054378A1 (en) Systems and methods for creating a playable video game from a three-dimensional model
Kudoh et al. Painting robot with multi-fingered hands and stereo vision
Guo et al. Creature grammar for creative modeling of 3D monsters
Feng et al. Magictoon: A 2d-to-3d creative cartoon modeling system with mobile ar
Guo et al. ShadowPainter: Active learning enabled robotic painting through visual measurement and reproduction of the artistic creation process
CN105160296B (en) One kind wield the pen motion capture system, device and drawing style emulation mode
Xie et al. Stroke-based stylization by learning sequential drawing examples
Jiang et al. Animation scene generation based on deep learning of CAD data
US12079908B2 (en) Generating artwork tutorials
Xu et al. AnimateZoo: Zero-shot Video Generation of Cross-Species Animation via Subject Alignment
Liu et al. Automatic Generation of Animation Special Effects Based on Computer Vision Algorithms
Zhou et al. The design of man-machine finger-guessing game based on the hand gesture of the IntelliSense
Shen et al. Animation Scene Object Recognition and Modeling Based on Computer Vision Technology
US12056510B2 (en) Generating artwork tutorials
Liu et al. Image Feature Extraction and Interactive Design of Cultural and Creative Products Based on Deep Learning
US11954943B2 (en) Method for generating synthetic data
Wang et al. Animation Design Based on Anatomically Constrained Neural Networks
Lin et al. Intelligent Assessment of Advertising Art Design Based on Reinforcement Learning and Computer Vision
Yan et al. Application of Computer Vision-based Chinese Painting Stroke Recognition and Simulation System
Choi et al. Extended spatial keyframing for complex character animation

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20190614

Termination date: 20210723

CF01 Termination of patent right due to non-payment of annual fee