CN109528217A - A kind of mood detection and method for early warning based on physiological vibrations analysis - Google Patents

A kind of mood detection and method for early warning based on physiological vibrations analysis Download PDF

Info

Publication number
CN109528217A
CN109528217A CN201811200522.0A CN201811200522A CN109528217A CN 109528217 A CN109528217 A CN 109528217A CN 201811200522 A CN201811200522 A CN 201811200522A CN 109528217 A CN109528217 A CN 109528217A
Authority
CN
China
Prior art keywords
mood
physiological
amplitude
physiological vibrations
early warning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811200522.0A
Other languages
Chinese (zh)
Inventor
王春雷
毛鹏轩
尉迟学彪
赵晓伟
段志成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jinsi Technology Co Ltd
Original Assignee
Beijing Jinsi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jinsi Technology Co Ltd filed Critical Beijing Jinsi Technology Co Ltd
Priority to CN201811200522.0A priority Critical patent/CN109528217A/en
Publication of CN109528217A publication Critical patent/CN109528217A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/746Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/15Biometric patterns based on physiological signals, e.g. heartbeat, blood flow

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Medical Informatics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Psychiatry (AREA)
  • Physiology (AREA)
  • Theoretical Computer Science (AREA)
  • Psychology (AREA)
  • Educational Technology (AREA)
  • Developmental Disabilities (AREA)
  • Human Computer Interaction (AREA)
  • Child & Adolescent Psychology (AREA)
  • General Physics & Mathematics (AREA)
  • Hospice & Palliative Care (AREA)
  • Multimedia (AREA)
  • Social Psychology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

The invention discloses a kind of mood detection based on physiological vibrations analysis and method for early warning, the analysis to Human Physiology vibration signal can be passed through, it realizes the non-contact detection and early warning to the emotional state of people, specifically includes video data stream input, physiological vibrations signal extraction, physiological vibrations mood calculate, computation model training optimization, emotional state determines, abnormal emotion early warning pushes 6 functional modules.

Description

A kind of mood detection and method for early warning based on physiological vibrations analysis
Technical field
The present invention relates to Emotion identification technical field, more particularly to a kind of mood detection based on physiological vibrations analysis and Method for early warning.
Background technique
Mood is the feeling for combining people, a kind of state of thought and act, is played in the exchange of person to person important Effect.Mood is a kind of feeling for combining people, the state of thought and act, it includes people to extraneous or autostimulation psychology Reaction, including the physiological reaction with this psychoreaction.In the routine work and life of people, the effect of mood is nowhere not ?.In medical treatment and nursing, if it is possible to know patient, particularly have the emotional state of the patient of expression obstacle, so that it may according to The mood of patient makes different nursing interventions, improves nursing quality.In product development process, if it is possible to identify user Using the emotional state in product process, user experience is understood, so that it may improve product function, design and be more suitable for user demand Product.In various man-machine interactive systems, if system can recognize that the emotional state of people, man-machine interaction will become It obtains more friendly and natural.Therefore, analysis is carried out to mood and identification is Neuscience, psychology, cognitive science, computer section Learn an important cross discipline research topic with fields such as artificial intelligence.
Generality viewpoint about Emotion identification can trace back to charles Robert Darwin in 1872 earliest Written " expression of human and animal " book, he think the mood of people and expression be it is born, universal, people can identify Mood and expression from Different Culture, the people of race.Many psychologists are obtained by research from the sixties in last century Emotion identification has the conclusion of generality.Ekman and Izard proposes that the mankind share 6 kinds of basic facial expressions: glad, indignation is feared Fear, is sad, detesting and is surprised.However some other psychologist then thinks the expression of mood and identification is the acquistion day after tomorrow, tool Literate otherness, intensity of this cultural difference in facial expression and in terms of all body to emotional experience It is existing.
Method is induced corresponding to different moods, Emotion identification method is also different, common Emotion identification method master It is divided into two major classes: the identification based on non-physiological signal and the identification based on physiological signal.Mood based on non-physiological signal is known Other method mainly includes the identification etc. to facial expression and speech intonation.Human facial expression recognition method is according between expression and mood Corresponding relationship identify different moods, under specific emotional state people can generate specific facial muscle movements and expression Mode, corners of the mouth angle upwarps when being such as in a cheerful frame of mind, and eye will appear annular fold;It can frown, open eyes wide when angry.Currently, Human facial expression recognition mostly uses the method for image recognition to realize.Speech intonation recognition methods is according to different emotional state servants Expression of language difference come what is realized, the intonation spoken when being such as in a cheerful frame of mind can be more cheerful and more light-hearted, intonation meeting when irritated Compare dull.Be based on the advantages of non-physiological signal recognition methods it is easy to operate, do not need special installation.The disadvantage is that it cannot be guaranteed that The reliability of Emotion identification, because people can cover up the true emotional of oneself by camouflage facial expression and speech intonation, And this camouflage is often not easy to be found.Secondly, being based on non-physiological signal for the disabled person with certain special diseases Know method for distinguishing to be often difficult to realize.
Emotion identification method based on physiological signal, mainly includes Emotion identification based on autonomic nerves system and is based on The Emotion identification of pivot nervous system.Recognition methods based on autonomic nerves system refers to by measuring heart rate, Skin Resistance, breathing Physiological signals are waited to identify corresponding emotional state;Recognition methods based on central nervous system refers to and is not sympathized with by analysis The unlike signal that brain issues under not-ready status identifies corresponding mood.This Emotion identification method based on physiological signal is not easy Pretended, and discrimination is higher compared with the recognition methods based on non-physiological signal.However in general, it is based on physiological signal Although Emotion identification method can obtain true data, usually require tested individual and dress corresponding signal acquisition to set Standby, information collection difficulty is big, and practical application scene is very limited.
Summary of the invention
In order to overcome the above-mentioned deficiencies of the prior art, the present invention provides a kind of mood detections based on physiological vibrations analysis And method for early warning.This method can be realized the true acquisition to mood data, and does not need any signal of tested individual wearing and adopt Collect equipment, therefore can be realized the non contact angle measurement to tested individual emotional state, multiple practical applications scene can be applicable in.
The technical scheme adopted by the invention is that:
Since the microvibration of human muscle is associated with emotion reflection, it can directly reflect the emotional state of people, therefore The present invention is mainly in such a way that the muscle Vibration Condition of face and neck to people carries out video acquisition and analysis, to realize pair The identification of the emotional state of people, concrete function process are as shown in Fig. 1.
Wherein, physiological vibrations signal extraction module is responsible for realizing the analysis to head end video data, therefrom to personage's individual Positioning and feature extraction are carried out, each facial musculi colli Vibration Condition for being tested individual is tested and analyzed frame by frame, therefrom extracts The related physiological parameters such as vibration amplitude and frequency, and export to physiological vibrations mood computing module.
Physiological vibrations mood computing module is responsible for carrying out the related physiological parameters such as the muscle vibration amplitude of typing and frequency Analytical calculation is realized with this and is detected to the emotional status for being tested individual, show that this is tested the emotional state of individual with this Value.
Computation model training optimization module is responsible for realizing by stochastic gradient descent method to physiological vibrations mood computation model It is iterated optimization, to promote the accuracy rate that mood determines result.
The emotional state value that emotional state determination module is responsible for being tested this individual determines, sets in advance if the value is higher than Fixed threshold value then triggers abnormal emotion early warning pushing module.
Abnormal emotion early warning pushing module is then responsible for the warning information that will be received, and is pushed to preset reception in real time Terminal.
Compared with prior art, the beneficial effects of the present invention are: the present invention, which does not need tested individual, dresses corresponding signal Equipment is acquired, therefore can be realized the contactless acquisition to physiological signal, so as to realize to tested individual emotional state Non contact angle measurement, avoid bringing discomfort to related tested personnel;Meanwhile the present invention is due to can be realized to tested individual feelings The non contact angle measurement of not-ready status, thus it is easy to use, and cost is relatively low, is difficult to be aware, and can be applicable in a variety of reality and answer Use scene.
Detailed description of the invention
Fig. 1 is the concrete function flow chart of this method.
Specific embodiment
The following further describes the present invention with reference to the drawings.
Mood detection function principle is vibrated according to Human Physiology, Human Physiology vibrational image data can pass through standardized view Frequency picture system is captured.Each pixel in image reflects the frequency or amplitude of the point, the group of frequency and amplitude parameter Conjunction reflects mood and the physiological characteristic of people, can be with area by the difference of mathematic parameter and histogram frequency distribution diagram design conditions Separate subtle emotional change.
If an individual adjacent spots are all different, the image that each point of object will occur during detection and analysis.Pass through False colour scale shows the accumulation amplitude of the displacement on every bit, has image and the actual color image of object faint similar Property, final result is obtained by colour code and display frequency (Hz).In particular technique realization, face view can be acquired by video camera Frequency image, with the frequency and amplitude of video camera and software for calculation processing pixel variation, to measure the small fortune of facial neck It is dynamic, and the emotional parameters such as calculation processing individual pressure, aggressiveness and anxiety degree based on amplitude and frequency, so that measurement is current The emotional state of individual.
The core of the technology is to obtain visual a variety of variables as a result, come appraiser by video image analysis algorithm Psychology and emotional state.Using video image analysis technology, mood testing principle analysis matching people is vibrated according to Human Physiology Physiological parameter, changed by pixel between each frame image of several seconds tens frame images, analyzed by mathematical algorithm The corresponding physiological signal parameter of the three-dimensional space state of people confirms mood shape of the people on different time space by special algorithm State realizes the contactless Emotion identification based on video analysis.The amplitude and frequency generated due to the head vibration of people is in the time With can all be changed in any point in space, by the amplitude of each pixel, the figure of system every frame per second in general movement The relative motion of storage image is reflected as moving the ratio generated, these information can be reduced to millimeter or micro- in a manner of colour code Rice.When facial identical position on the image, relative amplitude amplitude can be handled automatically by system.
It is a kind of realization process example of physiological vibrations mood computing module below.
Wherein, the amplitude of point each in video image is determined by following formula:
Wherein in x, y representative image the point coordinate value, the totalframes of n representative image, Vx,y,iRepresenting should in the i-th frame The displacement amplitude of point.
The frequency of each point is that following formula determines:
ΔiThe i-th point of difference between different frame of-image;
According to the reflex control relationship of Human Physiology vibration and emotional state, the feelings for being tested individual are calculated by following formula Thread tensity:
Wherein:The thermal vibration image net amplitude of tested individual left part is represented,Represent tested individual right part Thermal vibration image net amplitude,Represent fromStart toBetween maximum value,Represent tested individual left part Vibrational image maximum frequency,The maximum frequency of the thermal vibration image of tested individual right part is represented,Represent from Start toBetween maximum value, n represents tested individual and occupies maximum calorific value.

Claims (9)

1. a kind of mood detection and method for early warning based on physiological vibrations analysis, can be by dividing Human Physiology vibration signal Non-contact detection and early warning to the emotional state of people are realized in analysis, specifically include video data stream input, physiological vibrations signal It extracts, physiological vibrations mood calculates, computation model training optimization, emotional state determines, abnormal emotion early warning pushes 5 function moulds Block.
2. according to the method described in claim 1, it is characterized by: the Human Physiology vibration signal is by the face to people And the muscle Vibration Condition of neck carries out video and captures to obtain;The Human Physiology vibration signal be embodied in frequency and The other form of amplitude two major classes.
3. according to the method described in claim 1, it is characterized by: the physiological vibrations signal extraction module is responsible for from video counts According to positioning in stream to personage's individual and feature extraction, the facial musculi colli vibration of each personage's individual is tested and analyzed frame by frame Situation therefrom extracts the amplitude of physiological vibrations and the relevant parameter of frequency, and exports to physiological vibrations mood computing module.
4. according to the method described in claim 1, it is characterized by: the physiological vibrations mood computing module is responsible for typing The amplitude and frequency dependence parameter of physiological vibrations carry out quantum chemical method, show that this is tested the emotional state value of individual with this.
5. according to the method described in claim 1, it is characterized by: computation model training optimization module passes through stochastic gradient Descent method realizes the iteration optimization to physiological vibrations mood computation model, to promote the accuracy rate that mood determines result.
6. according to the method described in claim 1, it is characterized by: the emotional state determination module is responsible for the tested individual Emotional state value determined, if the value be higher than preset threshold value, trigger abnormal emotion early warning pushing module.
7. according to the method described in claim 3, it is characterized by: the amplitude parameter is by formulaDetermine, wherein in x, y representative image the point coordinate value, total frame of n representative image Number, Vx,y,iRepresent the displacement amplitude of the point in the i-th frame.
8. according to the method described in claim 3, it is characterized by: the frequency parameter is by formulaIt determines, wherein ΔiThe i-th point of difference between different frame of representative image.
9. according to the method described in claim 4, it is characterized by: the physiological vibrations mood computing module passes through formulaThe nervous degree for being tested individual is calculated, whereinRepresent tested individual left side Partial thermal vibration image net amplitude,The thermal vibration image net amplitude of tested individual right part is represented,Represent from Start toBetween maximum value,The vibrational image maximum frequency of tested individual left part is represented,Represent tested individual The maximum frequency of the thermal vibration image of right part,Represent fromStart toBetween maximum value, n represents tested individual Occupy maximum calorific value.
CN201811200522.0A 2018-10-16 2018-10-16 A kind of mood detection and method for early warning based on physiological vibrations analysis Pending CN109528217A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811200522.0A CN109528217A (en) 2018-10-16 2018-10-16 A kind of mood detection and method for early warning based on physiological vibrations analysis

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811200522.0A CN109528217A (en) 2018-10-16 2018-10-16 A kind of mood detection and method for early warning based on physiological vibrations analysis

Publications (1)

Publication Number Publication Date
CN109528217A true CN109528217A (en) 2019-03-29

Family

ID=65843738

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811200522.0A Pending CN109528217A (en) 2018-10-16 2018-10-16 A kind of mood detection and method for early warning based on physiological vibrations analysis

Country Status (1)

Country Link
CN (1) CN109528217A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110781719A (en) * 2019-09-02 2020-02-11 中国航天员科研训练中心 Non-contact and contact cooperative mental state intelligent monitoring system
CN111524601A (en) * 2020-04-26 2020-08-11 华东师范大学 Psychological state testing and evaluating method based on vestibular nerve reflex
CN111631735A (en) * 2020-04-26 2020-09-08 华东师范大学 Abnormal emotion monitoring and early warning method based on video data vibration frequency
CN112150759A (en) * 2020-09-23 2020-12-29 北京安信智文科技有限公司 Real-time monitoring and early warning system and method based on video algorithm
CN112932485A (en) * 2021-01-03 2021-06-11 金纪高科智能科技(北京)有限公司 Non-contact type conversation confidence rate testing system and method
CN113647950A (en) * 2021-08-23 2021-11-16 北京图安世纪科技股份有限公司 Psychological emotion detection method and system
CN113837125A (en) * 2021-09-28 2021-12-24 杭州聚视鼎特科技有限公司 Cadre psychological mood trend digital management system

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100152600A1 (en) * 2008-04-03 2010-06-17 Kai Sensors, Inc. Non-contact physiologic motion sensors and methods for use
CN105651377A (en) * 2016-01-11 2016-06-08 衢州学院 Video data mining-based non-contact object vibration frequency measurement method
CN106250877A (en) * 2016-08-19 2016-12-21 深圳市赛为智能股份有限公司 Near-infrared face identification method and device
CN106618608A (en) * 2016-09-29 2017-05-10 金湘范 Device and method for monitoring dangerous people based on video psychophysiological parameters
CN107169426A (en) * 2017-04-27 2017-09-15 广东工业大学 A kind of detection of crowd's abnormal feeling and localization method based on deep neural network
US20170367651A1 (en) * 2016-06-27 2017-12-28 Facense Ltd. Wearable respiration measurements system
CN207367228U (en) * 2017-08-25 2018-05-15 太原康祺科技发展有限公司 Potential emotional intelligence analysis system device applied to careers guidance
CN207367229U (en) * 2017-08-25 2018-05-15 太原康祺科技发展有限公司 Applied to the potential emotional intelligence analysis system device detected before particular job post

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100152600A1 (en) * 2008-04-03 2010-06-17 Kai Sensors, Inc. Non-contact physiologic motion sensors and methods for use
CN105651377A (en) * 2016-01-11 2016-06-08 衢州学院 Video data mining-based non-contact object vibration frequency measurement method
US20170367651A1 (en) * 2016-06-27 2017-12-28 Facense Ltd. Wearable respiration measurements system
CN106250877A (en) * 2016-08-19 2016-12-21 深圳市赛为智能股份有限公司 Near-infrared face identification method and device
CN106618608A (en) * 2016-09-29 2017-05-10 金湘范 Device and method for monitoring dangerous people based on video psychophysiological parameters
CN107169426A (en) * 2017-04-27 2017-09-15 广东工业大学 A kind of detection of crowd's abnormal feeling and localization method based on deep neural network
CN207367228U (en) * 2017-08-25 2018-05-15 太原康祺科技发展有限公司 Potential emotional intelligence analysis system device applied to careers guidance
CN207367229U (en) * 2017-08-25 2018-05-15 太原康祺科技发展有限公司 Applied to the potential emotional intelligence analysis system device detected before particular job post

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110781719A (en) * 2019-09-02 2020-02-11 中国航天员科研训练中心 Non-contact and contact cooperative mental state intelligent monitoring system
CN111524601A (en) * 2020-04-26 2020-08-11 华东师范大学 Psychological state testing and evaluating method based on vestibular nerve reflex
CN111631735A (en) * 2020-04-26 2020-09-08 华东师范大学 Abnormal emotion monitoring and early warning method based on video data vibration frequency
CN112150759A (en) * 2020-09-23 2020-12-29 北京安信智文科技有限公司 Real-time monitoring and early warning system and method based on video algorithm
CN112932485A (en) * 2021-01-03 2021-06-11 金纪高科智能科技(北京)有限公司 Non-contact type conversation confidence rate testing system and method
CN113647950A (en) * 2021-08-23 2021-11-16 北京图安世纪科技股份有限公司 Psychological emotion detection method and system
CN113837125A (en) * 2021-09-28 2021-12-24 杭州聚视鼎特科技有限公司 Cadre psychological mood trend digital management system

Similar Documents

Publication Publication Date Title
CN109528217A (en) A kind of mood detection and method for early warning based on physiological vibrations analysis
Gunes et al. From the lab to the real world: Affect recognition using multiple cues and modalities
Chen et al. Eyebrow emotional expression recognition using surface EMG signals
RU2708807C2 (en) Algorithm of integrated remote contactless multichannel analysis of psychoemotional and physiological state of object based on audio and video content
CN112766173B (en) Multi-mode emotion analysis method and system based on AI deep learning
Lu et al. Quantifying limb movements in epileptic seizures through color-based video analysis
Kortelainen et al. Multimodal emotion recognition by combining physiological signals and facial expressions: a preliminary study
Wei et al. Real-time facial expression recognition for affective computing based on Kinect
Kaiser et al. Automated coding of facial behavior in human-computer interactions with FACS
CN111920420A (en) Patient behavior multi-modal analysis and prediction system based on statistical learning
Abd Latif et al. Thermal imaging based affective state recognition
Chiarugi et al. Facial Signs and Psycho-physical Status Estimation for Well-being Assessment.
Dinculescu et al. Novel approach to face expression analysis in determining emotional valence and intensity with benefit for human space flight studies
Montenegro et al. Emotion understanding using multimodal information based on autobiographical memories for Alzheimer’s patients
Pantic et al. Facial gesture recognition in face image sequences: A study on facial gestures typical for speech articulation
Kandemir et al. Facial expression classification with haar features, geometric features and cubic b㉠zier curves
KR101940673B1 (en) Evaluation Method of Empathy based on micro-movement and system adopting the method
Wang et al. MGEED: A Multimodal Genuine Emotion and Expression Detection Database
Zhang et al. Auxiliary diagnostic system for ADHD in children based on AI technology
Powar An approach for the extraction of thermal facial signatures for evaluating threat and challenge emotional states
Montenegro et al. Cognitive behaviour analysis based on facial information using depth sensors
Syamsuddin Profound correlation of human and NAO-robot interaction through facial expression controlled by EEG sensor
Mousavi et al. Emotion Recognition in Adaptive Virtual Reality Settings: Challenges and Opportunities
Rivera et al. Development of an automatic expression recognition system based on facial action coding system
Turabzadeh Automatic emotional state detection and analysis on embedded devices

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20190329

WD01 Invention patent application deemed withdrawn after publication