CN107203266A - A kind of data processing method based on VR - Google Patents

A kind of data processing method based on VR Download PDF

Info

Publication number
CN107203266A
CN107203266A CN201710349766.4A CN201710349766A CN107203266A CN 107203266 A CN107203266 A CN 107203266A CN 201710349766 A CN201710349766 A CN 201710349766A CN 107203266 A CN107203266 A CN 107203266A
Authority
CN
China
Prior art keywords
equipment
target area
light emitting
emitting source
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710349766.4A
Other languages
Chinese (zh)
Inventor
向敏明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dongguan Huarui Electronic Technology Co Ltd
Original Assignee
Dongguan Huarui Electronic Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dongguan Huarui Electronic Technology Co Ltd filed Critical Dongguan Huarui Electronic Technology Co Ltd
Priority to CN201710349766.4A priority Critical patent/CN107203266A/en
Publication of CN107203266A publication Critical patent/CN107203266A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0641Shopping interfaces
    • G06Q30/0643Graphical representation of items or shoppers

Abstract

The invention provides a kind of data processing method based on VR, method includes:Human body information detection device obtains the build outline data of user;VR equipment receives the build outline data that the human body information detection device is sent;The VR equipment receives the selection instruction of the user;The VR equipment determines corresponding clothing according to the selection instruction;The VR equipment is adjusted the virtual image for obtaining the clothing according to the build outline data to the corresponding size data of the clothing;The VR equipment shoots the realtime graphic of the user;The virtual image and the realtime graphic are overlapped generation target image by the VR equipment;The VR equipment shows the target image to user.

Description

A kind of data processing method based on VR
Technical field
The communications field of the present invention, more particularly to a kind of data processing method based on VR.
Background technology
With the development of VR technologies, some are had in the prior art can realize the technical scheme of function between virtual fitting.
But in the scheme between existing virtual fitting, often simply carry out pure figure layer and replace, i.e., using clothing image Human body image is replaced, the display effect so fitted is more stiff, and matching degree is low, have impact on Consumer's Experience.
The content of the invention
The invention provides a kind of data processing method based on VR.
The data processing method based on VR that the present invention is provided, including:
The data processing method based on VR that the present invention is provided, methods described includes:
Human body information detection device obtains the build outline data of user;
VR equipment receives the build outline data that the human body information detection device is sent;
The VR equipment receives the selection instruction of the user;
The VR equipment determines corresponding clothing according to the selection instruction;
The VR equipment is adjusted to the corresponding size data of the clothing according to the build outline data and obtains institute State the virtual image of clothing;
The VR equipment shoots the realtime graphic of the user;
The virtual image and the realtime graphic are overlapped generation target image by the VR equipment;
The VR equipment shows the target image to user.
Alternatively, the build outline data of the human body information detection device acquisition user includes:
Multiple range sensors that the human body information detection device is set by itself are detected to the user;
The range data that the human body information detection device is returned according to the multiple range sensor generates the user Build outline data.
Alternatively, the range data that the human body information detection device is returned according to the multiple range sensor generates institute Stating the build outline data of user includes:
Multiple range data are mapped to the coordinate points of space coordinates by the human body information detection device;
Multiple coordinate points are connected to form envelope data by the human body information detection device;
The human body information detection device generates the build outline data of the user according to the envelope data.
Alternatively, the virtual image and the realtime graphic are overlapped generation target image bag by the VR equipment Include:
The realtime graphic is divided into each region by the VR equipment;
The VR equipment sets virtual light emitting source;
The VR equipment adjusts the color value in each region according to the virtual light emitting source, obtains background image;
The virtual image is superimposed on the background image and obtains the target image by the VR equipment.
Alternatively, the color value that the VR equipment adjusts each region according to the virtual light emitting source includes:
The VR equipment obtains the color value of the virtual light emitting source;
The VR equipment determines target area, during the target area is the realtime graphic, by the virtual light emitting source The region irradiated;
The VR equipment determines target according to the color value of the virtual light emitting source and the color value of the target area Color value;
The color value of the target area is replaced with the target color values by the VR equipment.
Alternatively, the VR equipment is true according to the color value of the virtual light emitting source and the color value of the target area Set the goal before color value, method also includes:
The VR equipment determines the reflection coefficient of the target area;
The reflection coefficient and the reflectance positive correlation of the material of the illuminated object corresponding to the target area.
Alternatively, the VR equipment determines also to include after the reflection coefficient of the target area:
The VR equipment is according to the scene distance between the reflection coefficient, the virtual light emitting source and the target area And the penetrating degree parameter of scene calculates the tone weights of the target area;
The VR equipment determines target according to the color value of the virtual light emitting source and the color value of the target area Color value includes:
The VR equipment is according to the color value, the color value of the target area and the target of the virtual light emitting source The tone weights in region determine target color values.
Alternatively, the VR equipment is according between the reflection coefficient, the virtual light emitting source and the target area The tone weights that scene distance and the penetrating degree parameter of scene calculate the target area include:
The scene distance and scene that the VR equipment is obtained between the virtual light emitting source and the target area are penetrating Spend parameter;
The VR equipment determines attenuation coefficient, the decay according to the penetrating degree parameter of the scene distance and the scene Coefficient and the scene distance positive correlation, and it is negatively correlated with the penetrating degree parameter of the scene;
The VR equipment is decayed according to the light attenuation coefficient to the light intensity parameter of the virtual light emitting source Interim light intensity parameter;
The VR equipment is modified to the interim light intensity parameter by the reflection coefficient and obtains target light intensity parameter;
The VR equipment calculates the tone weights of the target area according to the target light intensity parameter.
Alternatively, the VR equipment according to the color value of the virtual light emitting source, the color value of the target area and The tone weights of the target area determine target color values, including:
The VR equipment color value of the virtual light emitting source strengthen using the tone weights obtain first plus Weights;
The VR equipment decay obtaining the second weighting using the tone weights to the color value of the target area Value;
The VR equipment regard the difference between first weighted value and second weighted value as the color of object Value.
In the present invention, human body information detection device obtains the build outline data of user;VR equipment receives the human body letter Cease the build outline data that detection device is sent;The VR equipment receives the selection instruction of the user;The VR equipment Corresponding clothing is determined according to the selection instruction;The VR equipment is corresponding to the clothing according to the build outline data Size data is adjusted the virtual image for obtaining the clothing;The VR equipment shoots the realtime graphic of the user;It is described The virtual image and the realtime graphic are overlapped generation target image by VR equipment;The VR equipment shows institute to user Target image is stated, VR equipment can be adjusted to the corresponding size data of the clothing according to the build outline data and obtain The virtual image of the clothing, so enabling to virtual image more to be matched when being overlapped with realtime graphic, so as to carry High Consumer's Experience.
Brief description of the drawings
Fig. 1 is data processing method schematic flow sheet.
Embodiment
In order that those skilled in the art is better understood from technical scheme, it is below in conjunction with the accompanying drawings and specific real Applying mode, the present invention is described in further detail.
Referring to Fig. 1, data processing method flow of the present invention includes:
101st, human body information detection device obtains the build outline data of user;
Multiple range sensors that the human body information detection device is set by itself are detected to the user;
The range data that the human body information detection device is returned according to the multiple range sensor generates the user Build outline data.
Wherein,
The range data that human body information detection device is returned according to the multiple range sensor generates the body of the user Type outline data includes:
Multiple range data are mapped to the coordinate points of space coordinates by the human body information detection device;
Multiple coordinate points are connected to form envelope data by the human body information detection device;
The human body information detection device generates the build outline data of the user according to the envelope data.
102nd, VR equipment receives the build outline data that the human body information detection device is sent;
103rd, the VR equipment receives the selection instruction of the user;
104th, the VR equipment determines corresponding clothing according to the selection instruction;
105th, the VR equipment is adjusted according to the build outline data to the corresponding size data of the clothing To the virtual image of the clothing;
106th, the VR equipment shoots the realtime graphic of the user;
107th, the virtual image and the realtime graphic are overlapped generation target image by the VR equipment;
Wherein,
The virtual image and the realtime graphic are overlapped generation target image by the VR equipment to be included:
The realtime graphic is divided into each region by the VR equipment;
The VR equipment sets virtual light emitting source;
The VR equipment adjusts the color value in each region according to the virtual light emitting source, obtains background image;
The virtual image is superimposed on the background image and obtains the target image by the VR equipment.
Wherein, the color value that the VR equipment adjusts each region according to the virtual light emitting source includes:
The VR equipment obtains the color value of the virtual light emitting source;
The VR equipment determines target area, during the target area is the realtime graphic, by the virtual light emitting source The region irradiated;
The VR equipment determines target according to the color value of the virtual light emitting source and the color value of the target area Color value;
The color value of the target area is replaced with the target color values by the VR equipment.
Wherein, the VR equipment is determined according to the color value of the virtual light emitting source and the color value of the target area Before target color values, method also includes:
The VR equipment determines the reflection coefficient of the target area;
The reflection coefficient and the reflectance positive correlation of the material of the illuminated object corresponding to the target area.
Wherein, the VR equipment determines also to include after the reflection coefficient of the target area:
The VR equipment is according to the scene distance between the reflection coefficient, the virtual light emitting source and the target area And the penetrating degree parameter of scene calculates the tone weights of the target area;
The VR equipment determines target according to the color value of the virtual light emitting source and the color value of the target area Color value includes:
The VR equipment is according to the color value, the color value of the target area and the target of the virtual light emitting source The tone weights in region determine target color values.
Wherein, the VR equipment is according to the field between the reflection coefficient, the virtual light emitting source and the target area The tone weights that scape distance and the penetrating degree parameter of scene calculate the target area include:
The scene distance and scene that the VR equipment is obtained between the virtual light emitting source and the target area are penetrating Spend parameter;
The VR equipment determines attenuation coefficient, the decay according to the penetrating degree parameter of the scene distance and the scene Coefficient and the scene distance positive correlation, and it is negatively correlated with the penetrating degree parameter of the scene;
The VR equipment is decayed according to the light attenuation coefficient to the light intensity parameter of the virtual light emitting source Interim light intensity parameter;
The VR equipment is modified to the interim light intensity parameter by the reflection coefficient and obtains target light intensity parameter;
The VR equipment calculates the tone weights of the target area according to the target light intensity parameter.
Wherein, the VR equipment is according to the color value, the color value of the target area and institute of the virtual light emitting source The tone weights for stating target area determine target color values, including:
The VR equipment color value of the virtual light emitting source strengthen using the tone weights obtain first plus Weights;
The VR equipment decay obtaining the second weighting using the tone weights to the color value of the target area Value;
The VR equipment regard the difference between first weighted value and second weighted value as the color of object Value.
108th, the VR equipment shows the target image to user.
Described above, the above embodiments are merely illustrative of the technical solutions of the present invention, rather than its limitations;Although with reference to before Embodiment is stated the present invention is described in detail, it will be understood by those within the art that:It still can be to preceding State the technical scheme described in each embodiment to modify, or equivalent substitution is carried out to which part technical characteristic;And these Modification is replaced, and the essence of appropriate technical solution is departed from the spirit and scope of various embodiments of the present invention technical scheme.

Claims (9)

1. a kind of data processing method based on VR, it is characterised in that methods described includes:
Human body information detection device obtains the build outline data of user;
VR equipment receives the build outline data that the human body information detection device is sent;
The VR equipment receives the selection instruction of the user;
The VR equipment determines corresponding clothing according to the selection instruction;
The VR equipment is adjusted to the corresponding size data of the clothing according to the build outline data and obtains the clothing The virtual image of dress;
The VR equipment shoots the realtime graphic of the user;
The virtual image and the realtime graphic are overlapped generation target image by the VR equipment;
The VR equipment shows the target image to user.
2. according to the method described in claim 1, it is characterised in that the human body information detection device obtains the build wheel of user Wide data include:
Multiple range sensors that the human body information detection device is set by itself are detected to the user;
The range data that the human body information detection device is returned according to the multiple range sensor generates the body of the user Type outline data.
3. method according to claim 2, it is characterised in that the human body information detection device is according to the multiple distance The build outline data that the range data that sensor is returned generates the user includes:
Multiple range data are mapped to the coordinate points of space coordinates by the human body information detection device;
Multiple coordinate points are connected to form envelope data by the human body information detection device;
The human body information detection device generates the build outline data of the user according to the envelope data.
4. method according to claim 3, it is characterised in that the VR equipment is by the virtual image and the real-time figure Include as being overlapped generation target image:
The realtime graphic is divided into each region by the VR equipment;
The VR equipment sets virtual light emitting source;
The VR equipment adjusts the color value in each region according to the virtual light emitting source, obtains background image;
The virtual image is superimposed on the background image and obtains the target image by the VR equipment.
5. method according to claim 4, it is characterised in that the VR equipment adjusts each area according to the virtual light emitting source The color value in domain includes:
The VR equipment obtains the color value of the virtual light emitting source;
The VR equipment determines target area, during the target area is the realtime graphic, is shone by the virtual light emitting source The region penetrated;
The VR equipment determines color of object according to the color value of the virtual light emitting source and the color value of the target area Value;
The color value of the target area is replaced with the target color values by the VR equipment.
6. method according to claim 5, it is characterised in that the VR equipment is according to the color value of the virtual light emitting source And before the color value of the target area determines target color values, method also includes:
The VR equipment determines the reflection coefficient of the target area;
The reflection coefficient and the reflectance positive correlation of the material of the illuminated object corresponding to the target area.
7. method according to claim 6, it is characterised in that the VR equipment determines the reflection coefficient of the target area Also include afterwards:
The VR equipment according to the scene distance between the reflection coefficient, the virtual light emitting source and the target area and The penetrating degree parameter of scene calculates the tone weights of the target area;
The VR equipment determines color of object according to the color value of the virtual light emitting source and the color value of the target area Value includes:
The VR equipment is according to the color value, the color value of the target area and the target area of the virtual light emitting source Tone weights determine target color values.
8. method according to claim 7, it is characterised in that the VR equipment is according to the reflection coefficient, described virtual The penetrating degree parameter of scene distance and scene between light emitting source and the target area calculates the tone power of the target area Value includes:
The penetrating degree of scene distance and scene that the VR equipment is obtained between the virtual light emitting source and the target area is joined Number;
The VR equipment determines attenuation coefficient, the attenuation coefficient according to the penetrating degree parameter of the scene distance and the scene With the scene distance positive correlation, it is and negatively correlated with the penetrating degree parameter of the scene;
The VR equipment to the light intensity parameter of the virtual light emitting source decay according to the light attenuation coefficient obtains interim Light intensity parameter;
The VR equipment is modified to the interim light intensity parameter by the reflection coefficient and obtains target light intensity parameter;
The VR equipment calculates the tone weights of the target area according to the target light intensity parameter.
9. method according to claim 8, it is characterised in that the VR equipment is according to the color of the virtual light emitting source The tone weights of value, the color value of the target area and the target area determine target color values, including:
The VR equipment strengthen obtaining the first weighted value using the tone weights to the color value of the virtual light emitting source;
The VR equipment decay obtaining the second weighted value using the tone weights to the color value of the target area;
The VR equipment regard the difference between first weighted value and second weighted value as the target color values.
CN201710349766.4A 2017-05-17 2017-05-17 A kind of data processing method based on VR Pending CN107203266A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710349766.4A CN107203266A (en) 2017-05-17 2017-05-17 A kind of data processing method based on VR

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710349766.4A CN107203266A (en) 2017-05-17 2017-05-17 A kind of data processing method based on VR

Publications (1)

Publication Number Publication Date
CN107203266A true CN107203266A (en) 2017-09-26

Family

ID=59905817

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710349766.4A Pending CN107203266A (en) 2017-05-17 2017-05-17 A kind of data processing method based on VR

Country Status (1)

Country Link
CN (1) CN107203266A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107749924A (en) * 2017-10-27 2018-03-02 努比亚技术有限公司 Connect the VR apparatus operation methods of multiple mobile terminals and corresponding VR equipment
CN109255687A (en) * 2018-09-27 2019-01-22 姜圣元 The virtual trial assembly system of commodity and trial assembly method
CN109461049A (en) * 2018-10-19 2019-03-12 刘景江 A kind of fitting method and device using virtual reality technology
CN111583415A (en) * 2020-05-08 2020-08-25 维沃移动通信有限公司 Information processing method and device and electronic equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103106604A (en) * 2013-01-23 2013-05-15 东华大学 Three dimensional (3D) virtual fitting method based on somatosensory technology
CN103218506A (en) * 2011-11-09 2013-07-24 索尼公司 Information processing apparatus, display control method, and program
US20160110595A1 (en) * 2014-10-17 2016-04-21 Qiaosong Wang Fast 3d model fitting and anthropometrics using synthetic data
CN105843386A (en) * 2016-03-22 2016-08-10 宁波元鼎电子科技有限公司 Virtual fitting system in shopping mall
CN106489166A (en) * 2014-04-11 2017-03-08 麦特尔有限公司 Garment size is recommended and fit analysis system and method
CN106534835A (en) * 2016-11-30 2017-03-22 珠海市魅族科技有限公司 Image processing method and device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103218506A (en) * 2011-11-09 2013-07-24 索尼公司 Information processing apparatus, display control method, and program
CN103106604A (en) * 2013-01-23 2013-05-15 东华大学 Three dimensional (3D) virtual fitting method based on somatosensory technology
CN106489166A (en) * 2014-04-11 2017-03-08 麦特尔有限公司 Garment size is recommended and fit analysis system and method
US20160110595A1 (en) * 2014-10-17 2016-04-21 Qiaosong Wang Fast 3d model fitting and anthropometrics using synthetic data
CN105843386A (en) * 2016-03-22 2016-08-10 宁波元鼎电子科技有限公司 Virtual fitting system in shopping mall
CN106534835A (en) * 2016-11-30 2017-03-22 珠海市魅族科技有限公司 Image processing method and device

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107749924A (en) * 2017-10-27 2018-03-02 努比亚技术有限公司 Connect the VR apparatus operation methods of multiple mobile terminals and corresponding VR equipment
CN109255687A (en) * 2018-09-27 2019-01-22 姜圣元 The virtual trial assembly system of commodity and trial assembly method
CN109461049A (en) * 2018-10-19 2019-03-12 刘景江 A kind of fitting method and device using virtual reality technology
CN111583415A (en) * 2020-05-08 2020-08-25 维沃移动通信有限公司 Information processing method and device and electronic equipment
CN111583415B (en) * 2020-05-08 2023-11-24 维沃移动通信有限公司 Information processing method and device and electronic equipment

Similar Documents

Publication Publication Date Title
CN107203266A (en) A kind of data processing method based on VR
CN104598915B (en) A kind of gesture identification method and device
CN106534835B (en) A kind of image processing method and device
US9058661B2 (en) Method for the real-time-capable, computer-assisted analysis of an image sequence containing a variable pose
US9135502B2 (en) Method for the real-time-capable, computer-assisted analysis of an image sequence containing a variable pose
Clark et al. An interactive augmented reality coloring book
CN102812474B (en) Head Recognition Method
CN104392223B (en) Human posture recognition method in two-dimensional video image
CN104778712A (en) Method and system for pasting image to human face based on affine transformation
US9432594B2 (en) User recognition apparatus and method
CN108510594A (en) Virtual fit method, device and terminal device
CN108830225A (en) The detection method of target object, device, equipment and medium in terahertz image
US10984610B2 (en) Method for influencing virtual objects of augmented reality
CN110192241A (en) Control the brightness of emissive display
CN107665506A (en) Realize the method and system of augmented reality
CN103824089A (en) Cascade regression-based face 3D pose recognition method
CN106201173A (en) The interaction control method of a kind of user's interactive icons based on projection and system
CN104574387A (en) Image processing method in underwater vision SLAM system
CN107665505A (en) The method and device of augmented reality is realized based on plane monitoring-network
CN107886558A (en) A kind of human face expression cartoon driving method based on RealSense
CN104915976A (en) Image processing method and system for simulating pencil sketch
CN107665508A (en) Realize the method and system of augmented reality
KR20170002097A (en) Method for providing ultra light-weight data animation type based on sensitivity avatar emoticon
KR20220031142A (en) Color identification using infrared imaging
CN102609964A (en) Portrait paper-cut generation method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 1242802

Country of ref document: HK

WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20170926

WD01 Invention patent application deemed withdrawn after publication