CN103035028B - A kind of method and device realizing interactive application scene - Google Patents

A kind of method and device realizing interactive application scene Download PDF

Info

Publication number
CN103035028B
CN103035028B CN201210533188.7A CN201210533188A CN103035028B CN 103035028 B CN103035028 B CN 103035028B CN 201210533188 A CN201210533188 A CN 201210533188A CN 103035028 B CN103035028 B CN 103035028B
Authority
CN
China
Prior art keywords
interactive application
area
see
target area
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201210533188.7A
Other languages
Chinese (zh)
Other versions
CN103035028A (en
Inventor
谢桂冠
胡建华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vtron Technologies Ltd
Original Assignee
Vtron Technologies Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vtron Technologies Ltd filed Critical Vtron Technologies Ltd
Priority to CN201210533188.7A priority Critical patent/CN103035028B/en
Publication of CN103035028A publication Critical patent/CN103035028A/en
Application granted granted Critical
Publication of CN103035028B publication Critical patent/CN103035028B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the invention discloses a kind of method and the device that realize interactive application scene, when drawing interactive application scene, the workload of drawing can be reduced, improving and drawing efficiency.Embodiment of the present invention method comprises: obtain user's request information; Application scenarios modeling is carried out according to described user's request information; The interactive application scenario objects corresponding with described modeling result is made according to modeling result; Described interactive application scenario objects is divided into target area and see-through area; The object properties of described target area and see-through area are set; Show described interactive application scenario objects.The method of the interactive application scene in the embodiment of the present invention and device when drawing interactive application scene, can reduce the workload of drawing, and improve and draw efficiency.

Description

A kind of method and device realizing interactive application scene
Technical field
The embodiment of the present invention relates to software interactive application, is specifically related to a kind of method and the device that realize interactive application scene.
Background technology
In today of scientific and technological develop rapidly, various high-new equipment emerges in an endless stream, and the function of equipment also becomes and becomes increasingly complex, and makes interactive application scene, the window that this user exchanges with equipment, become more and more add important.
When drawing interactive application scene, often to carry out the change of interactive application scene background color and the amendment of application scenarios shape, generally all that global revision is carried out to interactive application scene in prior art, owing to also needing to process the application scenarios background color of amendment subregion during amendment application scenarios shape, also need to process application scenarios shape profile subregion during amendment application scenarios background color, therefore the data that this action need process is a large amount of, efficiency is relatively low.
Summary of the invention
Embodiments provide a kind of method and the device that realize interactive application scene, when drawing interactive application scene, the workload of drawing can be reduced, improving and drawing efficiency.
The method of interactive application scene that what the present embodiment provided realize, comprising:
Obtain user's request information;
Application scenarios modeling is carried out according to described user's request information;
The interactive application scenario objects corresponding with described modeling result is made according to modeling result;
Described interactive application scenario objects is divided into target area and see-through area;
The object properties of described target area and see-through area are set;
Show described interactive application scenario objects.
Alternatively, describedly described interactive application scenario objects is divided into target area and see-through area comprises:
The colored pixels of target area is obtained according to user's request information and modeling result;
Areas combine is carried out, as target area to all designated color pixel regions;
Areas combine is carried out, as see-through area to all non-designated colored pixels regions.
Alternatively, describedly described interactive application scenario objects is divided into target area and see-through area comprises:
According to user's request information and modeling result, plate picture is covered in design;
Described illiteracy plate picture distinguishes target area and see-through area by different color block;
Cut out according to target area and the picture of color block to interactive application scenario objects place corresponding to see-through area, obtain target area and see-through area.
Alternatively, described user's request information comprises at least one of user's scene, interactive mode, interaction scenarios pattern, function and applicable situation.
Alternatively, described object properties comprise the transparency of object, event and signaling.
Alternatively, the described application scenarios object of described display comprises:
Interaction scenarios interfaces windows is created in application software;
Interactive application scenario objects is presented in interaction scenarios interfaces windows in the ratio preset.
The device of interactive application scene that what the present embodiment provided realize, comprising:
Acquiring unit, for obtaining user's request information;
Modeling unit, for carrying out application scenarios modeling according to described user's request information;
Production unit, makes the interactive application scenario objects corresponding with described modeling result according to modeling result;
Division unit, for being divided into target area and see-through area by described interactive application scenario objects;
Setting unit, for arranging the object properties of described target area and see-through area;
Display unit, for showing described interactive application scenario objects.
Alternatively, described division unit comprises:
Obtain subelement, for obtaining the colored pixels of target area according to user's request information and modeling result;
First combination subelement, for carrying out areas combine, as target area to all designated color pixel regions;
Second combination subelement, for carrying out areas combine, as see-through area to all non-designated colored pixels regions.
Alternatively, described division unit comprises:
Design subelement, for according to user's request information and modeling result, designs and covers plate picture;
Distinguish subelement, for distinguishing target area and see-through area by different color block on described illiteracy plate picture;
Cutting out subelement, for cutting out according to target area and the picture of color block to interactive application scenario objects place corresponding to see-through area, obtaining target area and see-through area.
Alternatively, it is characterized in that, described display unit comprises:
Set up subelement, for creating interaction scenarios interfaces windows in application software;
Display subelement, for being presented at interactive interface window according to a certain percentage by application scenarios object.
In the present embodiment, first user's request information is obtained, then application scenarios modeling is carried out according to user's request information, then the interactive application scenario objects corresponded is made according to modeling result, then interactive application scenario objects is divided into target area and see-through area, the object properties of Offered target region and see-through area again, finally show described interactive application scenario objects.Just can process target area and see-through area separately when therefore revising interactive application scene, and need not entirety process, so the workload of amendment can be reduced, improve the efficiency of drawing interactive application scene.
Accompanying drawing explanation
Fig. 1 is the method first embodiment process flow diagram realizing interactive application scene in the embodiment of the present invention;
Fig. 2 is the method second embodiment process flow diagram realizing interactive application scene in the embodiment of the present invention;
Fig. 3 is the device first example structure figure realizing interactive application scene in the embodiment of the present invention;
Fig. 4 is the device second example structure figure realizing interactive application scene in the embodiment of the present invention.
Embodiment
Embodiments provide a kind of method and the device that realize interactive application scene, when drawing interactive application scene, workload can be reduced, raise the efficiency.
Refer to Fig. 1, the first embodiment realizing the method for interactive application scene in the embodiment of the present invention comprises:
101, user's request information is obtained;
Before carrying out application scenarios modeling, can first obtain user's request information, user's request information can comprise at least one of user's scene, interactive mode, interaction scenarios pattern, function and applicable situation.
102, application scenarios modeling is carried out according to user's request information;
After obtaining user's request information, application scenarios modeling can be carried out according to user's request information, such as, can be according to the corresponding model of user's scene drawing, also can modify to model according to applicable situation.
103, the interactive application scenario objects corresponding with modeling result is made according to modeling result;
After carrying out application scenarios modeling, the interactive application scenario objects corresponding with modeling result can be made according to modeling result, according to user's request information and the mutual needs of reality, interactive application scenario objects can be picture, do not limit the form of picture herein, concrete picture format can need alternately according to reality and determine.
104, interactive application scenario objects is divided into target area and see-through area;
Above-mentioned target area is fixed according to user's request information, is not limited to a connected entirety, can be made up of, and see-through area is the part except target area in interactive application scene some parts of separating.Such as, when carrying out roundtable conference, first can set up the interactive application scenario objects comprising round table and some chairs, interaction demand for above-mentioned roundtable conference can be known, round table and some chairs are the main bodys of interactive application scene, therefore round table and some chairs can form target area, and can form see-through area except the part of round table and some chairs in interactive application scenario objects.
105, the object properties of Offered target region and see-through area;
After interactive application scenario objects is divided into target area and see-through area, can according to the object properties of user's request information Offered target region and see-through area, above-mentioned object properties can comprise the transparency of object, event and signaling.
106, interactive application scenario objects is shown.
After the object properties of Offered target region and see-through area, interactive application scenario objects can be shown in predetermined interactive window.
In the present embodiment, first user's request information is obtained, then application scenarios modeling is carried out according to user's request information, then the interactive application scenario objects corresponded is made according to modeling result, then interactive application scenario objects is divided into target area and see-through area, the object properties of Offered target region and see-through area again, finally show described interactive application scenario objects.Just can process target area and see-through area separately when therefore revising interactive application scene, and need not entirety process, so the workload of amendment can be reduced, improve the efficiency of drawing interactive application scene.
Simply describe the first embodiment that the present invention realizes the method for interactive application scene above, the second embodiment the present invention being realized to the method for interactive application scene is below described in detail, refer to Fig. 2, the second embodiment realizing the method for interactive application scene in the embodiment of the present invention comprises:
201, user's request information is obtained;
Before carrying out application scenarios modeling, can first obtain user's request information, user's request information can comprise at least one of user's scene, interactive mode, interaction scenarios pattern, function and applicable situation.
202, application scenarios modeling is carried out according to user's request information;
After obtaining user's request information, application scenarios modeling can be carried out according to user's request information, such as, can be according to the corresponding model of user's scene drawing, also can modify to model according to applicable situation.
203, the interactive application scenario objects corresponding with modeling result is made according to modeling result;
After carrying out application scenarios modeling, the interactive application scenario objects corresponding with modeling result can be made according to modeling result, according to user's request information and the mutual needs of reality, interactive application scenario objects can be picture, do not limit the form of picture herein, concrete picture format can need alternately according to reality and determine.
204, utilize bitmap method or cover plate method and interactive application scenario objects is divided into target area and see-through area;
Above-mentioned target area is fixed according to user's request information, is not limited to a connected entirety, can be made up of, and see-through area is the part except target area in interactive application scene some parts of separating.Such as, when carrying out roundtable conference, first can set up the interactive application scenario objects comprising round table and some chairs, interaction demand for above-mentioned roundtable conference can be known, round table and some chairs are the main bodys of interactive application scene, therefore round table and some chairs can form target area, and can form see-through area except the part of round table and some chairs in interactive application scenario objects.
Concrete grammar interactive application scenario objects being divided into target area and see-through area has bitmap method and covers plate method two kinds, first introduces bitmap method below:
Utilize bitmap method that interactive application scenario objects is divided into the concrete grammar of target area and see-through area, first the colored pixels of target area can be obtained according to user's request information and modeling result, such as target area black and blue drafting, so black and blueness are exactly above-mentioned colored pixels; Then carry out areas combine to all designated color pixel regions, can obtain target area, the target area obtained can be some pieces of dispersion; Then areas combine is carried out to all non-designated colored pixels regions, can see-through area be obtained.
Describe in detail above and utilize bitmap method that interactive application scenario objects is divided into the process of target area and see-through area, introduce the illiteracy plate method in this step below:
Utilize bitmap method that interactive application scenario objects is divided into the concrete grammar of target area and see-through area, first can according to user's request information and modeling result, plate picture is covered in design; Then on described illiteracy plate picture, target area and see-through area is distinguished by different color block, above-mentioned illiteracy plate picture can profit obtain with the following method, first prepare one with the picture the same with interactive application scenario objects, then in above-mentioned picture, satisfactory illiteracy plate picture can be obtained with different chromatic zones partial objectives for regions and see-through area; Finally cut out according to target area and the picture of color block to interactive application scenario objects place corresponding to see-through area, target area and see-through area can be obtained.
205, the object properties of Offered target region and see-through area;
After interactive application scenario objects is divided into target area and see-through area, can according to the object properties of user's request information Offered target region and see-through area, above-mentioned object properties can comprise the transparency of object, event and signaling.
206, interactive application scenario objects is shown.
After the object properties of Offered target region and see-through area, first can create interaction scenarios interfaces windows in application software, then interactive application scenario objects is presented in interaction scenarios interfaces windows in the ratio preset; Also can directly interactive application scenario objects be shown on desktop in the ratio preset.
The present invention needs to realize based on windows special purpose function in implementation procedure.The windows special purpose function wherein mainly used and type comprise:
CreateRectRgn function, this function is for creating a new region.
CreatePolygonRgn function, this function is for creating a region surrounded by series of points.
CombineRgn function, it is a new region that this function is used for two areas combine, and recycling this function can each the independently shape merging formed.
SetWindowRgn function, this function for setting the viewing area of window, that is arranges window shape.
ExtCreateRegion function, the position that this function will be sheared for calculating picture, finally forms the shape after a shearing.
Colorref type, the type is used for describing RGB color, can be used for objective definition field color.
Above-mentioned listed windows special purpose function and type realize primary function of the present invention and type, the technical scheme that those skilled in the art come given by method flow diagram can implement detail of the present invention, at this because length limit, details is not described.
In the present embodiment, first user's request information is obtained, then application scenarios modeling is carried out according to user's request information, then the interactive application scenario objects corresponded is made according to modeling result, then utilize bitmap method or cover plate method and interactive application scenario objects is divided into target area and see-through area, the object properties of Offered target region and see-through area again, finally show described interactive application scenario objects.Just can process target area and see-through area separately when therefore revising interactive application scene, and need not entirety process, so the workload of amendment can be reduced, improve the efficiency of drawing interactive application scene.
Above the second embodiment of the long-range control method that the present invention is based on three-dimensional virtual scene is described in detail, particularly interactive application scenario objects is divided into the process of target area and see-through area, introduce below and the present invention is based on device first embodiment realizing interactive application scene, refer to Fig. 3, comprise based on the device embodiment realizing interactive application scene in the embodiment of the present invention:
Acquiring unit 301, for obtaining user's request information;
Modeling unit 302, for carrying out application scenarios modeling according to user's request information;
Production unit 303, makes the interactive application scenario objects corresponding with modeling result according to modeling result;
Division unit 304, for being divided into target area and see-through area by interactive application scenario objects;
Setting unit 305, for the object properties of Offered target region and see-through area;
Display unit 306, for showing interactive application scenario objects.
Division unit 304 comprises:
Obtain subelement 3041, for obtaining the colored pixels of target area according to user's request information and modeling result;
First combination subelement 3042, for carrying out areas combine, as target area to designated color pixel region;
Second combination subelement 3043, for carrying out areas combine, as see-through area to non-designated colored pixels region.
Display unit 306 comprises:
Set up subelement 3061, for creating interaction scenarios interfaces windows in application software;
Display subelement 3062, for being presented at interactive interface window according to a certain percentage by application scenarios object.
Before carrying out application scenarios modeling, acquiring unit 301 can first obtain user's request information, and user's request information can comprise at least one of user's scene, interactive mode, interaction scenarios pattern, function and applicable situation.
After obtaining user's request information, modeling unit 302 can carry out application scenarios modeling according to user's request information, such as, can be according to the corresponding model of user's scene drawing, also can modify to model according to applicable situation.
After modeling unit 302 carries out application scenarios modeling, production unit 303 can make the interactive application scenario objects corresponding with modeling result according to modeling result, according to user's request information and the mutual needs of reality, interactive application scenario objects can be picture, do not limit the form of picture herein, concrete picture format can need alternately according to reality and determine.
After the making of interactive application scenario objects is complete, interactive application scenario objects is divided into target area and see-through area by division unit 304;
Above-mentioned target area is fixed according to user's request information, is not limited to a connected entirety, can be made up of, and see-through area is the part except target area in interactive application scene some parts of separating.Such as, when carrying out roundtable conference, first can set up the interactive application scenario objects comprising round table and some chairs, interaction demand for above-mentioned roundtable conference can be known, round table and some chairs are the main bodys of interactive application scene, therefore round table and some chairs can form target area, and can form see-through area except the part of round table and some chairs in interactive application scenario objects.
Division unit 304 can utilize bitmap method that interactive application scenario objects is divided into target area and see-through area, refers to Fig. 3, introduces the bitmap method in the present embodiment below:
Utilize bitmap method that interactive application scenario objects is divided into the concrete grammar of target area and see-through area, first subelement 3041 can obtain target area colored pixels according to user's request information and modeling result is obtained, such as target area black and blue drafting, so black and blueness are exactly above-mentioned colored pixels; Then the first combination subelement 3042 carries out areas combine to all designated color pixel regions, can obtain target area, and the target area obtained can be some pieces of dispersion; Then the second combination subelement 3043 carries out areas combine to all non-designated colored pixels regions, can obtain see-through area.
After interactive application scenario objects is divided into target area and see-through area by division unit 304, setting unit 305 can according to the object properties of user's request information Offered target region and see-through area, and above-mentioned object properties can comprise the transparency of object, event and signaling.
After the object properties of setting unit 305 Offered target region and see-through area, set up subelement 3061 and first can create interaction scenarios interfaces windows in application software, then show subelement 3062 and interactive application scenario objects is presented in interaction scenarios interfaces windows in the ratio preset; Interactive application scenario objects also can directly show on desktop in the ratio preset by display unit 306.
In the present embodiment, first acquiring unit 301 obtains user's request information, then modeling unit 302 carries out application scenarios modeling according to user's request information, production unit 303 then makes the interactive application scenario objects corresponded according to modeling result, then division unit 304 utilizes bitmap method that interactive application scenario objects is divided into target area and see-through area, the object properties of setting unit 305 Offered target region and see-through area again, last display unit 306 shows described interactive application scenario objects.Just can process target area and see-through area separately when therefore revising interactive application scene, and need not entirety process, so the workload of amendment can be reduced, improve the efficiency of drawing interactive application scene.
Above the first embodiment of the remote control that the present invention is based on three-dimensional virtual scene is described in detail, particularly utilize bitmap method that interactive application scenario objects is divided into the process of target area and see-through area, introduce below and the present invention is based on device second embodiment realizing interactive application scene, and highlight the process utilizing illiteracy plate method interactive application scenario objects to be divided into target area and see-through area, refer to Fig. 4, comprise based on the device embodiment realizing interactive application scene in the embodiment of the present invention:
Acquiring unit 401, for obtaining user's request information;
Modeling unit 402, for carrying out application scenarios modeling according to user's request information;
Production unit 403, makes the interactive application scenario objects corresponding with modeling result according to modeling result;
Division unit 404, for being divided into target area and see-through area by interactive application scenario objects;
Setting unit 405, for the object properties of Offered target region and see-through area;
Display unit 406, for showing interactive application scenario objects.
Division unit 404 comprises:
Design subelement 4041, for according to user's request information and modeling result, designs and covers plate picture;
Distinguish subelement 4042, for distinguishing target area and see-through area by different color block on illiteracy plate picture;
Cutting out subelement 4043, for cutting out according to target area and the picture of color block to interactive application scenario objects place corresponding to see-through area, obtaining target area and see-through area.
Display unit 406 comprises:
Set up subelement 4061, for creating interaction scenarios interfaces windows in application software;
Display subelement 4062, for being presented at interactive interface window according to a certain percentage by application scenarios object.
Before carrying out application scenarios modeling, acquiring unit 401 can first obtain user's request information, and user's request information can comprise at least one of user's scene, interactive mode, interaction scenarios pattern, function and applicable situation.
After obtaining user's request information, modeling unit 402 can carry out application scenarios modeling according to user's request information, such as, can be according to the corresponding model of user's scene drawing, also can modify to model according to applicable situation.
After modeling unit 402 carries out application scenarios modeling, production unit 403 can make the interactive application scenario objects corresponding with modeling result according to modeling result, according to user's request information and the mutual needs of reality, interactive application scenario objects can be picture, do not limit the form of picture herein, concrete picture format can need alternately according to reality and determine.
After the making of interactive application scenario objects is complete, interactive application scenario objects is divided into target area and see-through area by division unit 404;
Above-mentioned target area is fixed according to user's request information, is not limited to a connected entirety, can be made up of, and see-through area is the part except target area in interactive application scene some parts of separating.Such as, when carrying out roundtable conference, first can set up the interactive application scenario objects comprising round table and some chairs, interaction demand for above-mentioned roundtable conference can be known, round table and some chairs are the main bodys of interactive application scene, therefore round table and some chairs can form target area, and can form see-through area except the part of round table and some chairs in interactive application scenario objects.
Division unit 404 utilizes illiteracy plate method that interactive application scenario objects is divided into target area and see-through area, refers to Fig. 4, introduces the illiteracy plate method in the present embodiment below:
Utilize and cover the concrete grammar that interactive application scenario objects is divided into target area and see-through area by plate method, first designing subelement 4041 can according to user's request information and modeling result, and plate picture is covered in design; Then distinguish subelement 4042 and distinguish target area and see-through area by different color block on described illiteracy plate picture, above-mentioned illiteracy plate picture can profit obtain with the following method, first prepare one with the picture the same with interactive application scenario objects, then in above-mentioned picture, satisfactory illiteracy plate picture can be obtained with different chromatic zones partial objectives for regions and see-through area; Finally cut out subelement 4043 to cut out according to target area and the picture of color block to interactive application scenario objects place corresponding to see-through area, target area and see-through area can be obtained.
After interactive application scenario objects is divided into target area and see-through area by division unit 404, setting unit 405 can according to the object properties of user's request information Offered target region and see-through area, and above-mentioned object properties can comprise the transparency of object, event and signaling.
After the object properties of setting unit 405 Offered target region and see-through area, set up subelement 4061 and first can create interaction scenarios interfaces windows in application software, then show subelement 4062 and interactive application scenario objects is presented in interaction scenarios interfaces windows in the ratio preset; Interactive application scenario objects also can directly show on desktop in the ratio preset by display unit 406.
In the present embodiment, first acquiring unit 401 obtains user's request information, then modeling unit 402 carries out application scenarios modeling according to user's request information, production unit 403 then makes the interactive application scenario objects corresponded according to modeling result, then division unit 404 utilizes illiteracy plate method that interactive application scenario objects is divided into target area and see-through area, the object properties of setting unit 405 Offered target region and see-through area again, last display unit 406 shows described interactive application scenario objects.Just can process target area and see-through area separately when therefore revising interactive application scene, and need not entirety process, so the workload of amendment can be reduced, improve the efficiency of drawing interactive application scene.
One of ordinary skill in the art will appreciate that all or part of step realized in above-described embodiment method is that the hardware that can carry out instruction relevant by program completes, described program can be stored in a kind of computer-readable recording medium, the above-mentioned storage medium mentioned can be ROM (read-only memory), disk or CD etc.
Above to provided by the present invention a kind of realize interactive application scene method and device be described in detail, for one of ordinary skill in the art, according to the thought of the embodiment of the present invention, all will change in specific embodiments and applications, in sum, this description should not be construed as limitation of the present invention.

Claims (10)

1. realize a method for interactive application scene, it is characterized in that, comprising:
Obtain user's request information, described user's request information comprises at least one in user's scene, interactive mode, interaction scenarios pattern, function and applicable situation;
Application scenarios modeling is carried out according to described user's request information;
The interactive application scenario objects corresponding with described modeling result is made according to modeling result;
Described interactive application scenario objects is divided into target area and see-through area;
The object properties of described target area and see-through area are set;
Show described interactive application scenario objects.
2. the method realizing interactive application scene according to claim 1, is characterized in that, describedly described interactive application scenario objects is divided into target area and see-through area comprises:
The colored pixels of target area is obtained according to user's request information and modeling result;
Areas combine is carried out, as target area to all designated color pixel regions;
Areas combine is carried out, as see-through area to all non-designated colored pixels regions.
3. the method realizing interactive application scene according to claim 1, is characterized in that, describedly described interactive application scenario objects is divided into target area and see-through area comprises:
According to user's request information and modeling result, plate picture is covered in design;
Described illiteracy plate picture distinguishes target area and see-through area by different color block;
Cut out according to target area and the picture of color block to interactive application scenario objects place corresponding to see-through area, obtain target area and see-through area.
4. the method realizing interactive application scene according to any one of claim 1 to 3, is characterized in that, described user's request information comprises at least one of user's scene, interactive mode, interaction scenarios pattern, function and applicable situation.
5. the method realizing interactive application scene according to any one of claim 1 to 3, is characterized in that, described object properties comprise the transparency of object, event and signaling.
6. the method realizing interactive application scene according to any one of claim 1 to 3, is characterized in that, the described application scenarios object of described display comprises:
Interaction scenarios interfaces windows is created in application software;
Interactive application scenario objects is presented in interaction scenarios interfaces windows in the ratio preset.
7. realize a device for interactive application scene, it is characterized in that, comprising:
Acquiring unit, for obtaining user's request information, described user's request information comprises at least one in user's scene, interactive mode, interaction scenarios pattern, function and applicable situation;
Modeling unit, for carrying out application scenarios modeling according to described user's request information;
Production unit, makes the interactive application scenario objects corresponding with described modeling result according to modeling result;
Division unit, for being divided into target area and see-through area by described interactive application scenario objects;
Setting unit, for arranging the object properties of described target area and see-through area;
Display unit, for showing described interactive application scenario objects.
8. the device realizing interactive application scene according to claim 7, is characterized in that, described division unit comprises:
Obtain subelement, for obtaining the colored pixels of target area according to user's request information and modeling result;
First combination subelement, for carrying out areas combine, as target area to all designated color pixel regions;
Second combination subelement, for carrying out areas combine, as see-through area to all non-designated colored pixels regions.
9. the device realizing interactive application scene according to claim 7, is characterized in that, described division unit comprises:
Design subelement, for according to user's request information and modeling result, designs and covers plate picture;
Distinguish subelement, for distinguishing target area and see-through area by different color block on described illiteracy plate picture;
Cutting out subelement, for cutting out according to target area and the picture of color block to interactive application scenario objects place corresponding to see-through area, obtaining target area and see-through area.
10. the device realizing interactive application scene according to claim 7, is characterized in that, described display unit comprises:
Set up subelement, for creating interaction scenarios interfaces windows in application software;
Display subelement, for being presented at interactive interface window according to a certain percentage by application scenarios object.
CN201210533188.7A 2012-12-11 2012-12-11 A kind of method and device realizing interactive application scene Expired - Fee Related CN103035028B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210533188.7A CN103035028B (en) 2012-12-11 2012-12-11 A kind of method and device realizing interactive application scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210533188.7A CN103035028B (en) 2012-12-11 2012-12-11 A kind of method and device realizing interactive application scene

Publications (2)

Publication Number Publication Date
CN103035028A CN103035028A (en) 2013-04-10
CN103035028B true CN103035028B (en) 2015-10-07

Family

ID=48021893

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210533188.7A Expired - Fee Related CN103035028B (en) 2012-12-11 2012-12-11 A kind of method and device realizing interactive application scene

Country Status (1)

Country Link
CN (1) CN103035028B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109102202A (en) * 2018-08-28 2018-12-28 职享科技(深圳)有限责任公司 Information interacting method and device

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103544263B (en) * 2013-10-16 2017-05-10 广东欧珀移动通信有限公司 Rendering method and rendering device for mobile terminal

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101567017A (en) * 2008-12-31 2009-10-28 合肥工业大学 Urban-evacuation simulation method based on multi-resolution images
CN102012906A (en) * 2010-10-27 2011-04-13 南京聚社数字科技有限公司 Three-dimensional scene management platform based on SaaS architecture and editing and browsing method
CN102157016A (en) * 2011-04-26 2011-08-17 南通大学 IDL based method for three-dimensionally visualizing medical images
CN102289833A (en) * 2011-08-30 2011-12-21 北京瑞信在线系统技术有限公司 Method and device for automatically segmenting picture
CN102737394A (en) * 2012-06-20 2012-10-17 北京市网讯财通科技有限公司 Method for drawing irregular skin of windows system software

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE602007001796D1 (en) * 2006-03-10 2009-09-10 Hoffmann Marlit SEQUENCE OF SINGLE VIDEO IMAGES, DEVICE AND METHOD FOR PROVIDING A SCENE MODEL, SCENE MODEL, DEVICE AND METHOD FOR CREATING A MENU STRUCTURE AND COMPUTER PROGRAM

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101567017A (en) * 2008-12-31 2009-10-28 合肥工业大学 Urban-evacuation simulation method based on multi-resolution images
CN102012906A (en) * 2010-10-27 2011-04-13 南京聚社数字科技有限公司 Three-dimensional scene management platform based on SaaS architecture and editing and browsing method
CN102157016A (en) * 2011-04-26 2011-08-17 南通大学 IDL based method for three-dimensionally visualizing medical images
CN102289833A (en) * 2011-08-30 2011-12-21 北京瑞信在线系统技术有限公司 Method and device for automatically segmenting picture
CN102737394A (en) * 2012-06-20 2012-10-17 北京市网讯财通科技有限公司 Method for drawing irregular skin of windows system software

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109102202A (en) * 2018-08-28 2018-12-28 职享科技(深圳)有限责任公司 Information interacting method and device

Also Published As

Publication number Publication date
CN103035028A (en) 2013-04-10

Similar Documents

Publication Publication Date Title
CN105512382B (en) The conversion of floor piecemeal and combined method and system based on BIM
CN101569193B (en) Method and system for video insertion
CN104484446A (en) Method and device for dynamically displaying statistical data
CN103631866B (en) Webpage display method and browser
CN111489429B (en) Image rendering control method, terminal equipment and storage medium
CN103324475A (en) Building information model (BIM) rendering optimization system and method based on IFC standard
CN106658139B (en) Focus control method and device
CN105118475A (en) Brightness or brightness and chroma adjusting method of LED display screen
CN104866567A (en) Method and apparatus for presenting business data
CN104765904B (en) Multi-specialized combination by BIM technology and transparent visual method and system
US11620039B2 (en) Performant configuration user interface
CN103473984B (en) Template-based dynamic map obtaining method in network environment
CN104699863A (en) Webpage data display system
CN103279347B (en) The synchronous method of a kind of general software product line domain model and application model
CN105808035A (en) Icon display method and apparatus
CN103035028B (en) A kind of method and device realizing interactive application scene
CN105630378A (en) Double-touch screen-based three-dimensional virtual scene designing and assembling system and method
AU2014280985B2 (en) Image processing apparatus, image processing method, image processing system, and program
CN105760917A (en) Three-dimensional code coding method and system
CN105556570A (en) Generating screen data
CN103473800B (en) A kind of comprehensive dynamic dispatching method of threedimensional model
KR101359273B1 (en) Auto modeling conversion process program from 3-dimensional cad modeling to 2-dimensional cad drawings
CN105005484A (en) Event dispatching method of cross-platform game development tool
CN109104628B (en) Focus foreground generation method, storage medium, device and system of android television
CN102611906A (en) Method for displaying and editing stereoscopic video image-text label with adaptive depth

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20151007

Termination date: 20211211