CN109242963B - Three-dimensional scene simulation device and equipment - Google Patents

Three-dimensional scene simulation device and equipment Download PDF

Info

Publication number
CN109242963B
CN109242963B CN201811150189.7A CN201811150189A CN109242963B CN 109242963 B CN109242963 B CN 109242963B CN 201811150189 A CN201811150189 A CN 201811150189A CN 109242963 B CN109242963 B CN 109242963B
Authority
CN
China
Prior art keywords
target object
scene
dimensional
dimensional model
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811150189.7A
Other languages
Chinese (zh)
Other versions
CN109242963A (en
Inventor
田浦延
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Fushi Technology Co Ltd
Original Assignee
Shenzhen Fushi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Fushi Technology Co Ltd filed Critical Shenzhen Fushi Technology Co Ltd
Priority to CN201811150189.7A priority Critical patent/CN109242963B/en
Publication of CN109242963A publication Critical patent/CN109242963A/en
Application granted granted Critical
Publication of CN109242963B publication Critical patent/CN109242963B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application discloses a three-dimensional scene simulation device and equipment. The three-dimensional scene simulation device is used for simulating the configuration condition of a target object in a scene and comprises an information acquisition module, a modeling module, a coupling module and an application module. The information acquisition module is used for acquiring three-dimensional data of the target object and three-dimensional data of the scene. The modeling module is used for respectively establishing a three-dimensional model of the target object and a three-dimensional model of the scene according to the three-dimensional data of the target object and the three-dimensional data of the scene. The coupling module is used for coupling a coordinate system of the three-dimensional model of the target object into a coordinate system of the three-dimensional model of the scene. The application module is used for configuring the three-dimensional model of the target object in the three-dimensional model of the scene to simulate the configuration condition of the target object in the scene.

Description

Three-dimensional scene simulation device and equipment
Technical Field
The application relates to the field of intelligent home, in particular to a three-dimensional scene simulation device and equipment.
Background
With the increasing standard of living, people often need to add new things at home, such as: furniture or household robots, etc. However, the actual size of the product available on the market is often not clear to the consumer, and requires either in-person on-site measurement or repeated comparison on the internet. Even if a product with a proper size is found and bought, the product is often found to be unsuitable for replacement due to the fact that the product possibly is not matched with the existing environment in the home or the purchased product is unsuitable for replacement due to errors in measurement before, which is very troublesome.
Disclosure of Invention
The embodiment of the application provides a three-dimensional scene simulation device and equipment.
The three-dimensional scene simulation device is used for simulating the configuration condition of a target object in a scene and comprises an information acquisition module, a modeling module, a coupling module and an application module. The information acquisition module is used for acquiring three-dimensional data of the target object and three-dimensional data of the scene. The modeling module is used for respectively establishing a three-dimensional model of the target object and a three-dimensional model of the scene according to the three-dimensional data of the target object and the three-dimensional data of the scene. The coupling module is configured to couple a coordinate system of a three-dimensional model of the target object into a coordinate system of a three-dimensional model of the scene. The application module is used for configuring the three-dimensional model of the target object in the three-dimensional model of the scene so as to simulate the configuration condition of the target object in the scene.
The device of the embodiment of the application comprises the three-dimensional scene simulation device of the embodiment. The device correspondingly executes corresponding functions according to the simulation effect of the three-dimensional scene simulation device.
According to the three-dimensional scene simulation device and the three-dimensional scene simulation equipment, the coordinate system of the three-dimensional model of the target object is coupled into the coordinate system of the three-dimensional model of the scene, so that the configuration condition of the target object in the scene can be simulated according to the arrangement of the three-dimensional model of the target object in the three-dimensional model of the scene, and the three-dimensional scene simulation device and the three-dimensional scene simulation equipment are simple, convenient, clear and visual, and are beneficial to improving user experience.
Additional aspects and advantages of embodiments of the application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of embodiments of the application.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings, in which:
FIG. 1 is a schematic block diagram of a three-dimensional scene simulation apparatus according to an embodiment of the present application;
FIG. 2 is a flow chart of a three-dimensional scene simulation method according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a device according to an embodiment of the present application;
FIG. 4 is a schematic block diagram of a three-dimensional scene simulation apparatus according to a first embodiment of the present application;
fig. 5 is a flow chart of a three-dimensional scene simulation method according to a first embodiment of the present application;
fig. 6 is a flow chart of a three-dimensional scene simulation method according to a first embodiment of the present application;
fig. 7 is a flow chart of a three-dimensional scene simulation method according to the first embodiment of the present application;
fig. 8 is a flow chart of a three-dimensional scene simulation method according to a first embodiment of the present application;
fig. 9 is a flow chart of a three-dimensional scene simulation method according to a first embodiment of the present application;
FIG. 10 is a schematic block diagram of a three-dimensional scene simulation apparatus according to a second embodiment of the present application;
FIG. 11 is a flow chart of a three-dimensional scene simulation method according to a second embodiment of the present application;
FIG. 12 is a flow chart of a three-dimensional scene simulation method according to a second embodiment of the present application;
fig. 13 is a flow chart of a three-dimensional scene simulation method according to a second embodiment of the present application.
Detailed Description
Embodiments of the present application are described in detail below, examples of which are illustrated in the accompanying drawings, wherein the same or similar reference numerals refer to the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the drawings are exemplary only for explaining the present application and are not to be construed as limiting the present application.
In the description of the present application, the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more of the described features. In the description of the present application, the meaning of "a plurality" is two or more, unless explicitly defined otherwise.
In the description of the present application, it should be noted that, unless explicitly specified and limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be either fixedly connected, detachably connected, or integrally connected, for example; may be mechanically connected, may be electrically connected, or may be in communication with each other; can be directly connected or indirectly connected through an intermediate medium, and can be communicated with the inside of two elements or the interaction relationship of the two elements. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art according to the specific circumstances.
The following disclosure provides many different embodiments, or examples, for implementing different features of the application. In order to simplify the present disclosure, components and arrangements of specific examples are described below. They are, of course, merely examples and are not intended to limit the application. Furthermore, the present application may repeat reference numerals and/or letters in the various examples, which are for the purpose of brevity and clarity, and which do not themselves indicate the relationship between the various embodiments and/or arrangements discussed. In addition, the present application provides examples of various specific processes and materials, but one of ordinary skill in the art will recognize the application of other processes and/or the use of other materials.
It should be understood that the embodiments and/or methods described herein are exemplary in nature and should not be construed as limiting the scope of the application. The embodiments or methods described herein are only one or more of numerous technical solutions covered by the technical ideas related to the present application, and thus, the steps of the described method technical solutions may be performed in the order indicated, may be performed in other orders, may be performed simultaneously, or may be omitted in some cases, and the above modifications should be regarded as the scope covered by the technical claims of the present application.
Embodiments of the present application provide a three-dimensional scene simulation apparatus 10 and a three-dimensional scene simulation method.
Referring to fig. 1, a three-dimensional scene simulation apparatus 10 according to an embodiment of the present application is used for simulating a configuration of a target object in a scene. The three-dimensional scene simulation apparatus 10 includes an information acquisition module 12, a modeling module 14, a coupling module 16, and an application module 18.
The information acquisition module 12 is configured to acquire three-dimensional data of a scene to be simulated and three-dimensional data of a target object configured in the scene. The modeling module 14 is configured to establish a three-dimensional model of the scene and a three-dimensional model of the target object according to the three-dimensional data of the scene and the three-dimensional data of the target object, respectively. The coupling module 16 is for coupling a coordinate system of a three-dimensional model of the target object into a coordinate system of a three-dimensional model of the scene. The application module 18 is configured to configure a three-dimensional model of the target object in the three-dimensional model of the scene to simulate the configuration of the target object in the scene. Such configurations include, but are not limited to, moving and/or flipping a target object in a three-dimensional model of a scene, or interaction and avoidance of a target object with existing structures in a scene.
Referring to fig. 2, the three-dimensional scene simulation method according to the embodiment of the present application is used for simulating the configuration situation of a target object in a scene, and includes the following steps:
step S12: acquiring three-dimensional data of a target object and three-dimensional data of a scene;
step S14: respectively and correspondingly establishing a three-dimensional model of the target object and a three-dimensional model of the scene according to the three-dimensional data of the target object and the three-dimensional data of the scene;
step S16: coupling a coordinate system of the three-dimensional model of the target object into a coordinate system of the three-dimensional model of the scene; and
step S18: a three-dimensional model of the target object is configured in the three-dimensional model of the scene to simulate the configuration of the target object in the scene.
Referring to fig. 3, the present application further provides an apparatus 100, such as a mobile phone, a notebook computer, a tablet computer, a touch-control interactive screen, a door, a vehicle, a robot, an automatic numerical control machine, etc. The apparatus 100 includes at least one three-dimensional scene simulation device 10 of any of the embodiments described above. The equipment is used for correspondingly executing corresponding functions according to the simulation effect of the three-dimensional scene simulation device. Including but not limited to home design, robotic control, simulated interactive games, etc.
According to the three-dimensional scene simulation device 10, the equipment 100 and the three-dimensional scene simulation method, the coordinate system of the three-dimensional model of the target object is coupled to the coordinate system of the three-dimensional model of the scene, so that the real configuration situation of the target object in the scene is simulated according to the configuration requirement of the three-dimensional model of the target object in the three-dimensional model of the scene, and the three-dimensional scene simulation device is simple, convenient, clear, visual and beneficial to improving user experience.
The three-dimensional scene simulation apparatus 10 and the three-dimensional scene simulation method of the present application can be divided into two specific embodiments, and the two embodiments will be described below, respectively. It should be noted that modifications, or alternatives, made by those skilled in the art in light of the two embodiments of the application are also within the scope of the application.
Embodiment one:
referring to fig. 4, a three-dimensional scene simulation apparatus 10 according to a first embodiment of the present application is provided for simulating a configuration of a target object in a three-dimensional model of a current scene in real time. In this embodiment, the current scene is a home scene, and the target object is furniture to be placed in the home.
The three-dimensional scene simulation device 10 comprises an information acquisition module 12, a sensing module 13, a modeling module 14, a memory 15, a coupling module 16, an interaction module 17 and an application module 18 which are connected through a bus 11. The modules/modules of the three-dimensional scene simulation device 10 interact with signals and data through the bus 11. The information acquisition module 12 is configured to acquire three-dimensional data of a scene to be simulated and three-dimensional data of a target object to be configured in the scene. In the first embodiment of the present application, the information acquisition module 12 is configured to acquire three-dimensional data of furniture and three-dimensional data of home furnishing.
The information acquisition module 12 may acquire three-dimensional data of the target object from a server of the merchant over a network. For example: the information module 12 may further comprise a search unit 122 and a download unit 124. The searching unit 122 searches for a desired target object from the merchant's website according to the size data of the preset target object. The download unit 124 downloads and saves the three-dimensional data of the selected target object in the local memory 15.
The information acquisition module 12 may also directly sense three-dimensional data of the scene or the target object through the sensing module 13. For example: the information acquisition module 12 may acquire corresponding three-dimensional data by capturing a scene or a target object in front of the scene with a three-dimensional camera provided on a portable terminal such as a mobile phone.
The sensing module 13 comprises a three-dimensional camera. The sensing module 13 is configured to process the output data of the three-dimensional camera to obtain three-dimensional data of the target object and/or three-dimensional data of the scene, and transmit the three-dimensional data of the target object and/or the three-dimensional data of the scene to the information obtaining module 12. In this manner, the information acquisition module 12 is enabled to acquire three-dimensional data of a target object and/or three-dimensional data of a scene.
The three-dimensional camera is used for sensing three-dimensional data of a photographed object. The principles Of the three-dimensional camera may be based on structured light, binocular vision, or Time Of Flight (TOF), without limitation. It can be understood that, since the three-dimensional camera can sense three-dimensional data of the object in space, a three-dimensional model of the target object and the scene can be reconstructed according to the sensed three-dimensional data in the same proportion, and the real configuration situation of the target object in the scene is simulated.
Specifically, the user himself or a designed robot or other equipment can hold the three-dimensional camera to sense complete three-dimensional data of a scene or a seen target object according to a plurality of preset angles. Or, at a plurality of preset positions of the scene to be simulated, such as corners of an indoor space ceiling, a plurality of three-dimensional cameras are correspondingly erected to sense complete three-dimensional data of the scene, and the number of the three-dimensional cameras is not particularly limited. The information acquisition module 12 performs denoising, splicing, matching, optimizing and other processes on the three-dimensional data sensed by the three-dimensional camera, and stores the processed three-dimensional data into the memory 15 for standby.
It will be appreciated that the sensing module 13 may also include a color camera, such that color information of the scene and the target object may be obtained by the color camera. The color three-dimensional model of the scene and the target object constructed by combining the color information acquired by the color camera and the three-dimensional data acquired by the three-dimensional camera can be more realistic.
The memory 15 may store three-dimensional data of the home and three-dimensional data of the furniture for the user, so that the modeling module 14 may read the required three-dimensional data from the memory 15 when building the three-dimensional model. The memory 15 may be a storage medium provided on the local terminal device, or may be a cloud memory on a network.
The modeling module 14 is configured to establish a three-dimensional model of the target object and a three-dimensional model of the scene according to the three-dimensional data of the target object and the three-dimensional data of the scene, respectively. The three-dimensional data includes, but is not limited to, three-dimensional coordinates, laser reflection intensity, point cloud data of color information. The modeling module 14 may perform preprocessing, segmentation, triangle gridding, and mesh rendering on the point cloud data, so as to complete the establishment of the three-dimensional model. For example, the preprocessing of the point cloud data can be performed by filtering and denoising, data reduction, data interpolation and the like. Then, the point cloud data is segmented, so that the whole point cloud is clustered into a plurality of point clouds, and each point cloud corresponds to an independent object. Then, the point cloud can be triangulated by adopting a convex hull or concave hull algorithm, so that subsequent grid rendering is facilitated. After the space topological structure of the point cloud is obtained, the texture is mapped into the grid, so that the object is more vivid.
The coupling module 16 is configured to couple the coordinate system of the three-dimensional model of the target object into the coordinate system of the three-dimensional model of the scene through coordinate transformation operation, so as to simulate the configuration situation of the target object in the three-dimensional model of the scene. Specifically, the coupling module 16 is configured to perform a coordinate transformation algorithm according to a preset algorithm, such as: and the coordinate translation matrix and/or the coordinate rotation matrix convert coordinate data of the three-dimensional model of the target object in the self reference coordinate system into coordinate data of the three-dimensional model of the target object in the reference coordinate system of the scene.
It will be appreciated that the coordinate system of the three-dimensional model of the target object before it is not coupled to the three-dimensional model of the scene is a coordinate system established with the target object itself as a reference. The coordinate system of the three-dimensional model of the scene is a coordinate system established by taking preset points in the scene as reference objects. The three-dimensional model of the target object is coupled into the three-dimensional model of the scene, that is, the coordinate data of the three-dimensional model of the target object is converted into the coordinate data of the three-dimensional model of the scene, that is, the three-dimensional model of the target object is described and measured by using the coordinate system of the three-dimensional model of the scene.
The user may issue instructions via the interaction module 17 to couple the coordinate system of the three-dimensional model of the target object into the coordinate system of the three-dimensional model of the scene to control the coupling module 16 to couple the three-dimensional model of the target object into the three-dimensional model of the scene. The interaction module 17 includes, but is not limited to, a mouse, a keyboard, and a touch screen.
Specifically, in this embodiment, the user may first open the three-dimensional model of the home, and then drag the three-dimensional model of the furniture into the three-dimensional model of the home; or, the user can open the three-dimensional model of the furniture first and then drag the three-dimensional model of the home into the three-dimensional model of the furniture; alternatively, the user may select and open the three-dimensional model of furniture and the three-dimensional model of home at one time.
Of course, a user may couple multiple three-dimensional models of furniture to one three-dimensional model of home at a time; the three-dimensional model of the furniture can be coupled to the three-dimensional models of a plurality of households at one time; it is also possible to couple three-dimensional models of multiple pieces of furniture to three-dimensional models of multiple pieces of furniture at a time. The number of three-dimensional models of furniture and three-dimensional models of home furniture that are coupled to one another is not limited here.
The application module 18 is configured to configure a three-dimensional model of the target object in the three-dimensional model of the scene to simulate the configuration of the target object in the scene. In the first embodiment of the present application, the application module 18 is configured to configure a three-dimensional model of furniture in a three-dimensional model of home, so as to simulate the actual placement effect of the home, and provide a reference for the home design.
In the first embodiment of the present application, the application module 18 includes an interference unit 182, an interaction unit 184, and a measurement unit 186.
The interference unit 182 is configured to determine, according to a preset spatial attribute of three-dimensional data of a scene and a target object, an interference condition of a three-dimensional model of the target object when the three-dimensional model contacts an existing structure in the three-dimensional model of the scene. The spatial attributes include traversable attributes and non-traversable attributes. The interference unit 182 senses that two spatial coordinate groups having an impenetrable property are moved into contact with each other while restricting their mutual entry into the respective spatial regions, only along the contact surface therebetween.
It can be understood that the property of the spatial region corresponding to the object of the solid property and the existing structure in the scene is the non-penetrable property, and the property of the other spatial regions except the object of the solid property is the penetrable property. In this way, the three-dimensional model of the target object is made to be consistent with the real situation as is the configuration action in the field Jing Sanwei model. In a real scene, actions that are not possible with the target object should not be realized in the three-dimensional model of the scene either.
The spatial attribute of the three-dimensional data of the target object and the spatial attribute of the three-dimensional data of the scene may be preset by the user according to the actual situation, or may be automatically given to the attribute after the modeling module 14 determines the spatial area occupied by the object according to the sensed three-dimensional data, and then the user adjusts the spatial attribute according to the actual situation.
In one application scenario, after moving the three-dimensional model of the table along a direction perpendicular to the wall of the three-dimensional model of the home, the three-dimensional model of the table cannot continue to move along a direction perpendicular to the wall of the three-dimensional model of the home, because in an actual scenario, the table cannot pass through the wall. In this scenario, the two spatial coordinate groups with the non-traversable properties are the three-dimensional model of the table and the walls of the three-dimensional model of the home, respectively.
In another application scenario, the spatial coordinates of the table are preset as the non-penetrable attribute, the spatial coordinates above the upper surface of the table are preset as the penetrable attribute, the spatial coordinates of the barrel of the pen container are preset as the non-penetrable attribute, and the spatial coordinates of the barrel cavity of the pen container are preset as the penetrable attribute, so that a pen can be placed in the barrel cavity, but the pen cannot penetrate the barrel, and the pen container can be moved along the upper surface of the table, but the pen container cannot penetrate the table. Note that "table" and "pen container" in the examples refer to three-dimensional models of furniture in a three-dimensional model of home, rather than table and pen container in a real-world scenario.
In addition to the traversable and non-traversable properties, the spatial properties may also include elastic properties. It will be appreciated that for objects that may be deformed in a real-world scene, the passable and non-passable properties are not sufficient to describe their properties. The space region with elastic properties can deform to a preset degree according to the self-elasticity when the space region is contacted with an object with non-penetrable properties, so that the object with non-penetrable properties can enter the space with the original elastic properties within a preset range.
In one scenario, the spatial coordinates of the three-dimensional model of the window covering may be preset as elastic properties, and the deformation amount may be set according to the deformation capability of the window covering, and when the three-dimensional model of the window covering is pulled, the three-dimensional model of the window covering is stretched within the deformation amount.
In another scenario, the spatial coordinates of the three-dimensional model of the sponge wall may be preset as elastic properties, and the set variable may be set according to the deformability of the sponge wall. When the three-dimensional model of the table is moved along the direction perpendicular to the surface of the three-dimensional model of the sponge wall until the three-dimensional model of the sponge wall is contacted with the three-dimensional model of the sponge wall, the three-dimensional model of the sponge wall is recessed within the deformation amount range, and after the limit of the deformation amount is reached, if the three-dimensional model of the table is moved along the original direction again, the three-dimensional model of the table cannot be moved continuously, and the three-dimensional model of the sponge wall can maintain deformation in the limit state of the deformation amount. The deformation that the influence that comes to simulate the sponge wall in the reality scene received the pressure of desk like this takes place.
The arrangement of the elastic attribute can enable interaction between furniture and interaction between furniture and existing structures in the home to be simulated to be closer to reality in a three-dimensional model of a scene.
The interference unit 182 prompts the user to pay attention when it is judged that the interference condition has occurred. For example: the interference unit 182 reminds in a vibration, voice broadcast or highlighting or flashing display manner at the place where the interference occurs.
The interaction unit 184 is configured to receive a signal configuring a three-dimensional model of the target object, and change the position and morphology of the three-dimensional model of the target object in the three-dimensional model of the scene according to the signal.
In this manner, the user can change the position and morphology of the three-dimensional model of the target object in the three-dimensional model of the scene through the interaction unit 184. Specifically, a user may input configuration signals to the three-dimensional scene simulation apparatus 10 through the interaction module 17, the interaction module 17 including, but not limited to, a mouse, a keyboard, and a touch screen. For example, the user may drag or rotate the furniture three-dimensional model with a finger on the touch screen in the three-dimensional model of the home, and the interaction unit 184 changes the position and morphology of the furniture three-dimensional model in the three-dimensional model of the home according to the sensed finger movement.
The calculating unit 186 is configured to calculate relative position information between the three-dimensional model of the scene and the plurality of markers in the three-dimensional model of the target object, for example: distance, angle, curvature, etc. The marks include, but are not limited to, coordinate points, lines, or patterns.
Therefore, the user can configure the target object into an ideal state in the three-dimensional model of the scene, and then calculate the position information of the target object or the existing structure in the scene, which needs to be changed in order to achieve the ideal state, so that the user can modify and adjust furniture or home conveniently. Similarly, the user may select a measurement coordinate point to the three-dimensional scene modeling apparatus 10 through the interaction module 17, and after selecting the measurement coordinate point, the user may start measurement by clicking a completion button to cause the measurement unit 186 to start measurement.
For example, the user may select two measurement coordinate points at a time, and the interaction unit 184 may send coordinate values of the two measurement coordinate points to the measurement unit 186 after acquiring the two measurement coordinate points. The calculating module 186 calculates the distance between the two measuring coordinate points according to the coordinate values of the two measuring coordinate points. Alternatively, the user may select a plurality of measurement coordinate points at a time, and the interaction unit 184 may send the coordinate values of the plurality of measurement coordinate points to the measurement unit 186 after acquiring the plurality of measurement coordinate points. The calculating module 186 calculates the distance between each two measuring coordinate points or the angle between the two connecting points according to the coordinate values of the measuring coordinate points. The user can also calculate the position information such as the included angle by setting lines in the three-dimensional model of the target object and/or the scene through the interaction module 17.
The calculation module 186 may display the measured positional information in the associated three-dimensional model for reference by the user.
Note that, since the three-dimensional data of the real object and the scene is obtained, the position information of the relevant mark in the three-dimensional model of the object and/or the scene measured by the measuring module 186 is the real size of the object and/or the scene.
After seeing the measured distance value, the user can input adjustment data to the three-dimensional scene simulation device 10 to observe the placement condition of the three-dimensional model of the adjusted furniture in the three-dimensional model of the home. The interaction unit 184 is configured to obtain adjustment data of the three-dimensional model of the target object and adjust the three-dimensional model of the target object in the three-dimensional model of the scene according to the adjustment data.
Therefore, the user can decide whether to readjust or save adjustment data according to the placement condition of the three-dimensional model of the adjusted furniture in the three-dimensional model of the home. If the user feels that the three-dimensional model of the adjusted furniture is not placed in the three-dimensional model of the home, the three-dimensional model of the furniture can be continuously adjusted in a similar manner until the three-dimensional model of the furniture is placed in the three-dimensional model of the home. When the user feels that the three-dimensional model of the adjusted furniture is placed in the three-dimensional model of the home, the adjustment data can be saved to the memory 15 through the save button, and the adjustment data can be sent to the seller of the furniture through one key, so that the seller of the furniture adjusts the size of the furniture according to the adjustment data. Of course, if the user makes a plurality of adjustments to the three-dimensional model of the furniture, the interactive unit 184 may send the adjustment data of each adjustment to the memory 15, so that the user may reproduce a certain previous adjustment after making the plurality of adjustments.
The three-dimensional scene simulation apparatus 10 of the first embodiment of the present application may apply a three-dimensional scene simulation method. The above explanation of the embodiment and advantageous effects of the three-dimensional scene simulation apparatus 10 is also applicable to the three-dimensional scene method of the present embodiment, and is not developed in detail here to avoid redundancy.
Referring to fig. 1, a three-dimensional scene simulation method according to a first embodiment of the present application is used for simulating a configuration of a target object in a three-dimensional model of a current environment in real time. In this embodiment, the current environment is a home scene, and the target object is furniture that needs to be placed in the home. The three-dimensional scene simulation method comprises the following steps:
step S12: acquiring three-dimensional data of a target object and three-dimensional data of a scene;
step S14: establishing a three-dimensional model of the target object and a three-dimensional model of the scene according to the three-dimensional data of the target object and the three-dimensional data of the scene;
step S16: coupling a coordinate system of the three-dimensional model of the target object into a coordinate system of the three-dimensional model of the scene; and
step S18: a three-dimensional model of the target object is configured in the three-dimensional model of the scene to simulate the configuration of the target object in the scene.
According to the three-dimensional scene simulation method, the coordinate system of the three-dimensional model of the target object is coupled into the coordinate system of the three-dimensional model of the scene, so that the configuration situation of the target object in the scene is simulated according to the configuration situation of the three-dimensional model of the target object in the three-dimensional model of the scene, and the three-dimensional scene simulation method is simple, convenient, clear and visual, and is beneficial to improving user experience.
Referring to fig. 5, step S18 further includes:
step S181: and judging the mutual interference condition of the three-dimensional model of the target object when the three-dimensional model of the target object is configured in the three-dimensional model of the scene according to the preset spatial attribute of the three-dimensional data of the scene and the target object.
The spatial attributes include traversable attributes and non-traversable attributes. Two groups of spatial coordinates with non-traversable properties are restricted from entering each other inside their respective spatial regions when they are sensed to be moved into contact with each other.
Referring to fig. 6, step S18 further includes:
step S182: and when the three-dimensional model of the target object is contacted with the three-dimensional model of the scene in the space coordinates which are the same as the non-penetrable attribute, prompting that the three-dimensional model of the target object interferes with the three-dimensional model of the scene.
The prompt includes, but is not limited to, a prompt in the form of a vibration, a voice broadcast, or a highlight or flashing display where interference occurs.
Referring to fig. 7, step S18 further includes:
step S183: and receiving a signal for configuring the three-dimensional model of the target object, and changing the position and the shape of the three-dimensional model of the target object in the three-dimensional model of the scene according to the signal.
Referring to fig. 8, step S18 further includes:
step S184: a plurality of markers for use in the evaluation in a three-dimensional model of the target object and/or scene are acquired. The marks include, but are not limited to, coordinate points, lines, or patterns; and
step S185: and measuring and calculating the relevant position information between the marks in the three-dimensional model of the target object and/or the scene. The location information includes, but is not limited to, for example, between markers: distance, angle, curvature, etc.
Referring to fig. 9, step S18 further includes:
step S186: obtaining a three-dimensional model of a target object; and
step S187: and adjusting the three-dimensional model of the target object in the three-dimensional model of the scene according to the adjustment data.
Embodiment two:
referring to fig. 10, a three-dimensional scene simulation apparatus 10 according to a second embodiment of the present application is provided for simulating a configuration of a target object in a three-dimensional model of a current environment in real time. In this embodiment, the current environment is a home scene, and the target object is a robot that needs to move in the home.
Including, but not limited to, a floor sweeping robot, a floor mopping robot, and a window cleaning robot.
Similar to the first embodiment of the present application, in the second embodiment of the present application, the three-dimensional scene simulation apparatus 10 includes an information acquisition module 12, a sensing module 13, a modeling module 14, a memory 15, a coupling module 16, an interaction module 17, and an application module 18, which are connected through a bus 11. The information acquisition module 12 is configured to acquire three-dimensional data of a target object and three-dimensional data of a scene. The modeling module 14 is configured to establish a three-dimensional model of the target object and a three-dimensional model of the scene according to the three-dimensional data of the target object and the three-dimensional data of the scene, respectively. The coupling module 16 is for coupling a coordinate system of a three-dimensional model of the target object into a coordinate system of a three-dimensional model of the scene. The application module 18 is configured to configure a three-dimensional model of the target object in the three-dimensional model of the scene to simulate the configuration of the target object in the scene.
Similar to the first embodiment of the present application, in the second embodiment of the present application, the application module 18 includes an interference unit 182, an interaction unit 184, and a measurement unit 186. However, in the second embodiment of the present application, the application module 18 further includes a control unit 188.
In order to avoid redundancy, the same or similar parts as those of the first embodiment in the second embodiment will not be described again.
The three-dimensional data on the target object, that is, the three-dimensional data of the robot in the second embodiment of the present application, can be acquired in a similar manner to the above-described acquisition of the three-dimensional data of the home; alternatively, the user downloads himself on the official website of the purchasing robot; alternatively, the three-dimensional data of the robot is stored in the memory of the robot itself, and the three-dimensional scene simulation device 10 may transmit a request instruction to the robot so that the robot transmits the three-dimensional data to the three-dimensional scene simulation device 10. Of course, the user may also hold the mobile device including the three-dimensional camera, and take pictures or photographs of the household robot from a plurality of angles, thereby obtaining three-dimensional data of the robot.
The control unit 188 is configured to plan a travel route of the target object according to the three-dimensional model of the scene.
In this way, the user can plan the travel route of the three-dimensional model of the robot in the three-dimensional model of the home by means of the control unit 188, so as to achieve planning of the travel route of the robot in a realistic home scene.
Note that the control unit 188 may be configured to automatically plan the travel route, or may be configured to plan the travel route based on a user instruction. When the control unit 188 automatically plans the travel route, the control unit 188 determines the travel route of the robot three-dimensional model in the home three-dimensional model according to the three-dimensional model of the robot and the spatial attribute of the three-dimensional model of the home, and then combines the tasks to be completed by the robot to provide a plurality of travel routes for the user to select.
When the control unit 188 plans the travel route based on the instruction of the user, the user can move the three-dimensional model of the household robot in the three-dimensional model of the household in a similar manner to the three-dimensional model of the household in which the furniture is moved, similar to the first embodiment of the present application. After the movement is completed, the control unit 188 may send the movement route to the memory 15 for saving.
In addition, whether the control unit 188 automatically plans the travel route or plans based on the user's instructions, the user can make individual adjustments to the travel route after planning the travel route. The user may input the adjustment data through the interaction module 17, and then the control unit 188 re-plans the travel route according to the adjustment data.
The target object may also include a motion device 22 and a sensing device 24 thereon.
The movement device 22 is used for driving the target object to move.
The control unit 188 is configured to control the movement device 22 to move the target object in the scene according to the pre-planned travel route of the target object.
In this manner, the user can control the movement of the target object through the control unit 188. Specifically, the exercise device 22 may communicate with the control unit 188 via bluetooth, WI-FI (Wireless-Fidelity), or the like. The control unit 188 may send the designed travel route to the exercise device 22 in the manner described above. The movement device 22 can control the robot to move in a real home scene according to the travel route when receiving the movement instruction sent by the control unit 188.
The sensing device 24 includes, but is not limited to, a three-dimensional camera, an ultrasonic ranging sensor, a gyroscope, an infrared detector, and the like. The three-dimensional camera may be disposed on top of the robot and may be rotated to capture an image of the environment surrounding the robot. The gyroscope is used for sensing the motion state of the target object. The ultrasonic ranging sensor and the infrared detector are used for sensing spatial parameters such as the position, the distance, the size and the like of the obstacle in the advancing process so as to provide references for obstacle avoidance of a target object in the advancing process.
The actual travel situation often varies from the planned route, so the control unit 188 is also configured to control the target object to avoid an obstacle according to the motion state of the target object and the obstacle situation in front of the travel route, which are sensed by the sensing device 24.
After the target object avoids the obstacle, the control unit 188 determines whether to guide the target object back to the original planned route or select another planned route with better performance according to the sensed current position of the target object through factors such as the long distance of the reference path and the task completion.
The three-dimensional scene simulation apparatus 10 of the second embodiment of the present application may apply a three-dimensional scene simulation method. The above explanation of the embodiment and advantageous effects of the three-dimensional scene simulation apparatus 10 is also applicable to the three-dimensional scene method of the present embodiment, and is not developed in detail here to avoid redundancy.
Referring to fig. 1, a three-dimensional scene simulation method according to a first embodiment of the present application is used for simulating a configuration of a target object in a three-dimensional model of a current environment in real time. In this embodiment, the current environment is a home scene, and the target object is a robot that needs to move in the home. The three-dimensional scene simulation method comprises the following steps:
step S12: acquiring three-dimensional data of a target object and three-dimensional data of a scene;
step S14: establishing a three-dimensional model of the target object and a three-dimensional model of the scene according to the three-dimensional data of the target object and the three-dimensional data of the scene;
step S16: coupling a coordinate system of the three-dimensional model of the target object into a coordinate system of the three-dimensional model of the scene; and
step S18: a three-dimensional model of the target object is configured in the three-dimensional model of the scene to simulate the configuration of the target object in the scene.
According to the three-dimensional scene simulation method, the coordinate system of the three-dimensional model of the target object is coupled into the coordinate system of the three-dimensional model of the scene, so that the configuration situation of the target object in the scene is simulated according to the configuration situation of the three-dimensional model of the target object in the three-dimensional model of the scene, and the three-dimensional scene simulation method is simple, convenient, clear and visual, and is beneficial to improving user experience.
Referring to fig. 11, step S18 includes:
step S188: and planning a travel route of the target object in the scene according to the three-dimensional model of the target object, the three-dimensional model of the scene and the task to be completed by the target object.
Referring to fig. 12, the target object includes a moving device 22, and step S18 includes:
step S189: the movement device 22 is controlled to drive the target object to move in the scene according to the pre-planned travel route of the target object.
Referring to fig. 13, the target object further includes a sensing device 24, and step S18 includes:
step S18a: the spatial parameters of the obstacle object encountered in the travel path and the target object motion state sensed by the sensing device 24 are acquired.
The spatial parameters include, but are not limited to, the position of the obstacle in the scene, the distance to the target object, the three-dimensional data of itself, etc.;
step S18b: controlling the target object to avoid the obstacle according to the sensed motion state of the target object and the spatial parameter of the obstacle; a kind of electronic device with high-pressure air-conditioning system
Step S18c: and judging whether to guide the target object back to the original planning route or select other better planning routes according to the sensed current position of the target object and referring to the path length and the task completion condition.
It should be noted that, the embodiments of the present application may satisfy only one of the above embodiments or may satisfy the above embodiments simultaneously, that is, the embodiments in which one or more of the above embodiments are combined also belong to the protection scope of the embodiments of the present application.
In the description of the present specification, reference to the terms "certain embodiments," "one embodiment," "some embodiments," "an exemplary embodiment," "an example," "a particular example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and further implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
Logic and/or steps represented in the flowcharts or otherwise described herein, e.g., a ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, system that includes a processing module, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). In addition, the computer readable medium may even be paper or other suitable medium on which the program is printed, as the program may be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.
It is to be understood that portions of embodiments of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above-described embodiments, the various steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, may be implemented using any one or combination of the following techniques, as is well known in the art: discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like.
While embodiments of the present application have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the application, and that changes, modifications, substitutions and variations may be made therein by those of ordinary skill in the art without departing from the scope of the application as defined by the claims and their equivalents.

Claims (9)

1. A three-dimensional scene simulation apparatus for simulating in real time a real configuration of a target object in a scene, the three-dimensional scene simulation apparatus comprising:
The sensing module comprises a three-dimensional camera;
the information acquisition module is used for directly sensing three-dimensional data of a scene where the three-dimensional camera of the sensing module is located and three-dimensional data of a current target object through the three-dimensional camera of the sensing module;
the modeling module is used for constructing a three-dimensional model of the target object and a three-dimensional model of the scene according to the same proportion according to the three-dimensional data of the target object and the three-dimensional data of the scene sensed by the three-dimensional camera;
the coupling module is used for converting coordinate data of the three-dimensional model of the target object in a self reference coordinate system into coordinate data of the three-dimensional model of the target object in the reference coordinate system of the scene according to a preset coordinate transformation algorithm so as to couple the coordinate system of the three-dimensional model of the target object into the coordinate system of the three-dimensional model of the scene; and
the application module is used for configuring the three-dimensional model of the target object in the three-dimensional model of the scene so as to simulate the configuration condition of the target object in the scene;
the application module comprises an interaction unit and a measuring and calculating unit, wherein the interaction unit is used for receiving signals for configuring the three-dimensional model of the target object and changing the position and the shape of the three-dimensional model of the target object in the three-dimensional model of the scene according to the signals; the interaction unit is used for setting a plurality of marks for measuring and calculating in the three-dimensional model of the target object and/or the scene; the measuring and calculating unit is used for measuring and calculating the relative position information between the marks in the target object and/or the three-dimensional model of the scene as a reference for adjusting the target object; the relevant position information between the marks comprises the distance, angle or curvature between the mark points.
2. The three-dimensional scene simulation apparatus according to claim 1, wherein the application module comprises:
the interference unit is used for judging interference conditions which occur when the three-dimensional model of the target object and the existing structure in the three-dimensional model of the scene are in contact with each other according to the preset spatial attributes of the three-dimensional data of the scene and the target object, the spatial attributes comprise a penetrable attribute and a penetrable attribute, and the interference unit senses that two spatial coordinate groups with the penetrable attribute are limited to enter the respective spatial areas when being moved to be in contact with each other.
3. The three-dimensional scene modeling apparatus according to claim 2, wherein the interference unit is configured to prompt the three-dimensional model of the target object to interfere with the three-dimensional model of the scene when the three-dimensional model of the target object and the spatial coordinates, which are the same as the non-traversable attribute, in the three-dimensional model of the scene are in contact with each other.
4. The three-dimensional scene modeling apparatus of claim 1, wherein the interaction unit is configured to obtain the calculated relevant position information of the marker and to exhibit an effect of adjusting the three-dimensional model of the target object according to the relevant position information of the marker.
5. The three-dimensional scene simulation apparatus according to claim 1, wherein the application module comprises a control unit for planning a travel route of the target object in the scene according to the three-dimensional model of the target object, the three-dimensional model of the scene, and a task preset by the target object.
6. The three-dimensional scene simulation apparatus according to claim 5, wherein the target object comprises a movement device, and the control unit is configured to control the movement device to drive the target object to move in the scene according to a pre-planned travel route of the target object.
7. The three-dimensional scene simulation apparatus according to claim 5, wherein the target object includes sensing means for sensing a motion state of the target object and an obstacle condition in front of the travel route, the control unit being configured to:
acquiring the motion state of the target object and the obstacle condition in front of the travelling route, which are sensed by the sensing device;
controlling the target object to avoid an obstacle according to the motion state of the target object and the obstacle condition in front of a traveling route;
After the target object avoids the obstacle, the control unit judges whether to guide the target object back to the original planning route or select other better planning routes according to the sensed current position of the target object through the reference path long distance and the task completion condition.
8. An apparatus comprising the three-dimensional scene simulation device according to any one of claims 1 to 7, the apparatus correspondingly performing a corresponding function according to a simulation effect of the three-dimensional scene simulation device.
9. The apparatus of claim 8, wherein the respective functions include any one or more of home design, robotic control, simulated interactive gaming.
CN201811150189.7A 2018-09-29 2018-09-29 Three-dimensional scene simulation device and equipment Active CN109242963B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811150189.7A CN109242963B (en) 2018-09-29 2018-09-29 Three-dimensional scene simulation device and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811150189.7A CN109242963B (en) 2018-09-29 2018-09-29 Three-dimensional scene simulation device and equipment

Publications (2)

Publication Number Publication Date
CN109242963A CN109242963A (en) 2019-01-18
CN109242963B true CN109242963B (en) 2023-08-18

Family

ID=65054081

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811150189.7A Active CN109242963B (en) 2018-09-29 2018-09-29 Three-dimensional scene simulation device and equipment

Country Status (1)

Country Link
CN (1) CN109242963B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109934908B (en) * 2019-02-28 2023-06-27 东华大学 Actual scene modeling method based on unmanned aerial vehicle
CN110503040B (en) * 2019-08-23 2022-05-27 斯坦德机器人(深圳)有限公司 Obstacle detection method and device
CN111145326B (en) * 2019-12-26 2023-12-19 网易(杭州)网络有限公司 Processing method of three-dimensional virtual cloud model, storage medium, processor and electronic device
CN111832104B (en) * 2020-06-24 2023-07-28 深圳市万翼数字技术有限公司 Method for establishing three-dimensional equipment model and related equipment
CN113838209A (en) * 2021-09-09 2021-12-24 深圳市慧鲤科技有限公司 Information management method of target environment and display method of related augmented reality
CN114152241A (en) * 2021-12-07 2022-03-08 中国南方电网有限责任公司超高压输电公司广州局 Operating state monitoring system of high-voltage line emergency repair tower

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7277572B2 (en) * 2003-10-10 2007-10-02 Macpearl Design Llc Three-dimensional interior design system
WO2006121457A2 (en) * 2004-08-18 2006-11-16 Sarnoff Corporation Method and apparatus for performing three-dimensional computer modeling
CN103778538A (en) * 2012-10-17 2014-05-07 李兴斌 Furniture simulation layout method and furniture simulation layout system
CN108460840A (en) * 2018-01-17 2018-08-28 链家网(北京)科技有限公司 The methods of exhibiting and displaying device of virtual house decoration

Also Published As

Publication number Publication date
CN109242963A (en) 2019-01-18

Similar Documents

Publication Publication Date Title
CN109242963B (en) Three-dimensional scene simulation device and equipment
US10872467B2 (en) Method for data collection and model generation of house
AU2019281667B2 (en) Data collection and model generation method for house
JP6423435B2 (en) Method and apparatus for representing a physical scene
KR102197732B1 (en) Method and apparatus for generating 3d map of indoor space
EP3007129A1 (en) Modeling device, three-dimensional model generation device, modeling method, program, and layout simulator
CN105637559A (en) Structural modeling using depth sensors
WO2016065063A1 (en) Photogrammetric methods and devices related thereto
US20180204387A1 (en) Image generation device, image generation system, and image generation method
US11341716B1 (en) Augmented-reality system and method
EP4177790A1 (en) Map creation method and apparatus for self-moving device, and device and storage medium
CN112413827A (en) Intelligent air conditioner and information display method and device thereof
US20230224576A1 (en) System for generating a three-dimensional scene of a physical environment
CN109313822A (en) Virtual wall construction method and device, map constructing method, mobile electronic equipment based on machine vision
CN109064562A (en) A kind of three-dimensional scenic analogy method
Angladon et al. Room floor plan generation on a project tango device
Nóbrega et al. Design your room: adding virtual objects to a real indoor scenario
WO2023174561A1 (en) Generating synthetic interior room scene data for training ai-based modules
CN115731349A (en) Method and device for displaying house type graph, electronic equipment and storage medium
CN111784797A (en) Robot networking interaction method, device and medium based on AR
EP4275178B1 (en) Computer-implemented augmentation of interior room models
EP4275173B1 (en) Computer-implemented reconstruction of interior rooms
CN115830162B (en) House type diagram display method and device, electronic equipment and storage medium
CN115908627B (en) House source data processing method and device, electronic equipment and storage medium
US20230351706A1 (en) Scanning interface systems and methods for building a virtual representation of a location

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant