CN113436311A - House type graph generation method and device - Google Patents

House type graph generation method and device Download PDF

Info

Publication number
CN113436311A
CN113436311A CN202010209236.1A CN202010209236A CN113436311A CN 113436311 A CN113436311 A CN 113436311A CN 202010209236 A CN202010209236 A CN 202010209236A CN 113436311 A CN113436311 A CN 113436311A
Authority
CN
China
Prior art keywords
room
house type
panoramic image
house
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010209236.1A
Other languages
Chinese (zh)
Inventor
于景铭
王家明
单成亮
赵斌强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN202010209236.1A priority Critical patent/CN113436311A/en
Publication of CN113436311A publication Critical patent/CN113436311A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/06Topological mapping of higher dimensional structures onto lower dimensional surfaces

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application discloses a house type graph generating method and a device thereof, wherein the method comprises the steps of obtaining a panoramic image corresponding to a room; acquiring structural information of the room by using the panoramic image, wherein the structural information is used for describing the room structurally; and generating a house type graph of the room by using the structural information of the room. By the method and the device, the user-type picture can be automatically generated by utilizing the panoramic picture, and the labor cost is saved.

Description

House type graph generation method and device
Technical Field
The present application relates to the field of computer technologies, and in particular, to a house type graph generating method and apparatus.
Background
With the increasing capability of computers to process data, subversive changes are generated in various fields. In the traditional home industry, many products with advanced technologies are also produced.
In determining the house layout of a house, the prior art may use a professional camera to take a large number of pictures of the house, determine various measurement data (for example, the shape of the ceiling, the size of each side, etc.) of the house, and then manually use the measurement data to generate the house layout of the house, but this prior art requires professional staff to participate, which is not only costly but also has poor real-time performance.
Disclosure of Invention
The embodiment of the application provides a house type graph generating method and a device thereof, which at least solve the above mentioned technical problems.
The embodiment of the present application further provides a method for generating a house type graph, where the method includes: acquiring a panoramic image corresponding to a room; acquiring structural information of the room by using the panoramic image, wherein the structural information is used for describing the room structurally; and generating a house type graph of the room by using the structural information of the room.
The embodiment of the present application further provides a method for generating a house type graph, including: responding to the triggering of a house type graph generation request, and acquiring a panoramic image corresponding to a room; displaying a house type map of the room on a display interface, wherein the house type map is an image generated by using structural information acquired from the panoramic image.
An embodiment of the present application further provides a device for generating a house type graph, where the device includes: a processor; and a memory arranged to store computer executable instructions that, when executed, cause the processor to perform the method as above.
Embodiments of the present application also provide a computer-readable storage medium having stored thereon computer instructions, which when executed, implement the above method.
The embodiment of the application adopts at least one technical scheme which can achieve the following beneficial effects:
the structural information in the panoramic image corresponding to the room can be extracted by using the machine learning layout model, and the house type image of the room is generated by using the structural information, so that the automatic generation of the house type image is realized.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
FIG. 1 is a diagram illustrating a scenario for obtaining a floor plan in accordance with an exemplary embodiment of the present application;
FIG. 2 is a flowchart illustrating a house pattern generation method according to an exemplary embodiment of the present application;
fig. 3A to 3F are user interfaces illustrating an electronic terminal according to an exemplary embodiment of the present application;
fig. 4 is a block diagram illustrating a house pattern generation apparatus according to an exemplary embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
First, the structure of the system of one embodiment of the present invention is explained. As shown in fig. 1, the system architecture 100 may include electronic terminals 101, 102, 103, 104, a network 105, and a server 106. The network 105 serves as a medium for providing communication links between the terminal devices 101, 102, 103, 104 and the server 106.
In the present embodiment, the electronic terminal 101, 102, 103, or 104 shown in fig. 1 can perform transmission of various information through the network 105. Network 105 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few. It is noted that the wireless connection means may include, but is not limited to, a 3G/4G/5G connection, a Wi-Fi connection, a bluetooth connection, a WiMAX connection, a Zigbee connection, a UWB connection, a local area network ("LAN"), a wide area network ("WAN"), an internet network (e.g., the internet), and a peer-to-peer network (e.g., an ad hoc peer-to-peer network), as well as other now known or later developed network connection means. The network 105 may communicate using any currently known or future developed network Protocol, such as HTTP (Hyper Text Transfer Protocol), and may interconnect any form or medium of digital data communication (e.g., a communications network).
A user may use the electronic terminals 101, 102, 103, 104 to interact with the server 106 via the network 105 to receive or send messages or the like. Various client applications, such as a house rental application, a house sales application, an interior decoration application, and the like, may be installed on the electronic terminal 101, 102, 103, or 104.
The electronic terminal 101, 102, 103, or 104 may be various electronic devices having a touch display screen and supporting display of images, and have a photographing function, including, but not limited to, a smart phone, a tablet computer, an e-book reader, an MP3 player (moving picture experts group compression standard audio layer 3), an MP4 (moving picture experts group compression standard audio layer 4) player, a head mounted display device, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PMP (portable multimedia player), a vehicle terminal, and the like, and a mobile terminal such as a digital TV, a desktop computer, and the like.
The server 106 may be a server providing various services, and for example, may generate a house pattern using an exemplary embodiment according to the present application after receiving a house pattern generation request transmitted by the electronic terminal 101, 102, 103, or 104. The house type graph mentioned in the application refers to a plane space layout graph of a room, and comprises a graph describing the use function, the corresponding position and the size of the room. In practical use, the trend layout of the room can be visually seen through the house type diagram.
As shown in fig. 1, the system structure 100 may further include a panoramic image capture device 20, and as an example, the panoramic image capture device 20 may respectively take panoramic photographs of different rooms in the same house at different positions in the real-world scene to obtain a plurality of panoramic images.
The panoramic image mentioned in the application refers to an image with a visual angle exceeding the normal visual angle of a person, and generally indicates a horizontal visual angle of 360 degrees and a vertical visual angle of 180 degrees, so that a surrounding scene can be mapped into a planar picture in a specific geometrical relationship.
In the present application, the panoramic image is typically acquired using a professional panoramic image capture device 20. Panoramic image capture device 20 may be a device that is separate from the electronic terminal, for example, as shown in fig. 1, panoramic image capture device 20 may be fixed to the same support (e.g., a tripod), and cooperate with an application in the android system to implement the method of the embodiment of the present invention.
As another example, as shown in the electronic terminal 102 in fig. 1, the panoramic image capture device 20 may also be a device embedded in the electronic terminal 102, for example, a camera module embedded in the electronic terminal may be a panoramic image capture device including, but not limited to, a camera, a camcorder, a web camera, a surveillance camera, a medical camera, a high speed camera, a multi-camera such as a three-dimensional (3D) camera, and the like.
Furthermore, it should be understood that the number of electronic terminals, networks, and servers in FIG. 1 are merely illustrative. There may be any number of electronic terminals, networks, and servers, as desired for implementation.
As an example, after the user takes a panoramic image of a room using the panoramic image capture device 20, the panoramic image is uploaded to the server 106 directly or using the electronic terminals 101 to 104, and the server 106 performs the house pattern generation method as described in fig. 2.
Fig. 2 is a flowchart illustrating a house pattern generation method according to an exemplary embodiment of the present application.
In step S210, a panoramic image corresponding to a room is acquired. The panoramic image is a panoramic image that shows the surroundings of the room as completely as possible in a wide-angle manner, i.e. the panoramic image can show the room 360 degrees. As an example, this may be achieved by using two fisheye cameras. Wherein each fisheye camera can cover a viewing angle of 180 degrees, and can also be realized by using two catadioptric cameras. Each catadioptric camera consists of a common camera and a reflector and can cover a viewing angle of 180 degrees.
In an implementation, in the case that images can be captured by using two cameras in the electronic terminal, two cameras in the two cameras can be divided into a main camera and a sub camera. The method comprises the steps that an image shot by a main lens is a main image of a scene, an image shot by an auxiliary lens is an auxiliary image of the scene, and the main image and the auxiliary image which are obtained in the same scene are spliced by using a panoramic splicing technology, so that a panoramic image is obtained.
In step S220, using the panoramic image, structural information of the room is obtained, wherein the structural information is used for describing the room structurally. As an example, the structure information of the room may be obtained through the obtained single-frame panoramic image, so that the user may operate more conveniently.
In implementation, the panoramic image may be input to a trained machine learning layout model, and the structural information of the room may be acquired, where the machine learning layout model is obtained by training the labeling information of the structural information by using a set of training panoramic images acquired in advance and each training panoramic image.
Specifically, a training panoramic image forming the training panoramic image set and labeling information of the training panoramic image on the structure information are acquired, wherein the structure information includes roof edge information, ground edge information and intersecting line information between intersecting walls of a training room corresponding to the panoramic training image, the roof edge information may indicate position information of a roof edge in the panoramic image, the ground edge information may indicate position information of a ground edge in the panoramic image, and the intersecting line information between the intersecting walls may indicate position information of an intersecting line between the intersecting walls.
By way of example, after acquiring panoramic images taken for respective rooms, labeling the rooms may be performed, including labeling roof edge information, ground edge information, and intersection line information between intersecting walls in each room.
Subsequently, the machine learning layout model is constructed, with full network parameters set. In an implementation, the machine-learned layout model includes a horizontal network model (horizon net) coupled with a long-short-term memory (LSTM) network using a neural network model, wherein the neural network model may be a Recurrent Neural Network (RNN).
And training the machine learning layout model by utilizing the corresponding relation between the training panoramic image and the labeling information, and adjusting the parameters of the whole network until the machine learning layout model meets the preset requirements. As an example, the pixel value of the training panoramic image on each pixel point may be used as an input vector, for example, the size of the input vector is 1024 × 512, after the machine learning layout model is input, corresponding roof edge information, ground edge information, and intersection line information between intersecting walls may be obtained, and the full network parameters may be adjusted by using the difference between the labeling information and the obtained information until the machine learning layout model meets a preset requirement, for example, the accuracy rate reaches 90% or more.
In an alternative implementation, in order to be able to determine the structural information of the room more accurately, a panoramic image acquired with a panoramic camera may be pre-processed as an initial panoramic image, and the pre-processed initial panoramic image may be input into the machine-learned layout model as a panoramic image.
The above-mentioned preprocessing may include an orientation correction processing. Specifically, there may be a deviation in the orientation (i.e., the shooting viewpoint) in the initial panoramic image, for example, the orientation is not a horizontal orientation. Based on this, the orientation can be corrected.
In implementation, the orientation correction may be performed by a straight line in the detected panoramic image. That is, after a straight line in a panoramic image is detected, the orientation of the panoramic image is corrected by using an angle between the straight line and a horizontal line.
As an example, the panoramic image may be projected onto a polyhedron, then the line segment detection may be performed on the polyhedron, the detected line segment may be re-projected back to the spherical surface, and finally the straight line in the panoramic image may be detected. In addition, the following method can be adopted: after edge detection is carried out on a panoramic image, determining all edges in the panoramic image; clustering all edges and grouping the clustered edges; projecting the edges in groups onto a three-dimensional space; the only plane principle can be determined based on the straight line and one point outside the line, and the straight line in the panoramic image is determined by forming a plane through the point on the line segment and the circle center (the center of the camera). In step S230, a house type map of the room is generated using the structure information of the room. By way of example, after the roof edge information, the floor edge information, and the intersection information between intersecting walls have been determined, the frame of the room may be determined using these information, e.g., the edge lines of the room may be determined from the projected locations of the roof edges, and finally, the house type map of the room may be generated using these structural information.
In an implementation, the shape of the room may be irregular, for example, the edge lines of the room may be curved. In this case, the house pattern may not match the room. Thus, it may be determined whether the house type graph matches a room after the house type graph is generated, and it may be determined whether the house type graph matches a room by a human after the house type graph is generated, as an example.
In the event that it is determined that the floor plan does not match the room, the method may further include generating a floor projection image of the room using a floor projection method. Subsequently, a floor plan for the room is generated using the floor projection images.
In an alternative embodiment, the method may also modify the generated house type graph by using the wall information of the engineering measurement (e.g., laser ruler measurement), thereby completing the complex ruler operation. The information may be inaccurate, and the user is supported to measure the corresponding wall surface through the laser ruler, and record data to complete the steps of ruler resetting.
As an example, the user may input room parameters for the room, e.g., may input respective side lengths of the room. The house type map can then be modified using the room parameters, thereby generating a modified house type map.
As an example, the method may also detect a user touch of a user to a user pattern displayed on the electronic terminal. Specifically, in a case where the user type diagram generated in step S230 has been displayed on the display unit, the user may perform a touch input such as a drag on the displayed user type diagram. To this end, the method may generate a modified house type map by modifying the house type map by the movement of the touch while maintaining the continuous touch.
In implementation, the above steps may be performed on each room in the house, and the house type map of each room may be obtained. Subsequently, the house type graphs of the rooms can be spliced to generate an integral house type graph of the whole house.
By way of example, the connected region prediction process may be utilized to determine the connected region information among the plurality of house type graphs, and the plurality of house type graphs may be spliced to generate the overall house type graph corresponding to the plurality of rooms.
In practice, connected regions between the panoramic images may be identified, and then connected region information between the panoramic images is determined. In short, two rooms are commonly communicated with each other through a communication area, so that a user can walk in each room through the communication area, for example, the user can walk from one room to another room through a door between the two rooms, or a window can be shared between the two rooms.
As an example, connected regions in the panoramic image may be detected using an edge detection algorithm. Still further, a corresponding panoramic depth image may be determined from the scene image, then projected into three-dimensional space and a three-dimensional point cloud within the three-dimensional space projected onto the ground, a depth value furthest from the dots in each column may be calculated, and the depth values may be used to determine connected regions in the panoramic image.
Finally, in order to more vividly provide the house type situation to the user, the planar house type graph generated above can be converted into a three-dimensional space model. In implementation, height information can be added to each pixel point in the planar floor plan, and the planar floor plan is converted into a three-dimensional space model.
In summary, according to the house type graph generating method of the present application, the panoramic image of the room can be used to obtain the house type graph of the room, so that the automatic generation of the house type graph is realized, and a professional tool is not required to acquire the room image for multiple times.
In an exemplary embodiment, the above steps may be performed using a panoramic image acquired by an electronic terminal, based on which a user acquires a user-type figure in real time using the electronic terminal. In the present application, the user may be a variety of house-related personnel such as a decorator, a designer, a property broker, a house owner, and the like.
Further, the rooms in the house type map generation method described above in the exemplary embodiment of the present application are not limited to rooms in houses dedicated for living, for example, rooms in villas, apartments, family dormitories, and the like, and may be applied to any room having an independent space.
As an example, the house type map generating method of the exemplary embodiment of the present application may be applied to the industrial field. Specifically, a single panoramic image can be shot for each workshop in the factory building, the house type diagrams of each workshop are generated by using the method, and finally the house type diagrams of each workshop are spliced to obtain the house type diagram of the whole factory building, so that the whole factory building is known more intuitively, and each workshop in the factory building is effectively utilized.
As an example, the house pattern generation method of the exemplary embodiment of the present application may be applied to the medical field. Specifically, a single panoramic image can be shot for each department in the medical treatment, the family type graphs of each department are generated by the method, and finally the family type graphs of each department are spliced to obtain the family type graph of the whole hospital, so that each department in the hospital can be planned more intuitively.
That is, the house type map generating method of the exemplary embodiment of the present application may also be applied to a house having a specific purpose, which may be a school, a hotel, a warehouse, etc., in addition to the factories and hospitals only given above.
The operations performed on the electronic terminal will be described in detail below with reference to fig. 3A to 3F.
As shown in fig. 3A, various icons for launching various applications may be displayed on the main interface of the electronic device, and when a user desires to run a certain application, the user interface of the application may be accessed by touching the icon corresponding to the certain application on the display unit. In fig. 3A, a user starts a house-related application by touching an icon 302 on a main interface of the electronic terminal with a finger.
Subsequently, the house-related application is executed and a user interface corresponding to the house may be displayed on the display unit of the electronic device. In addition, there is a case where an existing application adds a house-related module by way of update, for example, a house-map generation module is added to an existing house-purchasing application. In this case, the user triggers the icon corresponding to the module, so that the electronic device starts the module and displays the user interface corresponding to the module on the display unit.
The user may trigger the user-type graph generation event in a preset manner, as shown in fig. 3B, the user may trigger the user-type graph generation event by touching the icon 311, or as shown in fig. 3C, the user-type graph generation event may be triggered by a voice input manner after touching the icon 312.
In the present application, a house type operation generation event and a trigger operation corresponding to the house type diagram generation operation may be set in advance, and when the user performs the trigger operation, the house type diagram generation event may be triggered, and then the user input for performing this house type diagram generation may be acquired. It should be noted that the triggering operations given above are merely exemplary.
Subsequently, information related to the house, for example, a house location, measurement data of each room, and the like, may be input as shown in fig. 3D. It should be noted that the order of performing the steps is not limited thereto, and for example, the user may be prompted to enter information on first-order relationship with the house after the user-type diagram is generated.
The user may then enter the panoramic image as suggested by one or more controls displayed on the user interface, as shown in FIG. 3E. As shown in fig. 3E, the user may select a panorama that has been stored in the gallery when uploading the panorama, which may be a panorama image taken by a panorama camera and transmitted to the electronic device, or a panorama image generated and stored after invoking an application that may generate the panorama image. Additionally, a direct photographing mode can be selected, wherein the direct photographing mode refers to that the main camera and the auxiliary camera in the electronic terminal are used for simultaneously photographing the rooms and splicing the acquired images to generate a panoramic image. The manner in which the panoramic image is acquired is exemplary only and not limiting.
Subsequently, the electronic terminal may send a house type map generation request to the server according to the acquired panoramic image. Upon receiving the request, the server may generate a house pattern/three dimensions of the room/house according to the house pattern generation method described above.
In addition, as shown in fig. 3F, the user may adjust the generated three-dimensional space model by touching. That is, the electronic terminal may detect a user touch on the user pattern; and correcting the user type graph through the movement of the touch while keeping the continuous touch, and generating a corrected user type graph.
In addition, the electronic terminal can also receive user input aiming at the house type graph, wherein the user input comprises room parameters for correcting the house type graph; and correcting the house type graph by using the room parameters to generate a corrected house type graph.
In summary, the house type map generating method according to the exemplary embodiment of the present application may generate the house type map in real time only by using the electronic terminal having the panoramic image photographing function. On the basis, the user can also directly correct the house pattern diagram. In addition, the user-type graph with different precision can be generated according to different requirements. Specifically, in the field of house sales, the user is not required to perform the repeated ruler operation, and in the decoration industry, the user is prompted to perform the repeated ruler operation.
Fig. 4 shows a block diagram of a house pattern generation apparatus of an exemplary embodiment of the present application. Referring to fig. 4, the apparatus includes, at a hardware level, a processor, an internal bus, and a computer-readable storage medium, wherein the computer-readable storage medium includes volatile memory and non-volatile memory. The processor reads the corresponding computer program from the non-volatile memory and then runs it. Of course, besides the software implementation, the present application does not exclude other implementations, such as logic devices or a combination of software and hardware, and the like, that is, the execution subject of the following processing flow is not limited to each logic unit, and may also be hardware or logic devices.
Specifically, the processor performs the following operations: acquiring a panoramic image corresponding to a room; acquiring structural information of the room by using the panoramic picture, wherein the structural information is used for describing the room structurally; and generating a house type graph of the room by using the structural information of the room.
Optionally, the processor in the implementing step of acquiring the structural information of the room using the panoramic image includes: and inputting the panoramic image into a trained machine learning layout model to acquire the structural information of the room, wherein the machine learning layout model is a house type graph obtained by training the labeling information of the structural information by using a pre-acquired training panoramic image set and each training panoramic image.
Optionally, the processor may implement the step of obtaining a panoramic image corresponding to the room, including: acquiring an initial panoramic image corresponding to the room by using a panoramic camera; performing orientation correction processing on the initial panoramic image, and inputting the initial panoramic image after the orientation correction processing into the machine learning layout model as a panoramic image.
Optionally, the machine-learned layout model includes a horizontal network model coupled with a long-short term memory network using a neural network model.
Optionally, the training of the machine learning layout model on the labeling information of the structure information by using a set of training panoramic images acquired in advance and each training panoramic image includes: acquiring training panoramic images forming the training panoramic image set and marking information of the training panoramic images on the structural information, wherein the structural information comprises roof edge information, ground edge information and intersecting line information between intersecting walls of a training room corresponding to the panoramic training images; constructing the machine learning layout model, and setting full network parameters; and training the machine learning layout model by utilizing the corresponding relation between the training panoramic image and the labeling information, and adjusting the parameters of the whole network until the machine learning layout model meets the preset requirements.
Optionally, the processor may further implement the steps of: generating a ground projection image of the room by using a ground projection method under the condition that the house type map is determined not to be matched with the room; generating a house type map of the room using the ground projection image.
Optionally, the processor may implement the step of generating the house type map of the room to include: and correcting the generated house type graph by using the wall surface information of the engineering measurement so as to finish the operation of repeated ruler.
Optionally, the processor may further implement the steps of: acquiring a plurality of panoramic images corresponding to a plurality of rooms belonging to the same house; generating a plurality of house type graphs corresponding to the plurality of panoramic images respectively; and determining the information of the connected regions among the plurality of house type graphs by utilizing the connected region prediction processing, and splicing the plurality of house type graphs to generate the integral house type graphs corresponding to the plurality of rooms.
Optionally, the processor may further include, after the step of implementing generates the overall floor plan corresponding to the plurality of rooms: and converting the planar floor plan into a three-dimensional space model by using the acquired height information.
The processor may also implement the steps of: responding to the triggering of a house type graph generation request, and acquiring a panoramic image corresponding to a room; displaying a floor plan of the room on a display interface, wherein the floor plan is an image generated by inputting a panoramic image to a trained machine learning layout model.
The processor may also implement the steps of: and modifying the house type graph by using the room parameters extracted from the user input to generate a modified house type graph.
Furthermore, when the house pattern generating apparatus is an electronic apparatus used at a user side, the processor may perform the steps of: acquiring a panoramic image corresponding to a room in response to a house type map generation event being triggered; displaying a house type map of the room on a display interface, wherein the house type map is an image generated by using structural information acquired from the panoramic image.
Optionally, the processor may further perform the steps of: receiving user input for the floor plan, wherein the user input comprises room parameters for modifying the floor plan; and correcting the house type graph by using the room parameters to generate a corrected house type graph.
Optionally, the processor may further perform the steps of: detecting a user touch on the house pattern; and correcting the user type graph through the movement of the touch while keeping the continuous touch, and generating a corrected user type graph.
Optionally, the processor, in the step of implementing, displaying the house type diagram of the room on the display interface includes: and displaying the corrected house type graph on a display interface.
Optionally, the processor, in implementing step, being triggered to acquire the panoramic image corresponding to the room in response to the house pattern generation event includes: sensing that the house pattern generation event is triggered while displaying a user interface; in response to the layout generation event being triggered, displaying one or more controls on the display interface; and starting an image acquisition unit to shoot the room by operating the one or more controls, and acquiring a panoramic image corresponding to the room.
In summary, the house type graph generating apparatus according to the exemplary embodiment of the present application may extract the structural information in the panorama corresponding to the room by using the machine learning layout model, and generate the house type graph of the room by using the structural information, thereby implementing the automated generation of the house type graph. Still further, pre-processing may be performed on the panoramic image, thereby improving the accuracy of the machine-learned layout model. Furthermore, for a special house type, a floor projection mode can be utilized to determine a house type graph of the room. Furthermore, the house type graph can be corrected by using the wall surface information measured by the laser ruler, so that the accuracy is improved. Furthermore, the connected region information can be determined, all the house type graphs can be spliced, automatic splicing is achieved, and therefore the whole house type graph of the whole house is generated. Further, a three-dimensional space model of the room can be generated, thereby enabling the user to more intuitively understand the overall situation of the house.
It should be noted that the execution subjects of the steps of the method provided in embodiment 1 may be the same device, or different devices may be used as the execution subjects of the method. For example, the execution subject of steps 21 and 22 may be device 1, and the execution subject of step 23 may be device 2; for another example, the execution subject of step 21 may be device 1, and the execution subjects of steps 22 and 23 may be device 2; and so on.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (17)

1. A house type graph generating method is characterized by comprising the following steps:
acquiring a panoramic image corresponding to a room;
acquiring structural information of the room by using the panoramic image, wherein the structural information is used for describing the room structurally;
and generating a house type graph of the room by using the structural information of the room.
2. The method of claim 1, wherein obtaining structural information of the room using the panoramic image comprises:
and inputting the panoramic image into a trained machine learning layout model to acquire the structural information of the room, wherein the machine learning layout model is a house type graph obtained by training the labeling information of the structural information by using a pre-acquired training panoramic image set and each training panoramic image.
3. The method of claim 2, wherein obtaining a panoramic image corresponding to a room comprises:
acquiring an initial panoramic image corresponding to the room by using a panoramic camera;
performing orientation correction processing on the initial panoramic image, the orientation-corrected initial panoramic image being input into the machine learning layout model as a panoramic image.
4. The method of claim 2, wherein the machine-learned layout model comprises a horizontal network model coupled with a long-short term memory network using a neural network model.
5. The method of claim 4, wherein the machine-learned layout model is trained on labeling information for the structural information using a set of pre-acquired training panoramic images and each training panoramic image comprises:
acquiring training panoramic images forming the training panoramic image set and marking information of the training panoramic images on the structural information, wherein the structural information comprises roof edge information, ground edge information and intersecting line information between intersecting walls of a training room corresponding to the panoramic training images;
constructing the machine learning layout model, and setting full network parameters;
and training the machine learning layout model by utilizing the corresponding relation between the training panoramic image and the labeling information, and adjusting the parameters of the whole network until the machine learning layout model meets the preset requirements.
6. The method of claim 1, further comprising:
generating a ground projection image of the room by using a ground projection method under the condition that the house type map is determined not to be matched with the room;
generating a house type map of the room using the ground projection image.
7. The method of claim 6, wherein generating the house type graph of the room comprises:
and correcting the generated house type graph by using the wall surface information of the engineering measurement so as to finish the operation of repeated ruler.
8. The method of claim 7, further comprising:
acquiring a plurality of panoramic images corresponding to a plurality of rooms belonging to the same house;
generating a plurality of house type graphs corresponding to the plurality of panoramic images respectively;
and determining the information of the connected regions among the plurality of house type graphs by utilizing the connected region prediction processing, and splicing the plurality of house type graphs to generate the integral house type graphs corresponding to the plurality of rooms.
9. The method of claim 8, further comprising, after generating the overall floor plan for the plurality of rooms:
and converting the integral house-type diagram into a three-dimensional space model by using the acquired height information.
10. The method of claim 1, wherein generating the house type graph of the room using the structure information of the room further comprises:
and modifying the house type graph by using the room parameters extracted from the user input to generate a modified house type graph.
11. A house type graph generating method is characterized by comprising the following steps:
acquiring a panoramic image corresponding to a room in response to a house type map generation event being triggered;
displaying a house type map of the room on a display interface, wherein the house type map is an image generated by using structural information acquired from the panoramic image.
12. The method of claim 11, further comprising:
receiving user input for the floor plan, wherein the user input comprises room parameters for modifying the floor plan;
and correcting the house type graph by using the room parameters to generate a corrected house type graph.
13. The method of claim 11, further comprising:
detecting a user touch on the house pattern;
and correcting the user type graph through the movement of the touch while keeping the continuous touch, and generating a corrected user type graph.
14. The method of claim 12 or 13, wherein displaying the house type graph of the room on a display interface comprises:
and displaying the corrected house type graph on a display interface.
15. The method of claim 11, wherein acquiring a panoramic image corresponding to a room triggered in response to a house map generation event comprises:
sensing that the house pattern generation event is triggered while displaying a user interface;
in response to the layout generation event being triggered, displaying one or more controls on the display interface;
and starting an image acquisition unit to shoot the room by operating the one or more controls, and acquiring a panoramic image corresponding to the room.
16. A house type graph generating apparatus, comprising:
a processor; and
a memory arranged to store computer executable instructions that, when executed, cause the processor to perform the method of any of claims 1 to 15.
17. A computer readable storage medium having computer instructions stored thereon that, when executed, implement the method of any of claims 1 to 15.
CN202010209236.1A 2020-03-23 2020-03-23 House type graph generation method and device Pending CN113436311A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010209236.1A CN113436311A (en) 2020-03-23 2020-03-23 House type graph generation method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010209236.1A CN113436311A (en) 2020-03-23 2020-03-23 House type graph generation method and device

Publications (1)

Publication Number Publication Date
CN113436311A true CN113436311A (en) 2021-09-24

Family

ID=77753273

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010209236.1A Pending CN113436311A (en) 2020-03-23 2020-03-23 House type graph generation method and device

Country Status (1)

Country Link
CN (1) CN113436311A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114494486A (en) * 2021-12-30 2022-05-13 北京城市网邻信息技术有限公司 Home map generation method, device and storage medium
CN114511622A (en) * 2021-12-30 2022-05-17 北京城市网邻信息技术有限公司 Panoramic image acquisition method and device, electronic terminal and medium
CN114529621A (en) * 2021-12-30 2022-05-24 北京城市网邻信息技术有限公司 Household type graph generation method and device, electronic equipment and medium
CN114945090A (en) * 2022-04-12 2022-08-26 阿里巴巴达摩院(杭州)科技有限公司 Video generation method and device, computer readable storage medium and computer equipment
CN115761045A (en) * 2022-11-21 2023-03-07 北京城市网邻信息技术有限公司 Household type graph generation method, device, equipment and storage medium

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114494486A (en) * 2021-12-30 2022-05-13 北京城市网邻信息技术有限公司 Home map generation method, device and storage medium
CN114511622A (en) * 2021-12-30 2022-05-17 北京城市网邻信息技术有限公司 Panoramic image acquisition method and device, electronic terminal and medium
CN114529621A (en) * 2021-12-30 2022-05-24 北京城市网邻信息技术有限公司 Household type graph generation method and device, electronic equipment and medium
CN114494486B (en) * 2021-12-30 2022-09-06 北京城市网邻信息技术有限公司 Method, device and storage medium for generating user type graph
CN114529621B (en) * 2021-12-30 2022-11-22 北京城市网邻信息技术有限公司 Household type graph generation method and device, electronic equipment and medium
CN114945090A (en) * 2022-04-12 2022-08-26 阿里巴巴达摩院(杭州)科技有限公司 Video generation method and device, computer readable storage medium and computer equipment
CN115761045A (en) * 2022-11-21 2023-03-07 北京城市网邻信息技术有限公司 Household type graph generation method, device, equipment and storage medium
CN115761045B (en) * 2022-11-21 2023-08-18 北京城市网邻信息技术有限公司 House pattern generation method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
US11165959B2 (en) Connecting and using building data acquired from mobile devices
WO2021036353A1 (en) Photographing-based 3d modeling system and method, and automatic 3d modeling apparatus and method
US11632516B2 (en) Capture, analysis and use of building data from mobile devices
CN113436311A (en) House type graph generation method and device
CN111127655B (en) House layout drawing construction method and device, and storage medium
JP6951595B2 (en) Housing data collection and model generation methods
US11557083B2 (en) Photography-based 3D modeling system and method, and automatic 3D modeling apparatus and method
US9595294B2 (en) Methods, systems and apparatuses for multi-directional still pictures and/or multi-directional motion pictures
CN110874818B (en) Image processing and virtual space construction method, device, system and storage medium
US11657085B1 (en) Optical devices and apparatuses for capturing, structuring, and using interlinked multi-directional still pictures and/or multi-directional motion pictures
CN108846899B (en) Method and system for improving area perception of user for each function in house source
CN114494487A (en) House type graph generation method, device and storage medium based on panorama semantic stitching
US20220189127A1 (en) Information processing system, information processing terminal device, server device, information processing method and program thereof
CN114529621B (en) Household type graph generation method and device, electronic equipment and medium
CA3069813A1 (en) Capturing, connecting and using building interior data from mobile devices
CN110662015A (en) Method and apparatus for displaying image
JP2016194784A (en) Image management system, communication terminal, communication system, image management method, and program
CN114494486B (en) Method, device and storage medium for generating user type graph
EP3882846B1 (en) Method and device for collecting images of a scene for generating virtual reality data
JP2019105876A (en) Information processing apparatus, information processing method and information processing program
CN113256822B (en) Spatial relationship prediction, data processing method, device and storage medium
CA3102860C (en) Photography-based 3d modeling system and method, and automatic 3d modeling apparatus and method
CN110276837B (en) Information processing method and electronic equipment
CN115830162A (en) Home map display method and device, electronic equipment and storage medium
CN117011439A (en) Image reconstruction method, image reconstruction device, computer equipment, storage medium and product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination