CN110598333A - Method and device for determining light source position and electronic equipment - Google Patents

Method and device for determining light source position and electronic equipment Download PDF

Info

Publication number
CN110598333A
CN110598333A CN201910875146.3A CN201910875146A CN110598333A CN 110598333 A CN110598333 A CN 110598333A CN 201910875146 A CN201910875146 A CN 201910875146A CN 110598333 A CN110598333 A CN 110598333A
Authority
CN
China
Prior art keywords
furniture
information
house
light source
target house
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910875146.3A
Other languages
Chinese (zh)
Other versions
CN110598333B (en
Inventor
陈成
刘松松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong 3vjia Information Technology Co Ltd
Original Assignee
Guangdong 3vjia Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong 3vjia Information Technology Co Ltd filed Critical Guangdong 3vjia Information Technology Co Ltd
Priority to CN201910875146.3A priority Critical patent/CN110598333B/en
Publication of CN110598333A publication Critical patent/CN110598333A/en
Application granted granted Critical
Publication of CN110598333B publication Critical patent/CN110598333B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention provides a method and a device for determining a light source position and electronic equipment, which relate to the technical field of decoration design and comprise the following steps: acquiring house information of a target house; determining a target house type graph of a target house based on the house information; the target house type graph comprises one or more of furniture position information, furniture height information and furniture shape and size information; and determining the light source position of the target house based on the deep learning algorithm and the target house type graph. The invention improves the calculation efficiency of determining the position of the light source.

Description

Method and device for determining light source position and electronic equipment
Technical Field
The invention relates to the technical field of decoration design, in particular to a method and a device for determining a light source position and electronic equipment.
Background
In modern house decoration, the light effect after finishing determines the quality of the whole design effect to a great extent, and the influence of the visible light source plays a crucial role in the whole decoration effect, so that how to find the correct and proper light source position is very important. However, in the prior art, the positions of light sources in a house type diagram are mostly defined manually, and a worker determines the positions of the light sources manually according to some dimensional factors such as the distance from the house type diagram to a wall, different spaces, the layout of furniture, the distance between the light sources and the light sources, and the like.
Disclosure of Invention
The embodiment of the invention aims to provide a method and a device for determining a light source position and electronic equipment, which improve the calculation efficiency of determining the light source position.
In a first aspect, an embodiment of the present invention provides a method for determining a position of a light source, including: acquiring house information of a target house; determining a target house type graph of the target house based on the house information; the target house type graph comprises one or more of furniture position information, furniture height information and furniture shape and size information; and determining the light source position of the target house based on a deep learning algorithm and the target house type graph.
In an alternative embodiment, the house information includes house outline and indoor layout information; the step of determining a target house type graph of the target house based on the house information comprises: determining furniture arrangement information of the target house based on the house outline and the indoor layout information, and determining furniture shape and size and furniture position information of the target house according to the furniture arrangement information of the target house; determining a furniture graph according to the furniture shape and size of the target house, and determining the color depth of the furniture graph according to the furniture height in the target house; determining a target house type map of the target house based on the furniture location information, the furniture graphic, and the color depth of the furniture graphic.
In an optional embodiment, the step of determining the position of the light source in the target house type map based on the deep learning algorithm and the target house type map includes: inputting the target house type graph into a pre-trained DCGAN network model, and determining the light source position of the target house based on the DCGAN network model; the DCGAN network model is obtained by training based on a known pattern and the light source position of the known pattern.
In an alternative embodiment, the method further comprises: and determining the light source coordinates of the target house type graph according to the central point of the light source position of the target house.
In a second aspect, an embodiment of the present invention provides an apparatus for determining a position of a light source, including: the information acquisition module is used for acquiring the house information of the target house; the family graph determining module is used for determining a target family graph of the target house based on the house information; the target house type graph comprises one or more of furniture position information, furniture height information and furniture shape and size information; and the light source position determining module is used for determining the light source position of the target house based on a deep learning algorithm and the target house type graph.
In an alternative embodiment, the house information includes house outline and indoor layout information; the house type map determining module is further used for determining furniture arrangement information of the target house based on the house outline and the indoor layout information, and determining furniture shape and size and furniture position information of the target house according to the furniture arrangement information of the target house; determining a furniture graph according to the furniture shape and size of the target house, and determining the color depth of the furniture graph according to the furniture height in the target house; determining a target house type map of the target house based on the furniture location information, the furniture graphic, and the color depth of the furniture graphic.
In an optional embodiment, the light source position determining module is further configured to input the target house type graph into a pre-trained DCGAN network model, and determine the light source position of the target house based on the DCGAN network model; the DCGAN network model is obtained by training based on a known pattern and the light source position of the known pattern.
In an alternative embodiment, the apparatus further comprises: and the light source coordinate determination module is used for determining the light source coordinate of the target house type graph according to the central point of the light source position of the target house.
In a third aspect, an embodiment of the present invention provides an electronic device, including a memory and a processor, where the memory stores a computer program operable on the processor, and the processor executes the computer program to implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present invention provide a computer-readable medium, wherein the computer-readable medium stores computer-executable instructions that, when invoked and executed by a processor, cause the processor to implement the method of the first aspect.
The embodiment of the invention provides a method and a device for determining the position of a light source and electronic equipment, wherein the method comprises the following steps: firstly, acquiring house information of a target house; then determining a target house type map (comprising one or more of furniture location information, furniture height information and furniture shape and size information) of the target house based on the house information; and finally, determining the light source position of the target house based on a deep learning algorithm and the target house type graph. According to the method, on the basis of fully considering the influence factors (furniture position information, furniture height information and furniture shape and size information) of the light source position, the light source position in the target house can be rapidly determined based on a deep learning algorithm and the target house type graph of the target house, and the calculation efficiency for determining the light source position is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a flowchart of a method for determining a light source position according to an embodiment of the present invention;
fig. 2 is a flowchart of a method for determining a light source position according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a house type according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a two-dimensional target house type according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a house type with a light source location according to an embodiment of the present invention;
FIG. 6 is a diagram of a house type with light source coordinates according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of an apparatus for determining a position of a light source according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions of the present invention will be described clearly and completely with reference to the following embodiments, and it should be understood that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In view of the problem that the existing light source position determining technology has low calculation efficiency when the position of the light source is calculated manually due to a large number of factors, the embodiment of the invention provides a light source position determining method, a light source position determining device and electronic equipment, which can be applied to determining the position of the light source during house decoration design.
For the convenience of understanding the present embodiment, a method for determining the position of a light source disclosed in the present embodiment will be described in detail first.
An embodiment of the present invention provides a method for determining a light source position, referring to a flowchart of the method for determining a light source position shown in fig. 1, where the method is executed by a controller of an electronic device, and the method includes the following steps S102 to S106:
step S102: and acquiring the house information of the target house.
The target house may be a house for which the determination of the position of the internal light source is required at the time of the decoration design. The house information is information that needs to be extracted from the target house when the light source position is determined, so that the light source position is determined based on the extracted house information. In one embodiment, the house information may be obtained by identifying a house type diagram of a target house to obtain a complete house type diagram scheme, and the house information obtained according to the obtained house type diagram scheme data includes: wall body information, door and window information, profile information of a house type graph and the like, and then indoor layout information in the house is obtained according to the house information of the target house, wherein the indoor layout information comprises: furniture position information, furniture height information, furniture shape and size information, and the like.
Step S104: determining a target house type graph of a target house based on the house information; the target house type graph includes one or more of furniture position information, furniture height information, and furniture shape and size information.
The target house type graph can be a two-dimensional image capable of reflecting wall information, door and window information, contour information of the house type graph, furniture position information, furniture height information and furniture shape and size information of the target house. And drawing to obtain a target house type graph of the target house based on wall information, door and window information, contour information of the house type graph and indoor layout information in the house information.
Step S106: and determining the light source position of the target house based on the deep learning algorithm and the target house type graph.
The target house type graph comprises wall information, door and window information, contour information of the house type graph, furniture position information, furniture height information and furniture shape and size information of a target house, the light source position of the target house is determined based on the deep learning algorithm and the target house type graph, and the influence of the indoor furniture size and the furniture placing position of the furniture height meter on the position of the light source is fully considered according to the actual scene of the target house, so that the light source position result obtained based on the deep learning algorithm is more fit with the actual situation, and the rationality of light source position determination is improved.
According to the method for determining the light source position, provided by the embodiment of the invention, on the basis of fully considering the influence factors (furniture position information, furniture height information and furniture shape and size information) of the light source position, the light source position in the target house can be quickly determined based on a deep learning algorithm and the target house type graph of the target house, and the calculation efficiency for determining the light source position is improved.
The existing light source position determining technology does not consider the furniture height information in the target house, so that the lighting effect of the determined light source position is poor.
In order to improve the rationality of the light source position determination, this embodiment provides a specific implementation manner of determining a house type map of a target house based on house information, where the house information includes a house outline and indoor layout information, and the following steps (1) to (3) may be specifically referred to:
step (1): and determining furniture arrangement information of the target house based on the house outline and the indoor layout information, and determining the furniture shape and size and the furniture position information of the target house according to the furniture arrangement information of the target house.
The furniture arrangement information can also be obtained based on a house type diagram of a target house, the furniture arrangement information comprises the placing position of each piece of indoor furniture, a furniture top view and furniture size information, the placing position can be the relative position relationship between the indoor furniture and the wall body outline, and the furniture shape and size and the furniture position information in the target house can be determined based on the house outline and the indoor layout information in the house information.
Step (2): and determining the furniture graph according to the furniture shape and size of the target house, and determining the color depth of the furniture graph according to the furniture height in the target house.
In one embodiment, the furniture graphic may also be determined based on the furniture top view and the furniture size information. The shape of the furniture figure is determined according to the shape of the furniture top view, for example, the top view of a wardrobe and a bed is rectangular, the wardrobe and the bed are represented by rectangles with different sizes, and the size of the furniture can be reduced by a preset proportion according to the actual size of the furniture, so that the size of the furniture figure is obtained. In order to convert three-dimensional information (such as the height of furniture) in a target house into two-dimensional information, the color depth of the furniture image can be determined according to the height of the furniture in the target house, so that the furniture height information can be represented by furniture images with different depths.
In another embodiment, according to the top views of all pieces of furniture and the position information of the pieces of furniture, a maximized rectangle is drawn to just surround all parts of the pieces of furniture, so that each piece of furniture can be represented by a rectangular block, and since the specific furniture contour and the furniture texture have little influence on the calculation of the position of the light source, the pieces of furniture in the target house can be determined to be rectangular blocks with different sizes, the length-width ratio of the rectangular blocks can be determined according to the length-width ratio of the actual sizes of the pieces of furniture, and the length-width ratio examples of the rectangular blocks of different pieces of furniture are also different.
And (3): and determining a target house type graph of the target house based on the furniture position information, the furniture graph and the color depth of the furniture graph.
In order to make the calculated light source position more accurate, the target house type map may include more influencing factors influencing the light source position, for example, furniture in the target house has a great influence on the light, including the size and the placement of the furniture and the height of the furniture, and the target house type map of the target house is determined according to the furniture position information, the furniture figure and the color depth of the furniture figure, so that the house type map includes the furniture position information, the furniture height information and the furniture shape and size information.
In order to improve the accuracy of determining the light source position, the embodiment provides a specific implementation manner for determining the light source position in the target house type graph based on the deep learning algorithm and the target house type graph: inputting a target house type graph into a pre-trained DCGAN (Deep Convolution generated confrontation network) network model, and determining the light source position of a target house based on the DCGAN network model; the DCGAN network model is obtained by training based on a known house pattern and the light source position of the known house pattern. Firstly, a DCGAN network model is trained by using a known house type graph and the light source position of the known house type graph, wherein the known house type graph comprises wall information, door and window information, outline information of the house type graph, furniture position information, furniture height information and furniture shape and size information in the known house type, the light source positions of the known house type graph and the known house type graph are input into the DCGAN network model, and the DCGAN network model is trained to learn the mapping relation between the information of the known house type graph (comprising the wall information, the door and window information, the outline information of the house type graph, the furniture position information, the furniture height information and the furniture shape and size information) and the light source position of the known house type graph. And inputting the target house type graph into the trained DCGAN network model, wherein the trained DCGAN network model can determine the optimal light source position in the target house according to the wall information, the door and window information, the contour information of the house type graph, the furniture position information, the furniture height information and the furniture shape and size information in the target house type graph. And the DCGAN network model obtains the light source position of the target household graph in a backtracking mode based on the existing high-quality rendering scheme effect graph. The light source position may be output as a vertex coordinate position of the pixel region, or may be output by plotting a pixel region of the light source position on the target user type map.
The DCGAN network model is a combination of CNN (Convolutional Neural Networks) and GAN (Generative adaptive Networks), and introduces a Convolutional network into a Generative model for unsupervised training, and improves the learning effect of the Generative network by using the strong feature extraction capability of the Convolutional network. In an embodiment, a DCGAN network model in the present application adopts a base version of GAN, where there are two opposite networks G and D in the GAN, G is a generation network, D is a discrimination network, G aims to generate an image through network parameters, D aims to determine whether the image is true or false, the two networks oppose each other, the more true the image generated by G is, the higher the probability of D determination error is, and because a trained groudtruth is provided, the capability of D can be continuously improved, and conversely, the capability of G to generate a higher-quality image can be continuously promoted, so that in the continuous promotion of mutual opposition, G can completely generate data that is almost close to a true image and is needed, and a convolutional neural network CNN is introduced, and a main purpose is to better extract features of a target user type graph. Therefore, the generation network is continuously optimized in a mode of combining the base-GAN and the CNN, so that the DCGAN network model generates a house family pattern with a proper light source position.
In order to improve user experience and make the light source position more intuitive, the method for determining the light source position provided by this embodiment further includes: and determining the light source coordinates of the target house type graph according to the central point of the light source position of the target house. The light source positions obtained through calculation are a plurality of pixel matrixes at specific positions in the target house type image, the pixel values around the pixel matrixes are 0, in order to improve the accuracy of the calculated light source positions and determine the specific installation position of the lamp at the later period, the center point of each light source position pixel matrix is calculated, and the center point of each pixel matrix is used as the light source coordinate of the target house type image. And according to the light source coordinates in the house type diagram, the light source coordinates can be applied to a decoration design diagram, and according to the light source coordinate position, a light source is added to the light source position in the 3D diagram of the house, and a complete house decoration design effect diagram is presented through rendering of the light source.
In practical applications, referring to the flowchart of the method for determining the light source position shown in fig. 2, the light source position of the user-type diagram may be generated by using the method for determining the light source position, which may be specifically executed with reference to the following steps S202 to S208:
step S202: and acquiring a two-dimensional target house type graph of the target house according to the house information of the target house.
Specifically, all wall and furniture information of the target house can be obtained by identifying the house type diagram of the target house, or the wall information of the target house is described according to the existing analysis file of the house type diagram of the house, so that a complete outline diagram with the house structure layout is obtained; drawing the furniture in the target house in the contour map, according to the top view of all the furniture and the concrete positions of the furniture, using the standard that all parts of the furniture are just surrounded to the maximum extent as the standard, representing and replacing all the furniture by rectangular blocks, abstracting all the furniture information, and neglecting the detailed information because the concrete furniture contour and texture have little influence on the light source position; although the detail features of the furniture can be abstracted, the height of the furniture has a large influence on the determination of the light source position, so that the heights of all the furniture are converted into gray rectangular blocks with different shades for representation, and the darker the color of the rectangular block is, the higher the height of the furniture is, and in this way, the 3-dimensional furniture information can be converted into two-dimensional image information. Taking the house type diagram of the house a shown in fig. 3 as an example, all the wall and furniture information of the target house is obtained by recognizing the house type diagram of the house a, and then the two-dimensional target house type diagram of the house a is obtained according to the furniture arrangement information and the furniture size information of the house a, which is shown in fig. 4.
Step S204: the DCGAN network model is trained using training images.
The training image is a house layout picture and a known light source position in the house layout picture, and the house layout picture comprises wall information, door and window information, contour information of the house layout picture, furniture position information, furniture height information and furniture shape and size information of a house. By unifying the specifications of the training images, a house layout graph without a light source (also referred to as a two-dimensional target house layout graph) and a house layout graph with a light source can be merged (the house layout graphs are all graphs obtained by data drawing, data supplement and data conversion) to form a training image; it is also possible to adjust all training images to the same size, e.g. to a resolution of 640 x 640.
Step S206: and inputting the two-dimensional target house type graph into a pre-trained DCGAN network model, and generating the house type graph with the light source position based on the DCGAN network model.
Taking the two-dimensional target house type diagram of house a shown in fig. 4 as an example, the two-dimensional target house type diagram of house a is input into the trained DCGAN network model, so as to generate the house type diagram with the light source position of house a shown in fig. 5, and the gray dots (which may be other colors, such as red) in fig. 5 are the generated light source positions.
Step S208: and determining the coordinates of the light source in the target house type graph based on the house type graph with the position of the light source.
The light source positions are extracted from the generated house type graph with the light source positions, the house type graph with the light source positions is obtained, and according to the color channels, the house type graph with the light source positions is subjected to contrast segmentation of image pixel points by adopting a threshold value method, so that all the light source positions can be obtained. For example, when the generated light source position is a red dot, since furniture and house outlines are gray with different shades and the light source is red, the image pixel point is contrasted and segmented by using a threshold value method to obtain all light source positions. The light source position is actually a pixel matrix at a specific position, and the coordinates of the light source, i.e. the specific coordinates of the lighting, are determined by calculating the center point of the light source position. Taking the house type graph with the light source position of the house a shown in fig. 5 as an example, the threshold method is adopted to perform contrast segmentation on the image pixel points to obtain all the light source positions in fig. 5, and then the center point of the light source position is calculated to obtain the house type graph with the light source coordinate of the house a, such as the house type graph with the light source coordinate shown in fig. 6.
According to the method for determining the light source position, the three-dimensional information in the house information is converted into the two-dimensional target house type graph, the influence of the size, height and placement position of furniture in the target house on the light source position is fully considered, the DCGAN network model obtained through training and the light source position result obtained according to the DCGAN network model are more fit with the actual situation of the target house, and the reasonability and accuracy of light source position determination are improved.
Corresponding to the method for determining the position of the light source, an embodiment of the present invention provides a device for determining the position of the light source, referring to a schematic structural diagram of the device for determining the position of the light source shown in fig. 7, including:
and the information acquisition module 71 is configured to acquire the house information of the target house.
A layout determining module 72, configured to determine a target layout of a target house based on the house information; the target house type graph includes one or more of furniture position information, furniture height information, and furniture shape and size information.
And the light source position determining module 73 is used for determining the light source position of the target house based on the deep learning algorithm and the target house type graph.
The device for determining the position of the light source provided by this embodiment can quickly determine the position of the light source in the target house based on the deep learning algorithm and the target house type diagram of the target house on the basis of fully considering the light effect influence factors (furniture position information, furniture height information and furniture shape and size information), thereby improving the calculation efficiency for determining the position of the light source.
In one embodiment, the house information includes house profile and indoor layout information; the layout determining module 72 is further configured to determine furniture layout information of the target house based on the house outline and the indoor layout information, and determine the furniture shape and size and the furniture position information of the target house according to the furniture layout information of the target house; determining a furniture graph according to the furniture shape and size of a target house, and determining the color depth of the furniture graph according to the furniture height in the target house; and determining a target house type graph of the target house based on the furniture position information, the furniture graph and the color depth of the furniture graph.
In an embodiment, the light source position determining module 73 is further configured to input the target house type diagram into a pre-trained DCGAN network model, and determine the light source position of the target house based on the DCGAN network model; the DCGAN network model is obtained by training based on a known house pattern and the light source position of the known house pattern.
In one embodiment, the above apparatus further comprises:
and the light source coordinate determination module is used for determining the light source coordinates of the target house type graph according to the central point of the light source position of the target house.
According to the device for determining the light source position, the three-dimensional information in the house information is converted into the two-dimensional target house type graph, the influence of the size, height and placement position of furniture in the target house on the light source position is fully considered, the DCGAN network model obtained through training and the light source position result obtained according to the DCGAN network model are more fit with the actual situation of the target house, and the reasonability and accuracy of light source position determination are improved.
The device provided by the embodiment has the same implementation principle and technical effect as the foregoing embodiment, and for the sake of brief description, reference may be made to the corresponding contents in the foregoing method embodiment for the portion of the embodiment of the device that is not mentioned.
An embodiment of the present invention provides an electronic device, as shown in a schematic structural diagram of the electronic device shown in fig. 8, where the electronic device includes a processor 81 and a memory 82, where a computer program operable on the processor is stored in the memory, and when the processor executes the computer program, the steps of the method provided in the foregoing embodiment are implemented.
Referring to fig. 8, the electronic device further includes: a bus 84 and a communication interface 83, and the processor 81, the communication interface 83 and the memory 82 are connected by the bus 84. The processor 81 is arranged to execute executable modules, such as computer programs, stored in the memory 82.
The Memory 82 may include a high-speed Random Access Memory (RAM) and may also include a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. The communication connection between the network element of the system and at least one other network element is realized through at least one communication interface 83 (which may be wired or wireless), and the internet, a wide area network, a local network, a metropolitan area network, etc. may be used.
The bus 84 may be an ISA (Industry Standard Architecture) bus, a PCI (Peripheral Component Interconnect) bus, an EISA (extended Industry Standard Architecture) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one double-headed arrow is shown in FIG. 8, but that does not indicate only one bus or one type of bus.
The memory 82 is used for storing a program, the processor 81 executes the program after receiving an execution instruction, and the method executed by the apparatus defined by the flow process disclosed in any of the foregoing embodiments of the present invention may be applied to the processor 81, or implemented by the processor 81.
The processor 81 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 81. The Processor 81 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like. The device can also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, or a discrete hardware component. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 82, and the processor 81 reads the information in the memory 82 and performs the steps of the above method in combination with the hardware thereof.
Embodiments of the present invention provide a computer-readable medium, wherein the computer-readable medium stores computer-executable instructions, which, when invoked and executed by a processor, cause the processor to implement the method of the above-mentioned embodiments.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (10)

1. A method for determining a position of a light source, comprising:
acquiring house information of a target house;
determining a target house type graph of the target house based on the house information; the target house type graph comprises one or more of furniture position information, furniture height information and furniture shape and size information;
and determining the light source position of the target house based on a deep learning algorithm and the target house type graph.
2. The method of claim 1, wherein the house information includes house profile and indoor layout information;
the step of determining a target house type graph of the target house based on the house information comprises:
determining furniture arrangement information of the target house based on the house outline and the indoor layout information, and determining furniture shape and size and furniture position information of the target house according to the furniture arrangement information of the target house;
determining a furniture graph according to the furniture shape and size of the target house, and determining the color depth of the furniture graph according to the furniture height in the target house;
determining a target house type map of the target house based on the furniture location information, the furniture graphic, and the color depth of the furniture graphic.
3. The method of claim 1, wherein the step of determining the position of the light source in the target house type map based on the deep learning algorithm and the target house type map comprises:
inputting the target house type graph into a pre-trained DCGAN network model, and determining the light source position of the target house based on the DCGAN network model; the DCGAN network model is obtained by training based on a known pattern and the light source position of the known pattern.
4. The method of claim 1, further comprising:
and determining the light source coordinates of the target house type graph according to the central point of the light source position of the target house.
5. An apparatus for determining a position of a light source, comprising:
the information acquisition module is used for acquiring the house information of the target house;
the family graph determining module is used for determining a target family graph of the target house based on the house information; the target house type graph comprises one or more of furniture position information, furniture height information and furniture shape and size information;
and the light source position determining module is used for determining the light source position of the target house based on a deep learning algorithm and the target house type graph.
6. The apparatus of claim 5, wherein the house information comprises house outline and indoor layout information;
the house type map determining module is further used for determining furniture arrangement information of the target house based on the house outline and the indoor layout information, and determining furniture shape and size and furniture position information of the target house according to the furniture arrangement information of the target house; determining a furniture graph according to the furniture shape and size of the target house, and determining the color depth of the furniture graph according to the furniture height in the target house; determining a target house type map of the target house based on the furniture location information, the furniture graphic, and the color depth of the furniture graphic.
7. The apparatus of claim 5, wherein the light source location determining module is further configured to input the target house type map into a pre-trained DCGAN network model, and determine the light source location of the target house based on the DCGAN network model; the DCGAN network model is obtained by training based on a known pattern and the light source position of the known pattern.
8. The apparatus of claim 5, further comprising:
and the light source coordinate determination module is used for determining the light source coordinate of the target house type graph according to the central point of the light source position of the target house.
9. An electronic device comprising a memory and a processor, wherein the memory stores a computer program operable on the processor, and wherein the processor implements the method of any of claims 1-5 when executing the computer program.
10. A computer-readable medium having stored thereon computer-executable instructions that, when invoked and executed by a processor, cause the processor to implement the method of any of claims 1-5.
CN201910875146.3A 2019-09-16 2019-09-16 Determination method and device for light source position and electronic equipment Active CN110598333B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910875146.3A CN110598333B (en) 2019-09-16 2019-09-16 Determination method and device for light source position and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910875146.3A CN110598333B (en) 2019-09-16 2019-09-16 Determination method and device for light source position and electronic equipment

Publications (2)

Publication Number Publication Date
CN110598333A true CN110598333A (en) 2019-12-20
CN110598333B CN110598333B (en) 2023-05-16

Family

ID=68860331

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910875146.3A Active CN110598333B (en) 2019-09-16 2019-09-16 Determination method and device for light source position and electronic equipment

Country Status (1)

Country Link
CN (1) CN110598333B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111402409A (en) * 2020-04-03 2020-07-10 湖北工业大学 Exhibition hall design illumination condition model system
CN112257131A (en) * 2020-09-18 2021-01-22 杭州群核信息技术有限公司 Method for realizing ceiling overlook effect diagram based on matrix transformation
CN112257272A (en) * 2020-10-26 2021-01-22 南京国豪家装饰设计有限公司 Decoration design method for complete furniture

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103778538A (en) * 2012-10-17 2014-05-07 李兴斌 Furniture simulation layout method and furniture simulation layout system
CN104778756A (en) * 2015-04-10 2015-07-15 北京明兰网络科技有限公司 Intelligent home decoration design system
CN109815641A (en) * 2019-03-21 2019-05-28 河南工程学院 DESIGN OF INTERI OR LIGHT method
CN110059690A (en) * 2019-03-28 2019-07-26 广州智方信息科技有限公司 Floor plan semanteme automatic analysis method and system based on depth convolutional neural networks
CN110197225A (en) * 2019-05-28 2019-09-03 广东三维家信息科技有限公司 House type spatial match method and system based on deep learning
WO2019169699A1 (en) * 2018-03-09 2019-09-12 平安科技(深圳)有限公司 House model rendering method and apparatus, terminal device, and medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103778538A (en) * 2012-10-17 2014-05-07 李兴斌 Furniture simulation layout method and furniture simulation layout system
CN104778756A (en) * 2015-04-10 2015-07-15 北京明兰网络科技有限公司 Intelligent home decoration design system
WO2019169699A1 (en) * 2018-03-09 2019-09-12 平安科技(深圳)有限公司 House model rendering method and apparatus, terminal device, and medium
CN109815641A (en) * 2019-03-21 2019-05-28 河南工程学院 DESIGN OF INTERI OR LIGHT method
CN110059690A (en) * 2019-03-28 2019-07-26 广州智方信息科技有限公司 Floor plan semanteme automatic analysis method and system based on depth convolutional neural networks
CN110197225A (en) * 2019-05-28 2019-09-03 广东三维家信息科技有限公司 House type spatial match method and system based on deep learning

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
GIOVANNI GIACOMO等: "Sonar-to-Satellite Translation using Deep Learning", 《2018 17TH IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA)》 *
马天瑶: "室内图像中家具多标签标注的实现", 《电脑知识与技术》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111402409A (en) * 2020-04-03 2020-07-10 湖北工业大学 Exhibition hall design illumination condition model system
CN112257131A (en) * 2020-09-18 2021-01-22 杭州群核信息技术有限公司 Method for realizing ceiling overlook effect diagram based on matrix transformation
CN112257131B (en) * 2020-09-18 2023-10-03 杭州群核信息技术有限公司 Method for realizing suspended ceiling overlooking effect graph based on matrix transformation
CN112257272A (en) * 2020-10-26 2021-01-22 南京国豪家装饰设计有限公司 Decoration design method for complete furniture
CN112257272B (en) * 2020-10-26 2024-01-23 南京国豪家装饰设计有限公司 Decoration design method of furniture set

Also Published As

Publication number Publication date
CN110598333B (en) 2023-05-16

Similar Documents

Publication Publication Date Title
US9916676B2 (en) 3D model rendering method and apparatus and terminal device
US9818232B2 (en) Color-based depth smoothing of scanned 3D model to enhance geometry in 3D printing
CN110598333A (en) Method and device for determining light source position and electronic equipment
US10885660B2 (en) Object detection method, device, system and storage medium
US11182942B2 (en) Map generation system and method for generating an accurate building shadow
JP2023545200A (en) Parameter estimation model training method, parameter estimation model training apparatus, device, and storage medium
WO2017200968A1 (en) 2d image processing for extrusion into 3d objects
CN115937439B (en) Method and device for constructing three-dimensional model of urban building and electronic equipment
US20120275723A1 (en) Method and device for simplifying space data
CN111986212B (en) Portrait hairline flowing special effect implementation method
CN111695554B (en) Text correction method and device, electronic equipment and storage medium
CN110910308B (en) Image processing method, device, equipment and medium
CN106323190B (en) The depth measurement method of customizable depth measurement range and the system of depth image
CN107424583B (en) Display data processing method and system for special-shaped image
CN110688704A (en) Home decoration design method and system and electronic equipment
CN113658195B (en) Image segmentation method and device and electronic equipment
CN113343987B (en) Text detection processing method and device, electronic equipment and storage medium
WO2021197230A1 (en) Three-dimensional head model constructing method, device, system, and storage medium
CN114356201A (en) Writing beautifying method, device, equipment and readable storage medium
CN113033256A (en) Training method and device for fingertip detection model
WO2019051688A1 (en) Method and apparatus for detecting optical module, and electronic device
CN117315050B (en) Camera calibration method
RU2778288C1 (en) Method and apparatus for determining the illumination of an image of the face, apparatus, and data storage medium
CN114299265A (en) Three-dimensional point cloud data segmentation processing method and system of flight time camera
CN117115358A (en) Automatic digital person modeling method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant