WO2024011338A1 - Procédé d'affichage, appareil d'affichage et dispositif électronique - Google Patents

Procédé d'affichage, appareil d'affichage et dispositif électronique Download PDF

Info

Publication number
WO2024011338A1
WO2024011338A1 PCT/CN2022/104821 CN2022104821W WO2024011338A1 WO 2024011338 A1 WO2024011338 A1 WO 2024011338A1 CN 2022104821 W CN2022104821 W CN 2022104821W WO 2024011338 A1 WO2024011338 A1 WO 2024011338A1
Authority
WO
WIPO (PCT)
Prior art keywords
dimensional model
target object
target
information
building
Prior art date
Application number
PCT/CN2022/104821
Other languages
English (en)
Chinese (zh)
Inventor
马晨
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Priority to PCT/CN2022/104821 priority Critical patent/WO2024011338A1/fr
Publication of WO2024011338A1 publication Critical patent/WO2024011338A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management

Definitions

  • the present application relates to the field of computer technology, and specifically to a display method, a display device and an electronic device.
  • the architectural map display method in the related art may include a two-dimensional map and a three-dimensional map.
  • the two-dimensional map displays the building in layers, that is, a two-dimensional plan map is produced for each floor.
  • the two-dimensional map has the problems of destroying the integrity of the cross-floor building structure, being inconvenient to display walls and wall attachments, and having poor display effects.
  • the three-dimensional maps in the related art usually only display the external surface shape of the building, and it is difficult to display the blocked building. It can be seen that the display effect of the building map display method in the related art is poor.
  • the purpose of the embodiments of the present application is to provide a display method, device and electronic equipment that can display the detailed structure inside the building and improve the display effect of the architectural map display method.
  • embodiments of the present application provide a display method, which method includes:
  • a second three-dimensional model of the target object and first prompt information are displayed, where the first prompt information includes architectural feature information of the target object.
  • a display device which includes:
  • the first acquisition module is used to acquire the location information of the electronic device
  • a first determination module configured to determine the target object in the first three-dimensional model according to the first three-dimensional model of the target building
  • the first display module is configured to display the second three-dimensional model of the target object and first prompt information according to the position information, where the first prompt information includes architectural feature information of the target object.
  • inventions of the present application provide an electronic device.
  • the electronic device includes a processor and a memory.
  • the memory stores programs or instructions that can be run on the processor.
  • the programs or instructions are processed by the processor.
  • the processor is executed, the steps of the method described in the first aspect are implemented.
  • embodiments of the present application provide a readable storage medium.
  • Programs or instructions are stored on the readable storage medium.
  • the steps of the method described in the first aspect are implemented. .
  • inventions of the present application provide a chip.
  • the chip includes a processor and a communication interface.
  • the communication interface is coupled to the processor.
  • the processor is used to run programs or instructions to implement the first aspect. the method described.
  • embodiments of the present application provide a computer program product, the program product is stored in a storage medium, and the program product is executed by at least one processor to implement the method as described in the first aspect.
  • a seventh aspect provides a communication device configured to perform the steps of the method described in the first aspect.
  • the position information of the electronic device is obtained; the target object in the first three-dimensional model is determined according to the first three-dimensional model of the target building; and the second three-dimensional image of the target object is displayed according to the position information.
  • a model and first prompt information where the first prompt information includes architectural feature information of the target object.
  • the three-dimensional model of the target objects in the target building (such as rooms, elevators, fire escapes, and even structures hidden in walls and under the ground) can be displayed on the electronic device based on the first three-dimensional model that can reflect the internal details of the target building. and prompt information, which improves the display effect of the detailed structure in the target building.
  • Figure 1 is a flow chart of a display method provided in an embodiment of the present application.
  • Figure 2a is one of the schematic diagrams of application scenario one of the embodiment of the present application.
  • Figure 2b is the second schematic diagram of application scenario one of the embodiment of the present application.
  • Figure 3a is one of the schematic diagrams of application scenario two of the embodiment of the present application.
  • Figure 3b is the second schematic diagram of application scenario two of the embodiment of the present application.
  • Figure 3c is the third schematic diagram of application scenario two of the embodiment of the present application.
  • Figure 4 is a schematic structural diagram of a display device provided in an embodiment of the present application.
  • Figure 5 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
  • FIG. 6 is a schematic diagram of the hardware structure of an electronic device provided in an embodiment of the present application.
  • first, second, etc. in the description and claims of this application are used to distinguish similar objects and are not used to describe a specific order or sequence. It is to be understood that the figures so used are interchangeable under appropriate circumstances so that the embodiments of the present application can be practiced in orders other than those illustrated or described herein, and that "first,” “second,” etc. are distinguished Objects are usually of one type, and the number of objects is not limited. For example, the first object can be one or multiple.
  • “and/or” in the description and claims indicates at least one of the connected objects, and the character “/" generally indicates that the related objects are in an "or” relationship.
  • a display method provided by an embodiment of the present application may include the following steps:
  • Step 101 Obtain the location information of the electronic device.
  • the location information of the electronic device may include at least one of the three-dimensional coordinate position of the electronic device, the floor where it is located, and the viewing direction.
  • the three-dimensional coordinate position of the electronic device can be determined by at least one of WiFi positioning, Global Positioning System (GPS) positioning, Beidou satellite positioning, indoor positioning, etc., or according to
  • the user inputs on the electronic device to obtain the coordinate position of the electronic device;
  • the floor where the electronic device is located can be input by the user or detected through the positioning function;
  • the field of view of the electronic device can be the direction of the camera or through a compass The orientation detected by the function.
  • Step 102 Determine the target object in the first three-dimensional model according to the first three-dimensional model of the target building.
  • the target building can be a large building, such as a shopping mall, an office building, etc., or the above-mentioned target building can also be any building under construction or to be constructed, which can be an above-ground building, an underground building or even a concealed building.
  • the construction of the project is not specifically limited here.
  • the above-mentioned first three-dimensional model may be a three-dimensional model that can reflect the real structure and internal details of the target building.
  • the above-mentioned first three-dimensional model may be constructed based on the structural data of the target building, wherein the above-mentioned structural data may be a model that can describe the target building.
  • Digital data of architectural design drawings of buildings such as 3D MAX (3D Studio Max) drawings or Computer Aided Design (CAD) drawings, etc.
  • the above-mentioned structural data may also include location information and orientation information of the target building, etc., enabling the first three-dimensional model to have a determined spatial position and orientation.
  • a first three-dimensional model having the same size and corresponding position as the target building may be constructed according to the structural data.
  • the structural data may include spatial coordinate information of the target building, so that the spatial coordinates of the three-dimensional model constructed based on the structural data correspond to the actual spatial coordinates of the target building, so that the spatial coordinates of the three-dimensional model can be compared with the electronic The device's positioning coordinates are aligned.
  • the spatial coordinates of the first three-dimensional model and the positioning coordinates of the electronic device can also be aligned based on other methods such as camera content of the electronic device, manual annotation, WiFi positioning, etc.
  • the WiFi and router inside the public building can be calibrated to generate spatial coordinates.
  • the mobile phone will ask whether to enable WiFi positioning services.
  • WiFi positioning services are enabled, the mobile phone can locate the plane coordinates of the mobile phone through WiFi signals. For elevation coordinates, they can be determined through the user's floor settings on the mobile phone.
  • the three-dimensional models in related technologies usually only focus on the display of the surface shape of the building's exterior, without the internal details of the building, such as the parking lot and escape passages inside the building and other detailed structures.
  • the first three-dimensional model in the embodiment of the present application may include the detailed structure inside the target building.
  • each room in the target building can be constructed through the three-dimensional rendering function of mobile phones and XR devices.
  • the first three-dimensional model of detailed structures such as walls, passages, and public facilities can display the internal structure and three-dimensional information of the target building.
  • the purpose and function of each room can also be marked according to the actual application situation to refine the first three-dimensional model.
  • the target object in this step may be a detailed structure inside the first three-dimensional model, such as a certain room, a certain passage, a certain pipe buried underground, etc.
  • determining the target object in the first three-dimensional model based on the first three-dimensional model of the target building may be based on the positional relationship between the first three-dimensional model of the target building and the electronic device, for example: target The object includes an object that is less than a preset distance from the electronic device and is located within the viewing angle range toward which the electronic device is facing; alternatively, the target may be determined based on the first three-dimensional model of the target building and the search information input by the user.
  • Object for example: when the user searches for the fire escape, the target object is determined to be the fire escape in the first three-dimensional model; or, the target can be determined based on the location information of the electronic equipment of the first three-dimensional model of the target building and the search information input by the user.
  • Objects such as determining that the target object is a target object retrieved by the user and that the distance from the electronic device is short, is not exhaustive here.
  • determining the target object in the first three-dimensional model according to the first three-dimensional model of the target building includes:
  • the preset condition includes that the distance between the electronic device and the first building is within a preset distance range.
  • the second three-dimensional model and the first prompt information of the first building nearby can be displayed on the electronic device.
  • the display method further includes:
  • Determining the target object in the first three-dimensional model according to the first three-dimensional model of the target building includes:
  • the first information matches the architectural feature information of the target object.
  • the above-mentioned first information can be understood as retrieval information.
  • the first information includes retrieval of building feature information of target objects such as fire escapes, elevators, or underground pipes.
  • target objects such as fire escapes, elevators, or underground pipes.
  • the fire escape, elevator or underground pipe corresponding to the characteristic information is the target object.
  • the objects in the first three-dimensional model can also be divided in advance, such as dividing according to corresponding clients, dividing according to whether their functions are public buildings, etc.
  • the determined target objects only include objects corresponding to the client and objects corresponding to the public building.
  • a first building information list is displayed, and the buildings in the first building information list Match with the first user terminal, which is a user terminal corresponding to the electronic device;
  • Determining the target object in the first three-dimensional model includes:
  • the target object corresponds to the first target building information
  • the first building information list includes the first target building information
  • the client information may include at least one item of user identification information such as phone number, account number, etc.
  • user identification information such as phone number, account number, etc.
  • the user identification information of the client may be obtained, and the user identification information associated with the user identification information may be determined. private area, thereby retrieving retrieval results matching the first information from the private area and public area associated with the user identification information.
  • the buildings in the first building information list match the first user terminal, which may be: the buildings in the first building information list belong to the private area corresponding to the first user terminal, for example: the user A can only view the buildings in user A's residence, or user A can only view the buildings in user A's residence and buildings in public areas such as corridors and ventilation ducts.
  • a password can be configured for each building, and the client corresponding to the building can obtain the password. For example, when a user accesses a certain room, he or she needs to enter the password corresponding to the room to achieve access. Thus, the matching relationship between the building and the user terminal is determined through password verification.
  • the above-mentioned second input may be a selection operation for the building information displayed in the first building information list, for example: clicking on the first target building information, which is not specifically limited here.
  • their permissions can be configured to only view objects in their respective associated areas and objects in the public area. For example: only the homeowners and residents of the room are allowed to view the facilities, pipes and other objects in a certain room, while public facilities such as safety passages and elevators in the target building can be viewed by all users. In this way, object viewing permissions can be controlled and the privacy of the private area can be guaranteed.
  • the display method further includes:
  • the first information includes public building information
  • the location of the building in the second building information list matches the location information of the electronic device
  • Determining the target object in the first three-dimensional model includes:
  • the target object corresponds to the second target building information
  • the second building information list includes the second target building information
  • the location of the building in the second building information list matches the location information of the electronic device.
  • the distance between the building and the electronic device in the second building information list may be less than the first distance.
  • the first distance may be is a larger distance, for example: it can be 100 meters, 200 meters, etc.
  • the user can obtain a larger selection range based on the search results within this larger range, for example: the buildings in the second building information list can Including at least one type of public building information such as fire escapes, elevators, corridors, etc., when displaying search results within a larger range, the user can have more choices to select which path to take to the destination.
  • the above-mentioned first distance may be a smaller distance, such as 10 meters or 20 meters. In this way, users can obtain nearby search results, thereby providing users with more convenient search results.
  • the range within which the search results are to be obtained can be determined according to the structure type of the search information, which can provide a more efficient search effect.
  • the building can be directly determined as the target object without receiving the above-mentioned second input.
  • the relative position of the building in the building information list and the electronic device meets the preset conditions, for example: the distance between the two is less than the preset distance, or the building is located within the viewing range of the electronic device, or the distance between the two The distance is less than the preset distance and the building is within the viewing range of the electronic device, etc.
  • the preset distance may be used for presetting or a distance determined based on actual application scenarios, such as 10 meters, 20 meters, etc. The value and determination method of the preset distance are not specifically limited here.
  • Step 103 Display the second three-dimensional model of the target object and first prompt information according to the location information, where the first prompt information includes architectural feature information of the target object.
  • the positioning information of the electronic device includes the three-dimensional coordinate position of the electronic device and the orientation information of the electronic device
  • the second three-dimensional model and the first prompt information of the target object are displayed.
  • the above-mentioned preset distance may be preset by the user, configured at the factory, or determined based on the actual scenario.
  • the target object located within the field of view of the electronic device can be displayed on the screen according to the position and orientation of the electronic device, as well as information such as the size of the target object and the distance to the electronic device.
  • the target object within the visual field distance of the electronic device can be updated in real time. At this time, the object in the image within the visual field distance of the electronic device is the target object.
  • the above-mentioned target objects can be any objects located in the three-dimensional model such as rooms, shops, public facilities, building structures, etc.
  • the size of the outline of the displayed target object can also be determined based on the principle of near size and far size. For example, when the electronic device is close to the target object, the outline of the displayed target object can be gradually increased. In this way, the user can intuitively perceive the distance to the target object through near-large and far-small size deformation.
  • the second three-dimensional model of the target object may include the outline (or partial outline) of the target object to obtain other three-dimensional model information.
  • the user can learn the position, orientation, size, etc. of the target object.
  • At least one item of architectural feature information such as material, usage, area, distance from the electronic device, etc.
  • the first prompt message can be used to display the outline of the fire escape, the length of the fire escape, the direction of the fire escape, and other building feature information.
  • the architectural feature information can include geometric information (such as the shape, plane coordinates and burial depth of the pipeline, etc.), usage information (such as drinking water pipes, sewage pipes, heating pipes, etc.) , natural gas pipelines, etc.) and material information (such as cement pipes, iron and copper pipes, etc.).
  • geometric information such as the shape, plane coordinates and burial depth of the pipeline, etc.
  • usage information such as drinking water pipes, sewage pipes, heating pipes, etc.
  • material information such as cement pipes, iron and copper pipes, etc.
  • the display method before displaying the second three-dimensional model and the first prompt information of the target object, the display method further includes:
  • the displaying the second three-dimensional model of the target object and the first prompt information includes:
  • the area corresponding to the viewing angle range of the electronic device does not include the second area.
  • the above-mentioned second prompt information can be used to assist the user in placing the target object within the visual field of the electronic device. For example, when the electronic device searches for the target object, if the target object is not located within the visual field of the electronic device, the second prompt information can be displayed. Prompt information to prompt to change the orientation of the electronic device until the target object is within the field of view of the electronic device, and then display the outline of the target object. For example: As shown in Figure 2a, assume that the user is in area A and the target object is area B. If the user's mobile phone is facing area C at this time, a left-rotating arrow can be displayed on the mobile phone to allow the user to rotate according to the arrow. mobile phone, so that the camera of the mobile phone faces area B. In this way, the user can intuitively perceive the location of the target object through the second prompt information.
  • the electronic device that performs the display method provided by the embodiment of the present application may be a mobile terminal such as a mobile phone, or an extended reality (XR) device or a wearable device such as a smart watch, where the XR device includes a virtual Reality (Virtual Reality, VR) equipment, augmented reality (Augmented Reality, AR) equipment, mixed reality (Mixed Reality, MR) equipment.
  • VR Virtual Reality
  • AR Augmented Reality
  • MR mixed reality
  • the XR device can obtain an image in the direction of its field of view in real time, display the image, and display the first prompt information suspended on the image.
  • the outline of the target object and the architectural feature information of the target object are displayed suspended on the real-life image.
  • the image captured by the XR device in real time can be displayed when the field of view of the XR device is toward the ground.
  • the outline of the underground pipe is displayed on the image, which is similar to the effect of seeing through the underground pipe. In this way, a display effect that better matches the actual scene can be provided.
  • the method further includes:
  • At least one progress indicator the at least one progress indicator being used to indicate the construction progress of at least one object in the target building
  • Determining the target object in the first three-dimensional model according to the first three-dimensional model of the target building includes:
  • the target object in the first three-dimensional model is determined according to the first three-dimensional model and the target construction progress, and the target progress is Signs are used to indicate the progress of the stated target construction;
  • the target object matches the target construction progress.
  • the construction progress of at least one object in the target building can be used to reflect the construction timeline of the target building.
  • the construction time information of the target building may include the completion time information of each object that has completed construction, and/or the planned construction time or the planned construction time sequence of each object of the target building, etc. In this way, the construction progress of at least one object in the target building can reflect the time sequence of the construction of completed or unfinished facilities in the target building.
  • the above target object matches the target construction progress.
  • the target object may include a building that has completed construction when the construction progress of the target building is the target construction progress. For example, the target object that has completed construction at the target construction progress will be highlighted. Display, or only display, target objects that have completed construction at the target construction progress without displaying other buildings that have not completed construction.
  • the fourth input may be a selection operation on the target progress indicator, for example, an operation of touching the target progress indicator, which is not specifically limited here.
  • a progress disk 31 can be configured for each construction node, and the progress disk 31 can include identification information of the construction sequence (as shown in Figure 3a The numerical labels shown and the sectors corresponding to the progress).
  • the target buildings under each construction progress can be switched to view, and the construction sequence of each facility and the relative positional relationship between each facility can be more clearly perceived, especially For the construction of complex structures or underground hidden projects, it can provide construction sequence guidance for construction personnel, thereby reducing the probability of construction errors.
  • Construction workers can visually perceive the workflow by browsing the adding order of each layer in the engineering drawing, for example, perceiving the time and logical sequence of construction steps. Through visual image perception, construction personnel can compare the standard design with the current construction status, which can reduce human errors in construction.
  • the determination of the above-mentioned target object can also be combined with the relative position relationship between the electronic device and the target object. For example, when the distance between the target object and the electronic device is less than 50 meters, and the construction progress of the target building is the target construction progress , a building with completed construction. For example: assuming that electronic equipment is located outside the target building, the target objects can include all objects that have been completed under the target construction schedule. When the electronic equipment is located in the target building, the target objects can include objects that have been completed under the target construction progress and are close to the electronic equipment.
  • the construction characteristic information of the buried underground pipeline 32 can be displayed to prompt the construction personnel to avoid The underground pipeline 32 will be destroyed during subsequent construction.
  • the characteristic information of the completed objects can be displayed during the construction or renovation process, thereby reducing the risk of damage to the completed objects during the construction and renovation process.
  • it has a prominent prompting effect.
  • the display also includes:
  • the timeline of the first three-dimensional model is adjusted.
  • the construction progress of the target building can be updated to the actual construction progress of the target building through a third input.
  • the facilities or structures that have completed construction are updated simultaneously.
  • the outline of facilities or structures that have completed construction can be a solid line
  • the outline of facilities or structures that have not completed construction can be dotted lines.
  • the third input changes the dotted line corresponding to the facility to a solid line to update the construction progress of the target building.
  • the third input is the input operation of changing the dotted line to a solid line.
  • the third input can also be to receive other inputs such as the latest construction progress data, which is not specifically limited here.
  • construction personnel can add the completion status of each link into the first three-dimensional model of the target building. And by updating the construction progress of the target building according to the third input, the construction progress of the target building can be matched with the actual construction progress.
  • multiple clients can load the structural data of the same target building. For example, all construction personnel in the construction team separately load the CAD design drawings of the target building to be constructed.
  • the construction worker's mobile phone can directly or indirectly communicate with the mobile phones of other construction workers, so that the entire construction team can The construction progress information received by all construction workers is consistent.
  • only progress indicators that are greater than the actual construction progress can be displayed.
  • the progress indicator corresponding to the construction progress can be cancelled, and the number of the remaining progress indicators can be decremented. 1, so that the number of the progress indicator to be constructed in the next step is adjusted to 1, which will not be described in detail here.
  • the displaying the second three-dimensional model of the target object and the first prompt information includes:
  • the second three-dimensional model of the target object and the first prompt information can be adjusted according to the principle of near large and far small. For example, when the electronic device is close to the target object, the size of the displayed second three-dimensional model can be enlarged, and Reduce the distance value between the electronic device and the target object displayed in the first prompt information, so that the user can intuitively perceive changes in the distance to the target object based on changes in the outline size of the target object and the first prompt information.
  • the position information of the electronic device is obtained; the target object in the first three-dimensional model is determined according to the first three-dimensional model of the target building; and the second three-dimensional image of the target object is displayed according to the position information.
  • a model and first prompt information where the first prompt information includes architectural feature information of the target object.
  • the three-dimensional model of the target objects in the target building (such as rooms, elevators, fire escapes, and even structures hidden in walls and under the ground) can be displayed on the electronic device based on the first three-dimensional model that can reflect the internal details of the target building. and prompt information, which improves the display effect of the detailed structure in the target building.
  • the execution subject may be a display device.
  • a display device performing a display method is used as an example to illustrate the display device provided by the embodiment of the present application.
  • the display device 400 provided by the embodiment of the present application may include the following modules:
  • the first acquisition module 401 is used to acquire the location information of the electronic device
  • the first determination module 402 is used to determine the target object in the first three-dimensional model according to the first three-dimensional model of the target building;
  • the first display module 403 is configured to display the second three-dimensional model of the target object and first prompt information according to the location information, where the first prompt information includes architectural feature information of the target object.
  • the first determination module 402 is specifically used for:
  • the preset condition includes that the distance between the electronic device and the first building is within a preset distance range.
  • the display device 400 also includes:
  • the second acquisition module is used to acquire the architectural feature information of each object in the first three-dimensional model
  • the first determination module 402 includes:
  • a first input unit configured to receive a first input, the first input being used to input first information
  • a first determination unit configured to determine the target object in the first three-dimensional model according to the first three-dimensional model of the target building in response to the first input
  • the first information matches the architectural feature information of the target object.
  • the display device 400 also includes:
  • a second display module configured to display a first building information list in response to the first input when the first information is associated with the first user terminal, and the buildings in the first building information list are related to the The first client matches, and the first client is a client corresponding to the electronic device;
  • the first determination module 402 is specifically used for:
  • the target object corresponds to the first target building information
  • the first building information list includes the first target building information
  • the display device 400 also includes:
  • a third display module configured to display a second building information list in response to the first input when the first information includes public building information, and the location of the building in the second building information list is consistent with the Matching of location information of electronic devices;
  • the first determination module 402 is specifically used for:
  • the target object corresponds to the second target building information
  • the second building information list includes the second target building information
  • the location information of the electronic device includes a viewing angle range of the electronic device, and the target object is an object located within the viewing angle range of the electronic device.
  • the display device 400 also includes:
  • a fourth display module configured to display second prompt information corresponding to the target object when the target object is located in the second area, where the second prompt information includes viewing angle adjustment information;
  • the first display module 403 includes:
  • a second input unit configured to receive a third input for adjusting the viewing angle of the electronic device
  • a first display unit configured to display the second three-dimensional model of the target object and the first prompt information when the target object is located within the viewing angle range of the electronic device;
  • the area corresponding to the viewing angle range of the electronic device does not include the second area.
  • the display device 400 also includes:
  • a fourth display module used to display at least one progress indicator, the at least one progress indicator being used to indicate the construction progress of at least one object in the target building;
  • the first determination module 402 is specifically used to:
  • the target object matches the target construction progress.
  • the first display module 403 is specifically used for:
  • the display device in the embodiment of the present application may be an electronic device or a component in the electronic device, such as an integrated circuit or a chip.
  • the electronic device may be a terminal or other devices other than the terminal.
  • the electronic device can be a mobile phone, a tablet computer, a notebook computer, a handheld computer, a vehicle-mounted electronic device, a mobile Internet device (Mobile Internet Device, MID), an augmented reality (Augmented Reality, AR) device, a virtual reality (Virtual Reality, VR) equipment, mixed reality (Mixed Reality, MR) equipment, robots, wearable devices, ultra-mobile personal computers (Ultra-Mobile Personal Computer, UMPC), netbooks or personal digital assistants (Personal Digital Assistant, PDA), etc., you can also It is a server, a network attached storage (Network Attached Storage, NAS), a personal computer (Personal Computer, PC), a television (Television, TV), a teller machine or a self-service machine, etc., and is not
  • the display device in the embodiment of the present application may be a device with an operating system.
  • the operating system can be an Android operating system, an ios operating system, or other possible operating systems, which are not specifically limited in the embodiments of this application.
  • the display device provided by the embodiments of the present application can implement various processes implemented by the method embodiments shown in Figures 1 to 3c. To avoid duplication, they will not be described again here.
  • this embodiment of the present application also provides an electronic device 500, including a processor 501 and a memory 502.
  • the memory 502 stores programs or instructions that can be run on the processor 501.
  • the program or instruction is executed by the processor 501, each step of the above display method embodiment is implemented, and the same technical effect can be achieved. To avoid repetition, the details will not be described here.
  • the electronic devices in the embodiments of the present application include the above-mentioned mobile electronic devices and non-mobile electronic devices.
  • FIG. 6 is a schematic diagram of the hardware structure of an electronic device implementing an embodiment of the present application.
  • the electronic device 600 includes but is not limited to: radio frequency unit 601, network module 602, audio output unit 603, input unit 604, sensor 605, display unit 606, user input unit 607, interface unit 608, memory 609, processor 610, etc. part.
  • the electronic device 600 may also include a power supply (such as a battery) that supplies power to various components.
  • the power supply may be logically connected to the processor 160 through a power management system, thereby managing charging, discharging, and function through the power management system. Consumption management and other functions.
  • the structure of the electronic device shown in Figure 6 does not constitute a limitation on the electronic device.
  • the electronic device may include more or less components than shown in the figure, or combine certain components, or arrange different components, which will not be described again here. .
  • the interface unit 608 or the input unit 604 or the network module 602 is used to obtain the structural data of the target building;
  • Processor 610 configured to obtain the location information of the electronic device, and determine the target object in the first three-dimensional model according to the first three-dimensional model of the target building;
  • the display unit 606 is configured to display the second three-dimensional model of the target object and first prompt information according to the position information, where the first prompt information includes architectural feature information of the target object.
  • the processor 610 executes the step of determining the target object in the first three-dimensional model according to the first three-dimensional model of the target building, including:
  • the preset condition includes that the distance between the electronic device and the first building is within a preset distance range.
  • the user input unit 607 or the interface unit 608 or the input unit 604 or the network module 602 is used to obtain the architectural feature information of each object in the first three-dimensional model
  • the processor 610 executes the step of determining the target object in the first three-dimensional model according to the first three-dimensional model of the target building, including:
  • the user input unit 607 is used to receive a first input, the first input being used to input first information;
  • Processor 610 configured to respond to the first input and determine the target object in the first three-dimensional model according to the first three-dimensional model of the target building;
  • the first information matches the architectural feature information of the target object.
  • the display unit 606 is also configured to display a first building information list in response to the first input when the first information is associated with the first user terminal.
  • the building matches the first user end, and the first user end is the user end corresponding to the electronic device;
  • the determination of the target object in the first three-dimensional model performed by the processor 610 includes:
  • the user input unit 607 receives the second input of the first target building information, determine the target object in the first three-dimensional model
  • the target object corresponds to the first target building information
  • the first building information list includes the first target building information
  • the display unit 606 is also configured to display a second building information list in response to the first input when the first information includes public building information.
  • the location matches the location information of the electronic device;
  • the determination of the target object in the first three-dimensional model performed by the processor 610 includes:
  • the user input unit 607 receives the second input of the second target building information, determine the target object in the first three-dimensional model
  • the target object corresponds to the second target building information
  • the second building information list includes the second target building information
  • the location information of the electronic device includes a viewing angle range of the electronic device, and the target object is an object located within the viewing angle range of the electronic device.
  • the display unit 606 before performing the display of the second three-dimensional model and the first prompt information of the target object, the display unit 606 is also configured to display the second three-dimensional model of the target object and the first prompt information when the target object is located in the second area.
  • the second prompt information includes viewing angle adjustment information;
  • the displaying of the second three-dimensional model of the target object and the first prompt information performed by the display unit 606 includes:
  • the user input unit 607 When the user input unit 607 receives the third input for adjusting the viewing angle of the electronic device, and the target object is located within the viewing angle range of the electronic device, display the second three-dimensional model of the target object and the first prompt information;
  • the area corresponding to the viewing angle range of the electronic device does not include the second area.
  • the display unit 606 is also used to display at least one progress indicator, the at least one progress indicator being used to indicate the construction progress of at least one object in the target building;
  • the processor 610 executes the step of determining the target object in the first three-dimensional model according to the first three-dimensional model of the target building, including:
  • the target object in the first three-dimensional model is determined according to the first three-dimensional model and the target construction progress, and the target progress is Signs are used to indicate the progress of the stated target construction;
  • the target object matches the target construction progress.
  • displaying the second three-dimensional model of the target object and the first prompt information performed by the display unit 606 includes:
  • the electronic device 600 provided by the embodiment of the present application can realize the functions performed by each model in the display device 500 as shown in Figure 5, and can achieve the same beneficial effects. To avoid duplication, the details will not be described again.
  • the input unit 604 may include a graphics processor (Graphics Processing Unit, GPU) 6041 and a microphone 6042.
  • the graphics processor 6041 is responsible for the image capture device (GPU) in the video capture mode or the image capture mode. Process the image data of still pictures or videos obtained by cameras (such as cameras).
  • the display unit 606 may include a display panel 6061, which may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like.
  • the user input unit 607 includes a touch panel 6071 and at least one of other input devices 6072 .
  • Touch panel 6071 also called touch screen.
  • the touch panel 6071 may include two parts: a touch detection device and a touch controller.
  • Other input devices 6072 may include but are not limited to physical keyboards, function keys (such as volume control keys, switch keys, etc.), trackballs, mice, and joysticks, which will not be described again here.
  • Memory 609 may be used to store software programs as well as various data.
  • the memory 609 may mainly include a first storage area for storing programs or instructions and a second storage area for storing data, wherein the first storage area may store an operating system, an application program or instructions required for at least one function (such as a sound playback function, Image playback function, etc.) etc.
  • memory 609 may include volatile memory or non-volatile memory, or memory 609 may include both volatile and non-volatile memory.
  • non-volatile memory can be read-only memory (Read-Only Memory, ROM), programmable read-only memory (Programmable ROM, PROM), erasable programmable read-only memory (Erasable PROM, EPROM), electrically removable memory.
  • Volatile memory can be random access memory (Random Access Memory, RAM), static random access memory (Static RAM, SRAM), dynamic random access memory (Dynamic RAM, DRAM), synchronous dynamic random access memory (Synchronous DRAM, SDRAM), double data rate synchronous dynamic random access memory (Double Data Rate SDRAM, DDRSDRAM), enhanced synchronous dynamic random access memory (Enhanced SDRAM, ESDRAM), synchronous link dynamic random access memory (Synch link DRAM) , SLDRAM) and direct memory bus random access memory (Direct Rambus RAM, DRRAM).
  • RAM Random Access Memory
  • SRAM static random access memory
  • DRAM dynamic random access memory
  • synchronous dynamic random access memory Synchronous DRAM, SDRAM
  • Double data rate synchronous dynamic random access memory Double Data Rate SDRAM, DDRSDRAM
  • Enhanced SDRAM, ESDRAM synchronous link dynamic random access memory
  • Synch link DRAM synchronous link dynamic random access memory
  • SLDRAM direct memory bus random access memory
  • the processor 610 may include one or more processing units; optionally, the processor 610 integrates an application processor and a modem processor, where the application processor mainly handles operations related to the operating system, user interface, application programs, etc., Modem processors mainly process wireless communication signals, such as baseband processors. It can be understood that the above modem processor may not be integrated into the processor 610.
  • Embodiments of the present application also provide a readable storage medium.
  • Programs or instructions are stored on the readable storage medium.
  • the program or instructions are executed by a processor, each process of the above display method embodiment is implemented and the same can be achieved. To avoid repetition, the technical effects will not be repeated here.
  • the processor is the processor in the electronic device described in the above embodiment.
  • the readable storage medium includes computer readable storage media, such as computer read-only memory ROM, random access memory RAM, magnetic disk or optical disk, etc.
  • An embodiment of the present application further provides a chip.
  • the chip includes a processor and a communication interface.
  • the communication interface is coupled to the processor.
  • the processor is used to run programs or instructions to implement each of the above display method embodiments. The process can achieve the same technical effect. To avoid repetition, it will not be described again here.
  • chips mentioned in the embodiments of this application may also be called system-on-chip, system-on-a-chip, system-on-a-chip or system-on-chip, etc.
  • Embodiments of the present application provide a computer program product.
  • the program product is stored in a storage medium.
  • the program product is executed by at least one processor to implement each process of the above display method embodiment, and can achieve the same technical effect. To avoid repetition, they will not be repeated here.
  • An embodiment of the present application also provides an execution device configured to execute the method described above.
  • the methods of the above embodiments can be implemented by means of software plus the necessary general hardware platform. Of course, it can also be implemented by hardware, but in many cases the former is better. implementation.
  • the technical solution of the present application can be embodied in the form of a computer software product that is essentially or contributes to related technologies.
  • the computer software product is stored in a storage medium (such as ROM/RAM, disk, CD), including several instructions to cause a terminal (which can be a mobile phone, a computer, a server, or a network device, etc.) to execute the methods described in various embodiments of this application.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Human Resources & Organizations (AREA)
  • Operations Research (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Data Mining & Analysis (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

La présente demande concerne le domaine technique des ordinateurs et divulgue un procédé et un appareil d'affichage ainsi qu'un dispositif électronique. Le procédé d'affichage comprend les étapes consistant à : acquérir des données structurales d'un bâtiment cible et à construire un modèle tridimensionnel du bâtiment cible selon les données structurales ; déterminer un objet cible dans le modèle tridimensionnel en fonction d'informations de positionnement d'un dispositif électronique ; et à afficher des informations de caractéristique de l'objet cible.
PCT/CN2022/104821 2022-07-11 2022-07-11 Procédé d'affichage, appareil d'affichage et dispositif électronique WO2024011338A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/104821 WO2024011338A1 (fr) 2022-07-11 2022-07-11 Procédé d'affichage, appareil d'affichage et dispositif électronique

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/104821 WO2024011338A1 (fr) 2022-07-11 2022-07-11 Procédé d'affichage, appareil d'affichage et dispositif électronique

Publications (1)

Publication Number Publication Date
WO2024011338A1 true WO2024011338A1 (fr) 2024-01-18

Family

ID=89535111

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/104821 WO2024011338A1 (fr) 2022-07-11 2022-07-11 Procédé d'affichage, appareil d'affichage et dispositif électronique

Country Status (1)

Country Link
WO (1) WO2024011338A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140327670A1 (en) * 2011-12-30 2014-11-06 Honeywell International Inc. Target aquisition in a three dimensional building display
CN110779479A (zh) * 2019-09-02 2020-02-11 腾讯科技(深圳)有限公司 一种应用于室内地图的对象处理方法
CN111553659A (zh) * 2020-04-28 2020-08-18 深圳众维轨道交通科技发展有限公司 一种全生命建筑周期监控运维方法及系统
CN112381946A (zh) * 2020-12-04 2021-02-19 久瓴(江苏)数字智能科技有限公司 数字场景查看方法、装置、存储介质和计算机设备
CN113516331A (zh) * 2020-11-25 2021-10-19 腾讯科技(深圳)有限公司 一种建筑数据处理方法及装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140327670A1 (en) * 2011-12-30 2014-11-06 Honeywell International Inc. Target aquisition in a three dimensional building display
CN110779479A (zh) * 2019-09-02 2020-02-11 腾讯科技(深圳)有限公司 一种应用于室内地图的对象处理方法
CN111553659A (zh) * 2020-04-28 2020-08-18 深圳众维轨道交通科技发展有限公司 一种全生命建筑周期监控运维方法及系统
CN113516331A (zh) * 2020-11-25 2021-10-19 腾讯科技(深圳)有限公司 一种建筑数据处理方法及装置
CN112381946A (zh) * 2020-12-04 2021-02-19 久瓴(江苏)数字智能科技有限公司 数字场景查看方法、装置、存储介质和计算机设备

Similar Documents

Publication Publication Date Title
US10854013B2 (en) Systems and methods for presenting building information
US10354452B2 (en) Directional and x-ray view techniques for navigation using a mobile device
US20210064216A1 (en) Automated Tools For Generating Mapping Information For Buildings
KR101354688B1 (ko) 공사현장 감리 시스템 및 감리 방법
US11836973B2 (en) Automated direction of capturing in-room information for use in usability assessment of buildings
Sankar et al. Capturing indoor scenes with smartphones
US11790648B2 (en) Automated usability assessment of buildings using visual data of captured in-room images
CA3058602A1 (fr) Production automatisee d`information cartographique a partir d`images interdependantes
TW201818293A (zh) 以三維資訊模型為基礎之綜合感知定位技術應用系統
EP2974509B1 (fr) Communicateur d'informations personnelles
US8941752B2 (en) Determining a location using an image
AU2017219142A1 (en) Location Based Augmented Reality Property Listing Method, Software and System
JP2019153274A (ja) 位置算出装置、位置算出プログラム、位置算出方法、及びコンテンツ付加システム
US20230196684A1 (en) Presenting Building Information Using Video And Building Models
WO2019164830A1 (fr) Appareil, systèmes et procédés de marquage d'éléments de construction dans un espace 3d
US11138811B2 (en) Using augmented reality markers for local positioning in a computing environment
EP3640895A1 (fr) Gestion de ville intelligente et outil de navigation
WO2023185547A1 (fr) Procédé et appareil d'affichage d'informations de listage de maisons, et dispositif électronique et support d'enregistrement lisible
WO2024011338A1 (fr) Procédé d'affichage, appareil d'affichage et dispositif électronique
US10916066B2 (en) Methods of virtual model modification
US11532065B2 (en) Systems and methods of indoor navigation for emergency services
Chen et al. Integration of Augmented Reality and indoor positioning technologies for on-site viewing of BIM information
KR20210039700A (ko) 증강현실 기반의 지하개발 빌딩정보모델링(bim) 시스템
US20240233260A1 (en) Automated Localization Using Beacon Transmitter Devices Of Data Acquired In Buildings
EP4397945A1 (fr) Localisation automatisée à l'aide de dispositifs émetteurs de balises de données acquises dans des bâtiments

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22950477

Country of ref document: EP

Kind code of ref document: A1