WO2020151255A1 - Système et procédé de commande d'affichage basés sur un terminal mobile - Google Patents

Système et procédé de commande d'affichage basés sur un terminal mobile Download PDF

Info

Publication number
WO2020151255A1
WO2020151255A1 PCT/CN2019/109473 CN2019109473W WO2020151255A1 WO 2020151255 A1 WO2020151255 A1 WO 2020151255A1 CN 2019109473 W CN2019109473 W CN 2019109473W WO 2020151255 A1 WO2020151255 A1 WO 2020151255A1
Authority
WO
WIPO (PCT)
Prior art keywords
displayed
display
dimensional
model
mobile terminal
Prior art date
Application number
PCT/CN2019/109473
Other languages
English (en)
Chinese (zh)
Inventor
李新福
Original Assignee
广东康云科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 广东康云科技有限公司 filed Critical 广东康云科技有限公司
Publication of WO2020151255A1 publication Critical patent/WO2020151255A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Definitions

  • the invention relates to the field of control technology, in particular to a display control system and method based on a mobile terminal.
  • the purpose of the present invention is to provide a display control method and system based on a mobile terminal and a display control system based on a mobile terminal.
  • the first aspect of the technical solution of the present invention is a display control system based on a mobile terminal, including:
  • the 3D data acquisition module is used to acquire 3D data of the object to be displayed
  • the three-dimensional data processing module is used to process the acquired three-dimensional data of the object to be displayed to obtain the original three-dimensional model of the object to be displayed and the corresponding link;
  • the rendering compression optimization unit is used to perform rendering and compression optimization on the original 3D model of the object to be displayed to obtain the optimized 3D model of the object to be displayed;
  • the interactive display module is used to compress the optimized 3D model according to the obtained link or the object to be displayed Interactive display of the model;
  • the mobile terminal is used to obtain the obtained link or the compressed and optimized three-dimensional model of the object to be displayed, and generate a display control signal;
  • the display module is used to display according to the display control signal.
  • the three-dimensional data acquisition module includes:
  • Aerial photography scanner used to scan the three-dimensional data of the area and perform local preprocessing
  • Indoor scanner used to scan 3D data of indoor environment and perform local preprocessing
  • Outdoor scanning module used to scan 3D data of outdoor environment and perform local preprocessing
  • Object scanner used to scan the three-dimensional data of the object and perform local preprocessing
  • Human body scanner used to scan the three-dimensional data of the human body and perform local preprocessing
  • the third-party 3D data acquisition module is used to acquire the 3D data of the object to be displayed from the third party.
  • the 3D data processing module includes a 3D cloud model library and a rendering compression optimization unit,
  • the 3D cloud model library is used to intelligently process the 3D data of the object to be displayed to obtain the original 3D model of the object to be displayed and the corresponding link.
  • the intelligent processing includes model repair, editing, cropping, surface reduction, and model reduction , Compression, material processing, texture processing, denoising, coloring and light processing.
  • the three-dimensional cloud model library includes:
  • the access management unit is used to authorize the user with corresponding access rights according to the user's valid login credentials
  • Intelligent processing unit used for model repair, editing, cropping, surface reduction, mold reduction, compression, material processing, texture processing, denoising, coloring and light processing, to obtain the original 3D model of the object to be displayed;
  • the link generating unit is used to generate the corresponding link according to the original three-dimensional model of the object to be displayed;
  • the database is used to store the user's behavior data when viewing interactive display content, the user's valid login credentials, the original three-dimensional model of the object to be displayed, and the corresponding link.
  • the interactive display module is specifically configured to display the compressed and optimized three-dimensional model of the object to be displayed or the original three-dimensional model of the object to be displayed in the first smart display screen according to the interaction with the user.
  • the mobile terminal includes at least one of a mobile phone, an IPAD, a notebook, and a smart watch.
  • the mobile terminal includes:
  • the first communication unit is configured to receive the obtained link or the compressed and optimized three-dimensional model of the object to be displayed;
  • the display unit is used to display the obtained link or the compressed and optimized 3D model of the object to be displayed;
  • the control unit is used to control the three-dimensional model displayed by the display unit according to the input signal and generate a corresponding display control signal;
  • the second communication unit is used to generate a display control signal.
  • the mobile terminal-based display control system further includes a communication system, and the communication system includes:
  • Voice message unit used to provide voice message service
  • Telephone customer service unit used to provide telephone customer service
  • Robot customer service unit used to provide robot voice customer service
  • the live video customer service unit is used to provide live video customer service.
  • the interactive display module includes a multi-dimensional visual real-shopping module
  • the multi-dimensional visual real-shopping module specifically includes:
  • 3D scene display unit used to provide automatic navigation service, scene switching service, 3D real scene service, space roaming service, free walking service, 2D floor plan service and 3D house plan service;
  • AI intelligent voice shopping guide service is used to provide multi-national voice commentary services, three-dimensional product display services, three-dimensional shopping guide services, AI smart voice answer services and three-dimensional robot commentary services.
  • the interactive display content is sent to the second smart display screen in the form of a signal.
  • the second aspect of the technical solution of the present invention is a display control method based on a mobile terminal, which includes the following steps:
  • the acquired three-dimensional data of the object to be displayed is processed to obtain the original three-dimensional model of the object to be displayed and the corresponding link;
  • the beneficial effects of the present invention are: a display control system and method based on a mobile terminal of the present invention obtain, process, and render compression optimization to obtain a compressed and optimized three-dimensional model of the object to be displayed, and then compress it according to the object to be displayed through the mobile terminal
  • the optimized 3D model generates a display control signal, and then the display module displays the display control signal according to the display control signal.
  • the additional mobile terminal is used to display the display module, which solves the problem of when the display module’s screen is too large or the installation distance is far or suspended , It is difficult to control or operate during human-computer interaction, which has the advantages of convenient operation and good user experience.
  • Figure 1 is a block diagram of the overall structure of the mobile terminal-based display control system of the present invention.
  • FIG. 2 is a structural block diagram of a preferred embodiment of the display control system based on mobile terminals of the present invention
  • FIG. 3 is a structural block diagram of another preferred embodiment of the display control system based on a mobile terminal of the present invention.
  • Fig. 4 is a flowchart of a preferred embodiment of a display control method based on a mobile terminal of the present invention.
  • first, second, third, etc. may be used in this disclosure to describe various elements, these elements should not be limited to these terms. These terms are only used to distinguish elements of the same type from each other.
  • first element may also be referred to as the second element, and similarly, the second element may also be referred to as the first element.
  • second element may also be referred to as the first element.
  • the use of any and all examples or exemplary language (“such as”, “such as”, etc.) provided herein is only intended to better illustrate the embodiments of the present invention, and unless otherwise required, will not impose limitations on the scope of the present invention .
  • a display control system based on a mobile terminal of the present invention includes:
  • the three-dimensional data acquisition module 10 is used to acquire three-dimensional data of the object to be displayed;
  • the three-dimensional data processing module 11 is configured to process the acquired three-dimensional data of the object to be displayed to obtain the original three-dimensional model of the object to be displayed and the corresponding link;
  • the rendering compression optimization unit 12 is used to perform rendering and compression optimization on the original three-dimensional model of the object to be displayed, to obtain a compressed and optimized three-dimensional model of the object to be displayed;
  • the interactive display module 13 is used for interactive display based on the obtained link or the compressed and optimized three-dimensional model of the object to be displayed.
  • the interactive display includes the first smart display screen, AR device, VR device and browser based on interaction with the user.
  • At least one of the devices in the device displays the compressed and optimized three-dimensional model of the object to be displayed or the original three-dimensional model of the object to be displayed
  • the first smart display screen includes an air imaging device, an air screen, a PC computer screen, a tablet computer screen, and a smart screen.
  • the interaction with the user includes at least one of somatosensory control, eye tracking, gesture control, voice recognition, brain wave control, touch control, screen switching control, and face recognition;
  • the mobile terminal 18 is used to obtain the obtained link or the compressed and optimized three-dimensional model of the object to be displayed, and to generate a display control
  • the display module 19 is used for displaying according to the display control signal
  • the big data analysis backend 14 is used to analyze and count the behavior of users when watching interactive display content.
  • the three-dimensional data processing module 11 includes a three-dimensional cloud model library.
  • objects include objects (such as commodities) and environments (such as the indoor environment of a museum).
  • the three-dimensional data of the object can be two-dimensional images, point cloud data of the object, etc., which can be collected by manual or automatic scanning equipment (such as cameras, aerial drones, automatic scanning robots, etc.), or through third parties Interface (such as Internet interface, etc.) is directly imported.
  • the present invention includes at least two compression processes: the three-dimensional cloud model library includes at least one compression process, and the rendering compression optimization unit 12 will perform another compression before generating the compressed and optimized three-dimensional model of the object to be displayed.
  • the original three-dimensional model of the object to be displayed is displayed on the PC screen, tablet computer (such as IPAD) screen, and smart phone screen or browser to achieve interactive display through interaction with the user.
  • the user only needs to pass the corresponding link (such as URL link) can be accessed, which saves the process of loading APP, which is more efficient and more convenient.
  • the air screen displays the compressed and optimized three-dimensional model of the object to be displayed through interaction with the user.
  • the present invention also supports the display of the compressed and optimized three-dimensional model of the object to be displayed in an air imaging manner (that is, through an air imaging device), which has richer functions and more shocking visual effects.
  • the three-dimensional model of the object is a virtual model, which can be used for 360-degree browsing or viewing without blind angles, and the model can be enlarged through the interaction of the first smart display, AR device, VR device, and browser with the user , Zoom out, color change and visual switching, to meet the individual needs of different viewers.
  • the invention supports different intelligent terminals and devices such as AR equipment, VR equipment, PC computer screen, tablet computer (such as IPAD) screen, smart phone screen, air imaging device, air screen, etc. to access and display the three-dimensional model, with more abundant functions.
  • the mobile terminal 18 is configured to obtain the link obtained from the air screen or the rendering compression optimization unit or the compressed and optimized three-dimensional model of the object to be displayed, and generate a display control signal.
  • the display module can be a second smart display, the size of the second smart display is larger, or the second smart display is installed at a higher position (such as suspended installation) and installed at a longer distance (such as a person standing on the side of a pool, The screen is installed on the other side of the pool).
  • the mobile terminal 18 may include smart devices such as mobile phones, IPADs, and mobile computers.
  • the mobile terminal is also provided with a first communication unit 1801, a display unit 1802, a control unit 1803, and a second communication unit 1804.
  • the first communication unit 1801 is used to receive the obtained link or the compressed and optimized three-dimensional model of the object to be displayed;
  • the display unit 1802 is used to display the obtained link or the compressed and optimized three-dimensional model of the object to be displayed;
  • the control unit 1803 is used to According to the input signal, the three-dimensional model displayed by the display unit is controlled and the corresponding display control signal is generated;
  • the second communication unit 1804 is used for generating the display control signal.
  • the input signal may be at least one of a touch screen input signal, a gesture signal, and a voice input signal.
  • the display control signal includes at least one control signal of a viewing angle control signal, a color control signal, and a zoom control signal of the three-dimensional model.
  • the three-dimensional data acquisition module includes:
  • the aerial scanner 101 is used to scan three-dimensional data of an area (such as a certain city) and perform local preprocessing;
  • the indoor scanner 102 is used to scan 3D data of an indoor environment (such as inside a museum) and perform local preprocessing;
  • the outdoor scanning model 103 is used to scan 3D data of an outdoor environment (such as an outdoor exhibition) and perform local preprocessing;
  • the object scanner 104 is used to scan 3D data of objects (such as sneakers, pens, etc.) and perform local preprocessing;
  • the human body scanner 105 is used to scan the three-dimensional data of the human body and perform local preprocessing
  • the third-party 3D data acquisition module 106 is used to acquire 3D data of the object to be displayed from a third party (such as a scan model provider).
  • the aerial scanner 101 may use a drone or other aerial scanning equipment with a camera, which can acquire scan data of a certain area (for example, a certain city) through aerial scanning.
  • a drone or other aerial scanning equipment with a camera which can acquire scan data of a certain area (for example, a certain city) through aerial scanning.
  • the indoor scanner 102 may be a handheld scanning device (such as a camera with a support frame) or other automatic scanning devices (such as an automatic scanning robot).
  • a handheld scanning device such as a camera with a support frame
  • other automatic scanning devices such as an automatic scanning robot
  • the outdoor scanner 103 may be a handheld scanning device (such as a camera with a support frame) or other automatic scanning equipment (such as an automatic scanning robot).
  • a handheld scanning device such as a camera with a support frame
  • other automatic scanning equipment such as an automatic scanning robot
  • the object scanner 104 may be a handheld scanning device (such as an RGB-D camera with a support frame, etc.).
  • the human body scanner 105 may be an existing human body scanner specifically for human body modeling.
  • the aerial scanner 101, the indoor scanner 102, the outdoor scanning module 103, the object scanner 104 and the human body scanner 105 are all integrated with GPU chips, which can perform preliminary processing on the collected data such as two-dimensional images locally (such as Two-dimensional images are initially stitched according to depth information, etc.), which reduces the processing burden on the cloud.
  • the three-dimensional cloud model library 11 includes:
  • the access management unit 111 is configured to authorize the user with corresponding access rights according to the user's valid login credentials
  • the intelligent processing unit 112 is used to perform model repair, editing, cropping, surface reduction, mold reduction, compression, material processing, texture processing, denoising, coloring, and light processing to obtain the original three-dimensional model of the object to be displayed;
  • the link generating unit 113 is configured to generate a corresponding link according to the original three-dimensional model of the object to be displayed;
  • the database 114 is used to store the user's behavior data when viewing the interactive display content, the user's valid login credentials, the original three-dimensional model of the object to be displayed, and the corresponding link.
  • the user's valid login credential can be obtained by cooperating with the camera to perform face recognition.
  • the user can also register by inputting one or more personal details (such as name, age, location, gender, etc.) on the intelligent processing unit 112.
  • the user can be any user who has valid login credentials, and valid login credentials can be obtained by paying the fees for corresponding permissions.
  • the intelligent processing unit 112 integrates AI algorithms, and can automatically perform model repair, cropping, surface reduction, model reduction, compression, material processing, texture processing, and lighting processing on the three-dimensional data of the object to be displayed, with a high degree of intelligence.
  • the intelligent processing unit 112 may be an application program installed on a computing device.
  • the computing device may be, but not limited to, devices such as smart phones, tablet computers, notebook computers, smart watches, smart TVs, and computers.
  • it further includes a display cabinet, the air screen is installed on one surface of the display cabinet, and a booth for placing objects or real models of the objects is arranged in the display cabinet.
  • the display cabinet is composed of several planes and/or curved surfaces.
  • the display cabinet may be a rectangular parallelepiped and composed of six planes.
  • the display cabinet may be a variation of a rectangular parallelepiped.
  • one surface is composed of a curved surface.
  • the display cabinet can be in the shape of other polyhedrons.
  • the space enclosed by the multiple surfaces of the display cabinet may be a closed space or an open space, such as a rectangular parallelepiped with an open top.
  • the display cabinet is equipped with a booth for placing objects or real models of objects.
  • the air screen is arranged on one surface of the display cabinet, and the air screen can be a transparent LCD touch screen or the like.
  • the air screen may be a flat screen or a curved screen.
  • the air screen can also be used to display two-dimensional images or text.
  • multiple sides of the display cabinet are air screens, which can display the three-dimensional model of the article in different directions.
  • part of the air screen can display information other than the three-dimensional model, such as two-dimensional images, text descriptions, product LOGO, and so on.
  • the air screen can also provide corresponding software buttons on the display interface for users to choose, such as QR code payment buttons, color switching buttons, and viewing angle switching buttons.
  • the air screen 13 is provided with a somatosensory sensor 131, a gesture sensor 132, an eye tracker 133, a camera 134, a touch screen 135, a voice collection module 136, and brain wave collection.
  • Device 137, ultrasonic sensor 138 and infrared sensor 139 are provided with a somatosensory sensor 131, a gesture sensor 132, an eye tracker 133, a camera 134, a touch screen 135, a voice collection module 136, and brain wave collection.
  • Device 137 ultrasonic sensor 138 and infrared sensor 139.
  • the somatosensory sensor 131 is used to capture the user's somatosensory signal (such as the user's body motion signal when a car is driving in a simulation).
  • the gesture sensor 132 is used to capture the user's gesture signal (for example, a signal to switch the angle of view by grasping a certain commodity through the palm and thumb).
  • the eye tracker 133 is used to capture the user's eye movements. There are two purposes of eye tracking: one is to automatically switch the display content according to the focus of the eyeball. For example, if the user’s eyeball stays or stares at one of the multiple three-dimensional display models for several seconds, the air screen can Automatically switch to the scene of the 3D model for further detailed display; 2. Real-time monitoring of the user's eye movements.
  • the camera 134 is used to capture an image of the user.
  • the camera may adopt an RGB-D camera capable of simultaneously collecting two-dimensional face image information and depth information to obtain more accurate user images.
  • the touch screen 135 is used for the user to input touch instructions through touch.
  • the touch screen can be attached to the air screen or combined with the air screen into one.
  • the voice collection module 136 is used to collect the user's voice signal.
  • the user's voice signal can be the user's voice command signals such as "open the door”, “turn on the air conditioner in the car", “change the body color to black” and so on.
  • the voice collection module may be a voice collection device such as a microphone or a microphone.
  • the brain wave acquisition device 137 is used to collect the user's brain wave signal so as to recognize the user's ideas or thoughts, so as to perform corresponding operations on the three-dimensional model of the object displayed on the air screen, such as color switching, visual switching, and so on.
  • the ultrasonic sensor 138 is used to sense whether someone is approaching through ultrasonic waves, so that when someone approaches, the standby screen is switched to a three-dimensional model real-time display screen (that is, the air screen is called out), which is more intelligent.
  • the infrared sensor 139 is used to sense whether someone is approaching through infrared rays, so that when someone approaches, the standby screen is switched to a three-dimensional model real-time display screen (that is, the air screen is called out), and the degree of intelligence is higher.
  • the first communication unit 1801 of the mobile terminal 18 obtains the obtained link from the air screen 13 or the compressed and optimized three-dimensional model of the object to be displayed, and the display unit 1802 performs compression and optimization of the three-dimensional model according to the obtained link or the object to be displayed.
  • the control unit 1803 controls the three-dimensional model displayed by the display unit according to the input signal and generates a corresponding display control signal.
  • the second communication unit 1804 generates a display control signal and sends the signal to the display module 19, which can be
  • the second smart display screen enables the content displayed on the second smart display screen to be synchronized with the content displayed on the mobile terminal 18.
  • the first communication unit 1801 of the mobile terminal 18 obtains the obtained link or the optimized three-dimensional model of the object to be displayed from the rendering compression optimization unit 12, and the display unit 1802 compresses the optimized three-dimensional model according to the obtained link or the object to be displayed.
  • the model is displayed.
  • the control unit 1803 controls the three-dimensional model displayed by the display unit according to the input signal, and generates the corresponding display control signal.
  • the second communication unit 1804 generates the display control signal and sends the signal to the display module 19 and the display module 19 It may be a second smart display screen, so that the content displayed on the second smart display screen is synchronized with the content displayed on the mobile terminal 18.
  • the communication system 15 is further included, and the communication system 15 includes:
  • the voice message unit 151 is used to provide voice message service
  • the telephone customer service unit 152 is used to provide telephone customer service
  • the robot customer service unit 153 is used to provide robot voice customer service
  • the live video customer service unit 154 is used to provide live video customer service.
  • the present invention can also provide robotic voice customer service (models pre-trained through self-learning) and video live customer service (models obtained by scanning and modeling real people, which can be customized) services , The way is more flexible.
  • the big data analysis backend 14 includes:
  • the newly added user statistics unit 141 is used to count the number of newly added users
  • the user retention statistics unit 142 is used to count the number of retained users
  • the activity analysis unit 143 is used to analyze the activity of the user
  • the user information analysis unit 144 is used to analyze the user's gender, age, registration information, and IP distribution;
  • the hot spot analysis unit 145 is used to analyze the user's viewing hot spot and generate a corresponding heat map
  • the user viewing behavior analysis unit 146 is used to perform user viewing behavior analysis.
  • the user viewing behavior analysis includes eye tracking, facial expression analysis, somatosensory motion analysis, mouse browsing trajectory analysis and video recording, gesture analysis, and brain wave analysis. At least one
  • the information sharing unit 147 is used to share and publish the analysis results of the big data analysis background.
  • tracking the viewer's eye movements is mainly to identify the focus of the viewer's attention (when seeing the concerned part, the user's eyeballs will be different from the situation where the user does not see the concerned part, such as the time that the eyeball stays is different), so that The air screen directly switches the three-dimensional model of the object to the focus or details of the viewer's attention (if it is recognized that the user is paying attention to the body of the car, it will directly switch to the three-dimensional model of the car body).
  • the facial expression recognition of the user is mainly used to recognize the viewer's current mood (such as happy, unhappy, etc.).
  • the information sharing unit 147 can share and publish the results of big data analysis to existing general social media, such as WeChat, Weibo, and blogs.
  • Fig. 2 as a further preferred embodiment, it further includes:
  • the air screen remote control module 16 is used to remotely control the display content of the air screen.
  • the remote control includes the production and playback of advertisements, the update of display content, the layout of the display content, and the scheduling of the display content.
  • the present invention also adds an air screen remote control module 16, which can remotely control the display content of the air screen, meets the personalized customization requirements of different users, and can remotely operate the air screen anytime and anywhere, realizing data cross-space Sharing is more convenient.
  • the interactive display module includes a multi-dimensional visual real-scene shopping module 17, and the multi-dimensional visual real-scene shopping module 17 specifically includes:
  • the three-dimensional scene display unit 171 is used to provide automatic navigation services, scene switching services, three-dimensional real-world services, space roaming services, free walking services, two-dimensional floor plan services and three-dimensional house plan services;
  • AI smart voice shopping guide service 172 used to provide multi-country voice commentary services, three-dimensional product display services, three-dimensional shopping guide (obtained by scanning real people, customizable) services, AI smart voice answer services and three-dimensional robot commentary (pre-trained models) )service.
  • a display control method based on a mobile terminal in this embodiment includes the following steps:
  • the three-dimensional cloud model library 11 performs intelligent processing on the three-dimensional data of the object to be displayed to obtain the original three-dimensional model of the object to be displayed and the corresponding link.
  • the intelligent processing includes model repair, editing, cropping, surface reduction, model reduction, Compressing, processing materials, processing textures, denoising, coloring and processing lights;
  • the rendering compression optimization unit 12 performs rendering and compression optimization on the original three-dimensional model of the object to be displayed, to obtain a compressed and optimized three-dimensional model of the object to be displayed;
  • the air screen 13 interacts with the user, and interactively displays the compressed and optimized three-dimensional model of the object according to the result of the interaction; the specific display content of the air screen 13 can also be controlled by the air screen remote control module 16;
  • the big data analysis backend 14 analyzes and counts the behavior of users when viewing interactive display content
  • the 3D cloud model library 11 sends the link corresponding to the original 3D model of the object to be displayed to the first smart display or AR/VR device, so that the user can access the original 3D model of the object to be displayed by opening the link;
  • the mobile terminal 18 obtains the obtained link or the compressed and optimized three-dimensional model of the object to be displayed from the rendering compression optimization unit 12, and generates a display control signal;
  • the second smart display 19 performs display according to the display control signal.
  • the control unit 1803 of the mobile terminal controls the three-dimensional model displayed by the display unit 1802 and generates a control signal, which is sent to the second smart display 19 via the second communication unit 1804, so that the second smart display 19 is displayed on the mobile terminal 18.
  • the display is the same, and the display synchronization of the second smart display 19 and the mobile terminal 18 is realized.
  • the embodiments of the present invention can be realized or implemented by computer hardware, a combination of hardware and software, or by computer instructions stored in a non-transitory computer-readable memory.
  • the method can be implemented in a computer program using standard programming techniques-including a non-transitory computer readable storage medium configured with a computer program, where the storage medium so configured allows the computer to operate in a specific and predefined manner-according to the specific
  • Each program can be implemented in a high-level process or object-oriented programming language to communicate with the computer system. However, if necessary, the program can be implemented in assembly or machine language. In any case, the language can be a compiled or interpreted language. Furthermore, the program can be run on a programmed application specific integrated circuit for this purpose.
  • the processes (or variants and/or combinations thereof) described herein can be executed under the control of one or more computer systems configured with executable instructions, and can be used as codes that are executed collectively on one or more processors (such as , Executable instructions, one or more computer programs, or one or more applications), implemented by hardware or a combination thereof.
  • the computer program includes a plurality of instructions executable by one or more processors.
  • the method can be implemented in any type of computing platform that is operably connected to a suitable computing platform, including but not limited to personal computers, minicomputers, main frames, workstations, networks or distributed computing environments, separate or integrated computers Platform, or communication with charged particle tools or other imaging devices, etc.
  • Aspects of the present invention can be implemented by machine-readable codes stored on non-transitory storage media or devices, whether removable or integrated into computing platforms, such as hard disks, optical reading and/or writing storage media, RAM, ROM, etc., so that they can be read by a programmable computer, and when the storage medium or device is read by the computer, it can be used to configure and operate the computer to perform the processes described herein.
  • machine-readable code or part thereof, can be transmitted through a wired or wireless network.
  • machine-readable media include instructions or programs that implement the steps described above in combination with a microprocessor or other data processors
  • the invention described herein includes these and other different types of non-transitory computer-readable storage media.
  • the present invention also includes the computer itself.
  • a computer program can be applied to input data to perform the functions described herein, thereby transforming the input data to generate output data that is stored in non-volatile memory.
  • the output information can also be applied to one or more output devices such as displays.
  • the converted data represents physical and tangible objects, including specific visual depictions of physical and tangible objects generated on the display.

Abstract

L'invention concerne un système et un procédé de commande d'affichage basés sur un terminal mobile. Une acquisition, un traitement et une optimisation de compression de rendu sont effectués pour obtenir un modèle tridimensionnel d'un objet à afficher après une optimisation de compression ; puis un signal de commande d'affichage est généré au moyen du terminal mobile selon le modèle tridimensionnel de l'objet à afficher après l'optimisation de compression ; un module d'affichage réalise un affichage selon le signal de commande d'affichage, et le module d'affichage est soumis à une commande d'affichage au moyen du terminal mobile fourni en plus. Le problème selon lequel la commande ou le fonctionnement n'est pas facile pendant l'interaction homme-machine lorsqu'un écran du module d'affichage est trop grand ou que la distance de montage est longue ou le montage en suspension, est résolu, et les avantages d'une commodité d'utilisation et d'une bonne expérience d'utilisateur sont obtenus. Le procédé peut être largement appliqué au domaine technique de l'affichage.
PCT/CN2019/109473 2019-01-21 2019-09-30 Système et procédé de commande d'affichage basés sur un terminal mobile WO2020151255A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910052164.1A CN109918005A (zh) 2019-01-21 2019-01-21 一种基于移动终端的展示控制系统及方法
CN201910052164.1 2019-01-21

Publications (1)

Publication Number Publication Date
WO2020151255A1 true WO2020151255A1 (fr) 2020-07-30

Family

ID=66960428

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/109473 WO2020151255A1 (fr) 2019-01-21 2019-09-30 Système et procédé de commande d'affichage basés sur un terminal mobile

Country Status (2)

Country Link
CN (1) CN109918005A (fr)
WO (1) WO2020151255A1 (fr)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109918005A (zh) * 2019-01-21 2019-06-21 广东康云科技有限公司 一种基于移动终端的展示控制系统及方法
CN110298918A (zh) * 2019-08-02 2019-10-01 湖南海诚宇信信息技术有限公司 一种基于gpu实时三维建模显示装置及三维建模显示方法
CN111080799A (zh) * 2019-12-04 2020-04-28 广东康云科技有限公司 基于三维建模的场景漫游方法、系统、装置和存储介质
CN112286437A (zh) * 2020-10-27 2021-01-29 四川日报网络传媒发展有限公司 展示柜的交互展示设备和方法和装置、展示柜及计算机可读存储介质
CN112364403A (zh) * 2020-11-23 2021-02-12 四川大学 一种鞋类移动端3d智能搭配展示系统及方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104731343A (zh) * 2015-04-14 2015-06-24 上海云富网络科技有限公司 一种基于移动终端的虚拟现实人机交互儿童教育体验系统
US20160132962A1 (en) * 2013-06-17 2016-05-12 Spreadtrum Commications (Shanghai) Co. Ltd. Three-dimensional shopping platform displaying system
CN109085966A (zh) * 2018-06-15 2018-12-25 广东康云多维视觉智能科技有限公司 一种基于云计算的三维展示系统及方法
CN109918005A (zh) * 2019-01-21 2019-06-21 广东康云科技有限公司 一种基于移动终端的展示控制系统及方法

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106408620A (zh) * 2016-09-08 2017-02-15 成都希盟泰克科技发展有限公司 基于压缩感知的三维网格模型数据处理方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160132962A1 (en) * 2013-06-17 2016-05-12 Spreadtrum Commications (Shanghai) Co. Ltd. Three-dimensional shopping platform displaying system
CN104731343A (zh) * 2015-04-14 2015-06-24 上海云富网络科技有限公司 一种基于移动终端的虚拟现实人机交互儿童教育体验系统
CN109085966A (zh) * 2018-06-15 2018-12-25 广东康云多维视觉智能科技有限公司 一种基于云计算的三维展示系统及方法
CN109918005A (zh) * 2019-01-21 2019-06-21 广东康云科技有限公司 一种基于移动终端的展示控制系统及方法

Also Published As

Publication number Publication date
CN109918005A (zh) 2019-06-21

Similar Documents

Publication Publication Date Title
CN109085966B (zh) 一种基于云计算的三维展示系统及方法
WO2020151255A1 (fr) Système et procédé de commande d'affichage basés sur un terminal mobile
US10474336B2 (en) Providing a user experience with virtual reality content and user-selected, real world objects
US9479736B1 (en) Rendered audiovisual communication
CN108933892A (zh) 便携式电子设备及其控制方法
CN111541907B (zh) 物品显示方法、装置、设备及存储介质
CN106683197A (zh) 一种融合vr和ar技术的楼盘展示系统及其方法
CN107358007A (zh) 控制智能家居系统的方法、装置和计算可读存储介质
Cucchiara et al. Visions for augmented cultural heritage experience
US20220159178A1 (en) Automated eyewear device sharing system
CN107077750A (zh) 化身选择机制
JP2017536715A (ja) 立体空間の物理的な対話の発現
CN112199016B (zh) 图像处理方法、装置、电子设备及计算机可读存储介质
WO2020151432A1 (fr) Procédé et système de traitement de données pour visualisation de maison intelligente
CN103400543A (zh) 3d互动展示系统及其展示方法
WO2020151428A1 (fr) Système et procédé de surveillance visuelle intelligente 3d d'actions en direct
WO2020151425A1 (fr) Procédé et système d'affichage à commutation permettant une surveillance visuelle de scène réelle 3d
CN108594999A (zh) 用于全景图像展示系统的控制方法和装置
TWI795762B (zh) 用於在現實場景中疊加直播人物影像的方法和電子設備
CN108701355A (zh) Gpu优化和在线基于单高斯的皮肤似然估计
WO2020151430A1 (fr) Système d'imagerie dans l'air et son procédé de mise en œuvre
CN105472358A (zh) 一种关于视频图像处理的智能终端
CN113453027A (zh) 直播视频、虚拟上妆的图像处理方法、装置及电子设备
CN117292097B (zh) Ar试穿互动体验方法及系统
CN113014960B (zh) 一种在线制作视频的方法、装置及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19911537

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19911537

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 19911537

Country of ref document: EP

Kind code of ref document: A1