NL2027188B1 - Augmented reality and artificial intelligence integrated interactive display platform - Google Patents

Augmented reality and artificial intelligence integrated interactive display platform Download PDF

Info

Publication number
NL2027188B1
NL2027188B1 NL2027188A NL2027188A NL2027188B1 NL 2027188 B1 NL2027188 B1 NL 2027188B1 NL 2027188 A NL2027188 A NL 2027188A NL 2027188 A NL2027188 A NL 2027188A NL 2027188 B1 NL2027188 B1 NL 2027188B1
Authority
NL
Netherlands
Prior art keywords
screen
head
mounted device
computer system
augmented reality
Prior art date
Application number
NL2027188A
Other languages
Dutch (nl)
Other versions
NL2027188A (en
Inventor
Wang Chang-Song
Lai Yen-Ting
Chang Jen-Cheng
Su Kai-Hung
Chi Pai-Hung
Original Assignee
Adat Tech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Adat Tech Co Ltd filed Critical Adat Tech Co Ltd
Publication of NL2027188A publication Critical patent/NL2027188A/en
Application granted granted Critical
Publication of NL2027188B1 publication Critical patent/NL2027188B1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16MFRAMES, CASINGS OR BEDS OF ENGINES, MACHINES OR APPARATUS, NOT SPECIFIC TO ENGINES, MACHINES OR APPARATUS PROVIDED FOR ELSEWHERE; STANDS; SUPPORTS
    • F16M11/00Stands or trestles as supports for apparatus or articles placed thereon ; Stands for scientific apparatus such as gravitational force meters
    • F16M11/02Heads
    • F16M11/04Means for attachment of apparatus; Means allowing adjustment of the apparatus relatively to the stand
    • F16M11/043Allowing translations
    • F16M11/046Allowing translations adapted to upward-downward translation movement
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16MFRAMES, CASINGS OR BEDS OF ENGINES, MACHINES OR APPARATUS, NOT SPECIFIC TO ENGINES, MACHINES OR APPARATUS PROVIDED FOR ELSEWHERE; STANDS; SUPPORTS
    • F16M11/00Stands or trestles as supports for apparatus or articles placed thereon ; Stands for scientific apparatus such as gravitational force meters
    • F16M11/02Heads
    • F16M11/04Means for attachment of apparatus; Means allowing adjustment of the apparatus relatively to the stand
    • F16M11/06Means for attachment of apparatus; Means allowing adjustment of the apparatus relatively to the stand allowing pivoting
    • F16M11/08Means for attachment of apparatus; Means allowing adjustment of the apparatus relatively to the stand allowing pivoting around a vertical axis, e.g. panoramic heads
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16MFRAMES, CASINGS OR BEDS OF ENGINES, MACHINES OR APPARATUS, NOT SPECIFIC TO ENGINES, MACHINES OR APPARATUS PROVIDED FOR ELSEWHERE; STANDS; SUPPORTS
    • F16M11/00Stands or trestles as supports for apparatus or articles placed thereon ; Stands for scientific apparatus such as gravitational force meters
    • F16M11/02Heads
    • F16M11/18Heads with mechanism for moving the apparatus relatively to the stand
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16MFRAMES, CASINGS OR BEDS OF ENGINES, MACHINES OR APPARATUS, NOT SPECIFIC TO ENGINES, MACHINES OR APPARATUS PROVIDED FOR ELSEWHERE; STANDS; SUPPORTS
    • F16M11/00Stands or trestles as supports for apparatus or articles placed thereon ; Stands for scientific apparatus such as gravitational force meters
    • F16M11/20Undercarriages with or without wheels
    • F16M11/24Undercarriages with or without wheels changeable in height or length of legs, also for transport only, e.g. by means of tubes screwed into each other
    • F16M11/26Undercarriages with or without wheels changeable in height or length of legs, also for transport only, e.g. by means of tubes screwed into each other by telescoping, with or without folding
    • F16M11/28Undercarriages for supports with one single telescoping pillar
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • H04N21/43637Adapting the video stream to a specific local network, e.g. a Bluetooth® network involving a wireless protocol, e.g. Bluetooth, RF or wireless LAN [IEEE 802.11]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44218Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/816Monomedia components thereof involving special video data, e.g 3D video
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/64Constructional details of receivers, e.g. cabinets or dust covers
    • H04N5/655Construction or mounting of chassis, e.g. for varying the elevation of the tube

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Social Psychology (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

A platform includes a head-mounted device, a multifunctional base, a screen and a computer system. A user wears the head-mounted device which is configured to provide augmented reality content. The screen is disposed on the multifunctional base. The computer system is configured to provide video streaming to the screen. The screen rotates with respect to the user’s position. The user can experience augmented reality through this platform at arbitrary locations.

Description

-1-
AUGMENTED REALITY AND ARTIFICIAL INTELLIGENCE INTEGRATED INTERACTIVE DISPLAY PLATFORM
BACKGROUND Field of Invention
[01] The present disclosure is related to an augmented reality (AR) and artificial intelligence (Al) integrated interactive display platform for users to experience both Al and AR. Description of Related Art
[02] Augmented reality (AR) and artificial intelligence (Al) experience devices on the present market have only a limited range of movement, and thus users only experience AR at a fixed location. For example, if the AR and Al functions are designed for a huge machine which cannot be moved around, the users can only experience the AR and Al functions nearby the machine.
[03] CN 109309873 A discloses a screen angle adjusting method applied to an intelligent terminal. The method comprises the following steps: acquiring image data acquired by a camera device; determining a position of a target object included in the image data; determining user visual angle data according to the position of the target object; and adjusting the screen angle of the user according to the user visual angle data.
[04] US 2018/220110 A1 discloses a device comprising a housing comprising a processor disposed within the housing; a vapor projection component disposed within the center of the housing and communicatively coupled to the processor, the vapor projection component comprising: a water tank situated on the interior side of the vapor projection component, a set of tubes position on the exterior side of the vapor projection component, a transducer configured to nebulize water in the water tank, a movable belt comprising a plurality of apertures movable respective to the set of tubes, a variable speed fan configured to exert positive pressure on the nebulized water while the transducer nebulizes the water in the water tank and force the nebulized water through the apertures and the set of tubes, generating a set of nebulized vapor jets extending outward from the surface of the vapor projection component; and a projection device configured to display an image on the nebulized vapor jets.
[05] US 2013/314406 A1 discloses a method for creating a naked-eye effect, and particularly relates to a method for creating a naked-eye effect without requiring a display hologram, special optical film, or 3D glasses. This method includes following steps: (1) detecting rotating angle or moving position of a portable device by a detecting unit; (2) creating a new image of an object shown in a display according to the rotating angle or moving position of the portable device by an image processing unit; and (3) displaying the new image of the object in the display instead of the original image of the object. By this 40 method, a different image of the same object with different visual angles is displayed at
-2- different times, and it lets the brain of a person consider that the image of the object is a 3D image. Therefore, a naked-eye effect can be created.
[06] US 2007/215776 A1 discloses a display including a body and an elevation adjusting base is provided. The elevation adjusting base is secured to the body. The elevation adjusting base includes a base, an elevation adjusting mechanism, an outer bushing, an inner bushing and a friction ring. The elevation adjusting mechanism is connected to the base. The outer bushing is connected to the base and mounted on the elevation adjusting mechanism. The inner bushing is embedded inside the outer bushing. The friction ring is disposed between the outer bushing and the inner bushing. After the elevation of the elevation adjusting mechanism is adjusted, the friction ring provides a friction force between the outer bushing and the inner bushing to fix the elevation of the elevation adjusting mechanism. Users can accurately adjust the elevation of the display panel according to the view-angle and personal preferences.
SUMMARY
[07] Embodiments of the present disclosure provide an augmented reality and artificial intelligence integrated interactive display platform including a head-mounted device, a multifunctional base, a screen and a computer system. The head-mounted device is configured to provide augmented reality content. The multifunctional base includes a screen controlling unit. The screen is disposed on the multifunctional base and controlled by the screen controlling unit. The computer system is communicatively connected to the head- mounted device, the screen, and the multifunctional base, and configured to receive current position information from the head-mounted device, obtain a view angle of the head- mounted device according to the current position information, determine a content of a video streaming according to the view angle, and transmit the video streaming to the screen. The computer system is also configured to transmit a control signal to the screen controlling unit according to the view angle to rotate the screen such that a rotation angle of the screen is in response to the view angle. The video streaming includes a 3D object, and the computer system is configured to rotate the 3D object according to the view angle. The multifunctional base includes removable hollow pillars which are connected in series. One end of the series of removable hollow pillars is connected to the screen, and the other end of the series of removable hollow pillars is connected to the screen controlling unit. A power cable of the screen is disposed in the removable hollow pillars and is electrically connected to a power supply of the multifunctional base through a brush. The multifunctional base further includes removable hollow protection shells which are connected in series and surround the removable hollow pillars. A height the removable hollow protection shells is lower than that of the removable hollow pillars. The multifunctional base further includes a case, and a length of each of the removable hollow pillars and a length of each of the removable hollow protection shell are shorter than a length of the case. 40 [08] In some embodiments, the computer system is further configured to display a
-3- marker on the screen, and the head-mounted device includes an image sensor to capture an image of the marker. The computer system is configured to recognize the marker in the image to calculate a position of the head-mounted device relative to the screen.
[09] In some embodiments, which the position of the head-mounted device relative to the screen includes an initial vector. The current position information includes a three- dimensional coordinate of the head-mounted device, and the computer system is configured to calculate a difference between the three-dimensional coordinate of the head-mounted device and a three-dimensional coordinate of the screen to obtain a current vector, and calculate the view angle according to the initial vector and the current vector.
[10] In some embodiments, the video streaming includes multiple frames of an object, and each of the frames corresponds to a different angle of the object. The computer system is configured to determine which of the frames to show on the screen according to the view angle.
BRIEF DESCRIPTION OF THE DRAWINGS
[11] The invention can be more fully understood by reading the following detailed description of the embodiment, with reference made to the accompanying drawings as follows.
[12] FIG. 1 is a diagram illustrating an augmented reality and artificial intelligence integrated interactive display platform in accordance with some embodiments.
[13] FIG. 2 is a schematic diagram of the AIR integrated interactive display platform in accordance with some embodiments.
[14] FIG. 3 is a block diagram of the head-mounted device in accordance with some embodiments.
[15] FIG. 4A and FIG. 4B are diagrams of rotating the screen according to the view angle in accordance with some embodiments.
[16] FIG. 5 is a diagram illustrating frames of the video streaming having different angles in accordance with an embodiment.
[17] FIG. 6 is a front view of the AIR integrated interactive display platform in accordance with an embodiment.
DETAILED DESCRIPTION
[18] Specific embodiments of the present invention are further described in detail below with reference to the accompanying drawings, however, the embodiments described are not intended to limit the present invention and it is not intended for the description of operation to limit the order of implementation. Moreover, any device with equivalent functions that is produced from a structure formed by a recombination of elements shall fall within the scope of the present invention. Additionally, the drawings are only illustrative and are not drawn to actual size.
40 [19] The using of “first”, “second”, “third”, etc. in the specification should be
-4- understood for identifying units or data described by the same terminology, but are not referred to particular order or sequence.
[20] FIG. 1 is a diagram illustrating an augmented reality and artificial intelligence integrated interactive display platform in accordance with some embodiments. Referring to FIG. 1, the platform includes a head-mounted device 110, a screen 120, a computer system 130, and a multifunctional base 140. A user can wear the head-mounted device (HMD) 110 to experience augmented reality (AR) and artificial intelligence (Al) functions. Therefore, the computer system 130 may be referred to as an Al and AR (AIR) server.
[21] FIG. 2 is a schematic diagram of the AIR integrated interactive display platform in accordance with some embodiments. The multifunctional base 140 includes a screen controlling unit 141 (e.g. including a motor 141a and a motor controller 141b) and a communication module 142. The screen 120 may be a liquid crystal screen or an organic light emitting (OLED) screen. In some embodiments, the screen 120 is a touch screen. A wireless communication device of Wi-Fi is mounted on the screen 120, or the screen 120 supports Wi-Fi for receiving a video streaming from the computer system 130. The screen 120 is disposed on the multifunctional base 140 and controlled by the screen controlling unit 141 such that the screen 120 can rotate 360 degrees on the multifunctional base 140. The multifunctional base 140 is also electrically connected to the screen 120 for providing power. The computer system 130 is communicatively connected to the head-mounted device 110 and the screen 120 by wireless communication means such as Wi-Fi. In some embodiments, the communication module 142 is a Wi-Fi module for communicatively connecting to the computer system 130 and the head-mounted device 110. In some embodiments, the communication module 142 is a wire communication interface such as RS232 for communicatively connecting to the computer system 130.
[22] FIG. 3 is a block diagram of the head-mounted device in accordance with some embodiments. Referring to FIG. 3, the head-mounted device 110 includes a processor 310, a display 320, an image sensor 330, an inertial measurement unit (IMU) 340, a depth sensor 350, and a communication module 360. The processor 310 may be a central processing unit, a microcontroller, a microprocessor, or an application specific integrated circuit. The display 320 may be a transparent display for the user to see both of the real scene and virtual objects on the display 320. The image sensor 330 may include charge-coupled device (CCD) sensors, complementary metal-oxide semiconductor sensors, or other suitable light sensors. The inertial measurement unit 340 may include accelerometers, angular velocity sensors, magnetometers, etc. The depth sensor 350 may include an infrared emitter, an infrared sensor, dual cameras, a structured light sensing device, or any suitable device which is capable of sensing depth of a field. The communication module 380 may include a Wi-Fi module and/or a Bluetooth module.
[23] The head-mounted device 110 provides augmented reality content which may be related to stand operation procedure (SOP), for example, to check or repair mechanical 40 equipment. The head-mounted device 110 may also have navigation functions. For
-5.
example, the head-mounted device 110 may perform simultaneous localization and mapping (SLAM) algorithms by the image sensor 330 and the depth sensor 350 or calculate direction and the location of itself by the inertial measurement unit 340.
[24] The video streaming shown on the screen 120 is, for example, related to a machine. When the user moves around, the screen 120 rotates correspondingly, and thus the user sees the screen 120 as a simulation machine to experience AR. In particular, the image sensor 330 captures images of the simulation machine to perform some Al modules such as object detection, object recognition. Accordingly, the user can also experience Al at the same time.
[25] FIG. 4A and FIG. 4B are diagrams of rotating the screen according to the view angle in accordance with some embodiments. In the initial stage, the computer system 130 shows a marker 410 on the screen 120. The marker 410 has particular shape, color, texture, or words. Prompt messages may be shown on the screen 120 to ask the user to look at the marker 410. Accordingly, the image sensor 330 of the head-mounted device 110 captures an image of the marker 410, and the head-mounted device 110 transmits the image to the computer system 130 which recognizes the marker in the image. The computer system 130 can calculate the position of the head-mounted device 110 relative to the computer system 130 because the size and the shape of the marker 410 on the screen 120 are predetermined. The relative position is represented as a vector Va. For example, the computer system 130 can calculate the distance between the screen 120 and the head-mounted device 110 according to the size of the marker in the image, and a view angle is set to be zero because the head-mounted device 110 should face toward the screen 120. In some embodiments, the computer system 130 can calibrate the view angle according to the shape variation of the marker in the image. In some embodiments, the vector Va can be used to calculate absolute coordinates. For example, the three-dimensional (3D) coordinates of the screen 120 can be set to be predetermined values, and the X coordinate and Y coordinate of the head-mounted device 120 can be calculated according to the vector Va and the 3D coordinates of the screen 120. The Z coordinate of the head-mounted device 120 can be calculated according to the height of the screen 120 and the shape variation of the marker in the image. In some embodiments, the head-mounted device 110 can calculate the 3D coordinates of itself based on SLAM technology, and add the 3D coordinates to the vector Va to obtain the 3D coordinates of the screen 120. In some embodiments, the recognition of the marker in the image is performed by the head-mounted device 110. In this case, the head-mounted device 110 calculates the distance between the screen 120 and the head- mounted device 110 according to the size of the marker in the image, and the view angle is set to be zero.
[26] When the user moves, referring to FIG. 4B, the head-mounted device 110 transmits current position information of itself to the computer system 130. In some embodiments, the current position information includes 3D coordinates of the head-mounted 40 device 110. The computer system 130 calculates the difference between the 3D coordinates
-6- ofthe head-mounted device 110 and the 3D coordinates of the screen 120 to obtain a current vector Vb. The computer system 130 calculates a view angle 8 according to the vector Va and the vector Vb. In some embodiments, the head-mounted device 110 calculates the view angle 8 which is included by the current position information. Herein, “the computer system 130 obtains the view angle according to the current position information” includes the aforementioned embodiments. Next, the computer system 130 transmits a control signal based on the view angle 6 to the screen controlling unit 141 to rotate the screen 120 such that the rotation angle of the screen 120 is in response to the view angle 8. For example, the screen 120 can rotate 6 degrees so that the user still faces toward the screen 120.
[27] In particular, the computer system 130 determines content of the video streaming according to the view angle 6, and transmits the video streaming to the screen
120. The video streaming is, for example, related to a 3D object in a simulation scene. The 3D object may be tool machine, a traffic machine, a military weapon, a building, or furniture which is not limited in the disclosure. The computer system 130 rotates the 3D object —8 degrees, and thus the user would feel like watching the real object. In some embodiments, the user can take the screen 120 as the machine to perform interactive SOP such as checking or repairing the machine. Since the screen 120 can rotate 360 degrees, the user can walk around the machine to perform SOP form any angle. In some embodiments, the computer system 130 can provide Al modules of object detection, object recognition, and abnormal detection. The images sensed by the image sensor 330 of the head-mounted device 110 are transmitted to the computer system 130 to perform the Al modules, and the result is transmitted back to the head-mounted device 110. For example, the Al module can detect the abnormality of the machine, and the SOP for repairing the machine is shown on the head-mounted device 110. In some embodiments, some SOP options are shown on the screen 120, and thus the user can select (by touching) one of the options to determine if he/she is familiar with the SOP.
[28] FIG. 5 is a diagram illustrating frames of the video streaming having different angles in accordance with an embodiment. In the embodiments of FIG. 5, the aforementioned video streaming includes frames 510 of an object 520. Each frame 510 corresponds to different angles of the object 520. Note that the object 520 is merely for description, and the user only sees the frames 510 shown on the screen 120. The computer system 130 determines which of the frames 510 to show on the screen 120 according to the view angle 9. For example, there are N frames 510 where N is a positive integer. The computer system 130 can select (N x 8/360)" frame 510 to show on the screen 120.
[29] In some embodiments, the head-mounted device 110, the screen 120, the computer system 130, and the multifunctional base 140 are packed as a flight case, and thus the user can carry this flight case to demonstrate the combination of Al and AR anywhere. FIG. 6 is a front view of the AIR integrated interactive display platform in accordance with an embodiment. Referring to FIG. 8, the multifunctional base 140 includes 40 removable hollow pillars 611 and 612 which are connected in series. One end of the hollow
-7- pillars 611 and 612 is connected to the screen 120, and the other end is connected to the screen controlling unit 141 through gears 613, 614. When the screen controlling unit 141 rotates the gear 614, the gear 613 and the screen 120 also rotates. In addition, a power cable 615 of the screen 120 is disposed in the hollow pillars 611 and 612 and is electrically connected to a power supply 616 through a brush 615. As a result, the screen 120 can rotate 360 degrees. Note that the screen 120 cannot receive the video streaming through a cable because the screen 120 rotates. As described above, the screen 120 receives the video streaming through Wi-Fi. The multifunctional base 140 also includes removable hollow protection shells 621 and 622 which are connected in series and surround the hollow pillars 611 and 612 to stabilize the screen 120. In some embodiments, the height of the serially connected hollow protection shells 621 and 622 is lower than that of the hollow pillars 611 and 612. The hollow protection shells 621, 622 and the hollow pillars 611, 612 can be individually disassembled and stored in a case 630. Therefore, a length L1 of each of the hollow pillars 611 and 612 and a length L2 of each of the hollow protection shell 621, 622 are shorter than a length L3 of the case 830. In some embodiments, the head-mounted device 110, the screen 120, and the computer system 130 can also be stored in the case 630, and therefore the user can merely carry the case 630 to demonstrate AR and Al anywhere.
[30] The disclosed platform provides users with a 360-degree AR experience of a real object with Al functions. At the same time, SOP simulation is also integrated into this platform. This platform introduces a 360-degree rotating screen for users to walk around the object.
[31] Although the present invention has been described in considerable detail with reference to certain embodiments thereof, other embodiments are possible. It will be apparent to those skilled in the art that various modifications and variations can be made to the structure of the present invention without departing from the scope of the invention. In view of the foregoing, it is intended that the present invention covers modifications and variations of this invention provided they fall within the scope of the following claims.

Claims (4)

-8- CONCLUSIES-8- CONCLUSIONS 1. Een aangevulde realiteit, augmented reality, en kunstmatige intelligentie, artificial intelligence, geïntegreerde interactieve weergave-eenheid, infegrated interactive display, platform omvattende: een hoofdbevestigde, head-mounted, inrichting (110) geconfigureerd voor het verschaffen van augmented reality inhoud, content, een multifunctionele basis (140) omvattende een schermbesturingseenheid (141); een scherm (120) aangebracht op de multifunctionele basis (140) en bestuurd door de schermbesturingseenheid (141); en een computersysteem (130) dat communicatief is verbonden met de head-mounted inrichting (110), het scherm (120), en de multifunctionele basis (140), en geconfigureerd om huidige-positie-informatie te ontvangen van de head-mounted inrichting (110), een zichthoek te verkrijgen van de head-mounted inrichting (110) overeenkomstig de huidige- positie-informatie, een 3D-object van een video-streaming te roteren overeenkomstig de zichthoek, en de video-streaming te verzenden naar het scherm (120), waarbij het computersysteem (130) verder is geconfigureerd voor het verzenden van een besturingssignaal naar de schermbesturingseenheid (141) overeenkomstig de zichthoek om het scherm (120) zodanig te roteren dat een rotatiehoek van het scherm (120) in respons is op de zichthoek, waarbij de multifunctionele basis (140) verder omvat: een aantal verwijderbare holle pilaren (611, 812) die in serie zijn verbonden tussen het scherm (120) en de schermbesturingseenheid (141), waarbij een vermogenskabel (615) van het scherm is aangebracht in de verwijderbare holle pilaren (611, 612) en elektrisch is verbonden met een vermogenstoevoer (816) van de multifunctionele basis (120) door een borstel; een aantal verwijderbare holle beschermingsschalen (621, 622) die in serie zijn verbonden en de verwijderbare holle pilaren (611, 612) omgeven, waarbij een hoogte van de verwijderbare holle beschermingsschalen (621, 622) geringer is dan die van de verwijderbare holle pilaren (611, 612); en een behuizing (630), waarbij een lengte van elk van de verwijderbare holle pilaren (611, 612) en een lengte van elk van de verwijderbare holle beschermingsschalen (621, 622) geringer zijn dan een lengte van de behuizing (830).An augmented reality, augmented reality, and artificial intelligence, artificial intelligence, integrated interactive display unit, infegrated interactive display, platform comprising: a head-mounted, head-mounted device (110) configured to provide augmented reality content, content , a multifunctional base (140) comprising a screen controller (141); a screen (120) mounted on the multi-function base (140) and controlled by the screen control unit (141); and a computer system (130) communicatively connected to the head-mounted device (110), the display (120), and the multifunction base (140), and configured to receive current position information from the head-mounted device (110), obtain a viewing angle from the head-mounted device (110) according to the current position information, rotate a 3D object of a video streaming according to the viewing angle, and transmit the video streaming to the screen (120), wherein the computer system (130) is further configured to send a control signal to the screen control unit (141) according to the viewing angle to rotate the screen (120) such that a rotation angle of the screen (120) is responsive to the viewing angle, the multi-function base (140) further comprising: a plurality of removable hollow pillars (611, 812) connected in series between the display (120) and the display control unit (141) wherein a power cable (615) of the screen is mounted in the removable hollow pillars (611, 612) and electrically connected to a power supply (816) of the multi-function base (120) by a brush; a plurality of removable hollow guard shells (621, 622) connected in series and surrounding the removable hollow pillars (611, 612), a height of the removable hollow shield shells (621, 622) being less than that of the removable hollow pillars ( 611, 612); and a housing (630), wherein a length of each of the removable hollow pillars (611, 612) and a length of each of the removable protective hollow shells (621, 622) are less than a length of the housing (830). 2. Het augmented reality en artificial intelligence integrated interactive display platform volgens conclusie 1, waarbij het computersysteem (130) verder is geconfigureerd om een marker (410) weer te geven op het scherm (120), en de head-mounted inrichting (110) omvat een beeldsensor voor het invangen van een beeld van de marker (410), 40 waarbij het computersysteem (130) is geconfigureerd voor het herkennen van deThe augmented reality and artificial intelligence integrated interactive display platform of claim 1, wherein the computer system (130) is further configured to display a marker (410) on the screen (120), and the head-mounted device (110) includes an image sensor for capturing an image of the marker (410), 40 wherein the computer system (130) is configured to recognize the -9- marker (410) in het beeld voor het berekenen van een positie van de head-mounted inrichting (110) ten opzichte van het scherm (120).-9 marker (410) in the image for calculating a position of the head-mounted device (110) relative to the screen (120). 3. Het augmented reality en artificial intelligence integrated interactive display platform volgens conclusie 2, waarbij de positie van de head-mounted inrichting (110) ten opzichte van het scherm omvat een initiële vector (Va), waarbij de huidige-positie-informatie omvat een drie-dimensionele coördinaat van de head-mounted inrichting (110), en het computersysteem (130) is geconfigureerd voor het berekenen van een verschil tussen de drie-dimensionele coördinaat van de head-mounted inrichting en een drie-dimensionele coördinaat van het scherm (120) voor het verkrijgen van huidige vector (Vb), en het berekenen van de zichthoek overeenkomstig de initiële vector (Va) en de huidige vector (Vb).The augmented reality and artificial intelligence integrated interactive display platform of claim 2, wherein the position of the head-mounted device (110) relative to the screen comprises an initial vector (Va), wherein the current position information comprises a three-dimensional coordinate of the head-mounted device (110), and the computer system (130) is configured to calculate a difference between the three-dimensional coordinate of the head-mounted device and a three-dimensional coordinate of the screen ( 120) to obtain current vector (Vb), and calculate the viewing angle according to the initial vector (Va) and the current vector (Vb). 4. Het augmented reality en artificial intelligence integrated interactive display platform volgens conclusie 3, waarbij de video-streaming omvat een aantal frames (510) van een object, en elk van de frames (510) overeenkomt met een verschillende hoek van het object, waarbij het computersysteem is geconfigureerd voor het bepalen welke van de frames op het scherm (120) te tonen overeenkomstig de zichthoek.The augmented reality and artificial intelligence integrated interactive display platform of claim 3, wherein the video streaming comprises a number of frames (510) of an object, and each of the frames (510) corresponds to a different angle of the object, wherein the computer system is configured to determine which of the frames to display on the screen (120) according to the viewing angle.
NL2027188A 2019-12-31 2020-12-21 Augmented reality and artificial intelligence integrated interactive display platform NL2027188B1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962955450P 2019-12-31 2019-12-31
TW108217529U TWM596351U (en) 2019-12-31 2019-12-31 Augmented reality and artificial intelligence integrated interactive display platform

Publications (2)

Publication Number Publication Date
NL2027188A NL2027188A (en) 2021-08-23
NL2027188B1 true NL2027188B1 (en) 2021-10-28

Family

ID=72177131

Family Applications (1)

Application Number Title Priority Date Filing Date
NL2027188A NL2027188B1 (en) 2019-12-31 2020-12-21 Augmented reality and artificial intelligence integrated interactive display platform

Country Status (3)

Country Link
GB (1) GB2594121B (en)
NL (1) NL2027188B1 (en)
TW (1) TWM596351U (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10809173B2 (en) 2017-12-15 2020-10-20 Analog Devices, Inc. Smoke detector chamber boundary surfaces
USD920825S1 (en) 2018-11-06 2021-06-01 Analog Devices, Inc. Smoke detector chamber
USD918756S1 (en) 2018-11-06 2021-05-11 Analog Devices, Inc. Smoke detector boundary
US10921367B2 (en) 2019-03-06 2021-02-16 Analog Devices, Inc. Stable measurement of sensors methods and systems
US11796445B2 (en) 2019-05-15 2023-10-24 Analog Devices, Inc. Optical improvements to compact smoke detectors, systems and apparatus

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI327657B (en) * 2006-03-16 2010-07-21 Qisda Corp Display panel and elevation adjusting base of the same
TW201349844A (en) * 2012-05-23 2013-12-01 Univ Nat Taiwan Method for creating a naked-eye 3D effect
US10477168B2 (en) * 2017-01-27 2019-11-12 Otoy, Inc. Headphone based modular VR/AR platform with vapor display
CN109309873B (en) * 2017-07-28 2021-05-25 腾讯科技(深圳)有限公司 Screen angle adjusting method and device and storage medium

Also Published As

Publication number Publication date
TWM596351U (en) 2020-06-01
GB2594121A (en) 2021-10-20
NL2027188A (en) 2021-08-23
GB202020687D0 (en) 2021-02-10
GB2594121B (en) 2022-05-25

Similar Documents

Publication Publication Date Title
NL2027188B1 (en) Augmented reality and artificial intelligence integrated interactive display platform
US11557134B2 (en) Methods and systems for training an object detection algorithm using synthetic images
US20240037880A1 (en) Artificial Reality System with Varifocal Display of Artificial Reality Content
CN107408314B (en) Mixed reality system
US10878285B2 (en) Methods and systems for shape based training for an object detection algorithm
US9858707B2 (en) 3D video reconstruction system
US10884576B2 (en) Mediated reality
JP5869712B1 (en) Head-mounted display system and computer program for presenting a user's surrounding environment in an immersive virtual space
EP2847616B1 (en) Surveying apparatus having a range camera
EP3118722A1 (en) Mediated reality
EP2279469A1 (en) Display of 3-dimensional objects
JP2021060627A (en) Information processing apparatus, information processing method, and program
US10564801B2 (en) Method for communicating via virtual space and information processing apparatus for executing the method
US20200209953A1 (en) Positioning system
US10559131B2 (en) Mediated reality
US9679352B2 (en) Method for operating a display device and system with a display device
CN108446012B (en) Virtual reality moving picture display method and virtual reality device
US10885319B2 (en) Posture control system
CN114253389A (en) Augmented reality system and augmented reality display method integrating motion sensor
EP3115969A1 (en) Mediated reality
KR20170125617A (en) Virtual reality apparatus and method for thereof