KR20170101703A - System and apparatus for providing virtual experience - Google Patents

System and apparatus for providing virtual experience Download PDF

Info

Publication number
KR20170101703A
KR20170101703A KR1020160024605A KR20160024605A KR20170101703A KR 20170101703 A KR20170101703 A KR 20170101703A KR 1020160024605 A KR1020160024605 A KR 1020160024605A KR 20160024605 A KR20160024605 A KR 20160024605A KR 20170101703 A KR20170101703 A KR 20170101703A
Authority
KR
South Korea
Prior art keywords
data
sensory information
virtual experience
information
image
Prior art date
Application number
KR1020160024605A
Other languages
Korean (ko)
Inventor
조은식
Original Assignee
주식회사 엔큐브
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 엔큐브 filed Critical 주식회사 엔큐브
Priority to KR1020160024605A priority Critical patent/KR20170101703A/en
Publication of KR20170101703A publication Critical patent/KR20170101703A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B20/00Signal processing not specific to the method of recording or reproducing; Circuits therefor
    • G11B20/10Digital recording or reproducing
    • H04N13/0007
    • H04N13/0048
    • H04N13/0429
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/2257
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Human Resources & Organizations (AREA)
  • Theoretical Computer Science (AREA)
  • Economics (AREA)
  • Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A virtual experience providing system and apparatus are provided. A virtual experience providing system includes a data generating device for acquiring a plurality of different sensory information by sensing a surrounding environment and generating virtual experience data by combining the acquired plurality of sensory information, A data management device for extracting and transmitting information, and a data reproduction device for receiving and reproducing the sensory information for each reproduction point.

Description

TECHNICAL FIELD The present invention relates to a virtual experience providing system and apparatus,

The present invention relates to a virtual experience providing system and apparatus.

Virtual Reality is an information activity field that allows people to indirectly experience situations that are not experienced directly in the real world due to spatial and physical constraints through interaction with the human senses in a virtual space constructed using a computer Is one of the new paradigms of The ultimate goal of virtual reality is to provide a more realistic environment for communicating with a computer by using various input and output methods to enhance the ability to communicate with a computer.

The user can receive various sensory information provided by the device by wearing a device for realizing the virtual reality. For example, a device may provide visual, auditory, and tactile information.

Korean Patent Publication No. 10-2014-0013356 (Feb.

A problem to be solved by the present invention is to simultaneously generate virtual experience data together with image data, and to reproduce a virtual experience providing apparatus using the generated virtual experience data.

The objects of the present invention are not limited to the above-mentioned problems, and other objects not mentioned can be clearly understood by those skilled in the art from the following description.

In order to achieve the above object, an aspect of a virtual experience providing system of the present invention is a method for acquiring virtual experience data by acquiring a plurality of sensory information different from each other by sensing a surrounding environment, A data management device for extracting and transmitting sensory information for each reproduction point in the virtual experience data, and a data reproduction device for receiving and reproducing the sensory information for each reproduction point.

The sensory information includes at least one of visual information, auditory information, and tactile information.

The data generation device generates the virtual experience data by synchronizing the plurality of sensory information.

The data generation device generates the virtual experience data by matching the sensory information for each image frame or generates the virtual experience data by matching the time stamps assigned to the plurality of sensory information.

The data management apparatus extracts and transmits sensory information for each image frame or extracts and transmits sensory information at the same time with reference to a time stamp given to the plurality of sensory information.

The data management apparatus transmits the sensory information in a broadcast manner.

The data reproducing apparatus includes at least one of a video reproducing apparatus, an audio reproducing apparatus, a motion providing apparatus, a tactile providing apparatus, and a smell providing apparatus.

An aspect of the virtual experience providing apparatus of the present invention includes an image generating unit for capturing an image of a surrounding environment and generating an image, an environment for sensing the surrounding environment during the generation of the image to acquire a plurality of different sensory information And a data generation unit for generating virtual experience data by combining the image and the plurality of sensory information.

The data generating unit generates the virtual experience data by synchronizing each of the plurality of sensory information with the image.

The data generation unit generates the virtual experience data by matching sensory information for each image frame constituting the image or matches the time stamps assigned to the plurality of sensory information to generate the virtual experience data do.

The sensory information includes at least one of auditory information and tactile information.

Another aspect of the virtual experience providing apparatus of the present invention is a virtual experience providing apparatus comprising an orientation adjusting unit having a base plate and at least two moving supports that are independently adjustable in height with respect to the base plate, And a control unit controlling the posture adjusting unit to adjust the posture of the support plate supported by the at least two moving support units.

The posture adjusting unit adjusts the height of the at least two moving supports by the power of the battery.

The posture adjusting unit includes a fixed support portion for supporting the support plate in a state where the height of the base plate is fixed.

The details of other embodiments are included in the detailed description and drawings.

1 is a diagram illustrating a virtual experience providing system according to an embodiment of the present invention.
2 is a block diagram showing a data generating apparatus according to an embodiment of the present invention.
3 is a block diagram illustrating a data management apparatus according to an embodiment of the present invention.
4 is a view showing a kind of a data reproducing apparatus according to an embodiment of the present invention.
5 is a diagram illustrating virtual experience data according to an embodiment of the present invention.
6 is an exploded perspective view of a motion providing apparatus according to an embodiment of the present invention.
7 is a plan view of the posture control unit according to the embodiment of the present invention.
8 is a view illustrating that the height of the movable support unit according to the embodiment of the present invention is adjusted.
9 is a view showing a head mount apparatus according to an embodiment of the present invention.

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. BRIEF DESCRIPTION OF THE DRAWINGS The advantages and features of the present invention and the manner of achieving them will become apparent with reference to the embodiments described in detail below with reference to the accompanying drawings. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Is provided to fully convey the scope of the invention to those skilled in the art, and the invention is only defined by the scope of the claims. Like reference numerals refer to like elements throughout the specification.

Unless defined otherwise, all terms (including technical and scientific terms) used herein may be used in a sense commonly understood by one of ordinary skill in the art to which this invention belongs. Also, commonly used predefined terms are not ideally or excessively interpreted unless explicitly defined otherwise.

1 is a diagram illustrating a virtual experience providing system according to an embodiment of the present invention.

Referring to FIG. 1, the virtual experience providing system 10 includes a data generating apparatus 100, a data managing apparatus 200, and a data reproducing apparatus 300.

The data generation apparatus 100 acquires a plurality of different sensory information by sensing the surrounding environment, and combines the acquired sensory information to generate virtual experience data.

In the present invention, the sensory information includes at least one of visual information, auditory information, motion information, and tactile information. The visual information may include an image, and the audio information may include audio. In order to acquire the visual information and auditory information, the data generating apparatus 100 may include a video acquiring means such as a camera and a voice acquiring means such as a microphone.

The motion information includes information about motion applied to the data generating apparatus 100. [ For example, information on an external force such as force and vibration applied to the data generating apparatus 100 may be included in the motion information. In order to acquire motion information, the data generation apparatus 100 may include an acceleration sensor, a geomagnetic sensor, a vibration sensor, and the like.

The tactile information includes information on external influences exerted on the surface of the data generating apparatus 100. For example, temperature, humidity, wind, smell, and the like sensed on the surface of the data generating apparatus 100 may be included in the tactile information. The data generation apparatus 100 may include a thermometer, a hygrometer, a weather vane, and a gas analyzer in order to acquire haptic information.

The user can carry the data generating apparatus 100 or combine the data generating apparatus 100 with a specific apparatus to generate virtual experience data. For example, the user can ride the data generating apparatus 100 in a riding state while carrying the data generating apparatus 100. Thus, it is possible for the data generating apparatus 100 to sense the same thing that the user senses while the rides are operating. The data generating apparatus 100 can acquire various sensory information according to the sensed result and generate virtual experiential data using the sensory information.

In the present invention, virtual experience data can be understood as a result of synthesizing a plurality of sensory information in synchronization with each other. That is, the data generation apparatus 100 acquires each sensory information according to the passage of time, and the data generation apparatus 100 generates virtual experience data by synchronizing the points of time between the plurality of sensory information. Particularly, in the present invention, the data generation apparatus 100 can generate sensory experience data by matching sensory information for each image frame. The data generation apparatus 100 matches the sensory information with each of the image frames arranged in chronological order. To this end, the data generating apparatus 100 may acquire other sensory information at the same time while photographing the surroundings.

Alternatively, the data generation apparatus 100 may generate virtual experience data by matching the time stamps assigned to the plurality of sensory information to generate synchronized virtual experience data. In generating the sensory information, the data generation device 100 generates the virtual experience data by giving the time stamp to the sensory information at each time point and synchronizing the time points of the respective time stamps.

The data management apparatus 200 manages virtual experience data generated by the data generation apparatus 100. For example, the data management device 200 may store virtual experience data. Further, the data management apparatus 200 can extract sensory information for each reproduction point from the virtual experience data, and transmit the extracted sensory information to the data reproduction apparatus 300. The data reproducing apparatus 300 may include a plurality of detailed reproducing devices for reproducing respective sensory information. Accordingly, the data management apparatus 200 can extract the sensory information reproducible by each of the detailed reproduction apparatuses for each reproduction time point, and transmit the sensory information to the corresponding detailed reproduction apparatuses.

As described above, the virtual experience data is generated by synchronizing a plurality of sensory information. Accordingly, when the data management apparatus 200 extracts the sensory information in chronological order, it can be extracted for each reproduction time point.

On the other hand, the sensory information may be stored in an unsynchronized state. For example, the data generating apparatus 100 may assign the time stamp to the sensory information, but may generate the virtual experience data without synchronizing the time between the time stamps. In this case, the data management apparatus 200 can perform the extraction of the sensory information after synchronizing the virtual experience data using the time stamp assigned to the sensory information. For example, when the sensory information is extracted and transmitted at a specific point in time, the data management device 200 simultaneously extracts other sensory information corresponding to the point in time and transmits the same.

The extracted sensory information can be transmitted in a broadcast manner. Accordingly, the data reproducing apparatus 300 existing in the vicinity can perform reproduction according to the received sensory information. However, the transmission method by the data management apparatus 200 is not limited to the broadcast system, and the data management apparatus 200 may transmit the sensory information by configuring an individual network with each data reproduction apparatus 300 It is possible.

The data reproducing apparatus 300 plays a role of receiving and reproducing sensory information for each playback point sent from the data managing apparatus 200. [ The reproduction in the present invention can be understood as an operation to implement such that the environment sensed by the data generation apparatus 100 can be recognized by the user. For example, when an image of a subject is detected by the data reproducing apparatus 300, the user can visually check the image of the subject.

In the present invention, the data reproducing apparatus 300 may include at least one of an image reproducing apparatus, a sound reproducing apparatus, a motion providing apparatus, a tactile providing apparatus, and a smell providing apparatus. Accordingly, the data reproducing apparatus 300 can reproduce various sensory information obtained by the data generating apparatus 100, thereby enabling the user to indirectly experience the environment sensed by the data generating apparatus 100. [

The data management apparatus 200 can transmit sensory information by performing communication with each of the detailed playback apparatuses. To this end, the data management apparatus 200 can perform communication with each of the detailed playback apparatuses.

On the other hand, two or more detailed playback devices may be integrated. For example, the image reproducing apparatus and the audio reproducing apparatus can be embodied as one body. In this case, the integrated device may have one communication means for each detailed reproduction device. Thus, the communication means can receive both the sensory information on the image and the sensory information on the voice from the data management apparatus 200, and transmit the sensory information to the video reproducing apparatus and the audio reproducing apparatus.

2 is a block diagram showing a data generating apparatus according to an embodiment of the present invention.

2, the data generation apparatus 100 includes an image generation unit 110, an environment detection unit 120, a storage unit 130, a control unit 140, a data generation unit 150, and a communication unit 160 .

The image generating unit 110 photographs the surrounding environment and generates an image. And the camera may serve as the image generating unit 110. [ The image sensor of the camera receives an analog image signal, and an image sensor may be provided in the image sensor to receive the image signal. A charge-coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS) may be used as the image pickup device, but the present invention is not limited thereto.

Also, in the present invention, the image may include not only the visible light band image but also the infrared light band image. For this purpose, the image generating unit 110 may include a camera for generating an image of a visible light band and a camera for generating an image of an infrared band.

The environment sensing unit 120 senses the surrounding environment during the generation of the image by the image generating unit 110 and acquires a plurality of different sensory information. That is, the image generation unit 110 can simultaneously generate the sensed information by the environment sensing unit 120. Here, the sensory information may include at least one of auditory information, motion information, and tactile information. In order to acquire such sensory information, the environment sensing unit 120 may include a voice acquiring unit, an acceleration sensor, a geomagnetic sensor, a vibration sensing sensor, a thermometer, a hygrometer, a weather vane, and a gas analyzer. In addition, the environment sensing unit 120 may include position sensing means such as a Global Positioning System (GPS) receiver. The position information confirmed by the position sensing means may be used for data editing by the data management device 200 or for data reproduction by the data reproduction device 300. [

The storage unit 130 stores the image generated by the image generation unit 110 and the sensory information acquired by the environment sensing unit 120. [ In addition, the storage unit 130 stores virtual experience data generated by the data generation unit 150, which will be described later.

The data generation unit 150 generates virtual experience data by combining the images generated by the image generation unit 110 and the plurality of sensory information acquired by the environment sensing unit 120. At this time, the data generating unit 150 may generate virtual experience data by synchronizing each of the plurality of sensory information with the image. The virtual experience data includes images and various sensory information. The sensory information is synchronized with the visual and sensory information so that the sensory information is reproduced at the reproduction time of the corresponding visual scene, and is included in the virtual experience data.

The image may be configured to include a plurality of image frames. Accordingly, the data generator 150 may generate the virtual experience data by matching the sensory information for each image frame constituting the image. Accordingly, the sensory information matched to the corresponding image frame can be reproduced while a specific image frame is reproduced. Alternatively, the data generating unit 150 may generate virtual experience data by matching the time stamps assigned to the plurality of sensory information. However, in the present invention, the method of generating virtual experience data by synchronizing the sensory information is not limited to the one using the matching and time stamp of each image frame, and the data generating unit 150 may synchronize the sensory information in various ways, Data can be generated.

The communication unit 160 transmits virtual experience data to the data management apparatus 200. For this purpose, a wireless or wired communication channel may be established between the communication unit 160 and the data management apparatus 200. The communication unit 160 may transmit the virtual experience data to the data management apparatus 200 after the data generation unit 150 has completed the generation of the virtual experience data. Alternatively, the communication unit 160 may transmit the virtual experience data in the form of a stream to the data management apparatus 200 during the generation of the virtual experience data by the data generation unit 150. [

The control unit 140 performs overall control of the image generation unit 110, the environment sensing unit 120, the storage unit 130, the data generation unit 150, and the communication unit 160.

3 is a block diagram illustrating a data management apparatus according to an embodiment of the present invention.

3, the data management apparatus 200 includes an input unit 210, a storage unit 220, a control unit 230, a function data generation unit 240, and a communication unit 250.

The input unit 210 plays a role of receiving a user command. For example, the input unit 210 may receive a virtual experience data transmission command from a user. In addition, the input unit 210 can receive an edit command for virtual experience data. The edit command may include a modify command, a delete command, and an add command.

The modification instruction indicates an instruction to modify at least a part of the detailed data constituting the virtual experience data. The delete command indicates an instruction to delete at least a part of the detailed data constituting the virtual experience data. The additional command indicates an instruction to include additional detailed data not included in the virtual experience data in the virtual experience data. The operation corresponding to the edit command may be performed by the control unit 230. [ That is, when the edit command is input, the controller 230 can perform the operation corresponding to the edit command and edit the virtual experience data.

The storage unit 220 stores virtual experience data. The virtual experience data received through the communication unit 250 may be stored in the storage unit 220. [ The control unit 230 may perform editing of the virtual experience data stored in the storage unit 220. [

The function data generation unit 240 extracts sensory information for each playback point in the virtual experience data to generate function data. The function data generation unit 240 extracts sensory information to be reproduced at each reproduction time according to the reproduction sequence of virtual experience data, and generates functional data using the sensory information. For example, when a specific image frame is reproduced, at least one sensory information corresponding to the image frame is extracted, and the extracted sensory information and the image frame are synthesized to generate functional data. Alternatively, the function data generating unit 240 may extract sensory information at the same time by referring to the time stamps assigned to the plurality of sensory information, and generate function data.

In the present invention, the function data may be understood as data including at least one sensory information. As described above, at least two of the plurality of detailed playback apparatuses constituting the data playback apparatus 300 may be configured as a single unit. The integrated device may include one communication means. Accordingly, the network configuration can be simplified by transmitting a plurality of pieces of sensory information corresponding to a plurality of detailed reproduction devices included in the integrated device in one function data.

On the other hand, when all of the plurality of detailed reproducing devices constituting the data reproducing apparatus 300 are individually provided and the detailed reproducing devices are provided with the communication means, the function data generating section 240 includes only one sensory information, Data can be generated. In any case, the function data generation unit 240 can generate the function data according to the reproduction sequence of the virtual experience data.

The communication unit 250 plays a role of receiving virtual experience data from the data generating apparatus 100. For this purpose, a wireless or wired communication channel may be established between the communication unit 250 and the data generating apparatus 100. [ The communication unit 250 also transmits the virtual experience data to the data reproduction apparatus 300. More specifically, the communication unit 250 can transmit the function data to the detailed playback apparatus constituting the data playback apparatus 300. [ The communication unit 250 may transmit the function data according to the order generated by the function data generation unit 240.

In transmitting the function data, the communication unit 250 may transmit the function data in a broadcast manner. Accordingly, the data reproducing apparatus 300 existing in the vicinity of the communication unit 250 can perform reproduction according to the received sensory information. However, the transmission method by the communication unit 250 is not limited to the broadcast method, and the communication unit 250 may transmit sensory information by configuring an individual network with each data reproduction apparatus 300. [

As described above, the control unit 230 can perform an editing operation on the virtual experience data. The control unit 230 performs overall control of the input unit 210, the storage unit 220, the function data generation unit 240, and the communication unit 250.

4 is a view showing a kind of a data reproducing apparatus according to an embodiment of the present invention.

4, the data reproducing apparatus 300 includes a video reproducing apparatus 310, a sound reproducing apparatus 320, a motion providing apparatus 330, a tactile angle providing apparatus 340, and an olfactory providing apparatus 350 .

The image reproducing apparatus 310 can reproduce an image. As the image data included in the function data is input to the image reproducing apparatus 310, the image reproducing apparatus 310 can reproduce the image. To this end, the image reproducing apparatus 310 may include display means (not shown). For example, the smartphone may function as the image reproducing apparatus 310. However, the image reproducing apparatus 310 of the present invention is not limited to a smart phone, and various apparatuses including the displaying means and the image processing means can perform the role of the image reproducing apparatus 310.

The voice reproducing apparatus 320 can reproduce the voice. As the voice data included in the function data is input to the voice reproduction apparatus 320, the voice reproduction apparatus 320 can reproduce the voice. For this purpose, the audio reproducing apparatus 320 may include audio output means (not shown) such as a speaker.

The motion providing device 330 can provide motion. As the motion data contained in the function data is input to the motion providing device 330, the motion providing device 330 can reproduce the motion. The motion provided by the motion providing device 330 in the present invention includes a motion that is provided to a user who looks at the motion providing device 330, wears the motion providing device 330, or contacts the motion providing device 330. For example, the motion providing device 330 may perform a specific motion and provide the user with a visual effect of the motion. Alternatively, the motion providing device 330 may perform a specific motion to apply a particular force to the user who wears or contacts the motion providing device 330. If the virtual experience data is data on boarding rides, the motion provided by the motion providing device 330 may be similar to the force the rides provide to the occupant. A detailed description of the motion providing apparatus 330 will be given later with reference to FIGS. 6 to 8. FIG.

The tactile sense providing device 340 may provide tactile stimulation. As the tactile data included in the function data is input to the tactile feedback device 340, the tactile feedback device 340 can reproduce the tactile stimulus. For example, tactile stimulation such as heat, cold air, moisture, wind, etc. may be provided by the tactile sense providing apparatus 340. To this end, the tactile sense providing apparatus 340 may include a heat generating means (not shown), a cold air generating means (not shown), a sprayer (not shown), a propeller (not shown)

The olfactory providing apparatus 350 may provide a olfactory stimulus. As the olfactory data contained in the function data is input to the olfactory providing apparatus 350, the olfactory providing apparatus 350 can reproduce the olfactory stimulus. To provide the olfactory stimulus, the olfactory providing device 350 may be equipped with various odor medicines. Thus, the olfactory-providing device 350 may spray medicines corresponding to the olfactory data or synthesize a plurality of medicines to provide specific odors.

As described above, various kinds of detailed playback apparatuses have been described. However, the detailed playback apparatuses may be included in the data playback apparatus 300 by providing various detailed playback apparatuses. Further, by combining two or more detailed playback devices, various combinations of integrated devices may be realized.

5 is a diagram illustrating virtual experience data according to an embodiment of the present invention.

Referring to FIG. 5, the virtual experience data 400 may include image data 410 and a plurality of sensory information 420. In particular, the image data 410 may be divided into a plurality of image frames 411, and the remaining sensory information 420 may be included in the virtual experience data 400 by matching the image frames.

The data management apparatus 200 extracts and transmits the image frame 411 and the sensory information 420 according to the time sequence. That is, the data management apparatus 200 can extract and transmit sensed information matched to a specific image frame and the corresponding image frame. Accordingly, reproduction of the corresponding image frame and reproduction of corresponding sensory information can be performed simultaneously.

7 is a plan view of the posture adjusting unit according to the embodiment of the present invention. FIG. 8 is a perspective view illustrating a state where the height of the moving support unit is adjusted according to the embodiment of the present invention. Fig.

6 to 8, the motion providing apparatus 330 includes a base plate 510, a posture adjusting unit 520, a guide unit 540, a support plate 550, a control module 560, and a battery 570 ).

The base plate 510 serves to receive the posture control unit 520, the battery 570, the control module 560, and the communication module. The posture control unit 520, the battery 570, the control module 560, and the communication module may be fixed on the base plate 510. Also, although not shown, a cover may be provided along the edge of the base plate 510 so that the base plate 510 and the cover may form a single housing.

The posture adjusting unit 520 controls the posture of the support plate 550 with respect to the base plate 510. For example, the support plate 550 may have a posture parallel to the base plate 510, and may have an oblique posture according to the adjustment of the posture control unit 520. One side of the support plate 550 may be referred to as a front side. The posture adjusting unit 520 may be configured to have an inclined posture toward the front surface, an oblique posture toward the rear surface, an oblique posture toward the left side surface, Can be adjusted. The attitude of the support plate 550 may be adjusted so that the posture adjusting unit 520 has an oblique posture toward the left front, an oblique posture toward the left posterior surface, an oblique posture toward the right posterior surface, and an oblique posture toward the right front have. The posture adjusting unit 520 may raise or lower the support plate 550 while maintaining the angle of the support plate 550 with respect to the base plate 510. Also, the posture adjusting unit 520 may cause the support plate 550 to vibrate.

In order to perform such an operation, the posture adjusting unit 520 may include at least two height adjusting units 530. [ The height adjusting portion 530 can adjust the height of the support point with respect to the base plate 510 while supporting a specific point of the support plate 550. [ The posture adjusting unit 520 may include a fixing support unit (not shown) for supporting the support plate 550 in a state where the height of the base plate 510 is fixed. The posture of the support plate 550 can be adjusted as the height of the two height adjusting portions 530 is adjusted while being supported by the fixed support portion at a specific point of the support plate 550.

Alternatively, the fixed support may be implemented in the form of another height adjuster 530. 6 to 8 illustrate an attitude adjusting unit 520 composed of three height adjusting units 530. As shown in FIG. Hereinafter, an attitude adjusting unit 520 including three height adjusting units 530 will be described.

As shown in FIG. 7, the three height adjusting portions 530 may be provided at different positions on the base plate 510. Accordingly, each of the height adjusting portions 530 can exert a force on three different points of the support plate 550.

The height adjusting portion 530 includes a driving motor 531, a gear portion 532, and a moving support portion 533. [ The driving motor 531 can generate a driving force. The moving support portion 533 can be raised or lowered by the driving force generated by the driving motor 531. [ The drive motor 531 may be a step motor for fine angle adjustment, but is not limited thereto.

The gear portion 532 serves to transmit the rotational force of the driving motor 531 to the moving support portion 533. The gear portion 532 may include at least one gear. The gear portion 532 serves to convert the rotational force of the driving motor 531 to the horizontal moving force of the moving support portion 533. [ For this purpose, a rotational force transmitting portion 534 may be provided outside the gear portion 532. The height of the moving support portion 533 relative to the base plate 510 can be adjusted while the rotation force transmitting portion 534 is rotated.

The movable supporting part 533 serves to support the supporting plate 550. [ The movable support 533 may be rotatably coupled to the support plate 550 for free movement between the movable support 533 and the support plate 550. In particular, the movable support portion 533 can rotate about two different axes with respect to the support plate 550. [ Accordingly, a free posture of the moving support part 533 relative to the support plate 550 can be realized.

Each of the height adjusting portions 530 can apply a force to three different points of the support plate 550 to adjust the height of the corresponding point. The height of the three points of the support plate 550 is independently adjusted by the three height adjustment portions 530 so that a free posture of the support plate 550 can be realized.

The guide part 540 serves to guide the movement of the movable supporting part 533. The moving support part 533 may be provided through the hole 541 provided in the guide part 540. Therefore, the movable supporting portion 533 can move within a range corresponding to the shape of the hole 541, and the height of the distal end of the moving supporting portion 533 can be adjusted based on the position of the hole 541.

8 shows that the height of the movable supporting part 533 is adjusted by the rotation of the rotational force transmitting part 534. [ One end of the movable supporting portion 533 is connected to the rotational force transmitting portion 534 so as to be rotatable. Further, the other end of the moving support portion 533 is connected to the support plate 550 so as to be rotatable. Further, a guide portion 540 for restricting the movement of the movable supporting portion 533 may be provided between the rotational force transmitting portion 534 and the support plate 550. Accordingly, the height of the movable supporting portion 533 can be adjusted by the rotation of the rotational force transmitting portion 534.

The control module 560 receives the sensory information and controls the posture control unit 520 based on the received sensory information. For example, when sensory information to raise the height of the support plate 550 is received, the control module 560 may control the overall height adjuster 530 to raise the height of the support plate 550 have. Alternatively, when sensory information for changing the posture of the support plate 550 is received, the control module 560 may control one or more height adjustment portions 530 to change the posture of the support plate 550. Alternatively, when sensory information is generated to cause vibration, the control module 560 may control one or more height adjusters 530 to cause vibrations to be applied to the support plate 550.

The battery 570 serves to supply power to the control module 560 and the posture control unit 520. That is, the motion providing apparatus 330 according to the embodiment of the present invention can operate with the power of the battery 570. Accordingly, the user can be provided with the motion by the motion providing device 330 even in a place where the power supply is restricted.

When the commercial power is input, the corresponding power source is used for operation of the control module 560 and the posture control unit 520. In this case, And may be used for charging the battery 570.

9 is a view showing a head mount apparatus according to an embodiment of the present invention.

The head mount apparatus 600 according to an embodiment of the present invention is a device that can be worn on the head of a user and can provide various sensory information by applying a stimulus to the user's head. The above-described tactile sense providing device 340 and the smell providing device 350 may be combined to implement the head mount apparatus 600.

Referring to FIG. 9, the head mount apparatus 600 may include a lens 610, a propeller 620, a sprayer 630, a sound provider 640, and an odor providing apparatus 650.

The lens 610 serves to adjust the focal distance so as to correspond to the surface of the display means. The head mount apparatus 600 may be equipped with display means for reproducing an image such as a smart phone. The user can be provided between the eyes and the display means so that the user can appreciate the image more clearly.

The propeller 620 serves to provide wind. The wind provided by the propeller 620 may be applied to the user's face. The propeller 620 can rotate at different speeds in response to input sensory information to provide winds of different intensities.

The sprayer 630 serves to spray water. The water sprayed by the sprayer 630 may be applied to the face of the user. The size of the water particles injected by the sprayer 630 may vary depending on sensory information. For example, the sprayer 630 may spray water particles larger than a certain size to recognize that the user is a water droplet. Alternatively, the sprayer 630 may spray water particles less than a certain size to allow the user to perceive the rise in humidity without recognizing that it is a water droplet.

The voice remover 640 plays a role of providing voice. The voice remover 640 may have a speaker shape or an earphone shape. 9 shows that two voice providing units 640 are provided, however, three or more voice providing units 640 may be provided at various positions of the head mounting apparatus 600. [

The odor providing unit 650 serves to provide an odor. The odor providing unit 650 can be understood as corresponding to the olfactory providing apparatus 350 described above. Accordingly, the odor providing unit 650 may be provided with various odor medicines, so that the odor can be generated by combining the medicines corresponding to the inputted sensory information.

Although the propeller 620, the atomizer 630, and the odor providing unit 650 have been described as separate modules, they may be provided as one unit. For example, an integral spray device (not shown) may generate wind, spray water, or generate an odor. Alternatively, a separate injecting device (not shown) combined with the functions of the propeller 620, the sprayer 630 and the odor providing device 650 may be provided, and a plurality of injecting devices may be combined to perform one function You may. For example, the spraying device A generating the wind and the spraying device B generating the smell can be simultaneously operated so that the smell can be included in the wind to be provided to the user.

While the present invention has been described in connection with what is presently considered to be practical exemplary embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, It will be understood. It is therefore to be understood that the above-described embodiments are illustrative in all aspects and not restrictive.

100: Data generating device
200: Data management device
300: Data reproducing device
110:
120: Environmental sensing unit
130:
140:
150:
160:
210:
220:
230:
240: Function data generation unit
250:

Claims (14)

A data generating device for acquiring a plurality of different sensory information by sensing the surrounding environment, and combining the acquired plurality of sensory information to generate virtual experience data;
A data management device for extracting sensory information for each playback point in the virtual experience data and transmitting the extracted sensory information; And
And a data reproduction device for receiving and reproducing the sensory information by the reproduction point.
The method according to claim 1,
Wherein the sensory information includes at least one of visual information, auditory information, motion information, and tactile information.
The method according to claim 1,
Wherein the data generation apparatus generates the virtual experience data by synchronizing the plurality of sensory information.
The method of claim 3,
The data generation device includes:
The virtual experience data is generated by matching sensory information for each image frame,
And the virtual experience data is generated by matching time stamps given to the plurality of sensory information.
The method according to claim 1,
The data management apparatus comprising:
Extracts sensory information for each image frame,
Extracting and transmitting sensory information at the same time with reference to a time stamp given to the plurality of sensory information.
The method according to claim 1,
And the data management apparatus transmits the sensory information in a broadcast manner.
The method according to claim 1,
Wherein the data reproducing apparatus comprises:
A virtual experience providing system comprising at least one of an image reproducing apparatus, a sound reproducing apparatus, a motion providing apparatus, a tactile providing apparatus, and a smell providing apparatus.
An image generation unit for generating an image by photographing a surrounding environment;
An environment sensing unit for sensing the surrounding environment during the generation of the image to acquire a plurality of different sensory information; And
And a data generating unit for generating virtual experience data by combining the image and the plurality of sensory information.
9. The method of claim 8,
Wherein the data generation unit generates the virtual experience data by synchronizing each of the plurality of sensory information with the image.
10. The method of claim 9,
Wherein the data generating unit comprises:
Generating virtual experience data by matching sensory information for each image frame constituting the image,
And the virtual experience data is generated by matching time stamps assigned to the plurality of sensory information.
9. The method of claim 8,
Wherein the sensory information includes at least one of auditory information, motion information, and tactile information.
A base plate;
An attitude adjusting unit having at least two moving supports whose heights are independently adjustable with respect to the base plate; And
And a control module for controlling the posture adjusting unit according to the received control command to adjust the posture of the support plate supported by the at least two moving support units.
13. The method of claim 12,
Wherein the posture adjusting unit adjusts the height of the at least two moving support units with the power of the battery.
13. The method of claim 12,
Wherein the posture adjusting portion includes a fixed support portion for supporting the support plate in a state that the height of the base plate is fixed.
KR1020160024605A 2016-02-29 2016-02-29 System and apparatus for providing virtual experience KR20170101703A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160024605A KR20170101703A (en) 2016-02-29 2016-02-29 System and apparatus for providing virtual experience

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160024605A KR20170101703A (en) 2016-02-29 2016-02-29 System and apparatus for providing virtual experience

Publications (1)

Publication Number Publication Date
KR20170101703A true KR20170101703A (en) 2017-09-06

Family

ID=59925340

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160024605A KR20170101703A (en) 2016-02-29 2016-02-29 System and apparatus for providing virtual experience

Country Status (1)

Country Link
KR (1) KR20170101703A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102385834B1 (en) * 2021-12-15 2022-04-12 박준 Five senses hardware transmission method
KR102398828B1 (en) * 2021-12-15 2022-05-16 박준 Artificial olfactory perception and transmission method
KR102404728B1 (en) * 2021-12-15 2022-06-02 박준 Method for implementing and transmitting artificial flavors of video and image
KR102404690B1 (en) * 2021-12-15 2022-06-02 박준 Method for generating and tran smitting artificial flavors using electric current, temperature and frequency
KR102424787B1 (en) * 2021-12-15 2022-07-25 박준 Method of artificial wind and temperature and air resistance transfer in environments in virtual, augmented and mixed reality

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102385834B1 (en) * 2021-12-15 2022-04-12 박준 Five senses hardware transmission method
KR102398828B1 (en) * 2021-12-15 2022-05-16 박준 Artificial olfactory perception and transmission method
KR102404728B1 (en) * 2021-12-15 2022-06-02 박준 Method for implementing and transmitting artificial flavors of video and image
KR102404690B1 (en) * 2021-12-15 2022-06-02 박준 Method for generating and tran smitting artificial flavors using electric current, temperature and frequency
KR102424787B1 (en) * 2021-12-15 2022-07-25 박준 Method of artificial wind and temperature and air resistance transfer in environments in virtual, augmented and mixed reality

Similar Documents

Publication Publication Date Title
KR20170101703A (en) System and apparatus for providing virtual experience
US11044544B2 (en) Headphones with interactive display
KR102197544B1 (en) Mixed reality system with spatialized audio
CN107071605B (en) Intelligent 3D earphone
US10535195B2 (en) Virtual reality system with drone integration
US9304585B2 (en) Substitutional reality system control device, substitutional reality system, substitutional reality system control method, and computer-readable non-transitory recording medium
CN109997097A (en) Information processing unit, information processing method and computer program
KR20180008631A (en) Privacy-sensitive consumer cameras coupled to augmented reality systems
US20150044662A1 (en) Acceleration sensation presentation device, acceleration sensation presentation method, and acceleration sensation presentation system
JP2020530971A (en) Head-mounted display and its display screen, head-mounted bracket and video
AU2018329647B2 (en) Headphones with interactive display
US20180374275A1 (en) Information processing method and apparatus, and program for executing the information processing method on computer
JP2016045814A (en) Virtual reality service providing system and virtual reality service providing method
JP6580516B2 (en) Processing apparatus and image determination method
WO2011117794A1 (en) Methods and devices for tactilely imparting information
EP3673348B1 (en) Data processing device, method and non-transitory machine-readable medium for detecting motion of the data processing device
CN109361727B (en) Information sharing method and device, storage medium and wearable device
US10223821B2 (en) Multi-user and multi-surrogate virtual encounters
JP6754819B2 (en) Kigurumi production support device, costume production support system and costume production support method
CN109240498A (en) Exchange method, device, wearable device and storage medium
CN108833740A (en) A kind of real-time word extractor method and apparatus based on three-dimensional animation live streaming
JP2020192387A (en) Stuffed suit performance support device, stuffed suit performance support system, and stuffed suit performance support method
US11518036B2 (en) Service providing system, service providing method and management apparatus for service providing system
KR101805449B1 (en) Virtual experience data generating apparatus
KR20180058199A (en) Electronic apparatus for a video conference and operation method therefor

Legal Events

Date Code Title Description
AMND Amendment
AMND Amendment