CN110543230A - Stage lighting element design method and system based on virtual reality - Google Patents
Stage lighting element design method and system based on virtual reality Download PDFInfo
- Publication number
- CN110543230A CN110543230A CN201810522548.0A CN201810522548A CN110543230A CN 110543230 A CN110543230 A CN 110543230A CN 201810522548 A CN201810522548 A CN 201810522548A CN 110543230 A CN110543230 A CN 110543230A
- Authority
- CN
- China
- Prior art keywords
- virtual
- stage
- user
- scene
- virtual reality
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 35
- 238000005516 engineering process Methods 0.000 claims abstract description 9
- 230000003190 augmentative effect Effects 0.000 claims abstract description 8
- 230000009471 action Effects 0.000 claims description 43
- 230000008859 change Effects 0.000 claims description 11
- 238000004891 communication Methods 0.000 claims description 9
- 238000004590 computer program Methods 0.000 claims description 9
- 230000004886 head movement Effects 0.000 claims description 5
- 230000000007 visual effect Effects 0.000 claims description 5
- 230000008569 process Effects 0.000 abstract description 5
- 230000008901 benefit Effects 0.000 abstract description 3
- 230000000694 effects Effects 0.000 description 12
- 230000036544 posture Effects 0.000 description 12
- 238000009877 rendering Methods 0.000 description 9
- 210000004247 hand Anatomy 0.000 description 8
- 238000010586 diagram Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 210000003811 finger Anatomy 0.000 description 4
- 230000036541 health Effects 0.000 description 4
- 230000003993 interaction Effects 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 238000012938 design process Methods 0.000 description 3
- 238000003825 pressing Methods 0.000 description 3
- 239000011521 glass Substances 0.000 description 2
- 230000007547 defect Effects 0.000 description 1
- 210000003414 extremity Anatomy 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 210000004932 little finger Anatomy 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 230000003238 somatosensory effect Effects 0.000 description 1
- 210000003813 thumb Anatomy 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/50—Lighting effects
- G06T15/506—Illumination models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/012—Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Graphics (AREA)
- Architecture (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention provides a design method and a design system of stage lighting elements based on virtual reality. The method comprises the following steps: displaying a virtual stage scene to a user through the virtual reality wearing equipment, wherein the virtual stage scene comprises a virtual user interface established through an augmented reality technology; capturing an interface instruction issued by a user operating the virtual user interface in the virtual stage scene through the virtual reality wearing equipment, and laying or adjusting the position and the posture of a stage lighting element in the virtual stage scene according to the interface instruction; and displaying the position and the posture of the virtual stage scene and the stage lighting elements thereof in real time through the display equipment. The invention effectively avoids the repeated conception and debugging process of the stage designer, reduces the labor intensity and labor time of scene arrangement personnel, and greatly improves the layout efficiency and enterprise benefit of the stage scene.
Description
Technical Field
the invention relates to the field of stage modeling software, in particular to a design method and a design system of stage lighting elements based on virtual reality.
Background
the traditional stage lighting operation system is mainly characterized in that the stage lighting rendering effect is manually configured through experience, stage elements in a stage lighting software system are distributed through a mouse or a touch screen, then interface command information is sent to a computer control center, and the computer control center sends instructions to the stage lighting elements so as to move the stage lighting elements and adjust the stage lighting elements to the positions and postures corresponding to the instructions. Such conventional stage layout technology has existed for many years, and is increasingly unable to meet the application requirements of stage layout in new era. The traditional stage layout mode cannot enable a designer to personally perceive the stereoscopic impression of a three-dimensional space, so that the designer can only imagine stage layout on a two-dimensional plane, and the space impression and the real-time rendering effect are difficult to grasp in the design process.
Disclosure of Invention
In view of the above-mentioned shortcomings of the prior art, an object of the present invention is to provide a method and a system for designing a stage lighting element based on virtual reality, which are used to solve the above-mentioned problems in the prior art, and enable a designer to feel a three-dimensional effect of a virtual stage in real time, so that the designer can enjoy an immersive stage lighting design process.
in order to achieve the above objects and other related objects, the present invention provides a method for designing stage lighting elements based on virtual reality, which is applied to a terminal device, wherein the terminal device is in communication connection with a virtual reality wearable device and a display device; the method comprises the following steps: displaying a virtual stage scene to a user through the virtual reality wearing equipment, wherein the virtual stage scene comprises a virtual user interface established through an augmented reality technology; capturing an interface instruction issued by a user operating the virtual user interface in the virtual stage scene through the virtual reality wearing equipment, and laying or adjusting the position and the posture of a stage lighting element in the virtual stage scene according to the interface instruction; and displaying the position and the posture of the virtual stage scene and the stage lighting elements thereof in real time through the display equipment.
In an embodiment of the present invention, the method further includes: displaying each pre-stored stage lighting element through the virtual user interface; capturing first gesture action information of a user through the virtual reality wearable device, and accordingly identifying a target stage lighting element selected by the user; and acquiring voice input information through the virtual reality wearable equipment, and modifying the parameter value of the target stage lighting element according to the voice input information after a preset time interval.
In an embodiment of the present invention, the method further includes: capturing second gesture action information of the user through the virtual reality wearing equipment, and arranging the target stage lighting elements at corresponding positions of the virtual stage scene.
in an embodiment of the present invention, the method further includes: capturing third gesture action information of a user through the virtual reality wearable device to display a laser ray in the virtual stage scene, wherein one end of the laser ray is intersected with the target stage lighting element, and meanwhile, the target stage lighting element is highlighted; capturing fourth gesture action information of the user through the virtual reality wearable device to move the position of the target stage lighting element or rotate the angle of the target stage lighting element.
In an embodiment of the present invention, the method further includes: capturing head movements of a user through the virtual reality wearable device so as to synchronously change the visual angle state of the virtual stage scene; capturing user hand actions through the virtual reality wearing equipment so as to synchronously change the moving state of the user in the virtual stage scene.
In order to achieve the above objects and other related objects, the present invention provides a stage lighting element design system based on virtual reality, which is applied to a terminal device, wherein the terminal device is in communication connection with a virtual reality wearable device and a display device; the system comprises: the virtual scene display module is used for displaying a virtual stage scene to a user through the virtual reality wearing equipment, wherein the virtual stage scene comprises a virtual user interface established through an augmented reality technology; the lighting element design module is used for capturing an interface instruction issued by a user operating the virtual user interface in the virtual stage scene through the virtual reality wearable device, and laying or adjusting the position and the posture of a stage lighting element in the virtual stage scene according to the interface instruction; and the lighting element display module is used for displaying the position and the posture of the virtual stage scene and the stage lighting elements thereof in real time through the display equipment.
In an embodiment of the present invention, the virtual scene display module is further configured to: displaying each pre-stored stage lighting element through the virtual user interface; the light element design module is further configured to: capturing first gesture action information of a user through the virtual reality wearable device, and accordingly identifying a target stage lighting element selected by the user; and acquiring voice input information through the virtual reality wearable equipment, and modifying the parameter value of the target stage lighting element according to the voice input information after a preset time interval.
In an embodiment of the present invention, the light element design module is further configured to: capturing second gesture action information of the user through the virtual reality wearing equipment, and arranging the target stage lighting elements at corresponding positions of the virtual stage scene.
In an embodiment of the present invention, the light element design module is further configured to: capturing third gesture action information of a user through the virtual reality wearable device to display a laser ray in the virtual stage scene, wherein one end of the laser ray is intersected with the target stage lighting element, and meanwhile, the target stage lighting element is highlighted; capturing fourth gesture action information of the user through the virtual reality wearable device to move the position of the target stage lighting element or rotate the angle of the target stage lighting element.
In an embodiment of the present invention, the virtual scene display module is further configured to: capturing head movements of a user through the virtual reality wearable device so as to synchronously change the visual angle state of the virtual stage scene; capturing user hand actions through the virtual reality wearing equipment so as to synchronously change the moving state of the user in the virtual stage scene.
To achieve the above and other related objects, the present invention provides a terminal device, comprising: a processor, and a memory; wherein the memory is for storing a computer program; the processor is used for loading and executing the computer program to enable the terminal device to execute the design method of the stage lighting element based on the virtual reality.
to achieve the above and other related objects, the present invention provides a design system for stage lighting elements, comprising: the terminal device, the virtual reality wearing device and the display device are in communication connection with the terminal device.
as described above, according to the stage lighting element design method and system based on virtual reality, the user is brought into the virtual three-dimensional world through equipment such as VR glasses, the movement of the user in the virtual scene is realized through gesture actions, the operation of the user on the UI interface in the virtual scene is captured through gestures, and the position and the posture of the stage lighting element in the three-dimensional scene are adjusted, so that the effects of placing the stage element scene and sensing the rendering of the stage lighting in real time are achieved, the labor intensity of a designer is greatly reduced, and the perfection degree of the stage scene design is favorably improved.
Drawings
Fig. 1 is a schematic structural diagram of a stage lighting element design system according to an embodiment of the present invention.
Fig. 2 is a schematic diagram illustrating a design method of stage lighting elements based on virtual reality according to an embodiment of the present invention.
fig. 3A is a schematic diagram illustrating an interaction effect between a user and a virtual scene according to an embodiment of the present invention.
fig. 3B is a schematic diagram illustrating an effect of a user interacting with a virtual scene according to another embodiment of the present invention.
Fig. 4 is a schematic diagram illustrating a design system of stage lighting elements based on virtual reality according to an embodiment of the present invention.
Fig. 5 is a schematic view illustrating an imaging of a virtual stage scene of a virtual reality wearable device according to an embodiment of the invention.
Detailed Description
The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention. It is to be noted that the features in the following embodiments and examples may be combined with each other without conflict.
it should be noted that the drawings provided in the following embodiments are only for illustrating the basic idea of the present invention, and the components related to the present invention are only shown in the drawings rather than drawn according to the number, shape and size of the components in actual implementation, and the type, quantity and proportion of the components in actual implementation may be changed freely, and the layout of the components may be more complicated.
The invention provides a novel stage lighting element design method, a novel stage lighting element design system, a novel terminal device and a novel stage lighting element design system based on virtual reality. The invention utilizes VR (Virtual Reality) and AR (Augmented Reality) technology to adjust the position and the posture of the stage lighting element in the Virtual stage scene, thereby designing the simulated rendering effect of the stage lighting.
The present invention will be described in detail below with reference to examples and the accompanying drawings.
fig. 1 shows a design system for a stage element. The stage element design system mainly comprises terminal equipment 1 (such as a desktop computer, a portable computer, a tablet personal computer, a smart phone, a network cloud and the like), virtual reality wearing equipment 2 (such as VR helmets, VR glasses, VR gloves, VR handles and the like) and display equipment 3 (such as an LED display screen and the like), wherein the virtual reality wearing equipment 2 and the display equipment 3 are respectively connected with the terminal equipment 1 in a mode that: a wired communication connection, a wireless communication connection, etc.
As shown in fig. 2, the method for designing stage lighting elements based on virtual reality according to the present embodiment is applied to the terminal device 1 shown in fig. 1, and mainly includes the following steps:
S21: a virtual stage scene is presented to a user through the virtual reality wearable device 2 shown in fig. 1, wherein the virtual stage scene includes a virtual user interface established through an augmented reality technology.
In detail, the user can "enter" a pre-established virtual stage scene after wearing the virtual reality wearing device 2, and meanwhile, a virtual user interface (UI interface) is provided in the virtual stage scene to realize real-time interaction between the user and the virtual stage scene. The terminal device 1 stores a plurality of predefined gesture actions and instruction meanings corresponding to the gesture actions, and when the gesture action captured by the virtual reality wearable device 2 is successfully matched with a predefined gesture action, the terminal device 1 can recognize an operation to be executed by the captured gesture action.
To avoid the virtual user interface interfering with the overall display of the virtual stage scene, in one embodiment, the virtual user interface pops up when the virtual reality wearable device 2 captures one gesture action of the user and hides when another gesture action of the user is captured, for example: the right hand extends to the right front of the head, the door knocking action is carried out for three times, and a UI (user interface) is popped up; the palm of the hand is downward and extends out of five fingers, and a UI interface is popped up; the left hand extends to the left front of the head, the action of knocking the door is performed three times, and a UI interface is hidden.
the virtual reality wearable device 2 captures the head motion and the hand motion of the user, and transmits the motion information to the terminal device 1. The terminal device 1 processes the action information and then synchronously displays the changed virtual stage scene to the user through the virtual reality wearable device 2, so that the user can obtain immersive spatial experience.
for example, according to the fact that a sensor of a VR helmet captures the left-right rotation and up-down movement of the head of a human body, a virtual stage scene can achieve the effect of synchronously rotating and moving up and down along with the head; left and right hands extend horizontally in front of the chest to represent that the scene is far away, left and right hands move forwards from two sides of the body to represent that the scene is close, right hands extend forwards and form 90 degrees with the body to represent that the user walks forwards along the current front direction, two hands extend forwards and form 90 degrees with the body to represent that the user walks forwards along the current front direction, right arms extend rightwards and form 90 degrees with the body to represent that the user walks rightwards, right arms extend rightwards and form 90 degrees with the body to represent that the user walks fast along the right direction, left arms extend leftwards and form 90 degrees with the body to represent that the user walks fast along the left direction, right arms extend forwards and form 45 degrees with the body to represent that the user walks fast along the front direction, two hands extend backwards and form 45 degrees with the body to represent that the user walks fast along the front direction, the left arm stretches out to the left place ahead and is 45 simultaneously with the health in the dead ahead and be 90 and represent along the rotation left, both hands stretch out to the left place ahead simultaneously and are 45 simultaneously with the dead ahead and be 90 with the health and represent along the rotation left fast, the right arm stretches out to the right place ahead and is 45 simultaneously with the health and be 90 and represent along the rotation right, both hands stretch out to the right place ahead simultaneously and are 45 simultaneously with the health and represent along the rotation right fast etc..
S22: an interface instruction issued by a user operating the virtual user interface in the virtual stage scene is captured through the virtual reality wearable device 2 shown in fig. 1, and the position and the posture of the stage lighting element in the virtual stage scene are laid or adjusted according to the interface instruction. The method comprises the following steps of capturing an interface instruction issued by a user operating the virtual user interface in the virtual stage scene, wherein two different implementation modes can be adopted: captured through VR gloves, captured through VR handles.
Specific implementations of capture by VR gloves, capture by VR handles are set forth below, respectively.
For capturing user instructions through VR gloves:
Referring to fig. 3A, the virtual user interface displays the pre-stored stage lighting elements and the operation function keys required in the design process to the user. When the virtual reality wearable device 2 captures first gesture action information of the user, for example: and moving the right hand to the virtual user interface, clicking a certain stage lighting element by the index finger, and considering that the user selects a target stage lighting element by the terminal equipment 1. At this time, the user may modify the parameter value of the target stage lighting element. When a user wishes to modify the parameter value of the target stage lighting element, the virtual reality wearable device 2 obtains voice input information, and modifies the parameter value of the target stage lighting element according to the voice input information after a preset time interval (for example, 1.5 seconds).
It should be noted that, since most of the voice input information obtained by modifying the parameter values is digital information, and continuous broadcast by the user may cause the terminal device 1 to be difficult to recognize whether the voice input is for modification of one parameter value or for modification of a plurality of parameter values, the present embodiment separates the digital information of the voice input by setting a preset time interval, so as to facilitate accurate recognition of each modified value by the terminal device 1, and avoid confusion of each modified item.
after recognizing the target stage lighting element selected by the user, if the second gesture action information of the user is captured through the virtual reality wearable device 2, the terminal device 1, for example: and if the right hand grasps the virtual stage scene and drags the virtual stage scene to a certain position, arranging the target stage lighting element at the corresponding position of the virtual stage scene, and adding the stage lighting element at a preset default position or a position selected in a UI (user interface).
in order to facilitate the user to intuitively know the touch screen position of the user on the virtual user interface, a virtual mouse is displayed on the virtual user interface and moves along with the movement of the limb of the user, such as: the right hand hovers at a certain operation function key in the moving process, and after hovering for 1.5 seconds, if the index finger performs tapping action, namely clicking the operation function key, the operation function is executed.
After the terminal device 1 identifies the target stage lighting element selected by the user, if the virtual reality wearing device 2 captures third gesture action information of the user, the terminal device 1 displays a laser ray pointing to the target stage lighting element in a virtual stage scene through the virtual reality wearing device 2, and at the moment, the target stage lighting element is highlighted; subsequently, if virtual reality wearing equipment 2 caught the fourth gesture action information of user, terminal equipment 1 then can show the gesture after the adjustment of target stage lighting element in the virtual stage scene through virtual reality wearing equipment 2, include: the position after the movement, the angle after the rotation, and the like. Further, the virtual reality wearable device 2 captures the fifth gesture action information of the user (if the little finger is extended after selection), and then the whole stage enters the rendering state.
For example: the right hand is extended forwards, a point in the palm center can emit a laser ray, the laser ray is intersected with the stage lighting element, namely the stage lighting element is selected, and meanwhile the selected stage lighting element can be highlighted; the right hand clenches a fist, and the drag and drop object is moved left and right and back to place the selected stage lighting element at a proper position; and when the right hand is extended, the selection of the stage lighting elements is released. Another example is: the left hand is stretched forwards, a point in the palm center can emit a laser ray, the laser ray is intersected with the stage lighting element, namely the stage lighting element is selected, and meanwhile the selected stage lighting element can be highlighted; the right hand is placed on the Leap Motion somatosensory controller to do a rotating action, so that the object can be rotated to rotate the selected stage lighting element to a proper angle position; the left hand is extended, and the selection of the stage lighting elements is released. For another example: when the stage lighting element is selected, the stage lighting element can be moved leftwards by swinging the arm leftwards, the stage lighting element can be moved rightwards by swinging the arm rightwards, the stage lighting element can be moved upwards by swinging the arm upwards, and the stage lighting element can be moved downwards by swinging the arm downwards; only extending out the middle finger after selection, and representing that the stage lighting elements rotate rightwards; after the selection, only the thumb is extended out, which represents that the stage lighting element rotates leftwards.
For capturing user instructions through the VR handle:
The controls for the VR handle typically include: modify key, track pad, trigger key, side key. Referring to fig. 3B, unlike capturing user commands through the VR glove, when a user presses a certain button of the VR handle, it emits a beam of light, and the color of the beam represents the interaction mode of the VR handle, such as: orange in the standard interaction mode, green in the selection mode, yellow in the movement mode, etc. Clicking the track pad can display or hide the opened window. And the virtual user interface is displayed in a suspending state, and when a user presses down the key of the VR handle, the operation function key is confirmed to be selected.
For a general VR handle, a controller is moved to a world scene by pressing a side key, and the user feels like to push and pull by grasping the world scene with a hand; the side key, the alignment controller and the trigger key are pressed, so that the current position can be moved to the position aligned with the controller; the world scene can be rotated around the user by pressing the side keys on the two controllers and alternately moving the two controllers, and the user feels like to grasp the world scene by hands to rotate; pressing the side keys on the two controllers and moving them close to or away from each other can enlarge and reduce the world scene, etc.
It goes without saying that the correspondence between the keys/key combinations and the first to fifth gesture information can be set by those skilled in the art in view of the prior art of the controller, and will not be expanded in detail here.
S23: the position and the posture of the virtual stage scene and the stage lighting elements thereof are shown in real time by the display device 3 shown in fig. 1.
Each action of the user interacting with the UI interface in real-time in the virtual stage scene, and the changes to the virtual stage scene caused by each action, are displayed in the display.
Fig. 4 shows a design system 400 for stage lighting elements based on virtual reality, wherein the system 400 is implemented as a piece of software in the terminal device 1 installed in fig. 1 to execute the design method for stage lighting elements based on virtual reality in the foregoing embodiments when running. Since the principle in this embodiment is the same as that in the foregoing method embodiment, the same technical details are not repeated.
The design system 400 for stage lighting elements based on virtual reality of the embodiment includes: a virtual scene display module 401, a light element design module 402, and a light element display module 403.
The virtual scene display module 401 displays a virtual stage scene to a user through the virtual reality wearable device, wherein the virtual stage scene includes a virtual user interface established by an augmented reality technology. In an embodiment, the virtual scene display module 401 captures head movements of the user through the virtual reality wearable device, so as to synchronously change the view angle state of the virtual stage scene; capturing user hand actions through the virtual reality wearing equipment so as to synchronously change the moving state of the user in the virtual stage scene.
the lighting element design module 402 captures an interface instruction issued by a user operating the virtual user interface in the virtual stage scene through the virtual reality wearable device, and lays or adjusts the position and the posture of the stage lighting element in the virtual stage scene according to the interface instruction. In an embodiment, the lighting element designing module 402 captures second gesture motion information of the user through the virtual reality wearable device, and lays the target stage lighting element at a corresponding position of the virtual stage scene. In an embodiment, the light element design module 402 captures third gesture motion information of the user through the virtual reality wearable device to display a laser ray in the virtual stage scene, wherein one end of the laser ray intersects with the target stage light element, and the target stage light element is highlighted; capturing fourth gesture action information of the user through the virtual reality wearable device to move the position of the target stage lighting element or rotate the angle of the target stage lighting element.
In an embodiment, the virtual scene display module 401 displays the pre-stored stage lighting elements through the virtual user interface. The lighting element design module 402 captures first gesture action information of the user through the virtual reality wearable device, so as to identify a target stage lighting element selected by the user; and acquiring voice input information through the virtual reality wearable equipment, and modifying the parameter value of the target stage lighting element according to the voice input information after a preset time interval.
The light element display module 403 displays the position and the posture of the virtual stage scene and the stage light elements thereof in real time through the display device.
in addition, the present invention further includes a storage medium and a terminal device, and the technical features in the foregoing embodiments may be applied to the storage medium embodiment and the electronic device embodiment, so that repeated descriptions are omitted.
The storage medium includes: various media such as ROM, RAM, magnetic disk or optical disk, etc. which can store program codes, wherein the computer program is stored, and when the computer program is loaded and executed by a processor, the computer program realizes all or part of the steps of the design method of the stage lighting element based on virtual reality in the foregoing embodiments.
The terminal equipment is equipment comprising a processor (CPU/MCU/SOC), a memory (ROM/RAM), a communication module (wired/wireless network) and a display module, and is preferably a desktop computer. In particular, the memory stores a computer program, and the processor implements all or part of the steps of the design method for stage lighting elements based on virtual reality in the foregoing embodiments when the computer program is loaded and executed.
In summary, the stage lighting element design method and system based on virtual reality of the present invention enables the user to sense or adjust the three-dimensional rendering scene by immersing in the virtual stage scene, as shown in fig. 5, selecting the stage lighting elements through gestures, popping up a UI interface after the stage lighting elements are selected, capturing an operation command by the UI interface to adjust the posture of the stage lighting elements in the virtual stage scene, displaying the adjustment process of the stage lighting elements by a display in real time, meanwhile, the designed stage lighting rendering effect is output, repeated conception and debugging processes of stage designers are avoided, the labor intensity and labor time of scene arrangement personnel are reduced, the layout efficiency and enterprise benefit of stage scenes are greatly improved, the stage lighting rendering effect is more visual compared with the traditional stage design mode, various defects in the prior art are effectively overcome, and the stage lighting rendering effect has high industrial utilization value.
the foregoing embodiments are merely illustrative of the principles and utilities of the present invention and are not intended to limit the invention. Any person skilled in the art can modify or change the above-mentioned embodiments without departing from the spirit and scope of the present invention. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical spirit of the present invention be covered by the claims of the present invention.
Claims (12)
1. A design method of stage lighting elements based on virtual reality is characterized by being applied to terminal equipment, wherein the terminal equipment is in communication connection with virtual reality wearing equipment and display equipment; the method comprises the following steps:
Displaying a virtual stage scene to a user through the virtual reality wearing equipment, wherein the virtual stage scene comprises a virtual user interface established through an augmented reality technology;
Capturing an interface instruction issued by a user operating the virtual user interface in the virtual stage scene through the virtual reality wearing equipment, and laying or adjusting the position and the posture of a stage lighting element in the virtual stage scene according to the interface instruction;
And displaying the position and the posture of the virtual stage scene and the stage lighting elements thereof in real time through the display equipment.
2. The method of claim 1, further comprising:
Displaying each pre-stored stage lighting element through the virtual user interface;
Capturing first gesture action information of a user through the virtual reality wearable device, and accordingly identifying a target stage lighting element selected by the user;
And acquiring voice input information through the virtual reality wearable equipment, and modifying the parameter value of the target stage lighting element according to the voice input information after a preset time interval.
3. The method of claim 2, further comprising: capturing second gesture action information of the user through the virtual reality wearing equipment, and arranging the target stage lighting elements at corresponding positions of the virtual stage scene.
4. The method of claim 2, further comprising:
Capturing third gesture action information of a user through the virtual reality wearable device to display a laser ray in the virtual stage scene, wherein one end of the laser ray is intersected with the target stage lighting element, and meanwhile, the target stage lighting element is highlighted;
Capturing fourth gesture action information of the user through the virtual reality wearable device to move the position of the target stage lighting element or rotate the angle of the target stage lighting element.
5. The method of claim 1, further comprising:
Capturing head movements of a user through the virtual reality wearable device so as to synchronously change the visual angle state of the virtual stage scene;
Capturing user hand actions through the virtual reality wearing equipment so as to synchronously change the moving state of the user in the virtual stage scene.
6. A stage lighting element design system based on virtual reality is characterized by being applied to terminal equipment, wherein the terminal equipment is in communication connection with virtual reality wearing equipment and display equipment; the system comprises:
The virtual scene display module is used for displaying a virtual stage scene to a user through the virtual reality wearing equipment, wherein the virtual stage scene comprises a virtual user interface established through an augmented reality technology;
the lighting element design module is used for capturing an interface instruction issued by a user operating the virtual user interface in the virtual stage scene through the virtual reality wearable device, and laying or adjusting the position and the posture of a stage lighting element in the virtual stage scene according to the interface instruction;
And the lighting element display module is used for displaying the position and the posture of the virtual stage scene and the stage lighting elements thereof in real time through the display equipment.
7. The system of claim 6,
The virtual scene display module is further configured to: displaying each pre-stored stage lighting element through the virtual user interface;
The light element design module is further configured to: capturing first gesture action information of a user through the virtual reality wearable device, and accordingly identifying a target stage lighting element selected by the user; and acquiring voice input information through the virtual reality wearable equipment, and modifying the parameter value of the target stage lighting element according to the voice input information after a preset time interval.
8. The system of claim 7, wherein the light element design module is further configured to: capturing second gesture action information of the user through the virtual reality wearing equipment, and arranging the target stage lighting elements at corresponding positions of the virtual stage scene.
9. The system of claim 7, wherein the light element design module is further configured to: capturing third gesture action information of a user through the virtual reality wearable device to display a laser ray in the virtual stage scene, wherein one end of the laser ray is intersected with the target stage lighting element, and meanwhile, the target stage lighting element is highlighted; capturing fourth gesture action information of the user through the virtual reality wearable device to move the position of the target stage lighting element or rotate the angle of the target stage lighting element.
10. The system of claim 6, wherein the virtual scene display module is further configured to: capturing head movements of a user through the virtual reality wearable device so as to synchronously change the visual angle state of the virtual stage scene; capturing user hand actions through the virtual reality wearing equipment so as to synchronously change the moving state of the user in the virtual stage scene.
11. A terminal device, comprising: a processor, and a memory; wherein,
The memory is used for storing a computer program;
the processor is used for loading and executing the computer program to enable the terminal device to execute the design method of the stage lighting element based on the virtual reality according to any one of claims 1 to 5.
12. A stage lighting element design system, comprising: the terminal device of claim 11, and a virtual reality wearing device and a display device in communication connection with the terminal device.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810522548.0A CN110543230A (en) | 2018-05-28 | 2018-05-28 | Stage lighting element design method and system based on virtual reality |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810522548.0A CN110543230A (en) | 2018-05-28 | 2018-05-28 | Stage lighting element design method and system based on virtual reality |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110543230A true CN110543230A (en) | 2019-12-06 |
Family
ID=68700651
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810522548.0A Pending CN110543230A (en) | 2018-05-28 | 2018-05-28 | Stage lighting element design method and system based on virtual reality |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110543230A (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111756956A (en) * | 2020-06-23 | 2020-10-09 | 网易(杭州)网络有限公司 | Virtual light control method and device, medium and equipment in virtual studio |
CN111818705A (en) * | 2020-07-17 | 2020-10-23 | 广州彩熠灯光股份有限公司 | Lamp selection method and system based on 3D simulation, storage medium and light console |
CN111867210A (en) * | 2020-08-03 | 2020-10-30 | 广州彩熠灯光股份有限公司 | Visualized lighting control method and electronic device based on 3D simulation system |
CN112214272A (en) * | 2020-10-12 | 2021-01-12 | 广州彩熠灯光股份有限公司 | Display method and medium of stage lighting console and stage lighting console |
WO2023049705A1 (en) * | 2021-09-23 | 2023-03-30 | Apple Inc. | Devices, methods, and graphical user interfaces for content applications |
US11995230B2 (en) | 2021-02-11 | 2024-05-28 | Apple Inc. | Methods for presenting and sharing content in an environment |
CN118244937A (en) * | 2024-01-16 | 2024-06-25 | 北京悉见科技有限公司 | Interaction method and device for augmented reality content deployment |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5307295A (en) * | 1991-01-14 | 1994-04-26 | Vari-Lite, Inc. | Creating and controlling lighting designs |
CN104412068A (en) * | 2012-07-06 | 2015-03-11 | 奥迪股份公司 | Method and control system for operating a motor vehicle |
CN105912110A (en) * | 2016-04-06 | 2016-08-31 | 北京锤子数码科技有限公司 | Method, device and system for performing target selection in virtual reality space |
CN107027014A (en) * | 2017-03-23 | 2017-08-08 | 广景视睿科技(深圳)有限公司 | A kind of intelligent optical projection system of trend and its method |
JP6278546B1 (en) * | 2017-06-02 | 2018-02-14 | 株式会社コロプラ | Information processing method, apparatus, and program for causing computer to execute information processing method |
-
2018
- 2018-05-28 CN CN201810522548.0A patent/CN110543230A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5307295A (en) * | 1991-01-14 | 1994-04-26 | Vari-Lite, Inc. | Creating and controlling lighting designs |
CN104412068A (en) * | 2012-07-06 | 2015-03-11 | 奥迪股份公司 | Method and control system for operating a motor vehicle |
CN105912110A (en) * | 2016-04-06 | 2016-08-31 | 北京锤子数码科技有限公司 | Method, device and system for performing target selection in virtual reality space |
CN107027014A (en) * | 2017-03-23 | 2017-08-08 | 广景视睿科技(深圳)有限公司 | A kind of intelligent optical projection system of trend and its method |
JP6278546B1 (en) * | 2017-06-02 | 2018-02-14 | 株式会社コロプラ | Information processing method, apparatus, and program for causing computer to execute information processing method |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111756956A (en) * | 2020-06-23 | 2020-10-09 | 网易(杭州)网络有限公司 | Virtual light control method and device, medium and equipment in virtual studio |
CN111818705A (en) * | 2020-07-17 | 2020-10-23 | 广州彩熠灯光股份有限公司 | Lamp selection method and system based on 3D simulation, storage medium and light console |
CN111867210A (en) * | 2020-08-03 | 2020-10-30 | 广州彩熠灯光股份有限公司 | Visualized lighting control method and electronic device based on 3D simulation system |
CN111867210B (en) * | 2020-08-03 | 2022-10-21 | 广州彩熠灯光股份有限公司 | Visualized lighting control method and electronic device based on 3D simulation system |
CN112214272A (en) * | 2020-10-12 | 2021-01-12 | 广州彩熠灯光股份有限公司 | Display method and medium of stage lighting console and stage lighting console |
US11995230B2 (en) | 2021-02-11 | 2024-05-28 | Apple Inc. | Methods for presenting and sharing content in an environment |
WO2023049705A1 (en) * | 2021-09-23 | 2023-03-30 | Apple Inc. | Devices, methods, and graphical user interfaces for content applications |
US12124673B2 (en) | 2021-09-23 | 2024-10-22 | Apple Inc. | Devices, methods, and graphical user interfaces for content applications |
CN118244937A (en) * | 2024-01-16 | 2024-06-25 | 北京悉见科技有限公司 | Interaction method and device for augmented reality content deployment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110543230A (en) | Stage lighting element design method and system based on virtual reality | |
US11461955B2 (en) | Holographic palm raycasting for targeting virtual objects | |
US10394334B2 (en) | Gesture-based control system | |
Wacker et al. | Arpen: Mid-air object manipulation techniques for a bimanual ar system with pen & smartphone | |
Wang et al. | Real-time hand-tracking with a color glove | |
Billinghurst et al. | Hands in space: Gesture interaction with augmented-reality interfaces | |
CN116324703A (en) | Method for interacting with virtual controls and/or affordances for moving virtual objects in a virtual environment | |
US20120113223A1 (en) | User Interaction in Augmented Reality | |
EP3262505B1 (en) | Interactive system control apparatus and method | |
KR20220062410A (en) | Projection Casting in Virtual Environments | |
TW202105129A (en) | Artificial reality systems with personal assistant element for gating user interface elements | |
WO2018196552A1 (en) | Method and apparatus for hand-type display for use in virtual reality scene | |
CN106023308A (en) | Somatosensory interaction rapid three-dimensional modeling auxiliary system and method thereof | |
CN113892075A (en) | Corner recognition gesture-driven user interface element gating for artificial reality systems | |
CN106873767A (en) | The progress control method and device of a kind of virtual reality applications | |
Smith et al. | Digital foam interaction techniques for 3D modeling | |
CN109960403A (en) | Visual presentation and interaction methods for medical images in an immersive environment | |
CN104914993A (en) | Experience type design method for controlling civil aircraft passenger cabin seat adjustment by gestures | |
Sun et al. | Phonecursor: Improving 3d selection performance with mobile device in ar | |
CN105975158A (en) | Virtual reality interaction method and device | |
JP5665396B2 (en) | Information processing apparatus and control method thereof | |
JP6801138B1 (en) | Terminal device, virtual object operation method, and virtual object operation program | |
GB2535730A (en) | Interactive system control apparatus and method | |
Yang et al. | An intuitive human-computer interface for large display virtual reality applications | |
CN109308741B (en) | Meta 2-based natural interaction handicraft creative design system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information | ||
CB02 | Change of applicant information |
Address after: 510540 No. 8, No. 8, Kexing Road, Guangzhou private science and Technology Park, 1633 North Baiyun Road, Baiyun District, Guangdong. Applicant after: Guangzhou Colourful Lighting Co.,Ltd. Address before: Baiyun District of Guangzhou City, Guangdong province 510540 North Tai Road No. 1633 Guangzhou private science and Technology Park Branch Road No. 8 Applicant before: GUANGZHOU FINEART LIGHTING Co.,Ltd. |
|
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20191206 |