CN206892844U - A kind of LED display of three-dimensional imaging - Google Patents

A kind of LED display of three-dimensional imaging Download PDF

Info

Publication number
CN206892844U
CN206892844U CN201621096230.3U CN201621096230U CN206892844U CN 206892844 U CN206892844 U CN 206892844U CN 201621096230 U CN201621096230 U CN 201621096230U CN 206892844 U CN206892844 U CN 206892844U
Authority
CN
China
Prior art keywords
virtual
display
user
main panel
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201621096230.3U
Other languages
Chinese (zh)
Inventor
刘耀
孙兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dahooo New Media Technology Co ltd
Beijing Dahooo Technology Co Ltd
Original Assignee
Beijing Leyard Video Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Leyard Video Technology Co Ltd filed Critical Beijing Leyard Video Technology Co Ltd
Priority to CN201621096230.3U priority Critical patent/CN206892844U/en
Application granted granted Critical
Publication of CN206892844U publication Critical patent/CN206892844U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The utility model discloses a kind of LED display of three-dimensional imaging, the LED display includes a main panel and multiple included from display screen, wherein main panel:Tracking module, large-size screen monitors display module, virtual scene module, large screen splicing device;The position of the coordinate tracking device tracking user, and it is sent to the main panel;The main panel handles the user location parameter using the tracking module, obtains coordinate position data of the user in true environment, and be converted to the virtual spatial location data in virtual three-dimensional space;The virtual scene module renders and exported the model of virtual three-dimensional space;The large-size screen monitors display module output needs the Virtual Space content shown;The large screen splicing device receives the Virtual Space content, is matched including in the main panel and multiple from display screen.By the utility model, LED display is changed into alterable active display mode from a kind of passive display mode, LED is realized that real-time volume is shown.

Description

A kind of LED display of three-dimensional imaging
Technical field
It the utility model is related to technical field of image processing, and in particular to a kind of LED three-dimensional imagings display screen.
Background technology
The display mode of mesh LED screen is always the passive display mode of plane, and how assembled layout can not all give People show the image of accurate 3 D stereo, and display image will not follow the position of beholder and real-time change, can not simulate Go out accurately three-dimensional space environment.Large-scale LED screen is only used as a kind of display unit or medium to use always, related active Screen display technology does not occur in LED screen industry.
3 Dimension Image Technique of the prior art mainly has following several technologies:
LED screen Display Technique, the display mode of LED screen is always the passive display mode of plane, no matter how assembled Layout can not all show the image of accurate 3 D stereo to people, and display image will not follow the position of beholder and become in real time Change, accurately three-dimensional space environment can not be simulated.Large-scale LED screen is only used as a kind of display unit or medium to make always With the screen display technology of related active does not occur in LED screen industry.
Space coordinates is fitted matching technique, and present space coordinates fitting matching technique application focuses primarily upon virtual reality should With, and the application for virtual world coordinate being referred to true environment is seldom, and it is more that virtual world coordinate is referred to true environment The application of article multiple views is no.
Patent document 1, publication number:CN103941851A
Patent document 2, publication number:CN103365572A
Patent document 3, publication number:CN105159522A
Patent document 4, publication number:CN102508546A
Patent document 5, publication number:CN103744518A
Patent document 1 discloses a kind of method and system for realizing virtual touch calibration, it discloses following scheme:
Create virtual calibration menu;
The plane formed using the plane where showing the virtual calibration menu as x-axis and y-axis, establishes the first coordinate system;
The second coordinate system is established, by user gesture position with the second coordinate system coordinate representation;
Calculate the corresponding relation of first coordinate system and second coordinate system;
According to the corresponding relation, the user gesture position coordinates that the second coordinate system is represented, with first coordinate system Coordinate representation;
According to the user gesture position coordinates of the coordinate representation with first coordinate system, correcting user gesture and virtual school The corresponding relation of quasi- menu.
In the prior art, the User Interface and implementation method of 3D virtual projections and virtual touch, including depth finding Device, binocular images disparity computation module, binocular images processing module, 3D display equipment, gesture recognition module, camera and virtual Touch controller.Following effect is achieved by patent document 1:When the position of depth detector change or user change When the oculopupillary distance of descendant is there occurs changing, the technological means that user clicks on the calibration point of virtual calibration menu, weight are employed Newly user gesture is operated and calibrated with virtual projection picture, is effectively solved in the prior art when the above-mentioned change of generation Afterwards, gesture clicks on the entanglement problem with non_uniform response, and above-mentioned change occurs even if realizing, and can also keep the accurate of interaction Property.
Patent document 2 discloses that the remote control method and electronic equipment of a kind of electronic equipment, this method is applied to first Between electronic equipment and the second electronic equipment, wherein, the first electronic equipment is wirelessly connected with the second electronic equipment, the One electronic equipment includes image acquiring device and touch-display unit, and the second electronic equipment includes a display unit, this method bag Include:
First electronic equipment is obtained comprising the first display shown by the display unit by described image acquisition device The realtime graphic of content, by the Real-time image display in the touch-display unit;
Establish the first display that the realtime graphic corresponds to displaing coordinate displaing coordinate corresponding with first display content Coordinate transformation relation;
The touch control operation information that the touch-display unit receives is detected, determines that this is touched according to the touch control operation information Whether the touch point coordinate of control operation corresponds to the first display content included in the realtime graphic, if it is, according to described First displaing coordinate transformational relation is converted to touch point coordinate corresponding to the touch control operation second in the display unit Coordinate, and the touch command in the touch control operation information is sent to the second electronic equipment, the second electronic equipment is passed through institute State operation of the touch command realization to second coordinate position.
Patent document 2 achieves following effect:Control is set by using the electronics including scaling camera and touch-screen Another electronic equipment, after user obtains the content shown in another electronic equipment display unit by camera, in touch-screen Middle display.User can be manipulated by touch-screen in the range of certain distance to the desktop of another electronic equipment. It can realize by control of the touch control manner to the electronic equipment of non-touch-control display screen.
Patent document 3 discloses that a kind of method of virtual reality display device response peripheral apparatus operation, is specifically included:
The virtual reality display device includes two display screens, each described display screen and the one of whole interactive regions Part is corresponding, and methods described includes:
Obtain the current position coordinates of peripheral apparatus;
Using conversion regime corresponding with predetermined condition, the current position coordinates are changed, obtained positioned at specified In the range of response position coordinates, wherein, the specified range be described two display screens in specify display screen described in corresponding to Interactive region;
Position interaction is carried out according to the response position coordinates.
Patent document 3 is obtained positioned at specified by the way that the current position coordinates of the peripheral apparatus of acquisition are carried out into Coordinate Conversion In the range of respond position coordinates, and then enable virtual reality display device respond peripheral apparatus operation, realize and peripheral apparatus Response position coordinates interact.Also, the response position coordinates after changing is limited in the specified range in interactive region, Current position coordinates can be prevented when being interacted with virtual reality display device, the response position coordinates of 2D inputs is in display device Stereopsis in jump, overcome 2D input response position coordinates occur jump to user in virtual reality experience The uncomfortable impression brought.
A kind of patent document 4 discloses that User Interface for realizing 3D virtual projections and virtual touch in the display device And implementation method, it is specifically included with lower component:
Depth detector:For detecting user's head and hand and the range information of 3D display equipment;
Binocular images disparity computation module:Calculated according to the range information of reception and User Interface is passed through into 3D display Virtual projection is to the binocular images parallax away from user's head arms length;
Binocular images processing module:The image procossing that right and left eyes are shown reaches the calculating of binocular images disparity computation module Binocular images parallax, then the image after processing is sent to 3D display equipment;
3D display equipment:The binocular parallax image that binocular images processing module is treated makes user with 3D display Interactive interface is shown in a manner of 3D virtual projections in user's head arms length;
Gesture recognition module:Using camera intake user's hand exercise track, and combine the use that depth detector obtains Family hand and the range information of 3D display equipment, identify gesture;
Camera:Absorb the movement locus of user's hand;
Virtual touch controller:The information of gesture recognition module is received, and makes respective reaction;
Wherein, the output end of the depth detector is connected with the input of binocular images disparity computation module, eyes figure As the output end of disparity computation module is connected with the input of binocular images processing module;The output end of binocular images processing module It is connected with 3D display equipment;The input of gesture recognition module is connected with depth detector and camera respectively, gesture identification mould The output end of block is connected with virtual touch controller.
By the technical scheme of patent document 4, depth finding technology, 3D display technology and gesture identification skill are comprehensively utilized Art, a kind of brand-new 3D virtual touch interactive modes are createed, overcome and touch can not frame out at present, and gesture can not be with friendship Mutual object distance farther out the problem of;User can not only realize carries out touch operation on virtual screen, can also realize 3D void Intend projection.The utility model can not only realize what a kind of band of increase fed back, virtual projection, the 3D user interfaces of virtual touch, And bring user's interactive experience easy to use and brand-new.
Patent document 5 discloses that a kind of stereo interaction method and its display device and system, this method include:Including:It is logical Cross three-dimensional interaction operation rod and carry out three-dimensional interact with the object to be operated of stereoscopic display on the screen of stereoscopic display device;Obtain and see The positional information for the person of seeing, and according to the situation of change of the positional information, the process of the three-dimensional interaction is performed based on motion The three-dimensional adjustment display of parallax.
In patent document 5, during three-dimensional interact is carried out with the object to be operated by three-dimensional interaction operation rod, pass through With reference to the Display Technique of motion parallax so that the situation that such as user's sight is blocked by three-dimensional interaction operation rod or hand is occurring When, it is only necessary to change viewing location, you can the display effect of screen is adjusted based on parallax change so that user can be from other Angle watches the image section being previously blocked, and facilitates user to be completed in the case of not interrupt operation to the object to be operated Three-dimensional interaction operation.
It can be seen that the display mode of LED screen is always the passive display mode of plane in the prior art, no matter how assembly is compiled Row can not show the image of accurate 3 D stereo to people, and display image will not follow the position of beholder and become in real time Change, accurately three dimensions can not be simulated.LED screen is only used as a kind of display unit or medium to use always, related active Screen display technology do not occurred in LED screen industry.Another Virtual Space shows, and needs to wear VR glasses, very not Conveniently and true environment periphery things is can't see, use is dangerous.
Utility model content
In order to solve the above technical problems, the utility model provides a kind of LED display of three-dimensional imaging, the LED is shown Shield for showing virtual three dimensional image, the LED display includes a main panel and multiple from display screen, wherein main panel Including:Tracking module, large-size screen monitors display module, virtual scene module, large screen splicing device;
The main panel is connected with tracking transducer, coordinate tracking device respectively;
The position of the coordinate tracking device tracking user, and the user location parameter is sent to the main panel;
The main panel handles the user location parameter using the tracking module, obtains user in true environment Coordinate position data, and be converted to the virtual spatial location data in virtual three-dimensional space;The virtual scene module render and Export the model of virtual three-dimensional space;The large-size screen monitors display module needs what is shown according to the virtual spatial location data output Virtual Space content;The large screen splicing device receives the Virtual Space content, match including the main panel with It is multiple from display screen.
Preferably, the content shown on the LED display is according to the user location parameter real-time update of collection.
Preferably, one main panel and multiple comprised at least from display screen:In face of the front display screen of user, position Left side display screen, right side display screen at left and right sides of user, the top display screen at the top of user, positioned at user bottom surface Bottom display screen.
Preferably, the Virtual Space content is adjusted according to the position of the LED display, number and area and be adapted to It is shown on the multiple display screen.
Preferably, the Virtual Space content for needing to show includes image and/or video.
Preferably, the infrared coordinate tracker uses wear-type shape.
Following technique effect is achieved by the technical solution of the utility model:
LED screen is changed into alterable active display mode from a kind of passive display mode, follows character positions change to show The image and video of different angle, the coordinate of personage can be followed to be become in real time in real time so as to reach the content shown in screen Change, LED is realized that real-time volume is shown.This Virtual Space stereoscopic display mode contrasts for VR glasses, be exactly client not The bulky helmet of threading and annoying cable are needed, can be with the easily Virtual Space in immersion.
Brief description of the drawings
Fig. 1 is LED 3-D imaging system block diagrams
Fig. 2 is area of space figure of the present utility model
Fig. 3 is space schematic diagram of the present utility model
Fig. 4 is system position schematic diagram of the present utility model
Fig. 5 is visual angle change display content matching schematic diagram of the present utility model
Embodiment
The utility model is herein infrared location system, can also determined by GPS by the alignment system in true environment The modes such as position, vision positioning, laser positioning, ultrasonic wave positioning, match the coordinate and area of true LED display, read personage True location coordinate after refer to virtual three-dimensional space and calculated, by after the calculating to virtual three-dimensional space virtual three The LED screen that the image or video of the respective coordinates of dimension space are mapped to true coordinate is shown.So as to reach what is shown in screen Content can follow the coordinate of personage to be changed in real time, LED is realized that real-time volume is shown.The infrared location system includes one Infrared transmitter, for sending infrared light to infrared coordinate tracking transducer, obtained by infrared coordinate tracking transducer tracked The coordinate and area of object, and be sent to server and handled.
Referring to Fig. 1, LED 3-D imaging systems of the present utility model include:LED main displays, LED from display screen, tracking Sensor, coordinate tracking device, large screen splicing device, wherein LED main displays include three zones module:Tracking module, big screen display Show module, virtual scene module.Wherein tracking module is used to handle coordinate position data of the personage in true environment, obtains people Positioning after article coordinate for Virtual Space uses, and large-size screen monitors display module is used to show Virtual Space content, including front left and right The content of upper and lower LED screen shows that large screen splicing device is used for the display content for matching polylith large-size screen monitors, and virtual scene module is used to render With the model of output virtual three dimensional space.
Referring to Fig. 2-5, the utility model includes three large space coordinates, and 1 Virtual Space coordinate, 2 outdoor scene LED screens are sat Mark, 3 character positions coordinates, three space coordinates magnitude relationships are that Virtual Space is more than outdoor scene LED screen space, i.e., whole LED screen It is spatially located in the coordinate of Virtual Space, personage is in LED screen space, passes through the alignment system in true environment(Passed by tracking Sensor and coordinate tracking device are formed), the progress of virtual three-dimensional space environment is mapped to after reading LED screen and the true coordinate of personage Position coordinates matches, then is mapped to from the image or video of the respective coordinates of virtual three-dimensional space the LED screen of true coordinate and shows Show, the coordinate of personage can be followed to carry out real-time change in real time so as to reach the content shown in LED screen, realize LED screen Real-time volume space is shown.
Below only as an example, not being defined to specific embodiment of the present utility model.One wear-type it is infrared Coordinate tracking device is worn on user, and wear-type is a kind of preferable mode, or other are easy to what user carried Various modes, such as wrist type etc., are not limited herein.Referring to accompanying drawing 4-5, passed by the infrared track for being fixed on true environment Sensor launches infrared light to infrared coordinate tracker, and infrared coordinate tracker positions user in true environment with infrared mode Position coordinates, the user coordinates in the true environment of collection is sent to server afterwards, wherein infrared coordinate tracker and red Outer tracking transducer forms the alignment system positioned to customer location.Tracking module processing in server receives personage Coordinate position data in true environment, user is positioned in the position of Virtual Space, user is in Virtual Space Position can be adjusted according to the needs of user, for example initial position of the user in Virtual Space is located in Virtual Space Centre, left side or right side etc., then further according to user in true environment the change of position and real-time update user in Virtual Space In position.Large-size screen monitors display module is used to show Virtual Space content, including is mapped to the personage of Virtual Space, on front, a left side The content of right, upper and lower LED display shows that large screen splicing device is used for the display content for matching polylith large-size screen monitors.Wherein, by virtual field Scape module renders and exported the panorama model of virtual three dimensional space, that is, includes from the virtual three dimensional space field of each visual angle displaying Scape, this virtual three dimensional space model pass through 3D design softwares by user in advance(Such as 3DMAX)Design completion imported into server In.The orthogonal camera that virtual scene module designs using orthogonal camera image-forming principle, analog subscriber is in true environment The vision of position, orthorhombic phase machine are located at the position that user navigates to virtual three dimensional space by true environment position, make viewfinder The LED dough sheets of mouth face related direction, the scene of virtual three dimensional space are obtained with the visual angle of user, now orthogonal camera is in three-dimensional The position of Virtual Space is corresponding in the position of true environment with user, then the Virtual Space Scene content taken(Including Image and and/or video)Large-size screen monitors display module is output to, Virtual Space Scene content is exported and spelled to large-size screen monitors by large-size screen monitors display module Connect device, final matching is output to real multiple LED displays, realize Virtual Space content with the movement of real user position and Renewal.
Specific embodiment
Server used in system example is Hewlett-Packard HP Z440, and it is that HTCVIVE-Lighthouse is infrared that tracking, which uses, Stroboscopic tracking system, it includes infrared coordinate tracker and infrared track sensor, and LED large-size screen monitors are Leah moral P1.9 spacing, long High two meters for three meters, large screen splicing device is Leah moral MVC-2-203, and virtual real-time rendering engine is UNITY5.40.
Flow is embodied, the first step completes the threedimensional model needed for client in design software 3DMAX, this Model, which is imported into UNITY, carries out secondary real-time edition, and the large-size screen monitors area and coordinate of corresponding true environment carry out corresponding face Piece is set, and the area and coordinate of LED display are configured in software by user in advance, i.e., by the dough sheet generation in threedimensional model For real LED display, orthogonal imaging is set to find a view camera using orthogonal camera image-forming principle(Set in the three-dimensional model by software Meter, instead of real user threedimensional model vision)The LED dough sheets of face related direction, the virtual space image taken is defeated Go out to large screen splicing device, large screen splicing device and correspond to the last transmission of adjustment adaptation that the multiple LED large-size screen monitors of different resolution carry out picture Shown on to real screen.Tracking module receives the location parameter of coordinate tracking device, is sent to server, server receives number Virtual scene is set to carry out the mobile change of corresponding data after, orthogonal camera exports the picture photographed to true environment in real time LED large-size screen monitors, simulative display goes out Virtual Space sense true to nature.
The server can be arranged at LED display in itself, i.e., LED display can include multiple display screens, and take Device function setting of being engaged in is in some display screen, the main panel as LED display.Specifically, the utility model provides one The LED display of kind three-dimensional imaging, the LED display are used to show virtual three dimensional image, and the LED display includes a master control System screen includes with multiple from display screen, wherein main panel:Tracking module, large-size screen monitors display module, virtual scene module, large-size screen monitors are spelled Connect device;
The main panel is connected with tracking transducer, coordinate tracking device respectively;
The position of the coordinate tracking device tracking user, and the user location parameter is sent to the main panel;
The main panel handles the user location parameter using the tracking module, obtains user in true environment Coordinate position data, and be converted to the virtual spatial location data in virtual three-dimensional space;The virtual scene module render and Export the model of virtual three-dimensional space;The large-size screen monitors display module needs what is shown according to the virtual spatial location data output Virtual Space content;The large screen splicing device receives the Virtual Space content, match including the main panel with It is multiple from display screen.
Preferred embodiment of the present utility model is the foregoing is only, is not intended to limit protection model of the present utility model Enclose.All any modification, equivalent substitution and improvement within the spirit and principles of the utility model, made etc., all should be protected Within the scope of protection of the utility model.

Claims (5)

1. a kind of LED display of three-dimensional imaging, the LED display is used to show virtual three dimensional image, it is characterised in that:Should LED display includes a main panel and multiple from display screen;
The main panel is connected with coordinate tracking device, obtains customer location;
The main panel utilizes the user location parameter, obtains the virtual spatial location data in virtual three-dimensional space, and The Virtual Space content for needing to show according to the virtual spatial location data output;
One large screen splicing device and the main panel, it is multiple be connected from display screen, the large screen splicing device receives the main control Shield output the Virtual Space content based on user's virtual spatial location, matched including the main panel and it is multiple from Display screen.
2. LED display according to claim 1, it is characterised in that the content shown on the LED display is according to adopting The user location parameter real-time update of collection.
3. LED display according to claim 1, it is characterised in that one main panel and multiple from display screen Comprise at least:In face of the front display screen of user, left side display screen, right side display screen at left and right sides of user, Yi Jiwei Top display screen at the top of user, the bottom display screen positioned at user bottom surface.
4. LED display according to claim 1, it is characterised in that according to the position of the LED display, number with And area includes Virtual Space content adjustment adaptation in the main panel and multiple from display screen.
5. LED display according to claim 1, it is characterised in that the Virtual Space content for needing to show includes Image and/or video.
CN201621096230.3U 2016-09-29 2016-09-29 A kind of LED display of three-dimensional imaging Active CN206892844U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201621096230.3U CN206892844U (en) 2016-09-29 2016-09-29 A kind of LED display of three-dimensional imaging

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201621096230.3U CN206892844U (en) 2016-09-29 2016-09-29 A kind of LED display of three-dimensional imaging

Publications (1)

Publication Number Publication Date
CN206892844U true CN206892844U (en) 2018-01-16

Family

ID=61327427

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201621096230.3U Active CN206892844U (en) 2016-09-29 2016-09-29 A kind of LED display of three-dimensional imaging

Country Status (1)

Country Link
CN (1) CN206892844U (en)

Similar Documents

Publication Publication Date Title
CN106454311B (en) A kind of LED 3-D imaging system and method
US6753828B2 (en) System and method for calibrating a stereo optical see-through head-mounted display system for augmented reality
CN106951074B (en) method and system for realizing virtual touch calibration
US4884219A (en) Method and apparatus for the perception of computer-generated imagery
US20230269358A1 (en) Methods and systems for multiple access to a single hardware data stream
US20020105484A1 (en) System and method for calibrating a monocular optical see-through head-mounted display system for augmented reality
US20130063560A1 (en) Combined stereo camera and stereo display interaction
US20060250392A1 (en) Three dimensional horizontal perspective workstation
US20160249043A1 (en) Three dimensional (3d) glasses, 3d display system and 3d display method
KR101822471B1 (en) Virtual Reality System using of Mixed reality, and thereof implementation method
US10235806B2 (en) Depth and chroma information based coalescence of real world and virtual world images
CN104536579A (en) Interactive three-dimensional scenery and digital image high-speed fusing processing system and method
KR102147430B1 (en) virtual multi-touch interaction apparatus and method
CN104765156B (en) A kind of three-dimensional display apparatus and 3 D displaying method
EP3413165A1 (en) Wearable system gesture control method and wearable system
CN104714646A (en) 3D virtual touch control man-machine interaction method based on stereoscopic vision
CN104216533B (en) A kind of wear-type virtual reality display based on DirectX9
CN104637080A (en) Three-dimensional drawing system and three-dimensional drawing method based on human-computer interaction
CN211349296U (en) Interactive installation is caught in location based on CAVE projection
US10296098B2 (en) Input/output device, input/output program, and input/output method
CN106919928A (en) gesture recognition system, method and display device
CN206892844U (en) A kind of LED display of three-dimensional imaging
CN114898440A (en) Driving method of liquid crystal grating, display device and display method thereof
US20170302904A1 (en) Input/output device, input/output program, and input/output method
JPH0628452A (en) Three-dimensional image processor

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: 101117 No. 11-1-C6 Tonghu Street, Tongzhou District, Beijing

Patentee after: BEIJING DEHUO TECHNOLOGY Co.,Ltd.

Address before: 101117 No. 11-1-C6 Tonghu Street, Tongzhou District, Beijing

Patentee before: DAHOOO NEW MEDIA TECHNOLOGY CO.,LTD.

Address after: 101117 No. 11-1-C6 Tonghu Street, Tongzhou District, Beijing

Patentee after: DAHOOO NEW MEDIA TECHNOLOGY CO.,LTD.

Address before: 101117 No. 11-1-C6 Tonghu Street, Tongzhou District, Beijing

Patentee before: BEIJING LEYARD VIDEO TECHNOLOGY Co.,Ltd.

CP01 Change in the name or title of a patent holder