CN106408667B - Customization practical method based on optical label - Google Patents

Customization practical method based on optical label Download PDF

Info

Publication number
CN106408667B
CN106408667B CN201610789865.XA CN201610789865A CN106408667B CN 106408667 B CN106408667 B CN 106408667B CN 201610789865 A CN201610789865 A CN 201610789865A CN 106408667 B CN106408667 B CN 106408667B
Authority
CN
China
Prior art keywords
optical label
background image
frame
customization
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610789865.XA
Other languages
Chinese (zh)
Other versions
CN106408667A (en
Inventor
王晓东
方俊
李江亮
苏爱民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Foreign Information Technology Co., Ltd.
Original Assignee
Xi'an Small Photon Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xi'an Small Photon Network Technology Co Ltd filed Critical Xi'an Small Photon Network Technology Co Ltd
Priority to CN201610789865.XA priority Critical patent/CN106408667B/en
Publication of CN106408667A publication Critical patent/CN106408667A/en
Application granted granted Critical
Publication of CN106408667B publication Critical patent/CN106408667B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Abstract

The present invention provides a kind of customization practical method based on optical label, can enhance the actual environment that user likes the understanding of optical label or building user, be customized to optical label scene real world images, form personalized VR effect.It includes the following steps that step 1, the scene customized to needs is scanned, and obtains demarcating optical label in scene by the difference of arbitrary neighborhood two field pictures;Step 2, the transmitting content of calibration optical label is parsed, determines the coordinate position of background image range, the entity object for including in background image and calibration optical label in background image;Step 3, according to coordinate position of the calibration optical label in background image, to background image range and entity object redrawn respectively and/or animation replacement after synthesis obtain the scene of customization reality.It is simple and efficient, accurate quickly scalability is strong, and access property is high.

Description

Customization practical method based on optical label
Technical field
The present invention relates to the methods of customization reality, specially the customization practical method based on optical label.
Background technique
Optical label is arranged in entity space, is combined together with entity background.On the one hand, this to be bound together mutually Relationship reflects the correlation of the two, also defines understanding of the user to optical label but then.Typically for phase in scene Same optical label, different users are also likely to be present different understanding or using expectations, habit, they prefer to these cursors Label be placed in oneself like or the background be accustomed in.Such as: for the optical label in scene, adult prefers to them with classified index Mode show, and children prefer to that they are shown in the form of the cartoon world, game lover then wishes that scene is it Known animation form, etc..With the fast development of augmented reality (AR), in the information discharged according to optical label Hold, is possibly realized for the method that user carries out the customization of scene reality and necessary.But the method for reality is customized in the prior art very Complexity, the equipment requirement needed is high, is unable to get large-scale universal.
Summary of the invention
Aiming at the problems existing in the prior art, the present invention provides a kind of customization practical method based on optical label, can The actual environment that enhancing user likes the understanding of optical label or building user, is customized optical label scene real world images, Form personalized VR effect.
The present invention is to be achieved through the following technical solutions:
Customization practical method based on optical label, includes the following steps,
Step 1, the scene customized to needs is scanned, and obtains scene acceptance of the bid by the difference of arbitrary neighborhood two field pictures Determine optical label;
Step 2, the transmitting content of calibration optical label is parsed, determines background image range, includes in background image Coordinate position in background image of entity object and calibration optical label;
Step 3, the coordinate position according to calibration optical label in background image, to background image range and entity object point Do not redrawn and/or animation replacement after synthesis obtain customization reality scene.
Preferably, specific step is as follows for step 1, carries out continuous acquisition multiple image to the scene that needs customize, adjacent Acquisition duration between frame identifies the time interval flashed twice not less than dynamic positioning on calibration optical label;To arbitrary neighborhood Two frames, which make the difference, gets differential chart, and the positioning identifier of calibration optical label is found in differential chart.
Further, specific step is as follows for step 2,
Data, which carry out Decoding Analysis frame by frame, to be shown to calibration optical label, nominal light label substance sequence is obtained, obtains background Coordinate position of the entity object and calibration optical label for including in image range, background image in background image.
Further, include the following steps when showing that data carry out Decoding Analysis frame by frame to calibration optical label,
Duration will be acquired between every frame to be adjusted to be equal to the inverse of calibration optical label flicker frequency;Acquisition calibration optical label is complete The frame image of portion's a cycle obtains signal element group by the positioning identifier found, reads each list in signal element group respectively The digital signal value of member then obtains the nominal light tag data sequence of corresponding frame, and all data sequences are arranged by frame image sequence The content array of all frame images in a cycle is obtained after column;It further include the coordinate position for demarcating optical label in content array.
Preferably, the coordinate position of optical label be optical label installation when synchro measure and default to calibration optical label biography It passs in content.
Preferably, in step 3, when being redrawn respectively to background image range and entity object, include the following steps,
According to coordinate position of the calibration optical label in background image, in the software that background image positioning and outdoor scene are provided It is matched, after obtaining the matching image of background image and entity object, carries out edge processing, color rendering and illumination effect respectively Fruit handles the background image and entity object redrawn.
Preferably, it in step 3, when carrying out animation replacement respectively to background image range and entity object, including walks as follows Suddenly,
According to the annotation definition of background image and entity object, similarity mode is carried out in animation subject matter library, obtains phase Like to highest animation model, background image and entity object are replaced;According to calibration optical label in background image Coordinate position synthesizes replaced background image and entity object.
Further, similarity mode is carried out according to the following formula,
Wherein, content array C={ P, the X of optical label are demarcated0,…Xi,…,Xr, P is the coordinate position for demarcating optical label, XiFor the nominal light tag data sequence of the i-th frame;
Animation subject matter library M={ m0,…mi..., animation subject matter mi={ l0,…lj,…,lk, j is natural number, ljIt is right The keyword code of animation subject matter;
sim(Xi, lj) it is nominal light tag data sequence XiWith the keyword code l of animation subject matterjSimilarity value.
Compared with prior art, the invention has the following beneficial technical effects:
The present invention utilizes the large data capacity transmission characteristic of optical label, and the scene that can be demarcated carries out digitlization reading And matching, it is configured by the server of corresponding virtual scene, background image and entity object can be converted to self customization Virtual reality, so as to meet the needs of different people are to information extraction and comprehension of information, can be based on optical label In the VR image displayspatial of positioning, according to the hobby and habit of user, customization generates the desired streetscape of user, to improve Understanding and acceptance level of the user to optical label and ambient enviroment, so that user is more easily understood, held using optical label The content of service is carried, or realizes the virtualized access of actual environment based on optical label.Its function and effect achieved are existing Other customization practical methods institute it is inaccessiable, by implement based on optical label customization reality scene method, everyone The optical label scene image observed is different, is more intended to personalization, is simple and efficient, and accurate quick, scalability is strong, connects Entering property height.
Detailed description of the invention
Optical label sample described in Fig. 1 present example.
Customization practical method schematic diagram of a scenario described in Fig. 2 present example based on optical label.
Image before customization described in Fig. 3 a present example.
Image after customization described in Fig. 3 b present example.
Customization practical method flow chart described in Fig. 4 present example based on optical label.
In figure: 1 identifies equipment for optical label, and 2 be real scene locating for optical label, and 3 be calibration optical label, and 4 be customization Real processing server, 5 be the reality scene after customization, 6 contents of object being inserted into the scene.
Specific embodiment
Below with reference to specific embodiment, the present invention is described in further detail, it is described be explanation of the invention and It is not to limit.
Above-mentioned optical label sample is as shown in Figure 1.Optical label includes signal element (cell) group (or being referred to as " data bit ") With positioning identifier (or referred to as " flag bit ") two parts, wherein positioning identifier be three biggish rectangle frames in Fig. 1 (three this Rectangle frame is known as " one group of position identifiers "), positioning identifier is flashed with certain Frequency Synchronization under working condition, passes through image difference The method divided can be obtained quickly and be detected by picture pick-up device, and then the position of signal element can be determined by positioning identifier, To carry out data identification and read;Black and white rectangle of the signal element between positioning identifier, multiple signal elements constitute one group, lead to Regular signal unit forms the array of 5 × 5 (being not limited to), and each signal element indicates " 0 " or " 1 " of digital signal, entire signal (side length of marker is twice of data bit side length to the signal data sequence of matrix one frame of composition of unit group composition here, more Add convenient for positioning), in order to increase the data space of signal element expression, each signal element can also be according to scheduled under working condition Program is flashed, so that more signal contents be shown by multiframe.At this moment need to provide in multiframe a start frame/ End of identification frame, for demarcating the beginning/end position of one complete cycle of multiframe, which is set as a spy Different data combination, such as: full 0 or the complete 1 or any not specific combination different with the information of actual capabilities statement.
The above-mentioned customization practical method based on optical label, as shown in figure 4, its process is:
Step 1: carrying out optical label scanning, finds the calibration optical label in target scene;The method of nominal light tag scan As follows: optical label identifies that equipment 1 according to continuous acquisition multiframe label image, is denoted as: f0, f1..., fr, acquisition between consecutive frame Duration is not less than the time interval that dynamic positioning identifier flashes twice;Adjacent any two frame is made the difference and gets differential chart, The positioning identifier of calibration optical label 3 is found in differential chart, all image-regions in addition to being identified as optical label picture position are recognized It is set to background image, is denoted as B;Further obtain coordinate position of the calibration optical label 3 in whole image;
Step 2: interpreting calibration 3 content of optical label, shows that data carry out Decoding Analysis frame by frame to calibration optical label 3, obtains Optical label content array;The method of above-mentioned Decoding Analysis frame by frame is as follows: the picture-taken frequency of optical label identification equipment 1 is turned up To optical label data acquisition state, i.e., the acquisition duration between every frame is equal to the inverse of the flicker frequency of optical label;Demarcate cursor Label 3 are r with the period, and circulation broadcasts content, it is assumed that are t at the time of at this time1, the r frame tagging image of whole a cycles of acquisition, It is denoted as: f0', f1' ..., fr', signal element group is obtained using positioning identifier obtained above, by arbitrary frame fi' image signal Unit group obtains the digital signal value of every unit, then obtains f according to dividing elements, 0≤i≤ri' frame nominal light label data Sequence is denoted as Xi;, method obtains f according to this0', f1' ..., fr' shown by data, be combined into complete calibration optical label 3 and put Send content C={ P, X0,…Xi,…,Xr, P is the coordinate position for demarcating optical label, X hereiniIt is the nominal light number of tags of the i-th frame According to sequence;Coordinate position P has just been measured and has been preset in when demarcating optical label 3 and installing in calibration optical label 3;
Step 3: according to nominal light label substance, real customization is carried out, the method for customization reality has:
Method one carries out streetscape and redraws, and is primarily based on where coordinate position P navigates to calibration optical label 3 and provides streetscape clothes Corresponding geographical location in the software of business, street view service software can be Google or Baidu herein, exist according to geographical location Calibration 3 place streetscape S of optical label is searched in streetscape Image Database;Streetscape is redrawn according to the hobby of user, redraws method Include:
Edge processing: the edge of streetscape S is obtained using the method for any image procossing, edge is carried out at reduction or reinforcing Reason;
Color rendering: streetscape S is rendered with various colors;
Lighting effect: increase lighting effect for streetscape S;
Streetscape S ' the replacement image B redrawn;
Method two renders calibration optical label 3, interprets first to content C, according to content in animation subject matter library Middle animation matching primitives, obtain best animation;Herein, animation subject matter library is denoted as M, M={ m0,…mi..., miArbitrarily to move Draw subject matter;M hereini={ l0,…lj,…,lk, j is natural number, ljIt is that animation is being set for the keyword code to animation subject matter What timing was translated after being indicated by designer to its content keyword carried out;Above-mentioned animation matching similarity SIM (M, C) calculates such as Under:
Herein, sim (Xi, lj) it is nominal light tag data sequence XiThe keyword code l of animation subject matterjSimilarity value, It is calculated result value that obtain has most like animation in animation subject matter library M with the content array C of calibration optical label, and then To the correspondence animated image L of calibration optical label 3;
Corresponding animated image L: being added on B by step 4 according to the position coordinates of background image and calibration optical label, complete It customizes, finishes at reality.As shown in figure 3, real streetscape is replaced with cartoon environment, child user is made to be more easier to receive.
The present invention can solve in the VR image displayspatial positioned based on optical label, according to the hobby and habit of user Used, customization generates the desired streetscape of user, so that understanding and acceptance level of the user to optical label and ambient enviroment are improved, So that user more easily understands, using the content of optical label institute carrying service, or actual environment is realized based on optical label Virtualized access.The above method is the problem of current other labels (such as two dimensional code, bar code) method is not directed to.
It can be realized following demand in practical applications.
Embodiment 1
User U is by optical label technology roaming access shopping street, in order to enable the VR equipment that optical label is used at it It is highlighted in (Google glasses), needs to carry out optical label real enhancing, therefore call this method.User U first is used Image in face of image capture device acquisition;By having an optical label in face of optical label scanning discovery, a fast food restaurants are identified; After identification, this method shows that content is analyzed frame by frame to optical label, it is found that the content of optical label transmitting is showing that this is fast The set meal newly released in meal shop, including hamburger, French fries, cola, ice cream etc.;Have one by matching discovery in animation subject matter library Section cartoon figure enjoys the animated video of the set meal;Then the video is inserted into the display video of user, user immediately appreciates that The push content of the optical label, and joyfully go to and consumed.
Embodiment 2
For user U by optical label technology roaming access shopping street, U is a battle game fan, in order to enable user U exists There is the feeling for staying game in actual life, streetscape is customized using this method.User U first uses image capture device Image in face of acquisition;By having an optical label, parking lot at mark one in face of optical label scanning discovery;After identification, this method Content is analyzed frame by frame to be shown to optical label, it is found that the content of optical label transmitting is the internal structure chart in parking lot;We Method carries out the reasonable processing of CS game figure to the internal structure chart;Then user U is observed by VR glasses, is exactly to swim around discovery Play scene, therefore while one side is advanced, battle game is unfolded with companion, sufficiently experiences the pleasant sensation fought in reality.

Claims (5)

1. the customization practical method based on optical label, which is characterized in that include the following steps,
Step 1, the scene customized to needs is scanned, and obtains nominal light in scene by the difference of arbitrary neighborhood two field pictures Label;Specifically, carrying out continuous acquisition multiple image to the scene that needs customize, the acquisition duration between consecutive frame is not less than mark Determine dynamic positioning on optical label and identifies the time interval flashed twice;Two frames of arbitrary neighborhood are made the difference and get differential chart, The positioning identifier of calibration optical label is found in differential chart;
Step 2, the transmitting content of calibration optical label is parsed, the reality for determining background image range, including in background image The coordinate position of body object and calibration optical label in background image;
Step 3, according to coordinate position of the calibration optical label in background image, to background image range and entity object respectively into Row redraw and/or animation replacement after synthesis obtain customization reality scene;
In step 3, when carrying out animation replacement respectively to background image range and entity object, include the following steps,
According to the annotation definition of background image and entity object, similarity mode is carried out in animation subject matter library, it is similar right to obtain Highest animation model, is replaced background image and entity object;According to coordinate of the calibration optical label in background image Position synthesizes replaced background image and entity object;
Similarity mode is carried out according to the following formula,
Wherein, SIM (mi, C) and it is animation matching similarity, demarcate content array C={ P, the X of optical label0,…Xi,…,Xr, P is Demarcate the coordinate position of optical label, XiFor the nominal light tag data sequence of the i-th frame;
Animation subject matter library M={ m0,…mi..., animation subject matter mi={ l0,…lj,…,lk, j is natural number, ljTo be inscribed to animation The keyword code of material;
sim(Xi,lj) it is nominal light tag data sequence XiWith the keyword code l of animation subject matterjSimilarity value.
2. the customization practical method according to claim 1 based on optical label, which is characterized in that the specific steps of step 2 It is as follows,
Data, which carry out Decoding Analysis frame by frame, to be shown to calibration optical label, nominal light label substance sequence is obtained, obtains background image Coordinate position of the entity object and calibration optical label for including in range, background image in background image.
3. the customization practical method according to claim 2 based on optical label, which is characterized in that shown to calibration optical label Data carry out frame by frame Decoding Analysis when include the following steps,
Duration will be acquired between every frame to be adjusted to be equal to the inverse of calibration optical label flicker frequency;Acquisition calibration optical label whole one The frame image in a period obtains signal element group by the positioning identifier found, reads each unit in signal element group respectively Digital signal value then obtains the nominal light tag data sequence of corresponding frame, after all data sequences are arranged by frame image sequence Obtain the content array of all frame images in a cycle;It further include the coordinate position for demarcating optical label in content array.
4. the customization practical method according to claim 1 based on optical label, which is characterized in that the coordinate position of optical label Be optical label installation when synchro measure and default to calibration optical label transmitting content in.
5. the customization practical method according to claim 1 based on optical label, which is characterized in that in step 3, to Background When being redrawn respectively as range and entity object, include the following steps,
According to coordinate position of the calibration optical label in background image, carried out in the software that background image positioning and outdoor scene are provided Matching after obtaining the matching image of background image and entity object, carries out at edge processing, color rendering and lighting effect respectively Manage the background image and entity object redrawn.
CN201610789865.XA 2016-08-30 2016-08-30 Customization practical method based on optical label Active CN106408667B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610789865.XA CN106408667B (en) 2016-08-30 2016-08-30 Customization practical method based on optical label

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610789865.XA CN106408667B (en) 2016-08-30 2016-08-30 Customization practical method based on optical label

Publications (2)

Publication Number Publication Date
CN106408667A CN106408667A (en) 2017-02-15
CN106408667B true CN106408667B (en) 2019-03-05

Family

ID=58000732

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610789865.XA Active CN106408667B (en) 2016-08-30 2016-08-30 Customization practical method based on optical label

Country Status (1)

Country Link
CN (1) CN106408667B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106980847B (en) * 2017-05-05 2023-09-01 武汉虚世科技有限公司 AR game and activity method and system based on ARMark generation and sharing
CN107885838B (en) * 2017-11-09 2021-12-21 陕西外号信息技术有限公司 Optical label fault detection and positioning method and system based on user data
CN107886017B (en) * 2017-11-09 2021-02-19 陕西外号信息技术有限公司 Method and device for reading optical label sequence
CN110471580B (en) * 2018-05-09 2021-06-15 北京外号信息技术有限公司 Information equipment interaction method and system based on optical labels
CN112561953A (en) * 2019-09-26 2021-03-26 北京外号信息技术有限公司 Method and system for target recognition and tracking in real scenes
CN112561952A (en) * 2019-09-26 2021-03-26 北京外号信息技术有限公司 Method and system for setting renderable virtual objects for a target
TWI785332B (en) * 2020-05-14 2022-12-01 光時代科技有限公司 Three-dimensional reconstruction system based on optical label

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103049729A (en) * 2012-12-30 2013-04-17 成都理想境界科技有限公司 Method, system and terminal for augmenting reality based on two-dimension code
CN103049728A (en) * 2012-12-30 2013-04-17 成都理想境界科技有限公司 Method, system and terminal for augmenting reality based on two-dimension code
CN103366610A (en) * 2013-07-03 2013-10-23 熊剑明 Augmented-reality-based three-dimensional interactive learning system and method
CN103530594A (en) * 2013-11-05 2014-01-22 深圳市幻实科技有限公司 Method, system and terminal for providing augmented reality
CN104436634A (en) * 2014-11-19 2015-03-25 重庆邮电大学 Real person shooting game system adopting immersion type virtual reality technology and implementation method of real person shooting game system
CN104778654A (en) * 2015-03-10 2015-07-15 湖北大学 Intangible cultural heritage digital display system and method thereof
CN105718840A (en) * 2016-01-27 2016-06-29 西安小光子网络科技有限公司 Optical label based information interaction system and method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100542370B1 (en) * 2004-07-30 2006-01-11 한양대학교 산학협력단 Vision-based augmented reality system using invisible marker
US20120320216A1 (en) * 2011-06-14 2012-12-20 Disney Enterprises, Inc. Method and System for Object Recognition, Authentication, and Tracking with Infrared Distortion Caused by Objects for Augmented Reality

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103049729A (en) * 2012-12-30 2013-04-17 成都理想境界科技有限公司 Method, system and terminal for augmenting reality based on two-dimension code
CN103049728A (en) * 2012-12-30 2013-04-17 成都理想境界科技有限公司 Method, system and terminal for augmenting reality based on two-dimension code
CN103366610A (en) * 2013-07-03 2013-10-23 熊剑明 Augmented-reality-based three-dimensional interactive learning system and method
CN103530594A (en) * 2013-11-05 2014-01-22 深圳市幻实科技有限公司 Method, system and terminal for providing augmented reality
CN104436634A (en) * 2014-11-19 2015-03-25 重庆邮电大学 Real person shooting game system adopting immersion type virtual reality technology and implementation method of real person shooting game system
CN104778654A (en) * 2015-03-10 2015-07-15 湖北大学 Intangible cultural heritage digital display system and method thereof
CN105718840A (en) * 2016-01-27 2016-06-29 西安小光子网络科技有限公司 Optical label based information interaction system and method

Also Published As

Publication number Publication date
CN106408667A (en) 2017-02-15

Similar Documents

Publication Publication Date Title
CN106408667B (en) Customization practical method based on optical label
KR102555443B1 (en) Matching content to a spatial 3d environment
CN103415849B (en) For marking the Computerized method and equipment of at least one feature of view image
Grasset et al. Image-driven view management for augmented reality browsers
KR102118000B1 (en) Target target display method and device
JP4032776B2 (en) Mixed reality display apparatus and method, storage medium, and computer program
CN110019600B (en) Map processing method, map processing device and storage medium
Simonetti Ibañez et al. Vuforia v1. 5 SDK: Analysis and evaluation of capabilities
US8493380B2 (en) Method and system for constructing virtual space
CN111638796A (en) Virtual object display method and device, computer equipment and storage medium
US20080074424A1 (en) Digitally-augmented reality video system
CN103530594A (en) Method, system and terminal for providing augmented reality
AU2013273829A1 (en) Time constrained augmented reality
JP2015001875A (en) Image processing apparatus, image processing method, program, print medium, and print-media set
KR20150104167A (en) Personal information communicator
Guedes et al. Enhancing interaction and accessibility in museums and exhibitions with augmented reality and screen readers
CN106130886A (en) The methods of exhibiting of extension information and device
KR102262521B1 (en) Integrated rendering method for various extended reality modes and device having thereof
CN104837065B (en) Two-dimensional barcode information sharing method and system between television terminal and mobile terminal
Fan et al. HiFi: hi de and fi nd digital content associated with physical objects via coded light
CN109302523A (en) A kind of mobile phone games stage division
KR20190028046A (en) Historical studies method and system using augmented reality
KR102175519B1 (en) Apparatus for providing virtual contents to augment usability of real object and method using the same
CN113709584A (en) Video dividing method, device, server, terminal and storage medium
KR102443049B1 (en) Electric apparatus and operation method thereof

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20190408

Address after: Room 801, No. 2 Building, 15 Ronghua South Road, Beijing Economic and Technological Development Zone, 100176

Patentee after: Beijing Foreign Information Technology Co., Ltd.

Address before: 710075 Room 301, Block A, Innovation Information Building, Xi'an Software Park, No. 2 Science and Technology Road, Xi'an High-tech Zone, Shaanxi Province

Patentee before: XI'AN SMALL PHOTON NETWORK TECHNOLOGY CO., LTD.