CN110955326B - Information data communication system and method thereof - Google Patents

Information data communication system and method thereof Download PDF

Info

Publication number
CN110955326B
CN110955326B CN201811125155.2A CN201811125155A CN110955326B CN 110955326 B CN110955326 B CN 110955326B CN 201811125155 A CN201811125155 A CN 201811125155A CN 110955326 B CN110955326 B CN 110955326B
Authority
CN
China
Prior art keywords
information data
user
content
file
feedback
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811125155.2A
Other languages
Chinese (zh)
Other versions
CN110955326A (en
Inventor
陈铭毅
张益嘉
汤一雄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yichi Jingcaizitong Co ltd
Zhang Yijia
Original Assignee
Yichi Jingcaizitong Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yichi Jingcaizitong Co ltd filed Critical Yichi Jingcaizitong Co ltd
Priority to CN202310472854.9A priority Critical patent/CN116560502A/en
Priority to CN201811125155.2A priority patent/CN110955326B/en
Publication of CN110955326A publication Critical patent/CN110955326A/en
Application granted granted Critical
Publication of CN110955326B publication Critical patent/CN110955326B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

One embodiment of the present disclosure provides an information data communication system, the communication system comprising: a processing device; content transition (content metamorphosis) means; a terminal device. When the terminal device detects that a user enters the range of the communication system, the communication system is configured to: acquiring a plurality of files from an external network according to the subscribed information data and the biological characteristics of the user; adding a plurality of content tags to each of the plurality of files; writing a script of the information data according to content tags related to the information data in a plurality of content tags of each of the plurality of files, wherein the script comprises a plurality of feedback tags, and each of the plurality of feedback tags comprises a plurality of corresponding operations and a plurality of user feedback levels; according to the script, instructing the terminal device to present at least one of the plurality of files to the user; and instructing the terminal device to sense feedback of the user, and in response to the feedback of the user, at least one of performing the corresponding operation and determining at least one of the user feedback levels.

Description

Information data communication system and method thereof
Technical Field
The present disclosure relates generally to a communication system and method, and more particularly to a communication system and method for communicating specific information content in conjunction with a plurality of devices, and for receiving feedback from a user to guide the user to receive the information content.
Background
Electronic display devices (including televisions, computer displays, electronic billboards, mobile device screens, etc.) and audio playback devices have been widely used in everyday life to communicate information. Electronic display devices and audio playback apparatus for use as billboards are also used in workplaces, homes and residences, commercial facilities, and outdoor locations including large signs, billboards, stadiums, and public areas.
Electronic display devices and audio players used today as billboards are usually fixed content or play back previously stored content in a set sequence in a loop. The electronic display device and the audio playing device cannot change the displayed content according to feedback of the user, and cannot be combined with other devices (such as a 3D projector device, a 3D printing device, a projector device, an audio playing device, an audio receiving device, a biosensor device, a camera device, a light control device, an environmental parameter sensor device, an odor control device and an environmental control device) to enable the user to receive information more easily.
Accordingly, there is a need for a communication system and method that can cooperate with a variety of devices and that allows a user to more easily receive a particular message data content transmission based on the user's feedback.
Disclosure of Invention
One embodiment of the present disclosure provides an information data communication system, the communication system comprising: a processing device; content transition (content metamorphosis) means; a terminal device. When the terminal device detects that a user enters the range of the communication system, the communication system is configured to: acquiring a plurality of files from an external network according to the subscribed information data and the biological characteristics of the user; adding a plurality of content tags to each of the plurality of files; writing a script of the information data according to content tags related to the information data in a plurality of content tags of each of the plurality of files, wherein the script comprises a plurality of feedback tags, and each of the plurality of feedback tags comprises a plurality of corresponding operations and a plurality of user feedback levels; according to the script, instructing the terminal device to present at least one of the plurality of files to the user; and instructing the terminal device to sense feedback of the user, and in response to the feedback of the user, at least one of performing the corresponding operation and determining at least one of the user feedback levels.
Another embodiment of the present disclosure provides a method of information data content communication, the method comprising: acquiring a plurality of files from an external network according to the subscribed information data and the biological characteristics of the user; adding a plurality of content tags to each of the plurality of files, the plurality of content tags being associated with the information data; writing a script of the information data according to a plurality of content tags of each of the plurality of files, the script comprising a plurality of feedback tags, each of the plurality of feedback tags comprising a plurality of corresponding operations and a plurality of user feedback levels; according to the script, a terminal device is instructed to present at least one of the files to the user; and instructing the terminal device to sense feedback of the user and perform at least one of the corresponding operations according to the feedback of the user and to determine at least one of the user feedback levels.
Yet another embodiment of the present disclosure provides an information data communication system, the communication system comprising: a processing device; a content transition device; a terminal device. The processing device is configured to perform the following operations: acquiring a plurality of files from an external network according to the reserved information data; adding a plurality of content tags to each of the plurality of files, the plurality of content tags being associated with the information data; writing a script of the information data according to a plurality of content tags of each of the plurality of files; and transmitting the plurality of files, the plurality of content tags and the script to a content transformation device. The content transformation device is configured to perform the following operations: converting a file associated with the terminal device from the plurality of files into a first transition file corresponding to the terminal device; labeling the first transition file with a corresponding first atmosphere label; and transmitting the first transition file and the first atmosphere tag to the terminal device to instruct the terminal device to present the transition file to a user.
Drawings
The following detailed description of the present disclosure may be better understood when read in conjunction with the accompanying drawings. It should be noted that the features in the drawings may not be drawn to scale according to standard practice within the industry. In fact, the dimensions of the features in the drawings may be arbitrarily increased or reduced to improve the understanding of the present disclosure.
Fig. 1 is a schematic diagram of a communication system according to some embodiments of the present disclosure.
Fig. 2 is a schematic diagram of an information data communication system according to some embodiments of the present disclosure.
Fig. 3 is a schematic diagram of an architectural model of an information data communication system according to some embodiments of the present disclosure.
Fig. 4 is a schematic diagram of an information data communication system according to some embodiments of the present disclosure.
Fig. 5 is a diagram of a network topology of an information data communication system according to some embodiments of the present disclosure.
Fig. 6 is a diagram of a message packet according to some embodiments of the present disclosure.
FIG. 7 is a schematic diagram of a data record and content tag according to some embodiments of the present disclosure.
FIG. 8 is a schematic diagram of an electronic file according to some embodiments of the present disclosure.
Fig. 9 is a schematic diagram of a network topology according to some embodiments of the present disclosure.
Fig. 10 is a schematic diagram of a network topology according to some embodiments of the present disclosure.
Fig. 11 is a schematic diagram of a network topology according to some embodiments of the present disclosure.
Fig. 12 is a schematic diagram of a feedback tag according to some embodiments of the present disclosure.
Fig. 13 is an algorithm state diagram according to some embodiments of the present disclosure.
Fig. 14 is an algorithm state diagram according to some embodiments of the present disclosure.
Fig. 15 is an algorithm state diagram according to some embodiments of the present disclosure.
Fig. 16 is a schematic diagram of an application of an information data communication system in accordance with certain embodiments of the present disclosure.
Fig. 17 is a schematic diagram of an application of an information data communication system according to some embodiments of the present disclosure.
Fig. 18 is a schematic diagram of an application of an information data communication system in accordance with certain embodiments of the present disclosure.
Fig. 19 is a schematic diagram of a processing device, a content transformation device, or a terminal device in an information data communication system according to some embodiments of the present disclosure.
FIG. 20 is a flow chart of a data record digitizing process according to some embodiments of the present disclosure.
FIG. 21 is a flow chart according to some embodiments of the present disclosure.
Fig. 22 is a state diagram of an information data communication system according to some embodiments of the present disclosure.
FIG. 23 is a data processing flow diagram according to some embodiments of the present disclosure.
FIG. 24 is an atmosphere tag encoding scheme according to some embodiments of the present disclosure.
FIG. 25 is a flow chart of a process for transferring an electronic file according to some embodiments of the present disclosure.
FIG. 26 is a diagram of data flow in different modes of an application layer according to some embodiments of the present disclosure.
Fig. 27 is a schematic diagram of an internet of things according to some embodiments of the present disclosure.
Detailed Description
The use of the same reference symbols in the drawings, depicting various exemplary embodiments of the disclosure, indicates identical components for ease of reading.
The present disclosure provides a system and method for communicating information data that can cooperate with a variety of devices and make it easier for a user to receive information based on feedback from the user.
The present disclosure provides an information data communication system, which includes a cloud device with high computing capability in the internet of things, which can digitize various information contents into data files of different formats, or can collect data files of different formats about various information contents from an external or internal network. The content of the feature values, keywords, etc. in such data files may be labeled as content tags. The processed data files with different formats can be transmitted to the gateway with routing capability through the Internet of things and then forwarded to a plurality of terminal devices with different functions. An information data content delivery communication system includes a processing device, a content transformation (content metamorphosis) device, and one or more terminal devices. The information data communication system may recognize images of living beings (e.g., a user) entering the unit and identify their biological characteristics, which may include skin color, pupil color, height, weight, gender, age, etc. The information data transmission communication system will play the content according to the information data of the subscription and the biological characteristics of the user. The information data communication system may also sense feedback from the living being (e.g., user) to change what is being played in the next time segment. If the information data transmission communication system does not have the content which can be played, the content switching device of the information data transmission communication system can acquire the corresponding content from other information data transmission communication systems through the network to play.
Fig. 1 is a schematic diagram of a communication system according to some embodiments of the present disclosure. The information content 101 is a derivative of the human being created in the evolution process, such as literature, art, science, religion …, etc., and the items are recorded by one mode of symbols, characters, voice law, pictures …, etc., or the combination mode is recorded and then are left to the world, and the information content is information generated in the life of the human being or in the evolution process. FIG. 1 discloses various electronic storage devices and types in wide use today. After the electronic storage device is widely used, various information contents 101 can be converted into electronic signals, and stored in recording media such as a magnetic tape 102, a hard disk 103, a solid state disk 104, or an optical disk 105, and the contents in these recording media can be reproduced via various playback devices. In addition, in the internet of things, the contents may be stored in the content cloud (or server) 106, and transmitted to a specific playing device, such as the mobile device 108 (e.g. mobile phone, tablet computer), the computer 109, the projection device 110, the video/audio device 111, the television wall 112, etc. via the internet 107 (or mobile communication and wireless broadcasting station), etc.
Fig. 2 is a schematic diagram of an information data communication system according to some embodiments of the present disclosure. FIG. 2 illustrates an information data communication system 200 in which information content 101 may be available from a variety of sources, or expressed in a variety of ways, such as data records in various formats including physical, size, pictorial, composition, manufacturing process, owner history, multi-lingual, and the like. In the information data communication system shown in fig. 2, a processing device 203 (such as a high-computing processing cloud or a server) is provided in a network (such as the internet of things), and data records with different formats are digitized into electronic files 202 with different formats. In the digitizing process, various data records 201 are converted into a plurality of electronic files 202 of images, engineering drawings, voice …, etc. in different formats. The characteristic values and keywords (e.g., climate, taste, location, element, person name, etc.) in each electronic file 202 are labeled as content tags 204.
The electronic file 202 containing the information data content of the content tag 204 is transmitted by the network to a specific content transformation device 205, and the content transformation device 205 is further connected to the terminal devices 210 to 221 via a wired or wireless method. The terminal devices 210 to 221 include an environment control device 210 (such as a temperature and humidity control device), an odor control device 211 (such as a fragrance device or a fragrance-expanding device), an environmental parameter sensing device 212 (such as sensing temperature and humidity), a light control device 213, an image capturing apparatus 214, a bio-sensing device 215, an audio receiving device 216, an audio playing device 217, a projecting device 218, a display apparatus 219, a 3D printing device 220, a 3D projecting device 221, a human-machine input interface (not shown), a position detecting device (not shown), and the like. The terminal devices 210-221 may be used to present the electronic file 202 in a particular configuration. For example, the terminal devices 210-221 may present the specific electronic file 202 on a schedule. For example, when a living being 222 (e.g., a user) enters the information data transmission communication system 200, the image capturing device 214 can obtain images (e.g., eye and face images) to identify the living being 222, and can obtain biological characteristics (e.g., fingerprint, DNA, pupil film, voiceprint) via the biological sensing device 215, and can obtain passwords and user input via the HID human input interface. Other terminal devices may obtain GPS coordinates, temperature, humidity, air quality, etc. environmental parameters to further adjust the environmental control device 210. Further, the 3D projection device 221 may present a virtual entity and the 3D printing device 220 may be instantiated for viewing or stroking by the living being 222.
The projector 218 and the display device 219 can display a description of the multi-screen video and image data. Other controlled devices such as audio playback device 217, light control device 213, environmental control device 210, and odor control device 211 may further shape the feeling of being in the scene. The living being 222 may also be fed back through the terminal device, for example, the image capturing device 214 and the audio receiving device 216 may obtain feedback of gestures, body gestures, and voices. Through the content tag 204 in the electronic file 202, the terminal device can retrieve the related electronic file 202 for playing. If the terminal device cannot retrieve the required electronic file 202 from the internal network, the content transformation device 205 can obtain the required electronic file 202 from the processing device 203 via the network for playing.
Fig. 3 discloses an architecture model of the information data communication system of the present disclosure. Table 1 provides a further description of an architectural model of the information data communication system of the present disclosure.
As shown in fig. 3, the architecture model of the information data communication system of the present disclosure comprises an 8-layer architecture comprising: the connection between the information data layer 301, the digital layer 302, the editing layer 303, the application layer 304, the session layer 305, the network layer 306, the architecture layer 307, and the physical layer 308 is shown in fig. 3. The physical layer 308, the architecture layer 307 and the network layer 306 correspond to the hardware device architecture and the network topology of the processing device 203, the content transformation device 205 and the terminal devices 210 to 221 in the information data transmission communication system 200, respectively.
TABLE 1
The information data layer 301, the digital layer 302 and the editing layer 303 illustrate how the various data records 201 are digitized into electronic files 202 in a plurality of data formats. Content tags 204 are added to the electronic file 202 for retrieval and subsequent retrieval, analysis, and application. The collected data records and/or the organized arrangement of electronic files are organized into a script by the content tags 204. The script provides information, such as summaries and schemas, to the electronic file 202 associated with an information data such that the electronic file is more correctly and efficiently used when transferred across a network.
The session layer 305 and the application layer 304 illustrate the process of communicating information data and the scene mode of the application in an information data communication system.
As shown in fig. 3, the 8-layer model includes a physical layer 308 corresponding to the various hardware devices implementing the information data delivery communication system in the network. The hardware device may be divided into a processing device, a content transformation device, and a terminal device. The processing device, the content transformation device and the terminal device all have the functions of calculation, memory, sensing, control, networking and the like.
Table 2 provides a functional comparison of the terminal device, the processing device, the content transformation device.
TABLE 2
As shown in table 2, the terminal device. The functions of the processing device and the content transformation device are different. For example, the processing device is a high-performance computing processor and a high-capacity memory, and is responsible for digitizing various data records 201 into various formats of electronic files 202, marking keywords in the digitized electronic files as content tags 204, and organizing a script according to an information data and the content tags 204 in the electronic files 202 to manage, store, delete and spread the electronic files 202 of the information data.
The content transformation device can transform the script from the processing device into an electronic file format that can be played by different terminal devices. The electronic file transferred to be played by the specific terminal device is called a transfer file. The content tag of the electronic file is also converted into a format recognizable by the terminal device, and is called an atmosphere tag. The content transformation device can generate and control the flow of different terminal devices according to the flow of playing in the script. The flow of information data transmission is one of the methods of information data transmission. The content transformation device can receive the information data content transmitted by feedback adjustment of the user, and guide the user to receive the specific information data content to be transmitted. The content transformation device has various heterogeneous networking capabilities, long-distance networking capabilities (such as GSM, wifi, CATM1 and NB_IOT) for connecting with the processing device, short-distance communication capabilities (including wireless WiFi/Bluetooth/ZigBee/RFID/RF/IR/ultrasonic/optical carrier … and the like or wired LAN/RS485/RS232/I2C/SPI/GPIO … and the like) for connecting with a plurality of terminal devices, and a computing processor for managing applications in an information data transmission communication system and a method for information data transmission. The terminal device has computing, memory and networking capabilities, as well as control capabilities for rendering electronic files and sensing capabilities for sensing environmental parameters and biological characteristics.
The sensing capability of the terminal device for sensing environmental parameters and biological access and characteristics may be implemented by input devices (e.g., touch pad, infrared detection, audio receiving device, image capturing device, HMI human-machine interface), environmental parameter detecting devices (e.g., detecting brightness, uv light, humidity, temperature), or biological sensing devices (e.g., identifying physiological characteristics of fingerprint, voiceprint, retina, DNA, face, etc.).
The control capability shown in Table 2 can control the odor control device, electronic paper, display device, audio playback device, light control device, 3D printing device, 3D laser developing device, temperature and humidity control device, etc. to present electronic file. Several terminal devices may be deployed in a predetermined area, such as a building, machine, vehicle, or cell, to form an interaction area. The terminal device detects environmental parameters (such as light, temperature, humidity, fragrance, etc.), and further controls other terminal devices to present visual sense, auditory sense, tactile sense and olfactory sense in the electronic file to organisms (such as human, animals, plants, etc.) in the predetermined area.
As shown in fig. 3, the 8-layer model includes an architecture layer 307. The architecture layer describes how the processing device, the content transformation device and the terminal device construct an information data transmission communication system.
Fig. 4 is a schematic diagram of an information data communication system according to some embodiments of the present disclosure. As shown in fig. 4, the data record 201 is digitized and processed by the processing device 401 into an electronic file 202 containing content tags 204 and a script. The electronic file 202 including the content tag 204 and the script are transmitted from the processing device 401 to the content transformation device 402 via a network (e.g., the internet of things). The content transition means 402 is connected to the terminal means 403-1 to 403-n, wherein the terminal means 403-1 to 403-n comprise sensing means or control means. The electronic file 202 has different formats, each of which is playable in accordance with a particular terminal device. Fig. 4 illustrates that the basic components of an information data communication system may include a processing device 401, a content transformation device 402, and a plurality of terminal devices 403-1 through 403-n.
A content transformation device 402 and a plurality of terminal devices 403-1 to 403-n may be configured as a communication unit 410 in a building, a moving vehicle, or any predetermined area. The communication unit 410 may further include environmental parameters 421 and living things (e.g., human or animal) 422 to form an interaction space 420. The processing device 401 may obtain various environmental parameters and biometric data via the sensing devices in the terminal devices 403-1 to 403-n. The information data transmission communication system presents different electronic files according to the change of environment and different biological objects. For example, the information data communication system can determine the presented electronic file and the presentation mode of the electronic file according to the height, sex or age of the living being 422 and different environmental parameters.
As shown in fig. 3, the 8-layer model includes a network layer 306. The network layer is the network topology of each device of the information data communication system. The network topology of each device in the information data transmission communication system is described with reference to fig. 5.
Fig. 5 discloses an exemplary network topology of an information data communication system as in the present disclosure, the network topology including processing devices 401-1 through 401-3, content transformation devices 402-1 through 402-3, terminal devices 403-1 through 403-7, and other devices 404-1 and 404-2 outside the system. As shown in fig. 5, the processing device 401, the content transformation device 402, and the terminal device 403 may be connected to each other, for example, by wireless connection or wired connection, as long as the processing device 401, the content transformation device 402, and the terminal device 403 are within the online range and capability.
As described above with respect to fig. 4, the basic components of an information data communication system include a processing device, a content transformation device, and a plurality of terminal devices. However, each of the processing devices 401-1 to 401-3 may be connected to a plurality of content transition devices 402. Thus, each of the processing devices 401-1 through 401-3 may support a plurality of different information data communication systems. For example, the processing device 401-1 in FIG. 5 may be connected to the content transformation devices 402-1 and 402-2 and form an information data communication system with both the content transformation devices 402-1 and 402-2.
Each of the content transition devices 402-1 to 402-3 may be connected to a plurality of processing devices 401, and one content transition device 402 may receive data from the plurality of processing devices 401, such as the content transition device 402-2 connected to the processing devices 401-1 and 401-2 in fig. 5. Different content transition devices may also be connected to each other to increase the communication distance, such as content transition device 402-1 in fig. 5 connects content transition devices 402-2 and 402-3.
Each of the terminal devices 403-1 through 403-7 may connect to one or more content transformation devices to join multiple information data communication systems, such as terminal device 403-2 in fig. 5 may connect to content transformation devices 402-1 and 402-2 simultaneously. Each of the terminal devices 403-1 to 403-7 may also be connected to each of the processing devices 401-1 to 401-3, as the terminal device 403-7 in fig. 5 is connected to the processing device 401-3. Each of the terminal devices 403-1 to 403-7 may directly serve as a sensing device or a control device of any one of the processing devices 401-1 to 401-3. Each of the terminal devices 403-1 through 403-7 may be interconnected with each other to operate in a harmonic mode (harmony mode), as in fig. 5, in which terminal device 403-5 connects with terminal device 403-6. In some embodiments, each of the processing devices 401-1 through 401-3 may send a script to any of the terminal devices 403-1 through 403-7 via the content transformation device. In some embodiments, content transformation devices 402-1 and 402-2 may send a script to another processing device or another content transformation device. In some embodiments, each of the terminal devices 403-1 through 403-7 may send a transcript to another terminal device.
Fig. 6 is a schematic diagram of a message packet according to some embodiments of the present disclosure. Fig. 6 discloses an exemplary message packet 600 for use in an information data communication system according to some embodiments of the present disclosure. The packet 600 includes an access code 610, a header 620, a guard segment 630, a payload 640, and a suffix 650. The access code 610 includes a preamble 611, a synchronization code (synchronization word) 612, and a suffix 613.
Header 620 includes 6 identifiers, identifiers 621 through 623 being identifiers of destinations and identifiers 624 through 626 being identifiers of sources. Identifiers 621 through 626 are hardware that identifies the received packet and the hardware that sent the packet. The header 620 includes a destination domain identifier 621, a destination content transition device identifier 622, a destination terminal device identifier 623, a source domain identifier 624, a source content transition device identifier 625, a source terminal device identifier 626, and a link control message 627. In addition, the header code 620 includes other link control messages 627 such as packet type codes, traffic handles, or acknowledgement indication codes for sequence numbers that order data in packet transmissions and header error checking.
Each processing device is responsible for domain management of an information data communication system. Each processing device has a network domain identifier.
Each content transformation device is responsible for managing an area under the information data transmission communication system, and each content transformation device is provided with an area identifier.
Each terminal device has a device identifier that, via the area identifier of the networked upper layer routing device (gateway), can be combined into a device identifier that is unique within the network domain of the same processing device.
As shown in fig. 3, the 8-layer model includes an information data layer 301. Information data is a means of presenting human information. Information is a derivative of what humans create during histories, such as literature, art, science, religion, politics, history, and the like. The derivatives of human information are stored by means of symbols, words, sounds, entities, etc. One information data may be a combination of a plurality of data records. In the past, some data recording materials were used to store human information, and these data records can be broadly classified as follows:
3D: real object, 3D map, model …;
2D: pictures, 2D drawings, photographs …;
video and audio: magnetic tape, audio tape, spoken language, score …;
Engineering: building map, mold map, equation, mathematical formula …;
smell: recipe, process ….
As shown in fig. 3, the 8-layer model includes a digital layer 302. The digital layer is to digitally process the data record into an electronic file including content tags. The data record 201 may be a film, a picture, a text, an engineering drawing, an equation, etc.
FIG. 7 is a schematic diagram of a data record and content tag according to some embodiments of the present disclosure. As shown in FIG. 7, the data record 201 is digitized into electronic files 202 of different formats. The electronic files may include video, audio, image, engineering drawings, and documents. Video formats include AVI, FLV, WMV, MOV, MPEG, etc.; the audio format includes MP3, WAV, AAC, BWF, etc. The image format includes GIF, PNG, X, 3, D, JPEG, RAW, etc. Engineering drawing formats include building drawings, structural drawings, mold drawings, mechanical drawings, electrical drawings, piping drawings, procedures, equations, chemical formulas, and the like. The file format contains DOC, PDF, TIFF, INI, RSS, etc.
The format has a corresponding terminal that can be played. If the data is recorded as a physical object, the data can be digitized into an electronic file 202 after 3D scanning and photographing. Keywords (e.g., specific meanings, features, and values) in the electronic file 202 may be labeled as content tags 204. In some embodiments, the content tag 204 may be: the time position and pixel position of the video file, the time position of the audio file, the pixel position of the image file, the number of pages and lines of the file, the pixel position, number of pages and lines of the engineering drawing file, etc., and marks the meaning, characteristics and values of the positions, etc. Each content tag may include parameters 704-1 through 704-n. Parameters 704-1 through 704-n may be light parameters, environmental parameters, odor parameters, terrain parameters, climate parameters, contextual parameters, tool parameters, biometric parameters. Parameters 704-1 through 704-n provide a content summary of electronic file 202 to compose a script.
FIG. 8 is a schematic diagram of an electronic file according to some embodiments of the present disclosure. FIG. 8 discloses that the data record 201 is digitized into p electronic files 202-1 through 202-p, where the electronic file 202-1 has n content tags 204-11 through 204-1n, the electronic file 202-2 has m content tags 204-21 through 204-2m, and the electronic file 202-p has q content tags 204-p1 through 204-pq. The content tags 204-11 through 204-pq may be considered a two-dimensional data structure.
As shown in fig. 3, an editing layer 303 is included in the 8-layer model. After the data record 201 is digitized into the electronic file 202 with the content tag 204, each information data needs to have a script to describe the summary, outline, etc. of the information data. The summaries in the transcript are a collection of content tags 204 in the electronic file 202, but content tags in different electronic files may have the same keyword attributes. A scenario may instruct the terminal device in the communication unit 410 to play at least one electronic file in each time section, and the functions and applications of the scenario will be described later. One of the functions of the editing layer is to sort the content tags into n-dimensional (e.g., 2-dimensional) data arrays. The script of the information data can search, classify, search and identify the electronic files in the information data.
As shown in fig. 3, a session layer 305 is included in the 8-layer model. The session layer is the interactive management in the process of information data transmission. The processing device can set the scheduling of the electronic files in the information data according to the interaction space 420. The terminal device can obtain the biological characteristics in the interaction space 420, and can transmit and play the electronic file in the information data after the biological characteristics are identified, and the playing can be stopped when the biological characteristics leave the interaction space 420. The living things can be fed back by the terminal device (e.g., the input/output device or the sensing device) to control the playing of the electronic file in the communication unit 410. The creature can also input an electronic file from a terminal device, which will broadcast the input electronic file and content tag to the content transformation device and terminal device of the communication unit 410 for presentation of the relevant information data.
As shown in fig. 3, an application layer 304 is included in the 8-layer model. The application layer initiates an application mode of the communication broadcast for the communication unit 410. The application modes include: a collaboration mode (collaboration mode), a collaboration mode (syngymode), and a collaboration mode (harmony mode).
Fig. 9 is a schematic diagram of a network topology according to some embodiments of the present disclosure. Fig. 9 discloses a collaboration mode of an information data communication system in accordance with certain embodiments of the present disclosure. In the collaboration mode, the scheduling and management of the broadcasting of the electronic file 202 is performed by the processing device 401 on the content transformation device 402 and the terminal devices 403-1 to 403-n in the communication unit 410.
Fig. 10 is a schematic diagram of a network topology according to some embodiments of the present disclosure. Fig. 10 discloses a cooperative mode of an information data communication system in accordance with certain embodiments of the present disclosure. In the network, a break in communication or a degradation in the quality of the connection (e.g., a delay) may be caused. In the collaboration mode, if the processing device 401 is disconnected, the content transformation device 402 will be responsible for the broadcast management of the electronic file. In addition, as shown in fig. 10, the content transformation device 402 may broadcast when receiving the electronic file input from the terminal device 403-1. Furthermore, in some embodiments, as shown in fig. 10, if the content transformation device 402 receives an electronic file (or transformation file) input from the terminal device 403-1, the input electronic file (or transformation file) may be translated into a relevant content tag, the electronic file (or transformation file) suitable for a selected information data may be edited in the database of the content transformation device 402, and the other terminal devices (e.g., 403-2, 403-3, etc.) may be broadcasted after translating into the transformation file suitable for the other terminal devices (e.g., 403-2, 403-3, etc.).
Fig. 11 is a schematic diagram of a network topology according to some embodiments of the present disclosure. Fig. 11 illustrates a harmonic mode of an information data communication system in accordance with certain embodiments of the present disclosure. In the harmonic mode, when the content transition device 402 in the communication unit 410 is disconnected, if the input of the electronic file is received from the terminal device 403-1, the terminal device 403-1 can add a content tag to the input electronic file and broadcast to other terminal devices (e.g. 403-2, 403-3, etc.). In addition, other terminal devices will present the corresponding electronic files from their own storage space to proceed the harmonic broadcasting. After disconnecting from the content transformation device 402, the terminal device 403-1 may transmit and receive signals via an infrared sensor, may broadcast and receive data in the space where information data is transmitted via a carrier modulated signal, and may notify other terminal devices (e.g., 403-2, 403-3, etc.) to perform the local tuning mode. The microphone and the loudspeaker can also be transceivers of ultrasonic signal carriers. The acoustic wave is modulated by the carrier wave, and then data is transmitted, and the terminal device 403-1 is notified to perform the local resonance mode by the ultrasonic carrier wave. The lamp controller and the photosensitive sensor can also be used as carriers of light wave data. A stereo ring-field loudspeaker is provided with a microphone, a photoresistor and other sensing components, and can receive data of ultrasonic carriers and optical carriers. When the wired and wireless networking is disconnected, the communication can be communicated with other terminal devices through other heterogeneous networking capabilities, so that the application of the local mode of coordination can be performed.
Fig. 12 is a schematic diagram of a feedback tag according to some embodiments of the present disclosure. The present disclosure provides a method for providing feedback to an information data communication system by a living being (e.g., a user).
The sensing device in the information data transmission communication system can detect the gesture and physiological parameters of a living being (such as a user), so as to obtain the intuitive response of the living being. As shown in FIG. 12, the user's reactions include facial expressions, pupils, gestures, postures, voices, physiology, etc. In detail, the facial expression reaction includes happiness, anger, sadness, happiness, etc.; pupil responses include enlargement, constriction, eye closure, etc.; gesture reactions include waving hands, thumb, holding chest, etc.; the gesture reaction comprises standing gesture, walking, head deflection, sitting and the like; physiological reactions include sweating, heart rate changes, blood pressure changes, etc.; and voice reactions include spoken language, volume changes, commands, etc. The user response may express the user's intent and provide feedback to the information data communication system.
User feedback in a time segment is highly dependent on the electronic file presented by the messaging system in the time segment. For example, with a finger ratio V, may represent a victory, a letter V, or a number 2. Thus, the same user may feedback different willingness expressed at different time segments.
Assuming that the person being conveyed (user) is a rational response, the user feedback may be tested first to set the desired response. In the communication of information data, a scenario includes playing a plurality of different electronic files, each electronic file to be played having a feedback tag in each time segment, including, for example, a user feedback table. The user's will expressed by the response of the time zone can be known by querying the user feedback list in the feedback label of each time zone. The same reaction may represent different willingness in different situations. For example, in a happy situation, the clapping response has a user feedback level of +1, but in a sad situation, the clapping response has a user feedback level of-1.
After the electronic file is played in a certain time section, the sensing function of the terminal device 403 can obtain the response of the user, and the processing device 401 compares the script of the information data to obtain the feedback level of the user in the time section. The user feedback levels for each time segment are summed up to obtain user satisfaction with the information data. In addition, according to the user response of each time section, the script of the information data is compared, so that the electronic files to be presented in the next time section and the playing modes of the electronic files to be presented can be further changed. Further, in the collaborative mode, the content transformation device 402 may be responsible for the comparison of user reactions with the information data script. In the tuning mode, the terminal device 403 may also be responsible for the comparison of the user response with the script of information data.
A scenario may instruct the terminal device in the communication unit 410 to play at least one electronic file in each time period. The script also includes a plurality of feedback labels, and each time zone includes at least one feedback label. Each feedback tag includes a plurality of corresponding operations and a plurality of user feedback levels. Regarding the corresponding operation, for example, a feedback tag indicates: if the user feedback is "happy" of facial expression reaction in the time period, there is a corresponding operation; if the user feedback is "chest holding" in the gesture response in this time period, there is another corresponding operation; if the user feedback is not indicated in the feedback label during this time period, there is another corresponding operation. The corresponding operation includes changing the file to be played in the next time section or the playing mode of the files to be played. In some embodiments the corresponding operations include changing a playback time position and a playback pixel position of the video file to be played, a playback time position of the audio file to be played, a playback pixel position of the image file to be played, a number of pages and a number of rows of the file to be played, or a playback pixel position, a number of pages and a number of rows of the engineering drawing file to be played in a next time section.
Regarding the user feedback level, for example, a feedback tag indicates: if the user feedback is "happy" for the facial expression response during this time period, there is a user feedback level (e.g., +10); if the user feedback is "chest hug" in the gesture response during this time period, there is another user feedback level (e.g., -5); if the user feedback is not indicated in the feedback tag during this time period, there is another user feedback level (e.g., +0). Each presentation of information data may result in a plurality of user feedback levels. The information data communication system of the present disclosure can determine a user feedback level in response to user feedback, and continuously update a user satisfaction based on the determined user feedback level in each time segment, and obtain an end user satisfaction after the electronic file is played. In some embodiments, end user satisfaction is obtained by summing the plurality of user feedback levels.
The propagation of information data may include the following four modes depending on the purpose: information data broadcasting ", information data modeling", interest acquisition and information data satisfaction.
Information data broadcast refers to an electronic file that broadcasts specific information data to general and specific users. For example, an electronic file broadcasting specific information data to users of different groups of users of a specific gender, a specific age, a specific height, guests, staff, etc. For example, the information data to be transmitted in the "" information data broadcast "" mode is ordered according to the biological characteristics of the user (the transmitter).
The term "information data modeling" refers to playing a specific electronic file to a user in a communication unit, creating an atmosphere through a control device of a terminal device, and obtaining feedback from the user through a sensing device of the terminal device, and timely adjusting the played electronic file so that the user can be immersed in the information data. For example, in the "information data modeling" mode, at least one of the corresponding operations is performed in response to feedback from the user (the conveyed party) so that the user satisfaction is maintained within the predetermined range. In other words, at least one of the corresponding operations to be performed is selected with the aim of maintaining the user satisfaction within the predetermined range in response to the feedback of the user.
The term "interest acquisition" refers to the step of playing a specific electronic file for a user, acquiring the response of the user to a specific feedback label and the feedback level of the user, and obtaining various interest orientations of the user after statistical analysis. For example, in the "interest acquisition" mode, user feedback levels determined in response to feedback from the user (the conveyed party) may be communicated to a processing device, content transformation device, or terminal device, which may be statistically analyzed to obtain various orientations of interest for the user.
The term "information data meeting" refers to the fact that a specific electronic file is played for the user, and the user feedback level are obtained, and then the played electronic file is adjusted to increase the user satisfaction. For example, in the "information data satisfying" mode, at least one of the corresponding operations is performed in response to feedback from the user (the conveyed party) to maximize user satisfaction. In other words, at least one of the corresponding operations to be performed is selected with the aim of maximizing user satisfaction in response to feedback from the user.
The four information data transmission modes can be achieved by the same algorithm in the present disclosure, but with different conditions for playing and environmental control of different electronic files. In some embodiments, the four information data propagation modes above may be achieved with different algorithms in the present disclosure.
Fig. 13 is an algorithm state diagram according to some embodiments of the present disclosure.
The propagation of information data can be divided into a plurality of time segments according to the playing time. The script provides the type and manner of presentation of the electronic file 202 to be presented in each time segment.
The information data propagation in fig. 13 is divided into m time segments. Each time segment is the smallest algorithm unit, and each time segment comprises an S state, an A state and a G state. The S state is a play state, the A state is to obtain the environment condition and the user' S response and determine the corresponding operation, and the G state is to determine the user feedback level.
S1, S2, S3. A1, A2, A3 were sequentially arranged on the time axis. G1, G2, G3.. Gm-1 are sequentially arranged on a time axis.
In the S1 state, the terminal device is initialized according to the scenario configured by the processing device, and then the electronic file is played. The subsequent S1 state to Sm state adjusts the electronic file to be played in the next time section and the presentation mode of the electronic file according to the information data transmission mode in the script. After the terminal device starts playing the electronic file in the S1 state, the sensing device in the terminal device enters the a state to obtain the user response of the user to a feedback tag, obtain the environmental parameters of the interaction space 420 and determine the corresponding operation as the reference for playing the electronic file in the S2 state. In the G state, the feedback level of the user is determined according to the corresponding feedback label in the user response comparison scenario.
Fig. 14 is an algorithm state diagram according to some embodiments of the present disclosure. When operating in the "information data broadcast" mode, the system operation state is as shown in FIG. 14. After the electronic file set by the scenario is played from the S1 end, in the a state of each time section, the sensing device in the terminal device confirms the environmental parameters and the response of the user, such as temperature and light are correct? Is the user changed? After confirming the changed condition, the corresponding operation is determined to adjust the electronic file played in the next S state. When operating in the information data broadcasting mode, the feedback level of the user is not determined according to the response of the user, so that the G state is not entered. The obvious difference between fig. 14 and 13 is that the G state is not included in fig. 14.
Fig. 15 is an algorithm state diagram according to some embodiments of the present disclosure. The system algorithm states are shown in fig. 15 when operating in the "information data shaping" mode, the "interest acquisition" mode, and the "information data satisfying" mode. All three modes determine the feedback level of the user according to the response of the user, so that the G state is calculated. According to the user feedback level, the played electronic file is adjusted in the next S state according to the conditions of different modes, and the environmental parameters and user responses that need to be obtained by the sensing device in the terminal device in the next A state.
The "information data modeling" mode instructs the user to feel the same scene and atmosphere as the information data. For example, if the message data is set to be comfortable, the electronic files indicated to be played in the scenario may be soft music, light, warm and fresh. After the user' S response is obtained through the A-state, the feedback label is compared with the G-state and the user feedback level is determined, and the terminal device can be adjusted at the lower position of the user feedback level in the specific feedback label in the next S-state. For example, the user experiences too hot or too noisy, the air conditioning or musical content may be adjusted. The operation in the information data molding mode can enable the user to feel the situation of the information data to be molded.
The "interest acquisition" mode refers to communicating a particular information data and acquiring a user feedback level for a user for a particular feedback tag. For example, a series of electronic files are set for the user, and the user feedback level of the feedback label is recorded. In the interest acquisition mode, the electronic file schedule played in the S state is the same as the information data broadcast mode, and the electronic file set according to the scenario is played in each time section. However, the G state is increased in the interest acquisition mode to acquire the user feedback level of the user for the specific feedback tag for further statistical analysis. In general, the interest acquisition mode may be applied to market research or opinion research. The user can also conduct statistical analysis on the played electronic file while receiving the information data transmission. In addition, the interest acquisition mode may further include an interest acquisition confirmation state. In the interest acquisition confirmation state, after acquiring the user feedback level of the user for the specific feedback label, the electronic file with the similar content label can be further provided for the feedback label of the higher user feedback level and the user feedback level can be further acquired so as to confirm the highest user feedback level of the electronic file with the similar content label.
The "information data satisfying" mode shapes a scene of information data to be transmitted and satisfies the user with respect to the information data. The information data satisfying mode is similar to the interest acquiring mode, and a specific information data is to be transmitted to the user, and the positive satisfaction of the user such as interest and liking of the information data is improved. For example, in a gymnasium environment, when the information data set by a user is to increase the heat consumption of exercise, the electronic file related to the information data is played, the related environment control device is adjusted, and the physiological condition of the user is sensed, so that the heat consumption of exercise can reach the condition set by the script of the original information data. The information data satisfying mode is to satisfy the user with respect to specific information data, and the interest obtaining mode is to set information data with multiple test purposes to obtain the interest orientation of the user.
Fig. 16 is a schematic diagram of an application of an information data communication system in accordance with certain embodiments of the present disclosure. Fig. 16 discloses information data 1601, a processing device 1602, a content transformation device 1603, an environmental control device (air conditioner) 1604, a light control device 1605, an image capturing apparatus 1606, an audio playing device 1607, a display apparatus 1608, an odor control device (fragrance expanding device) 1609, a 3D projection device 1610, a projection device 1611, and a user 1612. The environment control device 1604, the light control device 1605, the image capturing apparatus 1606, the audio playing device 1607, the display apparatus 1608, the odor control device 1609, the 3D projection device 1610, and the projection device 1611 are all terminal devices.
In this embodiment, user 1612 is a potential customer with dog walking. The content transformation device 1603, the environment control device 1604, the light control device 1605, the image capturing device 1606, the audio playing device 1607, the display device 1608, the odor control device 1609, the 3D projection device 1610, the projection device 1611 and the user 1612 form an interaction space 1600.
Fig. 16 discloses a scenario in which the present disclosure is applied to product sales. FIG. 16 discloses an interactive space 1600 for product sales information data to model the information data scenario for product sales. An interactive space 1600 with proper temperature, light, music and smell is formed by an environment control device 1604, a light control device 1605, an audio playing device 1607 and a smell control device 1609. The interest of the user in the feedback tag is obtained by the image capturing device 1606, for example, the image capturing device 1606 detects that the pupil of the user 1612 is looking at the sports shoes or hat merchandise in the file being played on the display device 1608. Then, the 3D projection device 1610 and the projection device 1611 present further information of the sports shoes or caps to the user 1612 through the setting of the script, thereby achieving the purpose of improving the consumer wish of the user 1612.
Fig. 17 is a schematic diagram of an application of an information data communication system according to some embodiments of the present disclosure. Fig. 17 discloses information data 1701, a processing means 1702, a content transformation means 1703, an environment control means (air conditioner) 1704, a light control means 1705, an image capturing device 1706, audio playing means 1707 and 1713, a display device 1708, odor control means (fragrance expanding means) 1709 and 1714, a production device sensing means 1710, an optical scanning means 1711, a projection means 1712, a 3D printing means 1715, and users 1716 and 1717.
The environment control device 1704, the light control device 1705, the image capturing apparatus 1706, the audio playing devices 1707 and 1713, the display device 1708, the odor control devices 1709 and 1714, the production equipment sensing device 1710, the optical scanning device 1711, the projection device 1712 and the 3D printing device 1715 are all terminal devices.
In this embodiment, the user 1716 is an operator of the production line and the user 1717 is a manager in the control room. The content transformation device 1703, the environment control device 1704, the light control device 1705, the image capturing device 1706, the audio playing device 1707, the display device 1708, the smell control device 1709, the production device sensing device 1710 and the optical scanning device 1711 form an interactive space 1700 of the production line. The content transformation device 1703, the light control device 1705, the audio playing device 1713, the smell control device 1714, the projection device 1712 and the 3D printing device 1715 form an interaction space 1750 in the control room.
FIG. 17 discloses a scenario where the present disclosure is applied to production lines and control rooms in a factory. In an online interactive space 1700, the present disclosure shapes the work environment that the user 1716 operator can satisfy. The mental state of the operator 1716 is obtained by the image capturing apparatus 1706. Quality conditions of the production facility and the product can be obtained by the production facility sensing device 1710 and the optical scanning device 1711. By analyzing the data acquired by the image capturing device 1706, the production device sensing device 1710 and the optical scanning device 1711, the operator's current response and feedback (e.g., work efficiency) can be presented. The display device 1708 may display a produced SOP job description to provide description assistance when an operator presents a reduced quality or poor efficiency of work. Interaction with interaction space 1700 is also possible in interaction space 1750 of the control room.
Fig. 18 is a schematic diagram of an application of an information data communication system in accordance with certain embodiments of the present disclosure. Fig. 18 discloses display apparatuses 1801 and 1802, light control 1803, image capturing apparatus 1804, traffic sign control 1805, electronic billboards 1806, 1807 and 1808, audio broadcasting apparatus 1809, user 1810, small vehicle 1811 and large vehicle 1812. The display devices 1801 and 1802, the light control device 1803, the image capturing device 1804, the traffic signal control device 1805, the electronic billboards 1806, 1807 and 1808, and the audio playing device (broadcasting device) 1809 are terminal devices.
The user 1810 is a public receiving evacuation instructions, and the small vehicle 1811 and the large vehicle 1812 are small vehicles and large vehicles receiving evacuation instructions, respectively. Fig. 18 is a public service application of the present disclosure to outdoor public service, more specifically, public service application of outdoor evacuation instructions. Fig. 18 shows an interaction space 1800 at the street corner. The display devices 1801 and 1802 play electronic files, and the light control device 1803, traffic signal control device 1805, and electronic billboards 1806 to 1808 provide more information to the user 1810, small vehicle 1811, and large vehicle 1812. For example, the image capturing device 1804 is used to learn that the pedestrian intends to pass through the intersection, and the traffic signal control device 1805 is used to control the traffic signal to make the pedestrian pass through; the road where the large vehicle 1812 is to enter is known to only allow pedestrians to pass through by the image capturing device 1804, the electronic billboards 1806 display the sign that the vehicle is prohibited from entering, and the electronic billboards 1807 and 1808 further display the direction of the pedestrians and vehicles.
The embodiment of fig. 18 is a tsunami evacuation information data providing evacuation directions for users 1810, small vehicles 1811 and large vehicles 1812. The embodiment of fig. 18 operates in an information data satisfaction mode. The directions of movement of the user 1810, the small vehicle 1811 and the large vehicle 1812 acquired by the image capturing apparatus 1804 can be regarded as the interests of the user 1810, the small vehicle 1811 and the large vehicle 1812. The direction of interest of the movement of the user 1810, the small vehicle 1811 and the large vehicle 1812 is guided to the default direction of movement in the information data via the light control device 1803, the traffic signal control device 1805 and the audio playing device 1809.
Further, the embodiment of FIG. 18 may operate in any of a collaboration mode, and a collaboration mode. The embodiment of fig. 18 shows devices 1801 and 1802, light control means 1803, image capturing device 1804, traffic signal control means 1805, electronic billboards 1806, 1807 and 1808, and audio playing means (broadcasting means) 1809 all having computing processor, memory and communication interface to enable processing, storage and networking capabilities. In the embodiment of fig. 18, the interactive space 1800 may not be connected to the processing device or the terminal device may not be connected to the content transformation device in case of a major emergency. In such cases, the interactive space 1800 may continue to operate by power within the interactive space 1800 and operate in either a collaborative mode (where it is still available to be connected to the content transformation device) or a harmonic mode (where it is not available to be connected to the content transformation device). In the collaboration mode and the collaboration mode, the failure of the information data transmission communication system in the case that the processing device cannot be connected or the content conversion device cannot be connected due to network disconnection or power interruption can be avoided. The 8-layer model in the information data communication system of the present disclosure is a specific application employing the concepts of edge operations and fog operations. In case of network disconnection, each interaction space or each terminal device can still continue to propagate information data.
Fig. 19 is a schematic diagram of a processing device, a content transformation device, or a terminal device in an information data communication system according to some embodiments of the present disclosure. FIG. 19 shows an exemplary block diagram of a processing device, content transformation device, or terminal device in accordance with certain embodiments of the present disclosure. According to table 2 of the present disclosure, one of the processing device, the content transformation device, and the terminal device may include a control device 1910 and a sensing device 1930, wherein the control device 1910 includes a haptic integration device 1911, a 3D projection device 1912, a 3D printing device 1913, virtual reality glasses 1914, a light control device 1915, a display apparatus 1916, an audio playing device 1917, an odor control device (e.g., a fragrance expanding device) 1918, an infrared input output device 1919, and an environmental control device (e.g., an air conditioner) 1920. The sensing device 1930 includes a global positioning system chip 1931, a human input interface 1932, an image capturing device 1933, an audio input device 1934, an odor sensing device (e.g., an electronic nose) 1935, a biometric sensing device 1936, an environmental parameter sensing device (e.g., a temperature and humidity sensing device) 1937, a touch sensing device 1938, an infrared input and output device 1919, and a scanning device 1939.
According to table 2 of the present disclosure, one of the processing device, the content transformation device, and the terminal device may include an i/o driving interface device 1940. The i/o drive interface device 1940 integrates the data from the sensing device 1930 and transmits the integrated data to the central processing unit 1951, and the i/o drive interface device 1940 transmits the instructions from the central processing unit 1951 to the control device 1910, respectively. According to table 2 of the present disclosure, one of the processing device, the content transformation device and the terminal device may include a central processing device 1951, a memory 1952, a power management control device 1953, a cellular network modem device 1954, a wireless network modem device 1955 and a wired network modem device 1956.
Wherein cellular network modem 1954 communicates with cellular network base station 1960 and wireless network modem 1955 and wired network modem 1956 communicate with network 1970. In some embodiments, the cellular modem 1954 is connected to the wide area cellular network by a cellular base station 1960. The central processing unit 1951 is connected to a memory 1952, the memory 1952 storing computer programs, code and databases for execution by the processing device. The central processing unit 1951 is connected to a power management controller 1953. The power management controller 1953 will implement a power saving schedule for the processing device, responsible for timing the powering on and off of the processing device.
The central processing unit 1951 is connected to the i/o driving interface device 1940, and the i/o driving interface device 1940 transmits data and instructions of the central processing unit 1951 to the control device 1910 to control the control device 1910 to play the electronic file. The I/O drive interface device 1940 transmits the data obtained from the sensing device 1930 to the CPU 1951 for analysis of the data.
The global positioning system chip 1931 in the sensing device 1930 can obtain geographic coordinates. The man-machine input interface 1932 may accept data input from a keyboard, a tablet, a touchpad, or a recording medium such as a solid-state memory, an optical disk, or a hard disk. Image capture device 1933 may perform image recognition.
Audio input device 1934 may record ambient sounds and speech. The scent sensing device 1935 may take an ambient air quality indicator such as CO, gas, smoke, etc. The biometric sensing device 1936 may obtain data such as fingerprints, voiceprints, DNA, electrocardiography, pupils, etc. The tactile sensing device 1938 may sense the roughness of the object. The ir i/o device 1919 may transmit signals and receive signals to detect the presence and distance of objects in the environment, and may transmit and receive modulated signals to broadcast and receive data in the space in which the information data is conveyed. The scanning device 1939 may be a 3D scanning device and a 2D scanning device to obtain 3D stereo data and 2D plane data of the object.
The haptic integrating device 1911 in the control device 1910 may be a haptic brush that provides a haptic sensation. The 3D projection device 1912 may provide 3D projection. The 3D printing device 1913 may print out the 3D model. Virtual reality glasses 1914 may acquire virtual reality. The light control 1915 may provide various light sources to provide different light scenes within the interactive space. The display device 1916 includes a projector, a panel, a television wall, and the like, and may be a multi-screen display. The audio playing device 1917 may be a mono speaker or a stereo surround speaker with high sound quality, or may be an ultrasonic speaker to transmit modulated acoustic signals. The odor control device 1918 may provide a scene of an odor atmosphere. The environmental control 1920 may provide on-site temperature and humidity control.
FIG. 20 is a flow chart of a data record digitizing process according to some embodiments of the present disclosure. FIG. 20 illustrates an exemplary process for generating a scenario based on a plurality of data records in the information data communication system model of the present disclosure. An information data may be expressed by various data records, for example, by data records having text 2037, image 2038, smell 2039, audio-visual 2040, and model (or object) 2041. The information data layer 2030 contains records for these original data.
Depending on the nature of the data recording, the processing device may decide to digitize it using different sensing devices. For example, the model 2041 may use a 3D holographic device 2035 or a 3D scanning device 2036 (e.g., using X-rays and laser light) to capture images in multiple layers. Audio video 2040 may be passed through analog to digital conversion 2034. The scent 2039 may be sampled via a scent sensing device (e.g., electronic nose) 2033. The image 2038 may be sampled via a 2D scanning device 2031 or a digital camera 2032.
These data records are initially sampled and then processed by multi-format digitizing 2026. The control means of the terminal device have different capabilities. Such as display devices including projectors, televisions, tablets, etc. with different resolutions. Or the format of the optimal playback of the control device of the terminal device is also different. The data record is processed for multi-format digitizing 2026. The smell 2039 sampled by the smell sensing device 2033 is analyzed 2027 by the composition formula to complete the digitization.
The text 2037 and the image 2038 may be digitized as a 2D electronic file 2022, the smell may be digitized as an atmosphere electronic file 2023, the audio-visual 2040 may be digitized as an audio-visual electronic file 2024, and the model 2041 may be digitized as a 3D electronic file 2025. The digitized electronic file has different attributes, such as 3D, audio-visual, atmosphere, image, description, touch and the like. Electronic files of these different attributes may be used to describe the same or different information data, so the content of the electronic files may or may not have an association. After analysis by the processing device or human verification, the content tag is labeled 2021 for the features in the content of each electronic file. Each electronic file contains a plurality of content tags. The above procedure is the digital layer 2020 in the 8-layer model.
Then, a plurality of electronic files containing a plurality of content tags are compiled 2012 according to the information data to be transmitted and based on the content tags related to the information data, and the script 2011 is completed. More specifically, the scenario compiling 2012 confirms the content tag and the parameter related to the information data according to the information data to be transmitted, and compiles a plurality of electronic files having the content tag related to the information data into the scenario 2011 based on the content tag and the parameter related to the information data. The above process occurs at the editing layer 2010 in the 8-layer model of the present disclosure.
The processing device and the content transformation device also have internet networking capability, and besides the method of locally digitizing the data records to generate electronic files as shown in fig. 20, the data records can also be collected and digitized for use via the internet.
FIG. 21 is a flow chart according to some embodiments of the present disclosure. Fig. 21 discloses further details of editing layer 2010 of fig. 20. One information data may be expressed via a plurality of electronic files (e.g., electronic files 2101-1 through 2101-n), each having a plurality of content tags 2102. In each electronic file, there are content tags 2102 with high repeatability, which can be labeled as important content tags (as shown in fig. 21, A, B and C). Editing of the content tag index set 2110 may be performed at the processing device based on the important content tags. The content tag index set 2110 includes content tag indexes 2110-1 to 2110-m. 2110-1 to 2110-3 in the embodiment of fig. 21 may be indexes of content tags A, B, C.
Editing of the information data summary and the content configuration 2120 is performed based on the content tag index set 2110. Editing of the electronic file playing program 2130 is performed according to the content tag index set 2110. Editing of feedback tab set 2140 is performed based on content tab index set 2110. Editing of the user feedback hierarchy set 2150 and corresponding operation set 2160 is performed according to the content tab index set 2110. Feedback tag set 2140 includes feedback tags 2140-1 through 2140-p, a feedback tag including a user feedback level set 2150 and a corresponding action set 2160. A set of user feedback levels 2150 includes user feedback levels 2150-1 through 2150-q, and a corresponding set of operations 2160 includes corresponding operations 2160-1 through 2160-m.
The processing device edits the scenario 2170 according to the content tag index set 2110, the information data summary and the content configuration 2120, the electronic file playing program 2130, the feedback tag set 2140, the user feedback hierarchy set 2150 and the corresponding operation set 2160. Based on the information data summary and the content configuration 2120, the processing device edits the script summary 2171 and the electronic file configuration 2172 of the script 2170, wherein the electronic file configuration 2172 includes what kind of control device the format of each electronic file is suitable for the terminal device, and the script summary 2171 can be automatically written by the processing device or manually input. In some embodiments, content tags and parameters thereof associated with the information data are compiled into a content tag index set 2110 based on the information data to be communicated, and a particular portion of the script 2170 is compiled based on the content tag index set 2110, wherein the parameters are used to provide content summaries of the plurality of files to compose the script.
The electronic file playing program 2130 in the scenario 2170 includes a playing relationship of each electronic file, so that the electronic files can be synchronized when played. Scenario 2170 further includes electronic files 2101-1 through 2101-n, content tag indexes 2110-1 through 2110-m, and feedback tags 2140-1 through 2140-p. The operations shown in FIG. 21 are defined in the semantic layer of the 8-layer model of the present disclosure. The scenario 2170 is transferred from the processing device to the content transformation device, and the content transformation device interprets the scenario 2170 and completes the setting for each terminal device in the communication unit.
It should be appreciated that the operations shown in fig. 20 and 21 are performed at the processing device in some embodiments, the scenario 2170 is transferred from the processing device to the content transformation device, and the content transformation device interprets the scenario 2170 and completes the setup for each terminal device within the communication unit. However, in embodiments where the connection to the processing device is disabled, the content transformation device and the terminal device may operate in a collaborative mode, and the content transformation device may complete the operations shown in fig. 20 and 21, interpret scenario 2170 and complete the setup for each terminal device within the communication unit. In addition, in embodiments where the connection to the processing device and the content transformation device is disabled, the terminal device may be operated in the harmonic mode, the terminal device may perform the operations shown in fig. 20 and 21, and interpret scenario 2170 and complete the setup for each terminal device within the communication unit.
Fig. 22 is a state diagram of an information data communication system according to some embodiments of the present disclosure. The processing device initial state is the P0 state. In the P0 state, the processing device is on standby. The processing device enters a P1 state, gathers data records and processes the data records into a script. The processing device returns to the P0 state and then enters the P2 state, and the processing device transmits the script to the content transition device in the P2 state. After the processing device completes the processing and transmission of the script, the processing device returns to the standby state of P0.
The initial state of the content transition device is R0 state. In the R0 state, the content transition device is on standby. After receiving the scenario transmitted by the processing device in the P2 state, the processing device enters the R1 state, interprets the scenario, distributes the electronic file to each terminal device of the communication unit, and returns to the R0 state.
When the content transformation device is at R0, each terminal device is notified to play the electronic file according to the electronic file playing program (e.g. the electronic file playing program 2130 in fig. 21). In the collaboration mode of the application layer in the 8-layer model of the present disclosure, the script may be transmitted by the processing device, and the information data may be transmitted according to the program and the condition after being distributed to the terminal device by the content transformation device.
When the terminal device plays the electronic file in the T0 state, the living things (such as users) can feed back by means of gestures, a man-machine interface, voice and the like. When the terminal device playing the electronic file receives the feedback, it enters into the T1 state, and the feedback is digitized and then transmitted to the R0 state content transformation device, or the original data is directly transmitted to the R0 state content transformation device. And the content transition device receives the feedback data, the free R0 state enters the R2 state, and the relevant feedback label in the script is searched to judge the feedback level of the user and the corresponding operation, and then the content transition device returns to the R0 state and informs the terminal device to perform the corresponding operation. In the collaboration mode of the application layer in the 8-layer model of the present disclosure, the content transformation device is responsible for the distribution of the electronic file without connecting to the processing device, and when receiving feedback from the terminal device, the content transformation device first retrieves the relevant feedback label in the scenario to determine the user feedback level and corresponding operation.
When the terminal device stands by in the T0 state and receives feedback of the playing electronic file from the organism, the terminal device enters the T2 state to identify the relevant feedback label of the feedback, and enters the O2 state to request the other terminal devices for resonance. All other terminal devices enter the application layer tuning mode in the 8-layer model of the present disclosure without the participation of the content transformation device.
The terminal device stands by in the T0 state, and if the terminal device receives the request of other terminal devices in the O3 state to perform the tuning, the terminal device in the T0 state will search whether the local electronic file which accords with the request is available for tuning and playing. The content transition device stands by in R0 state, if the request of other content transition device in O1 state is received to search the conforming electronic file, the content transition device in R0 state will enter R2 state to search whether the conforming electronic file is available for playing. The content transition device stands by in R0 state, if it receives the request of other processing device in O4 state to search the conforming electronic file, the content transition device in R0 state will enter R2 state to search whether the conforming electronic file is available for playing.
FIG. 23 is a data processing flow diagram according to some embodiments of the present disclosure. Fig. 23 shows a data processing flow of the present information data transmission communication system, which includes a data recording processing flow of the processing device 2301, a transition processing flow of the content transition device 2302, and a transition file processing flow of the terminal devices 2303-1 to 2303-n. As shown in fig. 23, the processing device 2301 digitizes 2305 the data record 2304 to generate an electronic file, and adds content tags 2306 and scripts 2307 to the electronic file. The processing device 2301 transmits the electronic file 2308, the content tag 2309 and the script 2310 to the content transformation device 2302.
As shown in fig. 23, the content transformation device 2302 receives the electronic file 2308 from the processing device 2301 and stores it in the electronic file database 2311 for use in different modes. The electronic file database 2311 has electronic files 2308-1 through 2308-n and a set of content tags 2309S, where the set of content tags 2309S are packaged into content tags 2309 corresponding to the electronic files 2308-1 through 2308-n. The electronic file 2308 received from the processing device 2301 is also sent to the electronic file buffer 2312, and after the electronic file is edited 2313, the electronic file is saved in the electronic file buffer 2314 according to the different file requirements of the terminal devices 2303-1 to 2303-n. The scenario received from the processing device 2301 may be written into the scenario 2307 by the content transformation device 2302, or the scenario timing 2317 may be generated after the scenario 2307 is written by the content transformation device 2302 and stored into the transformation content buffer 2314. After the received scenario 2310 generates the scenario timing 2317, the transition files 2315-1 through 2315-m in the transition file buffer 2314 are controlled to be outputted to the respective terminal devices 2303-1 through 2303-n to control the transmission of information data thereof. The transition content buffer 2314 has an atmosphere tag set 2316S comprising a plurality of atmosphere tags 2316, wherein each atmosphere tag is associated with a corresponding transition file 2315, electronic file 2308 and content tag 2309.
The sensing unit of the terminal devices 2303-1 to 2303-n can obtain the signal 2318 from the user 2324 or the signal 2318 from the environment 2325, digitize 2319 the signal 2318 in the terminal devices 2303-1 to 2303-n, compare the data in the internal transition file database 2328 and edit the atmosphere tag 2320, and the terminal devices 2303-1 to 2303-n can generate the transition file 2315-x and the atmosphere tag 2316-x related to the signal to be fed back to the content transition device 2302. The content conversion device 2302 receives the converted file, translates the converted file into a corresponding electronic file 2308 and a content tag 2309 by the translator 2326, and performs electronic file collection 2327 on the electronic file database 2311 to collect the appropriate electronic file 2308, and sends the electronic file into the converted content buffer 2314 after the converted content is edited 2313.
The content transition device 2302 may perform an application layer collaboration mode, and aggregate one of the appropriate electronic files into the content transition device 2302 to create a scenario timing 2317 for each terminal device after writing the scenario 2307, and sequentially send the transition file 2315 to each terminal device 2303-1 through 2303-n. As shown in fig. 23, after receiving the transition file 2315-1 and the atmosphere tag 2316-1 from the content transition device 2302, the terminal device 2303-1 performs information data transmission and playing to the control unit 2322 via the controller 2321. The transition file 2315-1 and the atmosphere tag 2316-1 received from the content transition device 2302 are also stored in the transition file database 2328 as samples for comparison after the sensing unit 2323 obtains the signal 2318. The terminal device 2303-1 may also receive the request from the other terminal device 2303-u for the atmosphere tag 2316-u and the tuning mode in the application layer, and after the atmosphere tag 2316-u is edited and identified 2320 by the atmosphere tag, the appropriate corresponding atmosphere tag 2316 may be found in the transition profile database 2328, and the information data may be transmitted to the control unit 2322 via the controller 2321. The content transition device 2302 may also be requested to operate in a harmonic mode via a transition file 2315 and an atmosphere tag 2316 from other content transition devices 2302-1, or may be requested to operate in a harmonic mode via an electronic file 2308 and a content tag 2309. After the transition files 2315 and the atmosphere tags 2316 from the other content transition devices 2302-1 are translated by the translator 2326, the content transition device 2302 can perform electronic file collection 2327 in the electronic file database 2311 to collect appropriate electronic files, and then control the terminal device 2303 to perform information data content transmission.
As shown in fig. 7, the content tags are multi-dimensional data organizations with a semantic meaning and a plurality of eigenvalues. In some embodiments, the content tags are preferably used to write the script 2307 at the processing device and the content transformation device to complete the scheduling of the script. In some embodiments, after the terminal device 2303 obtains the sensing signals 2318 from the sensing component 2323, the signals 2318 are digitized 2319 and the atmosphere label editing and identifying 2320 to generate an atmosphere label 2316, and the atmosphere label 2316 is transmitted to the content transformation device for user feedback or environment feedback identification. In some implementations, the atmosphere tag 2316 may be transmitted to other terminal devices in a harmonic mode of operation at the application layer. It should be appreciated that the atmosphere tag 2316 may serve as a content tag 2309 that all devices may share in recognition.
FIG. 24 is an atmosphere tag encoding scheme according to some embodiments of the present disclosure. One exemplary encoding of the atmosphere tag shown in fig. 24. Atmosphere tags can be coded in 2-carry, 3-carry, or even multiple-carry modes. The atmosphere tag is an N-bit length code, and may include m different levels of codes of first level 2310-1, second level 2310-2, … …, mth level 2310-m, etc. Each stage includes a different number of bits. For example, the first order 2310-1 contains bits 2311-1 through 2311-p, the second order 2310-2 contains bits 2312-1 through 2312-q, and the mth order 2310-m contains bits 2313-1 through 2313-r. The first order is the highest-level classification, such as Du-Wipe decimal book classification (Dewey Decimal Classification), and each atmosphere label is given a group of identifiers according to hierarchical classification, and the coding method can enable the atmosphere labels of two different codes to be sequentially compared with the bit codes, and the correlation of the atmosphere can be judged from the same byte number of the codes of the higher-level codes to the lower-level codes. The classification criteria of the atmosphere label are the criteria of common basis of the devices in the invention.
Both the processing device 2301 and the content transformation device 2302 are capable of translating the electronic file 2308 to generate a transformation file 2315 and an atmosphere tag 2316. Templates of the content of the transition file 2315 and the atmosphere label 2316 are generally manufactured by the processing device 2301, and are transmitted to all the terminal devices 2303 by the content transition device 2302, and when the terminal device 2301 or the content transition device 2302 obtains the signal 2318 fed back by the user 2324 or the environment 2325, the corresponding atmosphere label 2316 or the similar atmosphere label 2316 can be obtained after comparing the template of the transition file 2315 in the transition file database 2328. The same atmosphere tag 2316 may have different transition profiles 2315 for different terminal devices 2303. For example, an atmosphere tag 2316 describing lightning strike may be different for the status files corresponding to terminals such as electronic billboards, surround sound controls, and light controls. Furthermore, even for the same type of terminal 2303, different transition files 2315 are required for driving because of different control element specifications. After the content conversion device 2302 obtains the electronic file 2308 and the content tag 2309 of the processing device 2301, it is necessary to re-translate the information data content of the electronic file 2308, and the control device of the different type of terminal device 2303 connected thereto is driven. This process is referred to as a transition of electronic file 2308, and the data generated by this process is transition file 2315, and the features of transition file 2315 are labeled, and atmosphere label 2316. FIG. 25 is a flow chart of a process for transferring an electronic file according to some embodiments of the present disclosure. FIG. 25 discloses a transition process flow 2501 for a child file. FIG. 25 discloses a script 2502 and electronic files 2503-1 to 2503-n and content tags 2504-1 to 2504-n associated with the script. Taking a terminal device 2505-1 of an electronic billboard type as an example, the terminal device 2505-1 has a virtual screen function 2506, the virtual screen function 2506 can divide a screen 2518 into a plurality of screens, such as virtual screens W1, W2, W3 and W4. Each of the virtual screens W1, W2, W3, and W4 can support the playback of text, photo, or movie formats, and can provide animated special effects to the objects 2519 in the associated electronic file. The transition profile processing flow 2510-1 for the terminal device 2505-1 is disclosed in FIG. 25.
The "electronic file flow setting 2525-1" in the transition file process flow 2510-1 determines the flow of the electronic file to be played. The layout selection 2511 in the transition file process 2510-1 determines how many virtual screens to use and determines the size, location, resolution, etc. characteristics of each virtual screen. The "window content settings 2512" in the transition file process 2510-1 can determine the electronic file to be played by each virtual screen, the cut effect 2513 of each virtual screen, the animation effect 2514 of the object in the electronic file, etc. The transition profile processing flow 2510-1 is used for re-translating the electronic profile into a data stream playable by the terminal device 2505-1 according to the control specification of the terminal device 2505-1, the functions provided by the terminal device 2505-1 and the script to be transmitted.
The electronic file translated in the transition file processing flow 2510-1 is referred to as a transition file 2515-1, and the transition file 2515-1 is applicable to a specific terminal device (i.e., terminal device 2505-1) and is not commonly used for all terminal devices. Atmosphere tag 2516-1 is given to the terminal device 2505-1 in the section of the transition file 2515-1, so that the terminal device 2505-1 can sort and manage the transferred transition file 2515-1 in the database local to the terminal device according to the atmosphere tag 2516-1. Based on the transition profile 2515-1, the control unit 2517-1 will control the terminal device 2505-1 to communicate the contents of the transition profile 2515-1. In addition, the terminal device 2505-1 may have a sensing unit 2520 (e.g., an image capturing device and an audio capturing device) to obtain feedback from the user or the external environment.
A lamp-controlled terminal 2515-2 is also disclosed in FIG. 25. The terminal device 2515-2 includes lamps L1, L2, and L3 having different functions. The "electronic file flow settings 2525-2" in the transition file process flow 2510-2 determines the flow of the electronic file to be presented. The "lamp configuration 2521" in the transition profile process 2510-2 determines how many lamps are used and determines the characteristics of each lamp, such as brightness, color light combination, location, projection mode, and projected pattern 2524. The "" light cut settings 2522 "" in the transition file process 2510-2 determine the light situation effect 2523 for each light fixture, etc. The control unit 2517-2 can control the brightness, color light combination, position, projection mode, and projected pattern of the lamps L1, L2, and L3, so that visual selection, field crossing, fantasy, simulation, atmosphere, and the like can be provided. In the transition file processing flow 2510-2 in which the electronic files are transitioned to the terminal device 2505-2, the electronic files 2503-1 to 2503-n associated with the script may not include the electronic files for the terminal device 2505-2 (i.e., may not include the electronic files for controlling the light), so the transition file processing flow 2510-2 may control the light configuration, the light cut setting, the light situation effect and the projected light pattern in the terminal device according to the content tag in the script An Paideng, and then mark the atmosphere tag 2516-2.
For the terminal device 2505-n shown in FIG. 25, the "electronic file flow setting 2525-2" in the converted file processing flow 2510-n determines the flow of the electronic file to be presented. The "" transition flow 2526 "" in the transition file process flow 2510-n can determine characteristics associated with the terminal device 2505-n and translate to the transition file 2515-n specific to the terminal device 2505-n and to indicate the atmosphere tag 2516-n.
FIG. 26 is a diagram of data flow in different modes of operation of an application layer according to some embodiments of the present disclosure. The data streams 2620, 2623, 2626, and 2629 are data streams in a collaborative mode, wherein the data streams 2620 and 2623 include the electronic file 2604, the content tags 2605, and the script 2606, and the data streams 2626 and 2629 include the transition file 2607 and the atmosphere tags 2608. The data streams 2621, 2624, 2627, and 2630 are data streams in a collaborative mode, wherein the data streams 2621 and 2624 include the electronic file 2604 and the content tags 2605, and the data streams 2627 and 2630 include the transition file 2607 and the atmosphere tags 2608. The data streams 2622, 2625, 2628, 2631, 2632, and 2633 are data streams in the harmonic mode, wherein the data streams 2622 and 2625 include the electronic file 2604 and the content tag 2605, the data stream 2632 includes the electronic file 2604 and the atmosphere tag 2608, and the data streams 2628, 2631, and 2633 include the transition file 2607 and the atmosphere tag 2608.
FIG. 26 discloses a data flow diagram among a processing device 2601, content transformation devices 2602-1 and 2602-2, and terminal devices 2603-1 and 2603-2 when operating in a collaboration mode, and a resonance mode at an application layer. In collaboration mode, the electronic file is digitized by the processing device 2601 from the data record 2604 into an electronic file 2604. The processing device 2601 adds a content tag 2605 to the electronic archive 2604, and composes a scenario 2606 based on the content tag 2605. The processing device 2601 creates the electronic archive 2604, the content tag 2605, and the scenario 2606 as a data stream 2620 or 2623 in a collaborative mode. After the data stream 2620 or 2623 including the electronic file 2604, the content tag 2605 and the script 2606 is transferred to the content transformation device 2602-1 or 2602-2, the data stream 2626 or 2629 including the corresponding transformation file 2607 and the atmosphere tag 2608 is transferred to the connected terminal device 2603-1 or 2603-2 according to the process of the script arrangement after the compiling of the content transformation process into the transformation file 2607 and the atmosphere tag 2608.
And starting the terminal device of the information data transmission system according to the setting condition of the script to transmit the information data. After the processing device 2601 digitizes the data record 2604 into an electronic file and composes it into a script, it is transmitted to the content converting device 2602-1 or 2602-2, and the content is converted into a format that can be played by the terminal device 2603-1 or 2603-2, and this process is called a collaboration mode. The content transformation device and terminal device will store the received, edited and used electronic files 2604 and content tags 2605 (i.e. data streams 2621, 2622, 262, 2625 and 2632) or transformation files 2607 and atmosphere tags 2608 (i.e. data streams 2627, 2628, 2630, 2631, 2633) in their local databases as related materials for use in the collaboration mode and the resonance mode.
In the collaboration mode of the application layer, the content transformation device 2602-1 or 2602-2 is a control center for information data communication. When the scenario 2606 cannot be obtained by the processing device 2601 during the network disconnection (i.e. the data streams 2620 to 2622 and the data streams 2623 to 2625 are disconnected), or when the terminal device 2603-1 or 2603-2 obtains feedback from the user, the content transition device 2602-1 or 2602-2 analyzes the feedback content, edits the appropriate electronic file 2604 from the local database, and finds the corresponding transition file 2607 or the atmosphere tag 2608. The content transformation device 2603-1 or 2603-2 can compile a plurality of suitable electronic files 2603 into transformation files 2607 and atmosphere tags 2608 applicable to different terminal devices. The communication of information data by the content transition device 2603-1 or 2603-2 dominates its connected terminal device, and this flow is called a collaboration mode. The processing device 2601 may transfer the digitized electronic file 2604 and corresponding content tags 2605 associated with the particular information data to an in-memory database (i.e., data streams 2621, 2622, 262, and 2625) of the content transformation device 2602-1 or 2602-2. The data may not include a script, so that the communication of information data in the collaboration mode is not initiated, and the content transformation device uses the content as information data content reference material in the collaboration mode.
In the harmonic mode of the application layer, no means act as a dominant control center. When the terminal device 2603-1 or 2603-2 and the associated content transition device 2602-1 and 2602-2 are in the disconnection state, if the terminal device 2603-1 or 2603-2 obtains feedback from the user or information such as environmental change, the terminal device digitizes the feedback content, translates the feedback content into a transition file 2607 and an atmosphere tag 2608, and transmits the transition file to the content transition device 2603-1 or 2603-2 for support of the cooperation mode. If a response from the content transition device 2603-1 or 2603-2 cannot be obtained, the transition file 2607 and the atmosphere tag 2608 are broadcasted to other terminal devices or other content transition devices, and a tuning mode is requested. When other devices receive the request of the tuning mode and the related content, the other devices search the local memory database for the proper content and transmit the content back to the requesting device for information data transmission.
If the content transformation device receives the request of the harmonic mode data of other content transformation devices, the data between the content transformation device and the terminal device are mainly electronic files which are not transformed, and the data are transferred to the terminal device as transformation files. The information data transmission initiated by the terminal device is that each terminal device obtains a proper electronic file by itself by a local processing device and performs participation information data transmission, and the flow is called a harmonic mode. The processing device 2601 transmits the specific, personalized, obvious and digitized electronic file 2604 and the corresponding content tag 2605 to the content transformation device 2602-1 or 2602-2 (i.e., the data streams 2622 and 2625), and transforms the electronic file 2607 and the atmosphere tag 2608 to the terminal device (i.e., the data streams 2628 and 2631) as a reference database of the tuning mode. The electronic files used in the harmonic mode may be electronic files having the same content tag. The electronic files used in the collaboration mode may be a collection of electronic files of one or more associated content tags.
Fig. 27 is a schematic diagram of an internet of things according to some embodiments of the present disclosure. FIG. 27 discloses a processing device 2701 connecting two levels of content transition devices 2702 and 2703-1 to 2703-n. The front-stage content transition device 2702 is connected to one or a plurality of rear-stage content transition devices 2703-1 to 2703-n. As shown in the network architecture of fig. 27, the two-level content transformation device 2702 has the same functional architecture and processing operation flow, and the front-level content transformation device 2702 has a very high capacity electronic file database and a high capability scenario editing function, so that the electronic files and scenarios of the other content transformation devices 2703-1 to 2703-n in the collaboration mode and the tuning mode of the application layer can be provided on the local network. The content transition devices 2703-1 to 2703-n of the latter stage deal mainly with the transition file editing capability of each terminal device to control various different, more and complex terminal devices 2704-1 to 2704-m and 2705-1 to 2705-p to provide better information data content transmission environment and to deal with the coordination mode of the application layer.
The foregoing description describes features of various embodiments to provide those skilled in the art with a more complete understanding of the various aspects of the present disclosure. Those skilled in the art will appreciate that the present disclosure can be readily utilized as a basis for designing a method and structure for carrying out the same purposes and/or achieving the same purposes of the embodiments described herein. Such modifications, substitutions and variations do not depart from the spirit and scope of the present disclosure.
Symbol description
101. Information content
102. Magnetic tape
103. Hard disk
104. Solid state disk
105. Optical disk
106. Content cloud
107. Internet network
108. Mobile device
109. Computer with a memory for storing data
110. Projection apparatus
111. Video and audio equipment
112. Television wall
200. Information data communication system
201. Data recording
202. Electronic file
202-1 to 202-p electronic files
204-11 to 204-1n content tags
204-21 to 204-2m content tags
204-p1 to 204-pq content tags
203. Processing device
204. Content label
205. Content transition device
210. Environment control device
211. Odor control device
212. Environmental parameter sensing device
213. Light control device
214. Image capturing apparatus
215. Biological sensing device
216. Audio receiving device
217. Audio playing device
218. Projection device
219. Display apparatus
220 3D printing device
221 3D projection device
222. Biological material
301. Information data layer
302. Digital layer
303. Editing layer
304. Application layer
305. Conversation layer
306. Network layer
307. Architecture layer
308. Physical layer
401. Processing device
401-1 to 401-3 treatment device
402. Content transition device
402-1 to 402-3 content transition device
403-1 to 403-n terminal device
404-1 and 404-2 other devices
410. Communication unit
420. Interactive space
421. Environmental parameters
422. Biological material
600. Message packet
610. Access code
611. Preamble code
612. Synchronous code
613. Suffix(s)
620. Header
621. Network domain identifier of destination
622. Content-transitive device identifier of destination
623. Destination terminal device identifier
624. Network domain identifier of source
625. Content-transitive device identifier of source
626. Terminal device identifier of origin
627. Link control message
630. Protection section
640. Payload
650. Suffix(s)
704-1 to 704-n parameters
1601. Information data
1602. Processing device
1603. Content transition device
1604. Environment control device
1605. Light control device
1606. Image capturing apparatus
1607. Audio playing device
1608. Display apparatus
1609. Odor control device
1610 3D projection device
1611. Projection device
1612. User' s
1701. Information data
1702. Processing device
1703. Content transition device
1704. Environment control device
1705. Light control device
1706. Image capturing apparatus
1707. Audio playing device
1708. Display apparatus
1709. Odor control device
1710. Production facility sensing device
1711. Optical scanning device
1712. Projection device
1713. Audio playing device
1714. Odor control device
1715 3D printing device
1716. User' s
1717. User' s
1801. Display apparatus
1802. Display apparatus
1803. Light control device
1804. Image capturing apparatus
1805. Traffic signal control device
1806. Electronic billboard
1807. Electronic billboard
1808. Electronic billboard
1809. Audio playing device
1810. User' s
1811. Small-sized vehicle
1812. Large-sized vehicle
1910. Control device
1911. Haptic integration device
1912 3D projection device
1913 3D printing device
1914. Virtual reality glasses
1915. Light control device
1916. Display apparatus
1917. Audio playing device
1918. Odor control device
1919. Infrared input/output device
1920. Environment control device
1930. Sensing device
1931. Global positioning system chip
1932. Human-machine input interface
1933. Image capturing apparatus
1934. Audio input device
1935. Smell sensing device
1936. Biological feature sensing device
1937. Environmental parameter sensing device
1938. Touch sensing device
1939. Scanning device
1940. Input/output driving interface device
1951. Central processing unit
1952. Memory
1953. Power management control device
1954. Cellular network modem device
1955. Wireless network modem device
1956. Wired network modem device
1960. Cellular network base station
1970. Network system
2010. Editing layer
2011. Script
2012. Script compilation
2020. Digital layer
2021. Labeling of content tags
2022 2D electronic file
2023. Atmosphere electronic file
2024. Video-audio electronic file
2025 3D electronic archive
2026. Multi-format digitization
2027. Composition formula analysis
2030. Information data layer
2031 2D scanning device
2032. Digital camera
2033. Smell sensing device
2034. Analog-digital conversion device
2035 3D holographic camera device
2036 3D scanning device
2037. Text with a character pattern
2038. Image processing apparatus
2039. Smell of
2040. Audio and video
2041. Model
2101-1 to 2101-n electronic files
2102. Content label
2110. Content tag index set
2110-1 to 2110-m content tag index
2120. Information data summary and content configuration
2130. Electronic file playing program
2140. Feedback tag set
2140-1 to 2140-p feedback tags
2150. User feedback hierarchy set
2150-1 through 2050-q user feedback levels
2160. Corresponding operation set
2160-1 to 2160-w correspond to operations
2170. Script
2171. Script abstract
2172. Electronic file configuration
2301. Processing device
2302. Content transition device
2302-1 content transition device
2303-1 to 2303-n terminal devices
2303-u terminal device
2304. Data recording
2305. Digitization
2306. Adding content tags
2307. Writing script
2308. Electronic file
2308-1 to 2308-n electronic files
2309. Content label
2309S content tab set
2310. Script
2311. Electronic archive database
2312. Electronic file buffer
2313. Change-state archive editing
2314. Change-state file buffer
2315. Change-state file
2315-1 through 2315-m transition files
2315-1 through 2315-n transition files
2315-1 through 2315-p transformation files
2315-x conversion profile
2316. Atmosphere label
2316-1 to 2316-n atmosphere label
2316-u atmosphere label
2316-x atmosphere label
2316S oxygen surrounding label set
2316S-1 oxygen perimeter tag set
2317. Script timing
2318. Signal signal
2319. Digitization
2320. Atmosphere label editing and identification
2321. Controller for controlling a power supply
2322. Control unit
2323. Sensing assembly
2324. User' s
2325. Environment (environment)
2326. Translation device
2327. Electronic document collection
2328. Change-state archive database
2310-1 first order
2310-2 second step
2310-mth order m
2311-1 to 2311-p bits
2312-1 to 2312-q bits
2313-1 to 2313-r bits
2501. Electronic file conversion processing flow
2502. Script
2503-1 to 2503-n electronic files
2504-1 to 2504-n content tags
2505-1 to 2505-n terminal device
2506. Virtual screen function
2510-1 to 2510-n transition file processing flow
2511. Layout selection
2512. Window content setting
2513. Cut-out effect of virtual screen
2514. Animation effects of objects
2515-1 to 2515-n transition files
2516-1 to 2516-n atmosphere tags
2517-1 and 2517-2 control unit
2518. Screen panel
2519. Article (B)
2520. Sensing unit
2521. Lamp arrangement
2522. Setting of light passing field
2523. Lighting situation effect
2524. Projecting a pattern
2525-1 to 2525-n electronic archive flow settings
2526. Transition flow
2601. Processing device
2602-1 content transition device
2602-2 content transition device
2603-1 terminal device
2603-2 terminal device
2604. Electronic file
2605. Content label
2606. Script
2607. Change-state file
2608. Atmosphere label
2620. Data flow
2621. Data flow
2622. Data flow
2623. Data flow
2624. Data flow
2625. Data flow
2626. Data flow
2627. Data flow
2628. Data flow
2629. Data flow
2630. Data flow
2631. Data flow
2632. Data flow
2633. Data flow
2701. Processing device
2702. Content transition device
2703-1 to 2703-n content transition device
2704-1 to 2704-m terminal apparatus
2705-1 to 2705-p terminal device
A content label
B content label
C content label
S1 to Sm State
A1 to Am-1 State
G1 to Gm-1 state
L1 to L3 lamp
O0 to O3 state
P0 to P2 states
R0 to R2 states
T0 to T2 state
W1 to W4 virtual screen

Claims (34)

1. An information data communication system, comprising:
a processing device;
a content transition device;
a terminal device;
wherein when the terminal device detects that a user enters the range of the communication system, the communication system is configured to:
(a) Acquiring a plurality of files from an external network according to the subscribed information data and the biological characteristics of the user;
(b) Adding a plurality of content tags to each of the plurality of files;
(c) Writing a script of the information data according to content tags related to the information data in a plurality of content tags of each of the plurality of files, wherein the script comprises a plurality of feedback tags, and each of the plurality of feedback tags comprises a plurality of corresponding operations and a plurality of user feedback levels;
(d) According to the script, instructing the terminal device to present at least one of the plurality of files to the user; a kind of electronic device with high-pressure air-conditioning system
(e) Instruct the terminal device to sense feedback of the user, and perform at least one of the corresponding operations and determine at least one of the user feedback levels in response to the feedback of the user.
2. The information data communication system of claim 1, further comprising:
changing the file to be presented in the next time section and the playing mode of the file to be presented according to at least one of the feedback and the corresponding operation of the user.
3. The information data communication system of claim 1, further comprising:
in response to the feedback of the user determining one of the plurality of user feedback levels, the user satisfaction is updated based on the determined user feedback level.
4. The information data communication system of claim 1, wherein the terminal device comprises at least one of: the device comprises a 3D projection device, a 3D printing device, a display device, a projection device, an audio playing device, an audio receiving device, a biological sensing device, a photographing device, a light control device, an environment parameter sensing device, an odor control device and an environment control device.
5. The information data communication system of claim 1, wherein the plurality of files comprises one or more of: video files, audio files, image files, document files, and engineering drawing files.
6. The information data communication system of claim 5, wherein the plurality of content tags includes one or more of: the temporal position and pixel position of the video file, the temporal position of the audio file, the pixel position of the image file, the number of pages and lines of the file, and the pixel position, number of pages and lines of the engineering drawing file.
7. The information data communication system of claim 6, wherein each of the plurality of content tags further comprises a plurality of parameters for providing content summaries of the plurality of files for composing the transcript.
8. The information data communication system of claim 1, wherein the script of the information data comprises files to be played in each time segment and a playback manner of the files to be played.
9. The information data communication system of claim 6, wherein the script of the information data further comprises specifying in each time segment:
a play time position and a play pixel position of the video file to be played;
a play time position of the audio file to be played;
the position of the playing pixel of the image file to be played;
The number of pages and number of lines of the file to be played; or (b)
The position of the playing pixel, the number of playing pages and the number of playing lines of the project file to be played.
10. The information data communication system of claim 8, wherein performing at least one of the corresponding operations based on the feedback from the user includes changing a file to be played or a playback manner of the file to be played in a next time period.
11. The information data communication system of claim 9, wherein performing at least one of the corresponding operations based on the feedback from the user comprises changing a playback time position and a playback pixel position of a video file to be played, a playback time position of an audio file to be played, a playback pixel position of an image file to be played, a playback number of pages and playback number of columns of a file to be played, or a playback pixel position, a playback number of pages and playback number of columns of an engineering drawing file to be played in a next time section.
12. The information data communication system of claim 1, wherein the processing device performs operations (a) through (e) to operate in a cooperative mode.
13. The information data communication system of claim 1, wherein if the processing device in the system fails to perform operations (a) through (e), the content transformation device performs operations (a) through (e) to operate in a collaborative mode.
14. The information data communication system of claim 1, wherein if the processing device and the content transformation device in the system are unable to perform operations (a) through (e), the terminal device performs operations (a) through (e) to operate in a harmonic mode.
15. An information data communication system as claimed in claim 12, wherein the processing means transmits the scenario to the terminal device via the content transformation means.
16. The information data communication system of claim 15, wherein the content transformation device is operable to send the transcript to another processing device or another content transformation device.
17. The information data communication system of claim 16, wherein the terminal device is capable of transmitting the script, at least one of the plurality of files, and the feedback of the user to another terminal device.
18. The information data communication system of claim 12, wherein the feedback of the user sensed by the terminal device is transmitted to the processing device via the content transformation device.
19. The information data communication system of claim 1, wherein the user's biometric characteristics include at least one of: the gender of the user, the skin tone of the user, the age of the user, and the height of the user.
20. The information data communication system of claim 1, wherein the information data is subscribed to based on a biometric characteristic of the user.
21. The information data communication system of claim 3, wherein at least one of the corresponding operations is performed in response to the feedback from the user such that user satisfaction is maintained within a predetermined range.
22. The information data communication system of claim 1, wherein a user feedback level determined in response to the feedback of the user is further transmitted to one of the processing device, the content transformation device, and the terminal device.
23. The information data communication system of claim 3, wherein at least one of said corresponding operations is performed in response to said feedback from said user to maximize user satisfaction.
24. A method of information data communication, comprising:
(a) Acquiring a plurality of files from an external network according to the subscribed information data and the biological characteristics of the user;
(b) Adding a plurality of content tags to each of the plurality of files, the plurality of content tags being associated with the information data;
(c) Writing a script of the information data according to a plurality of content tags of each of the plurality of files, the script comprising a plurality of feedback tags, each of the plurality of feedback tags comprising a plurality of corresponding operations and a plurality of user feedback levels;
(d) According to the script, a terminal device is instructed to present at least one of the files to the user; a kind of electronic device with high-pressure air-conditioning system
(e) Instruct the terminal device to sense feedback of the user and perform at least one of the corresponding operations according to the feedback of the user and determine at least one of the user feedback levels.
25. The method for communicating information data of claim 24, wherein each of the plurality of content tags further comprises a plurality of parameters for providing a summary of the content of the plurality of files.
26. The information data communication method of claim 24, wherein the scenario of the information data includes files to be played in each time section and a playback manner of the files to be played in each time section.
27. The method of communicating information data of claim 24, further comprising determining one of the plurality of user feedback levels in response to the feedback of the user, updating the user satisfaction based on the determined user feedback level.
28. The method of claim 26, wherein performing at least one of the corresponding operations based on the feedback from the user includes changing a file to be played in a next time period and a playback manner of the file to be played.
29. The information data communication method of claim 26, further comprising:
changing the file to be presented in the next time section and the playing mode of the file to be presented according to at least one of the feedback and the corresponding operation of the user.
30. An information data communication system, comprising:
a processing device;
a content transition device; a kind of electronic device with high-pressure air-conditioning system
Terminal device, wherein
The processing device is configured to perform the following operations:
acquiring a plurality of files from an external network according to the reserved information data;
adding a plurality of content tags to each of the plurality of files, the plurality of content tags being associated with the information data;
writing a script of the information data according to a plurality of content tags of each of the plurality of files; a kind of electronic device with high-pressure air-conditioning system
Transmitting the files, the content labels and the script to a content transformation device, and
The content transformation device is configured to perform the following operations:
converting a file associated with the terminal device from the plurality of files into a first transition file corresponding to the terminal device;
labeling the first transition file with a corresponding first atmosphere label; a kind of electronic device with high-pressure air-conditioning system
And transmitting the first transition file and the first atmosphere label to the terminal device so as to instruct the terminal device to present the transition file to a user.
31. The information data communication system of claim 30, further comprising:
the terminal device receiving a signal from the user or environment;
the terminal device searches a second atmosphere label and a second transition file which are consistent with the signals in a local database, transmits the second atmosphere label and the second transition file to the content transition device, and transmits update requests of the transition file and the atmosphere label.
32. The information data communication system of claim 31, further comprising:
the content transition device receives the second atmosphere tag, the second transition file and the update request from the terminal device;
the content transition device searches a local database for a third atmosphere label and a third transition file which are consistent with the second atmosphere label and the second transition file, and transmits the third atmosphere label and the third transition file to the terminal device so as to respond to the updating request of the terminal device.
33. The information data communication system of claim 31, further comprising:
the content transition device receives the second atmosphere tag, the second transition file and the update request from the terminal device;
if the local database of the content transition device does not have the third atmosphere label and the third transition file which are consistent with the second atmosphere label and the second transition file, transmitting the second atmosphere label, the second transition file and the update request to another content transition device so as to respond to the update request of the terminal device.
34. The information data communication system of claim 30, further comprising:
the terminal device receives a second atmosphere label from another terminal device;
the terminal device searches the local database for a second transition file corresponding to the second atmosphere label and plays the second transition file.
CN201811125155.2A 2018-09-26 2018-09-26 Information data communication system and method thereof Active CN110955326B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202310472854.9A CN116560502A (en) 2018-09-26 2018-09-26 Cultural data communication system and method thereof
CN201811125155.2A CN110955326B (en) 2018-09-26 2018-09-26 Information data communication system and method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811125155.2A CN110955326B (en) 2018-09-26 2018-09-26 Information data communication system and method thereof

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202310472854.9A Division CN116560502A (en) 2018-09-26 2018-09-26 Cultural data communication system and method thereof

Publications (2)

Publication Number Publication Date
CN110955326A CN110955326A (en) 2020-04-03
CN110955326B true CN110955326B (en) 2023-08-04

Family

ID=69964657

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201811125155.2A Active CN110955326B (en) 2018-09-26 2018-09-26 Information data communication system and method thereof
CN202310472854.9A Pending CN116560502A (en) 2018-09-26 2018-09-26 Cultural data communication system and method thereof

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202310472854.9A Pending CN116560502A (en) 2018-09-26 2018-09-26 Cultural data communication system and method thereof

Country Status (1)

Country Link
CN (2) CN110955326B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115062148B (en) * 2022-06-23 2023-06-20 广东国义信息科技有限公司 Risk control method based on database

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160092402A1 (en) * 2014-09-25 2016-03-31 Monotype Imaging Inc. Selectable Styles for Text Messaging System Publishers
CN106612229B (en) * 2015-10-23 2019-06-25 腾讯科技(深圳)有限公司 The method and apparatus that user-generated content is fed back and shows feedback information
EP3398387B1 (en) * 2015-12-30 2023-06-21 InterDigital Patent Holdings, Inc. Devices for wireless transmit/receive unit cooperation
CN107436816B (en) * 2016-05-27 2020-07-14 腾讯科技(深圳)有限公司 Message delivery control method, system and terminal
US10073664B2 (en) * 2016-06-20 2018-09-11 Xerox Corporation System and method for conveying print device status information using a light indicator feedback mechanism
CN107609913B (en) * 2017-09-19 2020-06-19 上海恺英网络科技有限公司 Data analysis tracking method and system

Also Published As

Publication number Publication date
CN110955326A (en) 2020-04-03
CN116560502A (en) 2023-08-08

Similar Documents

Publication Publication Date Title
US11335210B2 (en) Apparatus and method for analyzing images
US20140289323A1 (en) Knowledge-information-processing server system having image recognition system
CN111339246B (en) Query statement template generation method, device, equipment and medium
US11955125B2 (en) Smart speaker and operation method thereof
US20170185276A1 (en) Method for electronic device to control object and electronic device
US20220207872A1 (en) Apparatus and method for processing prompt information
KR20100002756A (en) Matrix blogging system and service support method thereof
US20190095959A1 (en) Internet of advertisement method and system
CN110147467A (en) A kind of generation method, device, mobile terminal and the storage medium of text description
CN108476258B (en) Method for controlling object by electronic equipment and electronic equipment
US20220246135A1 (en) Information processing system, information processing method, and recording medium
WO2007069512A1 (en) Information processing device, and program
CN110955326B (en) Information data communication system and method thereof
JP2008198135A (en) Information delivery system, information delivery device and information delivery method
US11838587B1 (en) System and method of providing customized media content
CN116051192A (en) Method and device for processing data
CN111552794B (en) Prompt generation method, device, equipment and storage medium
JP2010003264A (en) Information presentation control device and information presentation control method
US20230033675A1 (en) Systems and methods for localized information provision using wireless communication
US20210004747A1 (en) Information processing device, information processing method, and program
JP4649944B2 (en) Moving image processing apparatus, moving image processing method, and program
TWM482779U (en) Object information retrieval system
JP6944920B2 (en) Smart interactive processing methods, equipment, equipment and computer storage media
CN103915094A (en) Shared voice control method and device based on target name recognition
CN106802943B (en) Music recommendation method and device based on movie and television information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20230705

Address after: Daan District, Taiwan city of Taipei Chinese Road 4 No. 325 12 floor 1

Applicant after: Yichi jingcaizitong Co.,Ltd.

Applicant after: Zhang Yijia

Address before: Daan District, Taiwan city of Taipei Chinese Road 4 No. 325 12 floor 1

Applicant before: Yichi jingcaizitong Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant