CN110955326A - Information data transmission communication system and method thereof - Google Patents

Information data transmission communication system and method thereof Download PDF

Info

Publication number
CN110955326A
CN110955326A CN201811125155.2A CN201811125155A CN110955326A CN 110955326 A CN110955326 A CN 110955326A CN 201811125155 A CN201811125155 A CN 201811125155A CN 110955326 A CN110955326 A CN 110955326A
Authority
CN
China
Prior art keywords
content
user
file
feedback
information data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811125155.2A
Other languages
Chinese (zh)
Other versions
CN110955326B (en
Inventor
陈铭毅
张益嘉
汤一雄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yichi Jingcaizitong Co ltd
Zhang Yijia
Original Assignee
Yichi Jingcaizitong Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yichi Jingcaizitong Co Ltd filed Critical Yichi Jingcaizitong Co Ltd
Priority to CN202310472854.9A priority Critical patent/CN116560502A/en
Priority to CN201811125155.2A priority patent/CN110955326B/en
Publication of CN110955326A publication Critical patent/CN110955326A/en
Application granted granted Critical
Publication of CN110955326B publication Critical patent/CN110955326B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

One embodiment of the present disclosure provides an information data communication system, comprising: a processing device; content meta-phonetics (content meta) device; a terminal device. When the terminal device detects that a user enters the range of the communication system, the communication system is configured to perform the following operations: obtaining a plurality of files from an external network according to the predetermined information data and the biological characteristics of the user; adding a plurality of content tags to each of the plurality of files; compiling a script of the information data according to content tags related to the information data in the content tags of each of the files, wherein the script comprises a plurality of feedback tags, and each of the feedback tags comprises a plurality of corresponding operations and a plurality of user feedback levels; instructing the terminal device to present at least one of the plurality of files to the user in accordance with the transcript; and instructing the terminal device to sense feedback of the user, and at least one of perform at least one of the corresponding operation and determine the user feedback level in response to the feedback of the user.

Description

Information data transmission communication system and method thereof
Technical Field
The present disclosure relates generally to a communication system and method, and more particularly, to an information data communication system and method that digitizes a specific information content to be transmitted in cooperation with various devices and guides a user to receive the information data content after receiving user feedback.
Background
Electronic display devices (including televisions, computer monitors, electronic billboards, mobile device screens, etc.) and audio playback devices are widely used in everyday life to communicate information. Electronic display devices and audio playback devices used as billboards are also used in workplaces, homes and residences, commercial establishments, and outdoor locations, including large signs, billboards, stadiums, and public areas.
Electronic display devices and audio playing devices used as billboards today usually have fixed contents or play the previously stored contents cyclically in a set sequence. These electronic display devices and audio playback devices cannot change the displayed content according to the feedback of the user, and further cannot be combined with other devices (e.g., 3D projector device, 3D printing device, projector device, audio playback device, audio receiving device, biosensor device, camera device, light control device, environmental parameter sensor device, odor control device, and environmental control device) to make the user more easily receive information.
Therefore, there is a need for a communication system and method that can cooperate with multiple devices and allow users to more easily receive specific information data content transmissions based on user feedback.
Disclosure of Invention
One embodiment of the present disclosure provides an information data communication system, comprising: a processing device; content meta-phonetics (content meta) device; a terminal device. When the terminal device detects that a user enters the range of the communication system, the communication system is configured to perform the following operations: obtaining a plurality of files from an external network according to the predetermined information data and the biological characteristics of the user; adding a plurality of content tags to each of the plurality of files; compiling a script of the information data according to content tags related to the information data in the content tags of each of the files, wherein the script comprises a plurality of feedback tags, and each of the feedback tags comprises a plurality of corresponding operations and a plurality of user feedback levels; instructing the terminal device to present at least one of the plurality of files to the user in accordance with the transcript; and instructing the terminal device to sense feedback of the user, and at least one of perform at least one of the corresponding operation and determine the user feedback level in response to the feedback of the user.
Another embodiment of the present disclosure provides an information data content communication method, the method comprising: obtaining a plurality of files from an external network according to the predetermined information data and the biological characteristics of the user; adding a plurality of content tags to each of the plurality of files, the plurality of content tags being associated with the information data; compiling a script of the information data according to a plurality of content tags of each of the plurality of files, the script including a plurality of feedback tags, each of the plurality of feedback tags including a plurality of corresponding operations and a plurality of user feedback levels; instructing a terminal device to present at least one of the plurality of files to the user according to the script; and instructing the terminal device to sense feedback of the user and at least one of perform the corresponding operation and determine the user feedback level according to the feedback of the user.
Yet another embodiment of the present disclosure provides an information data communication system, comprising: a processing device; a content state transition device; and a terminal device. The processing device is configured to perform the following operations: obtaining a plurality of files from an external network according to the predetermined information data; adding a plurality of content tags to each of the plurality of files, the plurality of content tags being associated with the information data; compiling a script of the information data based on a plurality of content tags of each of the plurality of files; and transmitting the plurality of files, the plurality of content tags and the script to a content transition device. The content state transition device is configured to perform the following operations: converting a file associated with the terminal device from the plurality of files into a first transition file corresponding to the terminal device; labeling the first transfer file with a corresponding first atmosphere label; and transmitting the first transition file and the first atmosphere label to the terminal device to instruct the terminal device to present the transition file to a user.
Drawings
The following detailed description of the present disclosure can be better understood when read in conjunction with the following drawings. It should be noted that, in accordance with standard practice in the industry, the features in the drawings may not be drawn to scale. In fact, the dimensions of the features in the figures may be arbitrarily expanded or reduced to facilitate understanding of the present disclosure.
Fig. 1 is a schematic diagram of a communication system in accordance with some embodiments of the present disclosure.
Fig. 2 is a schematic diagram of an information-data-conveying communication system, in accordance with some embodiments of the present disclosure.
Figure 3 is a schematic diagram of an information-data-conveying communication system architecture model, in accordance with some embodiments of the present disclosure.
Figure 4 is a schematic diagram of an information data communication system, in accordance with some embodiments of the present disclosure.
Figure 5 is a schematic diagram of a network topology of an information-data-conveying communication system, in accordance with some embodiments of the present disclosure.
FIG. 6 is a schematic diagram of a message packet in accordance with some embodiments of the present disclosure.
FIG. 7 is a schematic diagram of a data record and a content tag, according to some embodiments of the present disclosure.
FIG. 8 is a schematic diagram of an electronic file according to some embodiments of the present disclosure.
Figure 9 is a schematic diagram of a network topology in accordance with some embodiments of the present disclosure.
Figure 10 is a schematic diagram of a network topology, in accordance with some embodiments of the present disclosure.
Figure 11 is a schematic diagram of a network topology in accordance with some embodiments of the present disclosure.
Fig. 12 is a schematic diagram of a feedback tag in accordance with some embodiments of the present disclosure.
FIG. 13 is an algorithmic state diagram in accordance with certain embodiments of the present disclosure.
FIG. 14 is an algorithm state diagram in accordance with certain embodiments of the present disclosure.
FIG. 15 is an algorithmic state diagram in accordance with certain embodiments of the present disclosure.
Figure 16 is a schematic diagram of an information-data-conveying communication system application, in accordance with some embodiments of the present disclosure.
Figure 17 is a schematic diagram of an information-data-conveying communication system application, in accordance with some embodiments of the present disclosure.
Figure 18 is a schematic diagram of an information-data-conveying communication system application, in accordance with some embodiments of the present disclosure.
Fig. 19 is a schematic diagram of a processing device, a content transition device, or a terminal device in an information data communication system, according to some embodiments of the present disclosure.
FIG. 20 is a flow chart of a data record digitization process according to some embodiments of the present disclosure.
FIG. 21 is a flow chart in accordance with certain embodiments of the present disclosure.
Figure 22 is a state diagram of an information-data-conveying communication system, in accordance with certain embodiments of the present disclosure.
FIG. 23 is a flow chart of data processing according to some embodiments of the present disclosure.
FIG. 24 is an ambient tag encoding approach in accordance with certain embodiments of the present disclosure.
FIG. 25 is a flow chart of a transition process for an electronic file according to some embodiments of the present disclosure.
FIG. 26 is a data flow diagram in different modes of an application layer according to some embodiments of the present disclosure.
Fig. 27 is a schematic diagram of an internet of things according to some embodiments of the present disclosure.
Detailed Description
The same reference numbers will be used in the drawings to describe the same elements of the disclosure in order to facilitate reading.
The present disclosure provides an information data communication system and method, which can cooperate with a plurality of devices and make it easier for users to receive information according to user feedback.
The present disclosure provides an information data communication system including a cloud device with high computing power in the internet of things, which can digitize various information contents into data files of different formats or can collect data files of different formats related to various information contents from an external or internal network. The content of feature values, keywords, etc. in such data files may be labeled as content tags. The processed data files with different formats can be transmitted to a gateway with routing capability through the Internet of things and then forwarded to a plurality of terminal devices with different functions. An information data content delivery communication system includes a processing device, a content meta device and one or more terminal devices. The telematics system can identify images of a living being (e.g., a user) entering the unit and identify biometric characteristics thereof, wherein the biometric characteristics can include skin color, pupil color, height, weight, gender, age, and the like. The message data transmission communication system plays the content according to a predetermined message data and the user's biological feature. The message data communication system may also sense feedback from a living being (e.g., a user) to change what is played in a next time segment. If there is no corresponding playable content in the information data transmission communication system, the content transition device of the information data transmission communication system can obtain the corresponding content from other information data transmission communication systems via the network for playing.
Fig. 1 is a schematic diagram of a communication system in accordance with some embodiments of the present disclosure. The information content 101 is a derivative created by human in the evolution process, such as literature, art, science, religion …, etc., which are recorded by one or a combination of symbols, characters, temperaments, drawings …, etc., and are left in the world, and is the information generated in human life or in the evolution process. FIG. 1 illustrates various electronic storage devices and types that are widely used today. After being widely used, the electronic storage device can convert various information contents 101 into electronic signals, store the electronic signals in recording media such as magnetic tape 102, hard disk 103, solid state disk 104 or optical disk 105, and reproduce the contents in the recording media through various playback devices. In addition, the content may be stored in a content cloud (or server) 106 in the internet of things, and transmitted to a specific playing device through the internet 107 (or mobile communication and radio broadcasting station), for example, to a mobile device 108 (e.g., a mobile phone, a tablet computer), a computer 109, a projection device 110, an audio/video device 111, a video wall 112, and the like.
Fig. 2 is a schematic diagram of an information data communication system, in accordance with some embodiments of the present disclosure. FIG. 2 discloses an information-data communication system 200 in which the information content 101 may be represented by a plurality of data records in different formats, such as physical, size, drawing, composition, manufacturing process, owner history, multi-national languages, etc., from a plurality of component sources. In the information data communication system shown in fig. 2, a processing device 203 (e.g., a high computing processing cloud or a server) is provided in a network (e.g., the internet of things), and data records with different formats are digitized into electronic files 202 with different formats. During the digitization process, the various data records 201 are converted into a plurality of electronic files 202 in different formats, such as images, engineering drawings, voices …, and so on. The feature values and keywords (e.g., climate, taste, location, elements, names of people, etc.) in each electronic file 202 are labeled as content tags 204.
These electronic files 202 containing the information data content of the content tag 204 are transmitted from the network to a specific content transition device 205, and the content transition device 205 is further connected to the terminal devices 210 to 221 via a wired or wireless method. The terminal devices 210 to 221 include an environment control device 210 (e.g., a temperature and humidity control device), an odor control device 211 (e.g., a fragrance device or a fragrance diffusion device), an environment parameter sensing device 212 (e.g., for sensing temperature and humidity), a light control device 213, an image capturing device 214, a bio-sensing device 215, an audio receiving device 216, an audio playing device 217, a projection device 218, a display device 219, a 3D printing device 220, a 3D projection device 221, a human-machine input interface (not shown), a position detection device (not shown), and the like. The terminal devices 210-221 may be used to present the electronic file 202 in a particular configuration. For example, the terminal devices 210-221 may present the particular electronic file 202 on a schedule. Also for example, when a living being 222 (e.g., a user) enters the telematics system 200, the image capture device 214 can obtain images (e.g., eye and face images) to identify the living being 222, can obtain biometric features (e.g., fingerprint, DNA, pupillary membrane, voice print) via the biometric sensing device 215, and can obtain a password and user input from the HID human input interface. Other terminal devices may obtain environmental parameters such as GPS coordinates, temperature, humidity, air quality, etc. to further adjust the environmental control device 210. Further, the 3D projection device 221 may visualize the virtual entity, and the 3D printing device 220 may be instantiated for viewing or stroking by the creature 222.
The projector 218 and the display device 219 can display images of multiple screens and descriptions of image data. Other controlled devices such as the audio playing device 217, the light control device 213, the environment control device 210 and the odor control device 211 can further shape the feeling of being personally on the scene. The creature 222 may also be fed back through the terminal device, for example, gesture, body posture, voice feedback may be obtained from the image capturing device 214 and the audio receiving device 216. Through the content tag 204 in the electronic file 202, the terminal device can retrieve the related electronic file 202 for playing. If the terminal device can not search the required electronic file 202 in the internal network, the content transition device 205 can obtain the required electronic file 202 from the processing device 203 via the network for playing.
FIG. 3 discloses an architectural model of the message data communication system of the present disclosure. Table 1 provides a further description of the architectural model of the informative data-conveying communication system of the present disclosure.
As shown in fig. 3, the architecture model of the message data communication system of the present disclosure comprises an 8-layer architecture, including: an information data layer 301, a digital layer 302, an editing layer 303, an application layer 304, a session layer 305, a network layer 306, a framework layer 307, and a physical layer 308, wherein the connection relationship between each layer is shown in fig. 3. The physical layer 308, the architecture layer 307, and the network layer 306 respectively correspond to the hardware device architecture and the network topology of the processing device 203, the content transition device 205, and the terminal devices 210 to 221 in the information data communication system 200.
Figure RE-GDA0002228186890000051
Figure RE-GDA0002228186890000061
TABLE 1
The information data layer 301, the digital layer 302 and the editing layer 303 illustrate how the various data records 201 are digitized into the electronic file 202 in a plurality of data formats. Content tags 204 are added to the electronic file 202 to facilitate retrieval and subsequent retrieval, analysis and application. The collected data records and/or electronic files are organized into a script by the content tag 204. The script provides information such as summary and outline to the electronic file 202 associated with an information data, so that the electronic file can be used more accurately and efficiently when being transmitted in the network.
The session layer 305 and the application layer 304 illustrate the process of communicating information data and the scene mode of the application in an information data communication system.
As shown in fig. 3, the 8-layer model includes a physical layer 308 corresponding to the various hardware devices in the network that implement the messaging data communication system. The hardware device can be divided into a processing device, a content transition device and a terminal device. The processing device, the content state transferring device and the terminal device have the functions of calculation, memory, sensing, control, networking and the like.
Table 2 provides a comparison of the functionality of the terminal device, the processing device, and the content transition device.
Function(s) Processing apparatus Content state-changing device Terminal device
Computing power Good taste In Weak (weak)
Memory ability Good taste In Weak (weak)
Networking capability In Good taste Weak (weak)
Sensing capability and control capability Weak (weak) Weak (weak) Good taste
TABLE 2
As shown in table 2, the terminal device. The processing device and the content state transition device have different functions. For example, the processing device is a high-performance computing processor and a high-capacity memory, and is responsible for digitizing various data records 201 into electronic files 202 of various formats, marking the keywords in the digitized electronic files as content tags 204, and organizing a script according to an information data and the content tags 204 in the electronic files 202, so as to manage, store, delete and propagate the electronic files 202 of the information data.
The content transferring device can transfer the script from the processing device into an electronic file format that can be played by different terminal devices. The electronic file that is transited to be played by a specific terminal device is called a transition file. The content tag of the electronic file is also converted into a format that can be recognized by the terminal device and is called an atmosphere tag. The content transition device can generate the flow for controlling different terminal devices according to the flow played in the script. The process of information data transmission is one of the methods of information data transmission. The content state-changing device can receive the feedback of the user to adjust the content of the information data to be transmitted and guide the user to receive the specific information data content to be transmitted. The content state transition device has various heterogeneous networking capabilities, long-distance networking capabilities (such as GSM, Wifi, CAT M1, NB _ IOT) for connecting the processing device, short-distance communication capabilities (including wireless WiFi/Bluetooth/ZigBee/RFID/RF/IR/ultrasonic/optical carrier … and the like or wired LAN/RS485/RS232/I2C/SPI/GPIO … and the like) for connecting a plurality of terminal devices, and a computing processor for managing applications in an information data communication system and a method for information data communication. The terminal device has computing, memory and networking capabilities, as well as control capabilities for presenting electronic files and sensing capabilities for sensing environmental parameters and biometric characteristics.
The sensing capability of the terminal device for sensing environmental parameters and bio-entry and bio-features can be realized by input devices (such as touch pad, infrared detection, audio receiving device, image capturing device, HMI human machine interface), environmental parameter detection devices (such as detecting brightness, ultraviolet light, humidity, temperature) or bio-sensing devices (such as recognizing physiological features of fingerprint, voiceprint, retina, DNA, face, etc.).
The control capability shown in Table 2 can control the odor control device, electronic paper, display device, audio player, light control device, 3D printer, 3D laser developing device, air temperature and humidity control device, etc. to present electronic files. Several terminal devices may be deployed in one predetermined area, such as a building, machine, vehicle, or cell, to form an interaction area. By detecting environmental parameters (such as light, temperature, humidity, fragrance, etc.) by the terminal device, other terminal devices are further controlled to present visual, auditory, tactile and olfactory senses in the electronic file to the living body (such as human, animal and plant, etc.) in the predetermined area.
As shown in fig. 3, the 8-layer model includes a structural layer 307. The architecture layer describes how the processing device, the content state transition device and the terminal device may configure an information data transmission communication system.
Figure 4 is a schematic diagram of an information data communication system, in accordance with some embodiments of the present disclosure. As shown in fig. 4, the data record 201 is digitized and processed by the processing device 401 into an electronic file 202 containing content tags 204 and a script. The electronic file 202 including the content tag 204 and the scenario are transmitted from the processing device 401 to the content transition device 402 via a network (e.g., internet of things). The content transition device 402 is connected to the end devices 403-1 to 403-n, wherein the end devices 403-1 to 403-n comprise sensing means or control means. The electronic file 202 has different formats, each format being playable in compliance with a specific terminal device. Fig. 4 illustrates that the basic components of an information-data communication system may include a processing device 401, a content-transition device 402, and a plurality of end devices 403-1 through 403-n.
A content transition device 402 and a plurality of terminal devices 403-1 to 403-n may be deployed as a communication unit 410 in a building, a moving vehicle, or any predetermined area. The communication unit 410 may further include environmental parameters 421 and living things (e.g. human or animal) 422, so as to form an interactive space 420. The processing device 401 may obtain various environmental parameters and biometric data via sensing devices in the terminal devices 403-1 through 403-n. The information data transmission communication system presents different electronic files according to the change of environment and different biological objects. For example, the information data communication system may determine the presented electronic file and the presentation manner of the electronic file according to the height, sex or age of the creature 422, and with different environmental parameters.
As shown in fig. 3, the 8-layer model includes a network layer 306. The network layer conveys the network topology of the devices of the communication system for information data. The network topology of each device in an information-data-conveying communication system is described with reference to fig. 5.
Fig. 5 discloses an exemplary network topology for an informative data communicating communication system according to the present disclosure, the network topology comprising processing devices 401-1 to 401-3, content transition devices 402-1 to 402-3, terminal devices 403-1 to 403-7 and other devices 404-1 and 404-2 outside the system. As shown in fig. 5, the processing device 401, the content transition device 402 and the terminal device 403 may be connected to each other, for example, wirelessly or by wire, as long as the processing device 401, the content transition device 402 and the terminal device 403 are within the range and capability of the connection.
As described above with respect to fig. 4, the basic components of an information data communication system include a processing device, a content transition device and a plurality of terminal devices. However, each of the processing devices 401-1 through 401-3 may be connected to multiple content transition devices 402. Thus, each of the processing devices 401-1 through 401-3 may support a plurality of different information-data communication systems. For example, the processing device 401-1 in fig. 5 may be connected to the content state transition devices 402-1 and 402-2 and form an information data communication system with the content state transition devices 402-1 and 402-2.
Each of the content state transition devices 402-1 to 402-3 may be connected to a plurality of processing devices 401, and one content state transition device 402 may receive data from a plurality of processing devices 401, such as the content state transition device 402-2 in FIG. 5 connected to the processing devices 401-1 and 401-2. Different content state transition devices may also be connected to each other to increase the communication distance, such as the content state transition device 402-1 connected to the content state transition devices 402-2 and 402-3 in FIG. 5.
Each of the end devices 403-1 through 403-7 may be connected to one or more content state transfer devices to incorporate a multi-information data communication system, such as the end device 403-2 in fig. 5 may be connected to both content state transfer devices 402-1 and 402-2. Each of the terminal devices 403-1 through 403-7 may also be connected to each of the processing devices 401-1 through 401-3, as is the case with terminal device 403-7 and processing device 401-3 in FIG. 5. Each of the terminal devices 403-1 through 403-7 may directly act as a sensing device or a control device for any of the processing devices 401-1 through 401-3. Each of the terminal devices 403-1 through 403-7 may be interconnected to operate in a harmonic mode (harmony mode), such as terminal device 403-5 connecting terminal device 403-6 in fig. 5. In some embodiments, each of the processing devices 401-1 through 401-3 may send a scenario to any of the terminal devices 403-1 through 403-7 via the content transition device. In some embodiments, the content transition devices 402-1 and 402-2 may send a transcript to another processing device or another content transition device. In some embodiments, each of the end devices 403-1 through 403-7 may send a transcript to another end device.
FIG. 6 is a schematic diagram of a message packet according to some embodiments of the present disclosure. Fig. 6 discloses an exemplary message packet 600 for use in an information data communication system in accordance with some embodiments of the present disclosure. Packet 600 includes an access code (accesscode)610, a header 620, a guard 630, a payload 640, and a suffix 650. The access code 610 includes a preamble 611, a synchronization code 612, and a suffix 613.
Header 620 includes 6 identifiers, identifiers 621 through 623 being identifiers of destinations, and identifiers 624 through 626 being identifiers of sources. Identifiers 621-626 identify the hardware receiving the packet and the hardware sending the packet. The header 620 includes a destination's realm identifier 621, a destination's content transition device identifier 622, a destination's end device identifier 623, a source's realm identifier 624, a source's content transition device identifier 625, a source's end device identifier 626, and a link control message 627. In addition, the header code 620 includes other link control messages 627 such as a packet type code, a flow handle or an acknowledgement indicator code, sequence numbers used to sequence data in packet transmissions, and header error checking.
Each processing device is responsible for domain management of an information data communication system. Each processing device has a network domain identifier.
Each content state transfer device is responsible for the management of a deployed area under an information data communication system, and each content state transfer device has an area identifier.
Each terminal device has a device identifier, which can be combined into a device identifier unique in the network domain of the same processing device, via the area identifier of the networked upper layer routing device (gateway).
As shown in fig. 3, the 8-layer model includes an information data layer 301. Information data is a means of presenting human information. The information is the derivative created by human in the history process, such as literature, art, science, religion, politics, history, etc. The derivatives of human information are preserved by means of symbols, characters, voice, entities, etc. One information data may be a combination of a plurality of data records. In the past, some data recording materials were used to store human information, and these data records can be roughly classified as follows:
3D: material objects, 3D images, models …;
2D: drawing, 2D picture, photograph …;
video and audio: magnetic tape, audio tape, spoken language, music score …;
engineering: building diagrams, mold diagrams, equations …;
odor: recipe, process ….
As shown in fig. 3, the 8-tier model includes a digital tier 302. The digital layer is used for digitizing the data records into an electronic file including the content tag. The data record 201 may be a film, a drawing, a text, an engineering drawing, an equation, etc.
FIG. 7 is a schematic diagram of a data record and a content tag, according to some embodiments of the present disclosure. As shown in FIG. 7, the data record 201 is digitized into an electronic file 202 of a different format. The electronic files may include video, audio, images, engineering drawings, and documents. Video formats include AVI, FLV, WMV, MOV, MPEG, etc.; the audio formats include MP3, WAV, AAC, BWF, etc. The image formats include GIF, PNG, X3D, JPEG, RAW, etc. The engineering drawing format includes a construction drawing, a structure drawing, a mold drawing, a mechanical drawing, an electrical drawing, a piping drawing, a program, an equation, a chemical formula, and the like. The file format includes DOC, PDF, TIFF, INI, RSS, etc.
The format has a corresponding terminal capable of playing. If the data record is a real object, it can be digitized into the electronic file 202 after 3D scanning and photographing. Keywords (e.g., specific meanings, characteristics, and values) in the electronic file 202 can be labeled as content tags 204. In some embodiments, the content tag 204 may be: the temporal location and pixel location of the video file, the temporal location of the audio file, the pixel location of the image file, the page number and row number of the file, and the pixel location, page number and row number of the engineering drawing file, etc., and the meaning, feature, and numerical value of these locations, etc., are indicated. Each content tag may include parameters 704-1 through 704-n. The parameters 704-1 to 704-n may be light parameters, environmental parameters, odor parameters, terrain parameters, climate parameters, situational parameters, tool parameters, biometric parameters, etc. The parameters 704-1 to 704-n provide a summary of the content of the electronic file 202 to compose a transcript.
FIG. 8 is a schematic diagram of an electronic file according to some embodiments of the present disclosure. FIG. 8 shows that the data record 201 is digitized into p electronic files 202-1 to 202-p, wherein the electronic file 202-1 has n content tags 204-11 to 204-1n, the electronic file 202-2 has m content tags 204-21 to 204-2m, and the electronic file 202-p has q content tags 204-p1 to 204-pq. Content tags 204-11 through 204-pq can be viewed as a two-dimensional data structure.
As shown in fig. 3, the 8-layer model includes an editing layer 303. After the data record 201 is digitized into the electronic file 202 with the content tag 204, each information data needs to have a script describing the summary, outline, etc. of the information data. The summary in the script is a collection of content tags 204 in the electronic file 202, but content tags in different electronic files may have the same keyword attributes. A scenario may instruct the terminal device in the communication unit 410 to play at least one electronic file in each time segment, and the function and use of the scenario will be described in detail later. One of the functions of the editing layer is to sort the content tags into an n-dimensional (e.g., 2-dimensional) array of data. The script of the information data can search, classify, search and identify the electronic files in the information data.
As shown in fig. 3, the 8-tier model includes a conversation tier 305. The session layer is used for interactive management in the information data transmission process. The processing device may set the schedule of electronic files in the information data according to the interactive space 420. The biological characteristics can be obtained from the terminal device in the interactive space 420, and the transmission and playing of the electronic file in the information data can be performed after the identification, and the playing can be stopped when the biological leaves the interactive space 420. The creatures can be fed back from the terminal device (such as input/output device or sensing device) to control the playing of the electronic file in the communication unit 410. The creatures can also input electronic files from the terminal device, which will broadcast the input electronic files and content tags to the content transition device and the terminal device of the communication unit 410 for presentation of the related information data.
As shown in FIG. 3, the 8-tier model includes an application tier 304. The application layer starts an application mode of the communication broadcast for the communication unit 410. The application mode includes: cooperative mode (collaborative mode), and harmonic mode (harmony mode).
Figure 9 is a schematic diagram of a network topology in accordance with some embodiments of the present disclosure. Fig. 9 discloses a collaboration mode of an information-data conveying communication system, in accordance with certain embodiments of the present disclosure. In the collaboration mode, the scheduling and management of the broadcast of the electronic file 202 is performed by the processing device 401 to the content transition device 402 and the terminal devices 403-1 to 403-n in the communication unit 410.
Figure 10 is a schematic diagram of a network topology, in accordance with some embodiments of the present disclosure. Fig. 10 discloses a cooperative mode of an information-data conveying communication system in accordance with certain embodiments of the present disclosure. A dropped or degraded connection quality (e.g., a time delay) may be caused in the network. In the cooperative mode, if the processing device 401 is disconnected, the content transition device 402 is responsible for the broadcast management of the electronic file. In addition, as shown in FIG. 10, the content transition device 402 can broadcast when receiving the electronic file input from the terminal device 403-1. Furthermore, in some embodiments, as shown in fig. 10, if the content transition device 402 receives an input of the electronic file (or transition file) from the terminal device 403-1, the input electronic file (or transition file) can be translated into the related content tag, the electronic file (or transition file) suitable for a selected information data is edited from the database of the content transition device 402, and the electronic file (or transition file) suitable for other terminal devices (e.g. 403-2, 403-3, etc.) is translated into the transition file suitable for other terminal devices (e.g. 403-2, 403-3, etc.) and then broadcast to other terminal devices (e.g. 403-2, 403-3, etc.).
Figure 11 is a schematic diagram of a network topology in accordance with some embodiments of the present disclosure. Fig. 11 discloses a coordination mode of an information-data communication system in accordance with certain embodiments of the present disclosure. In the tuning mode, when the content state transition device 402 in the communication unit 410 is disconnected, if an input of the electronic file is received from the terminal device 403-1, the terminal device 403-1 can add a content tag to the input electronic file and broadcast the electronic file to other terminal devices (e.g., 403-2, 403-3, etc.). In addition, the other terminal device will provide the corresponding electronic file from its storage space for tuning broadcast. The terminal device 403-1 can transmit and receive signals through the infrared sensor after disconnecting from the content transition device 402, and can also transmit and receive signals through the carrier-modulated signal, so as to broadcast and receive data in the space where the information data is transmitted, and inform other terminal devices (such as 403-2, 403-3, etc.) to perform the local tuning mode. The microphone and horn may also be transceivers for ultrasonic signal carriers. The sound wave is modulated by the carrier wave and transmits data, and notifies the terminal device 403-1 of the local tuning mode by the ultrasonic carrier wave. The light control device and the photosensitive sensor can also be used as a carrier of light wave data. A sensing component such as a microphone and a photoresistor is arranged on a device of the three-dimensional ring field loudspeaker and can receive data of ultrasonic wave carrier waves and optical carrier waves. When the wired and wireless networking is disconnected, the application of the harmonic operation mode can be carried out through the communication messages of other heterogeneous networking capabilities and other terminal devices.
Fig. 12 is a schematic diagram of a feedback tag in accordance with some embodiments of the present disclosure. The present disclosure provides a method for providing feedback to an information data communication system from a living being (e.g., a user).
The sensing device in the information data transmission communication system can detect the posture and physiological parameters of a living being (such as a user) so as to obtain the intuitive response of the living being. As shown in fig. 12, the user's responses include facial expressions, tone-holes, gestures, voice, physiology, etc. In detail, the facial expression responses include happiness, anger, sadness, happiness, etc.; pupillary responses include enlargement, reduction, eye closure, etc.; the gesture reaction comprises hand waving, thumb, chest holding and the like; the posture reaction comprises standing, walking, head deviating, sitting down and the like; physiological responses include sweating, changes in heart rate, changes in blood pressure, and the like; and voice responses include spoken language, volume changes, commands, and the like. The user responses may express the user's wishes to provide feedback to the information data communication system.
User feedback during a time interval is highly dependent on the electronic file presented by the information data communication system during the time interval. For example, with a finger ratio V, may represent a win, the letter V, or the number 2. Therefore, the same user feedback may express different wishes in different time segments.
Assuming that the person (user) to be communicated is a rational response, the user feedback may be tested to set the desired response. In the communication of information data, a scenario includes playing a plurality of different electronic files, each electronic file to be played having a feedback tag in each time segment, including, for example, a user feedback table. The user's will expressed in response to each time segment can be known by looking up the user feedback table in the feedback label in each time segment. The same response may represent different wishes in different contexts. For example, in a happy situation, the user feedback level is +1, but in a sad situation, the user feedback level is-1.
After the electronic file is played in a certain time period, the user response can be obtained by the sensing function of the terminal device 403, and the processing device 401 compares the scenario of the information data to obtain the user feedback level of the time period. The user feedback levels for each time segment are summed up to obtain the user's satisfaction degree for the information data. In addition, according to the user response of each time segment, the electronic files to be presented in the next time segment and the playing mode of the electronic files to be presented can be further changed compared with the script of the information data. In addition, in the collaboration mode, the content transition device 402 may be responsible for comparing the user response with the scenario of the information data. In the tuning mode, the terminal device 403 may also be responsible for comparing the user response with the scenario of the information data.
A scenario may instruct the terminal device in the communication unit 410 to play at least one electronic file in each time segment. The scenario also includes a plurality of feedback labels, and each time segment includes at least one feedback label. Each feedback label includes a plurality of corresponding operations and a plurality of user feedback levels. With respect to the corresponding operation, for example, a feedback tag indicates: if the user feedbacks the 'favorite' of facial expression reaction in the time section, there is a corresponding operation; if the user feedbacks the gesture response of "hug chest" in this time period, there is another corresponding operation; if the user feedback is not marked in the feedback label in the time interval, another corresponding operation is performed. The corresponding operation includes changing the files to be played in the next time segment or the playing manner of the files to be played. In some embodiments, the corresponding operation includes changing the playing time position and the playing pixel position of the video file to be played, the playing time position of the audio file to be played, the playing pixel position of the image file to be played, the number of playing pages and the number of playing lines of the file to be played, or the playing pixel position, the number of playing pages and the number of playing lines of the engineering drawing file to be played in the next time segment.
Regarding the user feedback level, for example, a feedback label indicates: if the user feedbacks the facial expression response "" likes "" during the time period, then there is a user feedback level (e.g. + 10); if the user feedback is "chest-holding" in the gesture response during the time period, there is another user feedback level (e.g., -5); if the user feedback is not marked in the feedback label in this time period, there is another user feedback level (e.g., + 0). Each presentation of the information data may result in a plurality of user feedback levels. The telematics system of the present disclosure determines a user feedback level in response to user feedback, and continuously updates a user satisfaction based on the determined user feedback level in each time segment, and obtains a final user satisfaction after the playback of the electronic file is completed. In some embodiments, end user satisfaction is then summed from the plurality of user feedback levels.
The information data transmission can include the following four modes according to purposes: "information data broadcast", "information data modeling", "interest acquisition", and "information data satisfaction".
"message data broadcast" refers to an electronic file that broadcasts specific message data to both general and specific users. For example, an electronic file broadcasting specific information data to users of different groups of users of specific gender, specific age, specific height, guests, employees, etc. For example, the message data to be transmitted in "" message data broadcasting "" mode is specified based on the biometric characteristic of the user (the person to be transmitted).
The 'information data modeling' is to play the specific electronic file to the user in the communication unit, create the atmosphere through the control device of the terminal device, and learn the feedback of the user through the sensing device of the terminal device, and adjust the played electronic file in time so that the user can be immersed in the information data. For example, in the "message data modeling" mode, at least one of the corresponding operations is performed in response to feedback from the user (to be conveyed) so that the user satisfaction is maintained within a predetermined range. In other words, at least one of the corresponding operations to be performed is selected with the purpose of maintaining the user satisfaction within the predetermined range in response to the user's feedback.
"interest acquisition" refers to the user's response to a specific feedback tag and user feedback level obtained by playing a specific electronic file, and obtaining various interest orientations of the user after statistical analysis. For example, in the "interest acquisition" mode, the user feedback level determined in response to the user's (communicated) feedback may be transmitted to a processing device, a content state-transfer device, or a terminal device, and various interest orientations of the user may be acquired after statistical analysis.
The term "information data satisfied" refers to playing a specific electronic file for the user, obtaining the user feedback and user feedback level, and then adjusting the played electronic file to increase the user's satisfaction. For example, in the "informational data satisfy" mode, at least one of the corresponding operations is performed in response to feedback from the user (who is being communicated) to maximize user satisfaction. In other words, at least one of the corresponding operations to be performed is selected with the goal of maximizing user satisfaction in response to user feedback.
The above four information data transmission modes can be achieved by the same algorithm but different conditions for playing electronic files and controlling environment. In some embodiments, the above four information data transmission modes can be achieved by different algorithms in the present disclosure.
FIG. 13 is an algorithmic state diagram in accordance with certain embodiments of the present disclosure.
The transmission of an information data can be divided into a plurality of time segments according to the playing time. The type and presentation of the electronic file 202 to be presented in each time segment is provided within the script.
The information data propagation in fig. 13 is divided into m time segments. Each time segment is the smallest calculation unit, and each time segment comprises an S state, an A state and a G state. The S state is a playing state, the A state is a state for obtaining the environmental condition and the response of the user and determining the corresponding operation, and the G state is a state for determining the feedback level of the user.
S1, S2, S3. A1, A2, A3.. and Am-1 are arranged in sequence on a time axis. G1, G2, G3., Gm-1 are arranged in sequence on the time axis.
In the state of S1, the terminal device is initialized according to the scenario configured by the processing device and then plays the electronic file. The subsequent state from S1 to Sm is to adjust the electronic file to be played in the next time segment and the presentation mode of the electronic file according to the information data propagation mode in the script. After the terminal device starts playing the electronic file in the state setting of S1, the sensing device in the terminal device enters the state a to obtain the user response to a feedback tag from the user and obtain the environmental parameters of the interactive space 420 and determine the corresponding operation as the reference for playing the electronic file in the state of S2. In the G state, the corresponding feedback label in the script is compared according to the user response so as to judge the feedback level of the user.
FIG. 14 is an algorithm state diagram in accordance with certain embodiments of the present disclosure. When operating in the "data broadcast message" mode, the system algorithm states are as shown in FIG. 14. After the electronic file set in the script is played from the end of S1, in the a state of each time segment, the sensing device in the terminal device confirms whether the environmental parameters and the user' S response, such as temperature and light, are correct? Is the user changed? After the changed condition is confirmed, the corresponding operation is determined to adjust the electronic file played in the next S state. When operating in the message data broadcasting mode, the user feedback level is not determined according to the user response, so the G state is not entered. The difference between fig. 14 and fig. 13 is that the G state is not included in fig. 14.
FIG. 15 is an algorithmic state diagram in accordance with certain embodiments of the present disclosure. Operating in the "message data modeling" mode, the "interest acquisition" mode, and the "message data satisfy" mode, the state of the system algorithm is shown in FIG. 15. The three modes are all determined according to the user's response, so it is calculated by the G state. According to the user feedback level, the played electronic file is adjusted in the next S state according to the conditions of different modes, and the environmental parameters and the user response required to be obtained by the sensing device in the terminal device in the next A state are obtained.
The "information data creation" mode is a mode for instructing the user to feel the same scene and atmosphere as the information data. For example, if the communicated information data is set to be comfortable, the electronic file indicated to be played in the script may be soft music, light, warm and fresh. After the user response is obtained through the A state, the G state is compared with the feedback label to determine the user feedback level, and the terminal device can be adjusted at the lower position of the user feedback level in the specific feedback label in the next S state. For example, if the user feels too hot or loud, the user may adjust the air conditioning or music content. When the information data modeling mode is operated, a user can feel the situation that the information data is to be modeled.
The "interest acquisition" mode is to transmit a specific message data and to acquire the user feedback level of the user to the specific feedback tag. For example, a series of electronic files are set for the user, and the user feedback level of the user to the feedback label is recorded. In the interest acquisition mode, the electronic file schedule played in the S state is the same as the information data broadcasting mode, and the electronic file set according to the scenario is played in each time segment. However, the G state is added to the interest acquisition model to obtain the user feedback level of the user for a specific feedback tag for further statistical analysis. Generally, the interest-obtaining model can be applied to market research or opinion research. The user can receive the information data transmission and simultaneously perform statistical analysis on the played electronic file. In addition, an interest acquisition confirmation status may be further included in the interest acquisition mode. In the interest obtaining confirmation state, after obtaining the user feedback level of the user to the specific feedback label, the electronic file with the similar content label can be further provided for the feedback label of the higher user feedback level and the user feedback level can be further obtained to confirm the highest user feedback level of the electronic file of the similar content label for the user.
The "message data satisfied" mode shapes a scene of message data to be transmitted and satisfies the user with the message data. The information data satisfying mode is similar to the interest acquiring mode, and a specific information data is transmitted to the user, and the positive satisfaction of the user such as the interest and the like of the information data is improved. For example, in a gymnasium environment, if the information data set for a user is exercise-enhancing caloric consumption , the exercise caloric consumption can meet the requirements set by the script of the original information data by playing the electronic file associated with the information data, adjusting the associated environmental control device, and sensing the physiological condition of the user. The information data satisfying mode satisfies the user for the specific information data, and the interest acquiring mode sets the information data with multiple test purposes to acquire the interest orientation of the user.
Figure 16 is a schematic diagram of an information-data-conveying communication system application, in accordance with some embodiments of the present disclosure. Fig. 16 shows information data 1601, a processing device 1602, a content transition device 1603, an environment control device (air conditioner) 1604, a lighting control device 1605, an image capture device 1606, an audio playback device 1607, a display device 1608, a scent control device (scent diffusion device) 1609, a 3D projection device 1610, a projection device 1611, and a user 1612. The environment control device 1604, the light control device 1605, the image capture device 1606, the audio playback device 1607, the display device 1608, the odor control device 1609, the 3D projection device 1610, and the projection device 1611 are all terminal devices.
In this embodiment, user 1612 is a potential customer who walks with a dog. The content state-changing device 1603, the environment control device 1604, the light control device 1605, the image capturing device 1606, the audio playing device 1607, the display device 1608, the odor control device 1609, the 3D projection device 1610, the projection device 1611 and the user 1612 form an interactive space 1600.
FIG. 16 discloses a scenario in which the present disclosure is applied to product sales. FIG. 16 shows an interactive space 1600 of product sales information data to shape the scene of the product sales information data. An environment control device 1604, a light control device 1605, an audio player 1607, and a smell control device 1609 are used to form an interactive space 1600 with appropriate temperature, light, music, and smell. The user's interest in the feedback tag is obtained by the image capture device 1606, for example, the image capture device 1606 detects that the user 1612 is looking at sports shoes or hat merchandise in the file being played on the display device 1608. Then, the 3D projection device 1610 and the projection device 1611 present further information of the sports shoes or caps to the user 1612 through the setting of the script, thereby achieving the purpose of enhancing the consumption will of the user 1612.
Figure 17 is a schematic diagram of an information-data-conveying communication system application, in accordance with some embodiments of the present disclosure. Fig. 17 discloses information data 1701, a processing device 1702, a content transition device 1703, an environment control device (air conditioner) 1704, a light control device 1705, an image capturing apparatus 1706, audio playback devices 1707 and 1713, a display device 1708, odor control devices (fragrance diffusion devices) 1709 and 1714, a production device sensing device 1710, an optical scanning device 1711, a projection device 1712, a 3D printing device 1715, and users 1716 and 1717.
The environment control device 1704, the light control device 1705, the image capture device 1706, the audio playback devices 1707 and 1713, the display device 1708, the odor control devices 1709 and 1714, the production equipment sensing device 1710, the optical scanning device 1711, the projection device 1712, and the 3D printing device 1715 are all terminal devices.
In this embodiment, user 1716 is an operator on the production line and user 1717 is a manager in the control room. The content transition device 1703, the environment control device 1704, the light control device 1705, the image capturing device 1706, the audio playing device 1707, the display device 1708, the odor control device 1709, the production device sensing device 1710 and the optical scanning device 1711 form an interactive space 1700 of the production line. The content state transition device 1703, the light control device 1705, the audio playing device 1713, the odor control device 1714, the projection device 1712, and the 3D printing device 1715 form an interactive space 1750 in the control room.
FIG. 17 discloses a scenario in which the present disclosure is applied to production lines and control rooms in a plant. In an online production interactive space 1700, the present disclosure may shape the work environment that is satisfied by users 1716. The mental state of the operator 1716 is obtained by the image capture device 1706. The quality status of the production equipment and the product can be obtained by the production equipment sensing device 1710 and the optical scanning device 1711. By analyzing the data obtained by the image capturing equipment 1706, the production equipment sensor 1710 and the optical scanning device 1711, the current response and feedback (e.g., work efficiency) of the operator can be presented. The display device 1708 may display SOP job descriptions of production to provide description assistance when the operator exhibits reduced work quality or poor work efficiency. Interactive space 1750 in the control room may also interact with interactive space 1700.
Figure 18 is a schematic diagram of an information-data-conveying communication system application, in accordance with some embodiments of the present disclosure. Fig. 18 discloses display devices 1801 and 1802, a light control device 1803, an image capture device 1804, a traffic sign control device 1805, electronic billboards 1806, 1807, and 1808, an audio player 1809, a user 1810, a small vehicle 1811, and a large vehicle 1812. The display devices 1801 and 1802, the lighting control device 1803, the image capture device 1804, the traffic sign control device 1805, the electronic billboards 1806, 1807, and 1808, and the audio playback device (broadcasting device) 1809 are terminal devices.
The user 1810 is a person who receives evacuation instructions, and the small-sized vehicle 1811 and the large-sized vehicle 1812 are a small-sized vehicle and a large-sized vehicle, respectively, which receive evacuation instructions. FIG. 18 illustrates the application of the present disclosure to outdoor public services, and more particularly, to public services for outdoor refuge evacuation indication. Figure 18 shows interactive space 1800 for a street corner. The display devices 1801 and 1802 play the electronic files, and the light control device 1803, the traffic signal control device 1805 and the electronic billboards 1806 to 1808 provide more information to the user 1810, the small vehicles 1811 and the large vehicles 1812. For example, the image capturing device 1804 can be used to inform pedestrians to pass through the intersection, and the traffic signal control device 1805 can be used to control the traffic light signal to let pedestrians pass through; the image capture device 1804 learns that the road into which the large vehicle 1812 is going is only allowed to pass, the electronic billboard 1806 displays the sign that the vehicle is prohibited from entering, and the electronic billboards 1807 and 1808 further display the following directions of the pedestrian and the vehicle.
The embodiment of fig. 18 is a tsunami evacuation information data providing evacuation directions for the user 1810, small vehicles 1811 and large vehicles 1812. The embodiment of fig. 18 operates in an informational data satisfying mode. The direction of movement of the user 1810, small vehicle 1811 and large vehicle 1812 obtained by the image capture apparatus 1804 may be considered the interest of the user 1810, small vehicle 1811 and large vehicle 1812. The moving interest directions of the user 1810, the small-sized vehicle 1811 and the large-sized vehicle 1812 are guided to the default moving direction in the information data through the light control device 1803, the traffic signal control device 1805 and the audio playback device 1809.
Further, the embodiment of FIG. 18 may operate in any of a collaboration mode, and a coordination mode. The embodiment of fig. 18 shows devices 1801 and 1802, light control device 1803, image capture device 1804, traffic control device 1805, electronic billboards 1806, 1807 and 1808, and audio playback device (broadcaster) 1809 all having computing processor, memory and communication interface to implement processing, storage and networking capabilities. In the embodiment of fig. 18, if a serious emergency occurs, the interactive space 1800 may not be connected to the processing device, or the terminal device may not be connected to the content transition device. In such cases, the interactive space 1800 may continue to operate in either a collaborative mode (still connected to the content state transition device) or a collaborative mode (not connected to the content state transition device) by power within the interactive space 1800. The cooperative mode and the cooperative mode can avoid the information data communication system from being incapable of operating under the condition that the network is disconnected or the power is interrupted and the connection to the processing device or the content transition device is not realized. The 8-layer model in the information data communication system of the present disclosure is a specific application using the concept of edge calculation and fog calculation. In case of network disconnection, each interactive space or each terminal device can still continue to transmit information data.
Fig. 19 is a schematic diagram of a processing device, a content transition device, or a terminal device in an information data communication system, according to some embodiments of the present disclosure. FIG. 19 illustrates an exemplary block diagram of a processing device, content transition device, or terminal device in accordance with certain embodiments of the present disclosure. According to the disclosure table 2, one of the processing device, the content transition device and the terminal device may include a control device 1910 and a sensing device 1930, where the control device 1910 includes a tactile integration device 1911, a 3D projection device 1912, a 3D printing device 1913, virtual reality glasses 1914, a light control device 1915, a display device 1916, an audio playing device 1917, an odor control device (such as an aroma expansion device) 1918, an infrared input/output device 1919 and an environment control device (such as an air conditioner 1920). The sensing devices 1930 include a global positioning system chip 1931, a human-machine input interface 1932, an image capturing device 1933, an audio input device 1934, an odor sensing device (e.g., an electronic nose) 1935, a biometric sensing device 1936, an environmental parameter sensing device (e.g., a temperature and humidity sensing device) 1937, a tactile sensing device 1938, an infrared input-output device 1919, and a scanning device 1939.
According to table 2 of the present disclosure, one of the processing device, the content transition device and the terminal device may include an io driver interface device 1940. The I/O driver interface device 1940 integrates the data from the sensing devices 1930 and transmits the integrated data to the CPU 1951, and the I/O driver interface device 1940 transmits the indications from the CPU 1951 to the control device 1910. According to the disclosure of table 2, one of the processing device, the content transition device and the terminal device may include a central processing device 1951, a memory 1952, a power management control device 1953, a cellular modem device 1954, a wireless modem device 1955 and a wired modem device 1956.
Wherein the cellular modem 1954 may communicate with a cellular base station 1960 and the wireless modem 1955 and wired modem 1956 may communicate with a network 1970. In some embodiments, the cellular modem 1954 is connected to the wide area cellular network via a cellular base station 1960. Central processing unit 1951 is coupled to memory 1952, and memory 1952 stores computer programs, code, and databases that execute for the processing device. CPU 1951 is coupled to power management controller 1953. power management controller 1953 implements a schedule for power saving of the processing devices, taking care of powering up and powering down the processing devices on a regular basis.
The cpu 1951 is connected to the i/o driver interface device 1940, and the i/o driver interface device 1940 transmits data and instructions of the cpu 1951 to the control device 1910 to control the control device 1910 to execute playback of the electronic file. The I/O driver interface device 1940 transmits the data obtained from the sensing device 1930 to the CPU 1951 for analysis.
The global positioning system chip 1931 in the sensing device 1930 can obtain the geographic coordinates. The man-machine input interface 1932 can accept data input from a keyboard, a handwriting pad, a touch pad, etc., and can also accept data input from a recording medium such as a solid-state memory, an optical disc, a hard disc, etc. Image capture device 1933 can perform image recognition.
The audio input device 1934 may record ambient sounds as well as speech. The odor sensing device 1935 may obtain an air quality indicator of the environment, such as CO, Gas, smoke, etc. The biometric sensing device 1936 can acquire fingerprint, voiceprint, DNA, electrocardiogram, pupil, etc. The tactile sensing device 1938 can sense the roughness of an object. The ir i/o device 1919 may emit and receive signals to detect the presence and distance of objects in the environment, and may also transmit and receive modulated signals to broadcast and receive data in the space in which the information data is communicated. The scanning device 1939 can be a 3D scanning device and a 2D scanning device to obtain 3D stereo data and 2D plane data of the real object.
The tactile integration device 1911 in the control device 1910 may be a tactile brush providing tactile sensation. The 3D projection device 1912 may provide 3D projection. The 3D printing device 1913 may print out the 3D model. Virtual reality glasses 1914 can take virtual reality. The light control devices 1915 may provide various light sources to provide different light scenes within the interactive space. Display devices 1916 include projectors, panels, video walls, etc., which may be multi-screen displays. The audio playing device 1917 may be a single tone speaker or a high-quality stereo surround speaker, or may be an ultrasonic speaker for transmitting modulated sound signals. The scent control device 1918 may provide a scene of a scent atmosphere. The environmental control devices 1920 may provide in situ temperature and humidity control.
FIG. 20 is a flow chart of a data record digitization process according to some embodiments of the present disclosure. FIG. 20 illustrates an exemplary process for generating a scenario based on a variety of data records in the messaging system model of the present disclosure. An information data can be expressed by a plurality of data records, such as a data record with text 2037, an image 2038, a smell 2039, audio-visual information 2040, and a model (or object) 2041. The information data layer 2030 contains records for these original data.
Depending on the nature of the data record, the processing means may decide to digitize it with different sensing means. For example, the model 2041 may be imaged in multiple layers using a 3D holography device 2035 or a 3D scanning device 2036 (e.g., using X-rays and laser light). The audio/video signal 2040 may be converted to analog/digital conversion means 2034. The scent 2039 can be sampled via a scent sensing device (e.g., an electronic nose) 2033. The image 2038 may be sampled via a 2D scanning device 2031 or a digital camera 2032.
These data records are initially sampled and then processed through multi-format digitization 2026. The control devices of the terminal devices have different capabilities. Such as display devices including projectors, televisions, flat panels, and the like, with different resolutions. Or the format of the best play of the control device of the terminal device is also different. The data records are processed by multi-format digitization 2026. The odor 2039 sampled by the odor sensor 2033 is digitized by the compositional formula analysis 2027.
The text 2037 and the image 2038 are digitized to form a 2D electronic file 2022, the odor is digitized to form an atmosphere electronic file 2023, the audio/video 2040 is digitized to form a video/audio electronic file 2024, and the model 2041 is digitized to form a 3D electronic file 2025. After being digitalized, the electronic file has different attributes, such as 3D, video, atmosphere, image, description, touch and the like. These electronic files with different attributes can be used to describe the same or different information data, so the contents of these electronic files may or may not have relevance. After analysis by the processing device or human inspection, the content label 2021 is applied to the feature in the content of each electronic file. Each electronic file includes a plurality of content tags. The above process is a digital layer 2020 in an 8-layer model.
Then, a plurality of electronic files containing a plurality of content tags are compiled 2012 based on the content tags related to the information data according to the information data to be transmitted to complete the script 2011. More specifically, the scenario compiler 2012 identifies the content tag and the parameter thereof associated with the information data according to the information data to be transmitted, and compiles a plurality of electronic files having the content tag associated with the information data into the scenario 2011 based on the content tag and the parameter thereof associated with the information data. The above process occurs at the editing layer 2010 in the 8-layer model of the present disclosure.
The processing device and the content transferring device also have internet networking capability, and can generate an electronic file by digitizing the data records locally as shown in fig. 20, or gather the data records via the internet and use them digitally.
FIG. 21 is a flow chart in accordance with certain embodiments of the present disclosure. FIG. 21 discloses further details of the editing layer 2010 of FIG. 20. An informational data may be represented via a plurality of electronic files (e.g., electronic files 2101-1 through 2101-n), with each electronic file having a plurality of content tags 2102. Within each electronic file is a highly repetitive content label 2102, which may be labeled as a critical content label (see A, B and C in FIG. 21). Editing of the content tag index set 2110 may be performed at the processing device based on the important content tags. The set of content tag indices 2110 includes content tag indices 2110-1 through 2110-m. 2110-1 through 2110-3 may be an index to content tag A, B, C in the embodiment of FIG. 21.
The information data summary and content configuration 2120 are edited according to the content tag index set 2110. The electronic file playing program 2130 is edited according to the content tag index set 2110. The editing of the feedback tab set 2140 is performed based on the content tab index set 2110. The user feedback level set 2150 and the corresponding operation set 2160 are edited according to the content tag index set 2110. Feedback label set 2140 includes feedback labels 2140-1 to 2140-p, a feedback label includes a user feedback level set 2150 and a corresponding action set 2160. A set of user feedback levels 2150 includes user feedback levels 2150-1 to 2150-q, and a set of corresponding operations 2160 includes corresponding operations 2160-1 to 2160-m.
The processing device edits the transcript 2170 based on the content tag index set 2110, the message data summary and content configuration 2120, the electronic file playback program 2130, the feedback tag set 2140, the user feedback hierarchy set 2150, and the corresponding operation set 2160. Based on the information data summary and content configuration 2120, the processing device compiles a transcript summary 2171 and an electronic file configuration 2172 of the transcript 2170, wherein the electronic file configuration 2172 includes what control device the format of each electronic file is suitable for the terminal device, and the transcript summary 2171 may be authored automatically by the processing device or by human input. In some embodiments, the content tags and their parameters associated with the information data are compiled into the content tag index set 2110 according to the information data to be transferred, and a specific portion of the script 2170 is compiled based on the content tag index set 2110, wherein the parameters can be used to provide summaries of the contents of the plurality of files to compose the script.
The playing program 2130 of the electronic file in the script 2170 includes the playing relationship of each electronic file, so that the electronic files can be synchronized during playing. The script 2170 further includes electronic files 2101-1 to 2101-n, content tag indices 2110-1 to 2110-m, and feedback tags 2140-1 to 2140-p. The operations illustrated in FIG. 21 are defined in the semantic layer of the 8-layer model of the present disclosure. The scenario 2170 is transmitted from the processing device to the content transition device, and the content transition device interprets the scenario 2170 and completes the setup for each terminal device in the communication unit.
It should be appreciated that the operations shown in fig. 20 and 21 are performed at the processing device in some embodiments, and the script 2170 is transmitted from the processing device to the content transition device, which interprets the script 2170 and performs settings for each terminal device within the communication unit. However, in embodiments where connection to the processing device is not possible, the content transition device and the terminal device may operate in a cooperative mode, the content transition device may perform the operations shown in fig. 20 and 21, interpret the script 2170 and perform settings for each terminal device within the communication unit. Further, in embodiments where connection to the processing device and content transition device is not possible, the terminal device may operate in a harmonic mode, the terminal device may perform the operations shown in fig. 20 and 21, interpret script 2170 and perform settings for each terminal device within the communication unit.
Figure 22 is a state diagram of an information-data-conveying communication system, in accordance with certain embodiments of the present disclosure. The processing apparatus initial state is the P0 state. In the P0 state, the processing apparatus is in standby. The processing device enters the P1 state, gathers the data records, and processes the data records into a transcript. The processing device returns to the P0 state and then enters the P2 state, and in the P2 state, the processing device transmits the scenario to the content transition device. After the processing and transmission of the scenario are completed, the processing apparatus returns to the standby state of P0.
The content transition device initial state is the R0 state. In the R0 state, the content transition device is in standby. After receiving the scenario transmitted in P2 state, the processing device enters R1 state, decodes the scenario, and sends the electronic file to each terminal device of the corresponding communication unit, and then returns to R0 state.
At R0, the content transition device notifies each terminal device to play the electronic file according to the electronic file playing procedure (e.g., the electronic file playing procedure 2130 of fig. 21). In the collaboration mode of the application layer in the 8-layer model of the present disclosure, the scenario can be transmitted by the processing device, and then the scenario is distributed to the terminal device by the content transition device, and then the information data can be transmitted according to the procedure and condition.
When the terminal device plays the electronic file in the state of T0, the creature (such as the user) can perform feedback by using gestures, human-computer interface, voice, etc. When the terminal device playing the electronic file receives the feedback, it enters the T1 state, and digitizes the feedback and transmits it to the content state-changing device in R0 state, or directly transmits the original data to the content state-changing device in R0 state. The content state transition device receiving the data related to the feedback enters the state of R2 from the state of R0, and after searching the relevant feedback label in the script to determine the feedback level and the corresponding operation of the user, the content state transition device returns to the state of R0, and informs the terminal device to perform the corresponding operation. In the collaboration mode of the application layer in the 8-layer model of the present disclosure, the content transition device is responsible for the distribution of the electronic file without being connected to the processing device, and when receiving the feedback from the terminal device, the content transition device first searches the relevant feedback tags in the scenario to determine the user feedback level and corresponding operation.
The terminal device stands by in the T0 state, and when receiving the feedback of the playing electronic file from the creature, the terminal device enters the T2 state to identify the relevant feedback label of the feedback, and enters the O2 state to request the tuning action from other terminal devices. All other end devices enter the tuning mode of the application layer in the 8-layer model of the present disclosure without the content transition device participating on the other end devices.
The terminal device stands by in the state of T0, and if a request for tuning is received from another terminal device in the state of O3, the terminal device in the state of T0 will search whether there is a locally compliant electronic file for tuning and playing. If the content state transition device is in the R0 state, and receives a request from other content state transition devices in the O1 state to search for a conforming electronic file, the content state transition device in the R0 state will enter the R2 state to search for whether a conforming electronic file is available for playback. The content state transition device is standby in the R0 state, and if a request from another processing device in the O4 state is received to request the retrieval of a compliant electronic file, the content state transition device in the R0 state will enter the R2 state to retrieve whether a compliant electronic file is available for playback.
FIG. 23 is a flow chart of data processing according to some embodiments of the present disclosure. FIG. 23 shows the data processing flow of the telematics system, which includes the data recording processing flow of the processing device 2301, the state transition processing flow of the content state transition device 2302, and the state transition file processing flow of the terminal devices 2303-1 to 2303-n. As shown in fig. 23, the processing device 2301 digitizes 2305 the data record 2304 to generate an electronic file, adds a content tag 2306 to the electronic file, and writes a script 2307. The processing device 2301 transmits the electronic file 2308, the content tab 2309, and the scenario 2310 to the content transition device 2302.
As shown in FIG. 23, the content transitive device 2302 receives the electronic file 2308 from the processing device 2301 and stores the received electronic file into the electronic file database 2311 for use in different modes. Electronic archive database 2311 has electronic archives 2308-1 through 2308-n and content tag sets 2309S, where content tag sets 2309S are packaged into content tags 2309 corresponding to electronic archives 2308-1 through 2308-n. The received electronic file 2308 from the processing device 2301 is also sent to the electronic file register 2312, and after being edited by the transition file 2313, the transition file is stored in the transition file register 2314 according to the different file requirements of each terminal device 2303-1 to 2303-n. The scenario received from the processing device 2301 may be edited by the content transition device 2302 to generate a scenario 2307, or may be stored in the transition content buffer 2314 without editing the scenario 2307 by the content transition device 2302 to generate a scenario timing 2317. After the received scenario 2310 generates the scenario timing sequence 2317, the transition files 2315-1 to 2315-m in the transition file buffer 2314 are controlled to output to the terminal devices 2303-1 to 2303-n, so as to control the transmission of the information data. The transition content register 2314 has an atmosphere tag set 2316S including a plurality of atmosphere tags 2316, each associated with a corresponding transition file 2315, electronic file 2308 and content tag 2309.
The signal 2318 from the user 2324 or the signal 2318 from the environment 2325 may be acquired by sensing units in the terminal devices 2303-1 to 2303-n, digitized 2319 via the signal 2318 within the terminal devices 2303-1 to 2303-n, and after data comparison and atmosphere tag editing 2320 in the internal transition profile database 2328, the terminal devices 2303-1 to 2303-n may generate transition profiles 2315-x and atmosphere tags 2316-x for the signal back to the content transition device 2302. After the content transition device 2302 receives the transition file and translates the transition file into the corresponding electronic file 2308 and the content label 2309 by the translator 2326, the electronic file database 2311 performs electronic file collection 2327 to collect the appropriate electronic file 2308, and sends the appropriate electronic file to the transition content buffer 2314 after the transition content editing 2313.
The content transition device 2302 may perform an application layer collaboration mode, integrate one of the appropriate electronic files into the content transition device 2302 to compose a script 2307, generate a script timing 2317 for each end device, and send the transition file 2315 to each end device 2303-1 to 2303-n in sequence. As shown in fig. 23, after the terminal device 2303-1 receives the transition file 2315-1 and the atmosphere label 2316-1 from the content transition device 2302, the control unit 2322 is played for transmitting information data through the controller 2321. The transition file 2315-1 and the atmosphere label 2316-1 received from the content transition device 2302 are also stored in the transition file database 2328 as a sample for comparison after the sensing unit 2323 obtains the signal 2318. The terminal device 2303-1 may also receive the atmosphere tag 2316-u and the request for executing the tuning mode in the application layer from the other terminal device 2303-u, and after the atmosphere tag 2316-u is edited and identified 2320 via the atmosphere tag, the appropriate corresponding atmosphere tag 2316 may be found in the status transferring archive database 2328, and the controller 2321 may transmit information data to the control unit 2322. Content state transitive device 2302 can also be requested to operate in a harmonic mode via state transitive file 2315 and atmosphere tab 2316 from other content state transitive devices 2302-1, or can be requested to operate in a harmonic mode via electronic file 2308 and content tab 2309. After the transition file 2315 and the atmosphere label 2316 from the other content transition device 2302-1 are translated by the translator 2326, the content transition device 2302 can perform the electronic file collection 2327 in the electronic file database 2311 thereof to collect the appropriate electronic file, and then control the belonging terminal device 2303 to perform the information data content transmission.
As shown in FIG. 7, the content tags are multi-dimensional data organization with semantic meaning and a plurality of feature values. In some embodiments, the content tags are suitable for authoring scripts 2307 on the processing device and the content transition device to complete script scheduling. In some embodiments, after the terminal device 2303 obtains the sensing signals 2318 from the sensing element 2323, the signals 2318 are digitized 2319 and the atmosphere label editing and identification 2320 is performed to generate an atmosphere label 2316, and the atmosphere label 2316 is transmitted to the content transition device for recognition of user feedback or environment feedback. In some implementations, the atmosphere tag 2316 may be communicated to other end devices operating in a tuning mode of the application layer. It should be appreciated that ambience tag 2316 may share an identified content tag 2309 as all devices.
FIG. 24 is an ambient tag encoding approach in accordance with certain embodiments of the present disclosure. An exemplary encoding of the atmosphere tag is shown in fig. 24. The atmosphere label can be a 2-carry, 3-carry or even multi-carry coding mode. The atmosphere tag is an N-bit length code, and may include m different levels of codes, e.g., first level 2310-1, second level 2310-2, … …, and mth level 2310-m. Each stage includes a different number of bits. For example, the first level 2310-1 includes bits 2311-1 through 2311-p, the second level 2310-2 includes bits 2312-1 through 2312-q, and the mth level 2310-m includes bits 2313-1 through 2313-r. The first level is the highest level Classification, such as the Dewey Decimal book Classification (Dewey Decimal Classification), and each atmosphere label is given a group of identifiers according to the hierarchical Classification. The classification criteria for such an atmosphere tag are the criteria that are common basis for the devices within the present invention.
The processing device 2301 and the content transition device 2302 are both capable of translating the electronic file 2308 to generate the transition file 2315 and the atmosphere tag 2316. The processing device 2301 typically creates a template of the content of the transition file 2315 and the atmosphere label 2316, which is communicated to all the terminal devices 2303 via the content transition device 2302, and the terminal device 2301 or the content transition device 2302 can compare with the template of the transition file 2315 in the transition file database 2328 to obtain the corresponding atmosphere label 2316 or similar atmosphere label 2316 when obtaining the signal 2318 fed back according to the user 2324 or the environment 2325. The same atmosphere label 2316 may have different transition profiles 2315 corresponding to different terminal devices 2303. For example, an atmosphere label 2316 describing a thunder strike has different transition profiles corresponding to terminals such as an electronic billboard, a surround sound control terminal and a light control terminal. Furthermore, even the same type of terminal device 2303 may require different transition files 2315 to drive due to different control element specifications. After the content transition device 2302 acquires the electronic file 2308 and the content label 2309 of the processing device 2301, the content of the information data of the electronic file 2308 needs to be retranslated to drive the control elements of the different types of terminal devices 2303 connected thereto. This process, referred to as the transition of electronic file 2308, produces data as transition file 2315, and the features of transition file 2315 are labeled as atmosphere label 2316. FIG. 25 is a flow chart of a transition process for an electronic file according to some embodiments of the present disclosure. FIG. 25 shows a transition processing flow 2501 of the sub-file. FIG. 25 discloses a scenario 2502 and electronic files 2503-1 to 2503-n and content tags 2504-1 to 2504-n associated with the scenario. Taking an electronic signboard type terminal device 2505-1 as an example, the terminal device 2505-1 has a virtual screen function 2506, and the virtual screen function 2506 can divide a screen 2518 into a plurality of screens, such as virtual screens W1, W2, W3 and W4. Each of the virtual screens W1, W2, W3, and W4 may support playback in text, photo, or movie format, and may provide animated special effects to the objects 2519 in the associated electronic file. FIG. 25 shows a transition file processing flow 2510-1 for end device 2505-1.
The "" electronic file flow setting 2525-1 "" in the transition file processing flow 2510-1 determines the flow of the electronic file to be played. The "" layout selection 2511 "" in the transition file processing flow 2510-1 determines how many virtual screens are used, and determines the size, position, resolution, etc. of each virtual screen. The "" window content setting 2512 "" in the transition file processing flow 2510-1 can determine the electronic file to be played by each virtual screen, the scene effect 2513 of each virtual screen, and the animation effect 2514 of the object in the electronic file. The transition file processing flow 2510-1 is for retranslating the electronic file into a data stream playable by the terminal 2505-1 according to the control specification of the terminal 2505-1, the function provided by the terminal 2505-1 and the scenario to be transmitted.
The electronic file translated in the transition file processing flow 2510-1 is referred to as a transition file 2515-1, and the transition file 2515-1 is applicable only to a specific terminal device (i.e., the terminal device 2505-1), not commonly to all terminal apparatuses. The section of the transition profile 2515-1 is given an atmosphere tag 2516-1, so that the terminal 2505-1 can sort and manage the transferred transition profile 2515-1 in a database local to the terminal according to the atmosphere tag 2516-1. Based on the transition state file 2515-1, the control unit 2517-1 controls the terminal 2505-1 to communicate the contents of the transition state file 2515-1. In addition, the terminal 2505-1 may have a sensing unit 2520 (e.g., an image capturing device and an audio capturing device) to get feedback from the user or the external environment.
Also shown in FIG. 25 is a lamp-controlled terminal 2515-2. The terminal 2515-2 includes lamps L1, L2, and L3 having different functions. The "" electronic file flow setting 2525-2 "" in the transition file processing flow 2510-2 determines the flow of the electronic file to be presented. The "lamp configuration 2521" in the transition file processing flow 2510-2 determines how many lamps are used, and determines the characteristics of each lamp, such as brightness, color light combination, position, projection manner, and projection pattern 2524. The "" light passing setting 2522 "" in the transition file processing flow 2510-2 determines the light ambient effect 2523 of each lamp, and so on. The control unit 2517-2 can control the brightness, color combination, position, projection mode and projected pattern of the lamps L1, L2 and L3, so as to provide visual control of selection, scene passing, illusion, simulation and atmosphere. In the transition file processing flow 2510-2 in which the electronic file transition is the terminal device 2505-2, since the electronic files 2503-1 to 2503-n associated with the scenario may not include the electronic file for the terminal device 2505-2 (i.e. may not include the electronic file for controlling the light), the transition file processing flow 2510-2 may arrange the light configuration, the light scene setting, the light situation effect and the projected light pattern in the light-controlled terminal device according to the content tag in the scenario, and further mark the atmosphere tag 2516-2.
Regarding the terminal device 2505-n shown in FIG. 25, the "electronic file flow setting 2525-2" in the transition file processing flow 2510-n determines the flow of the electronic file to be presented. A "" transition flow 2526 "" in the transition file processing flows 2510-n determines characteristics associated with the end devices 2505-n and translates the same into a transition file 2515-n specific to the end devices 2505-n and identifies the ambient tags 2516-n.
FIG. 26 is a data flow diagram in different modes of operation of the application layer according to some embodiments of the present disclosure. Data streams 2620, 2623, 2626 and 2629 are data streams in a collaborative mode, wherein data streams 2620 and 2623 include electronic file 2604, content tab 2605 and script 2606, and data streams 2626 and 2629 include transition file 2607 and atmosphere tab 2608. Data streams 2621, 2624, 2627 and 2630 are data streams in a collaborative mode, wherein data streams 2621 and 2624 include electronic file 2604 and content tag 2605, and data streams 2627 and 2630 include transition file 2607 and ambience tag 2608. Data streams 2622, 2625, 2628, 2631, 2632 and 2633 are data streams in the harmonic mode, wherein data streams 2622 and 2625 comprise electronic file 2604 and content tag 2605, data stream 2632 comprises electronic file 2604 and ambience tag 2608, and data streams 2628, 2631 and 2633 comprise transit file 2607 and ambience tag 2608.
FIG. 26 discloses a data flow diagram between processing device 2601, content staging devices 2602-1 and 2602-2, and end devices 2603-1 and 2603-2 when operating in a collaboration mode, and a collaboration mode at the application layer. In the collaboration mode, the electronic file is digitized into the electronic file 2604 by the data record 2604 via the processing device 2601. The processing device 2601 adds a content tag 2605 to the electronic archive 2604, and composes a scenario 2606 based on the content tag 2605. The processing device 2601 creates the electronic file 2604, the content tab 2605, and the scenario 2606 as a data stream 2620 or 2623 in the collaboration mode. The data stream 2620 or 2623 containing the electronic file 2604, the content tag 2605 and the scenario 2606 is transmitted to the content transshipment device 2602-1 or 2602-2, then is compiled into a transshipment file 2607 and an atmosphere tag 2608 by the content transshipment process, and the data stream 2626 or 2629 containing the corresponding transshipment file 2607 and atmosphere tag 2608 is transmitted to the connected terminal device 2603-1 or 2603-2 according to the process of the scenario layout.
And starting the terminal device of the information data transmission system according to the set conditions of the script to transmit the information data. The digital data record 2604 is converted into an electronic file by the processing device 2601, and the electronic file is compiled into a script, and then transmitted to the content transferring device 2602-1 or 2602-2, and the content is transferred into a format that can be played by the terminal device 2603-1 or 2603-2, which is called a collaboration mode. The content transshipment device and terminal device will store the received, edited and used electronic files 2604 and content tags 2605 (i.e., streams 2621, 2622, 262, 2625 and 2632) or the transshipment files 2607 and atmosphere tags 2608 (i.e., streams 2627, 2628, 2630, 2631 and 2633) in their local databases as relevant materials for the collaborative mode and the collaborative mode.
In the cooperative mode of the application layer, the content transition device 2602-1 or 2602-2 is a control center for information data communication. When the scenario 2606 cannot be obtained from the processing device 2601 (i.e., the data streams 2620 to 2622 and the data streams 2623 to 2625 are disconnected) during the network disconnection, or when the terminal device 2603-1 or 2603-2 receives the feedback from the user, the content transition device 2602-1 or 2602-2 analyzes the feedback content, and edits the appropriate electronic file 2604 from the local database to find the corresponding transition file 2607 or atmosphere label 2608. Content staging devices 2603-1 or 2603-2 can compile a number of suitable electronic files 2603 into a staging file 2607 and atmosphere tags 2608 for different end devices. The content transition device 2603-1 or 2603-2 controls the communication of information data by its connected terminal device, and this process is called a cooperation mode. The processing device 2601 can transfer the digitized electronic file 2604 and the corresponding content tag 2605 associated with the specific information data to the in-memory database (i.e., data streams 2621, 2622, 262, and 2625) of the content flip-flop device 2602-1 or 2602-2. The data may not contain script, so that the information data transmission of the cooperation mode is not started, and the content transferring device takes the contents as the information data content reference material of the cooperation mode.
In the harmonic mode of the application layer, there is no device acting as the dominant control center. When the terminal 2603-1 or 2603-2 and its associated content transition device 2602-1 or 2602-2 are in a disconnected state, if the terminal 2603-1 or 2603-2 gets the feedback from the user or the environment change, the terminal digitizes the feedback content and then translates it into a transition file 2607 and an atmosphere tag 2608, which are first transmitted to the content transition device 2603-1 or 2603-2 to request the support of the collaboration mode. If no response is received from the content transition device 2603-1 or 2603-2, the transition file 2607 and the atmosphere tag 2608 are broadcasted to other terminal devices or other content transition devices, and a tuning mode is requested. When the other device receives the request and the related content in the tuning mode, it will search the local memory database for the appropriate content and transmit it back to the requesting device for information data transmission.
If the content state-changing device receives the request of the harmonic mode data of other content state-changing devices, the mutual data are mainly the electronic file without state-changing, and the data are transferred to the terminal device to be the state-changing file. The information data transmission initiated by the terminal device is performed by each terminal device obtaining an appropriate electronic file by a local processing device and participating in the information data transmission, and the process is called a coordination mode. Processing device 2601 sends specific, personalized, explicit and digitized electronic file 2604 and corresponding content tags 2605 to content morphing devices 2602-1 or 2602-2 (i.e. streams 2622 and 2625), which morph into morphed file 2607 and atmosphere tags 2608 and then to end devices (i.e. streams 2628 and 2631) as a reference database of the collaborative mode. The electronic file used in the tuning mode may be an electronic file having the same content tag. The electronic file used in the collaborative mode may be a collection of one or more electronic files associated with content tags.
Fig. 27 is a schematic diagram of an internet of things according to some embodiments of the present disclosure. FIG. 27 shows the processing device 2701 connected to the two level content state transition devices 2702 and 2703-1 to 2703-n. The former content state transition device 2702 is connected to one or a plurality of the latter content state transition devices 2703-1 to 2703-n. As shown in the network architecture of fig. 27, the two-level content transition device has the same functional architecture and processing operation flow, the front-level content transition device 2702 has a very high-capacity electronic file database and a high-capability script editing function, and can provide the electronic files and scripts of the collaboration mode and the collaboration mode of the other content transition devices 2703-1 to 2703-n in the application layer in the local network. The content state transition devices 2703-1 to 2703-n at the next stage are mainly used for processing the state transition file editing capability of each terminal device, so as to control various, more and complicated terminal devices 2704-1 to 2704-m and 2705-1 to 2705-p, thereby providing a better information data content transmission environment and enabling each terminal device to operate in a harmonious mode of processing the application layer.
The foregoing description describes features of various embodiments that will enable those skilled in the art to more fully appreciate the disclosure herein. Those skilled in the art will appreciate that the present disclosure may be implemented as a basis for providing methods and structures for achieving the same purposes and/or achieving the same benefits as the above-described embodiments. Such modifications, substitutions and variations do not depart from the spirit and scope of the present disclosure.
Description of the symbols
101 information content
102 magnetic tape
103 hard disk
104 solid state disk
105 optical disc
106 content cloud
107 internet
108 action device
109 computer
110 projection device
111 video and audio equipment
112 television wall
200 information data communication system
201 data record
202 electronic file
202-1 to 202-p electronic files
204-11 to 204-1n content tags
204-21 to 204-2m content tag
204-p1 through 204-pq content tags
203 processing device
204 content tag
205 content state transition device
210 environment control device
211 odour control device
212 environmental parameter sensing device
213 light control device
214 image capturing device
215 biological sensing device
216 audio receiving device
217 audio player
218 projection arrangement
219 display device
2203D printing device
2213D projection device
222 organisms
301 information data layer
302 digital layer
303 edit layer
304 application layer
305 meeting layer
306 network layer
307 framework layer
308 physical layer
401 processing device
401-1 to 401-3 treatment apparatus
402 content state transition device
402-1 to 402-3 content state-changing device
403-1 to 403-n terminal devices
404-1 and 404-2 other devices
410 communication unit
420 interaction space
421 environmental parameter
422 organism
600 message packet
610 access code
611 preamble
612 synchronization code
613 suffix
620 header
621 network domain identifier of destination
622 destination content transitive device identifier
623 destination terminal device identifier
624 network domain identifier of the source
625 content transitive device identifier of source
626 terminal device identifier of source
627 link control message
630 protection section
640 payload
650 suffix
704-1 to 704-n parameters
1601 information data
1602 processing device
1603 content transfer device
1604 environment control device
1605 light control device
1606 image capturing device
1607 Audio playing device
1608 display device
1609 smell control device
16103D projector
1611 projection device
1612 user
1701 information data
1702 processing device
1703 content state transition device
1704 environment control device
1705 light control device
1706 image capturing apparatus
1707 Audio playing device
1708 display device
1709 odor control device
1710 production equipment sensing device
1711 optical scanning device
1712 projection arrangement
1713 Audio player
1714 odor control device
17153D printing device
1716 users
1717 users
1801 display device
1802 display apparatus
1803 light control device
1804 image capture equipment
1805 traffic signal control device
1806 electronic billboard
1807 electronic billboard
1808 electronic billboard
1809 Audio player
1810 user
1811 Small vehicle
1812 Large vehicle
1910 control device
1911 tactile sense integration device
19123D projection device
19133D printing device
1914 glasses virtual reality
1915 lamplight control device
1916 display device
1917 Audio playing device
1918 smell control device
1919 Infrared input/output device
1920 environment control device
1930 sensing device
1931 Global positioning System chip
1932 human-machine input interface
1933 image capturing device
1934 Audio input device
1935 odor sensing device
1936 biometric sensing device
1937 environmental parameter sensing device
1938 tactile sensing device
1939 scanning device
1940 output/input drive interface device
1951A central processing unit
1952 the memory
1953 power management control device
1954 cellular network modem device
1955 Wireless network modem device
1956 Modem for wired network
1960 cellular network base station
1970 network
2010 edit layer
2011 script
2012 transcript compilation
2020 digital layer
2021 labeling of content tags
20222D electronic file
2023 atmosphere electronic file
2024 video/audio electronic file
20253D electronic file
2026 Multi-Format digitization
2027 compositional formulation analysis
2030 information data layer
20312D scanning device
2032 digital camera
2033 odor sensing device
2034 analog-to-digital converter
20353D holographic device
20363D scanning device
2037 letters
2038 image
2039 smell
2040 Audio-visual communication
2041 model
2101-1 to 2101-n electronic files
2102 content tag
2110 content tag index set
2110-1 through 2110-m content tag index
2120 information data summarization and content configuration
2130 electronic file playing program
2140 feedback tag set
2140-1 to 2140-p feedback labels
2150 user feedback hierarchy set
2150-1 to 2050-q user feedback levels
2160 corresponding operation set
2160-1 to 2160-w correspond to the above-mentioned operations
2170 Theater
2171 transcript abstract
2172 electronic file configuration
2301A processing device
2302 content state transition device
2302-1 content state transfer device
2303-1 to 2303-n terminal devices
2303-u terminal device
2304 data recording
2305 digitizing
2306 adding content tags
2307 composing script
2308 electronic files
2308-1 to 2308-n electronic files
2309 content tag
2309S content tag set
2310 Theater
2311 electronic archive database
2312 electronic file buffer
2313 editing transition files
2314 transition state file buffer
2315 transition file
2315-1 to 2315-m transition state files
2315-1 through 2315-n transition files
2315-1 through 2315-p transition files
2315-x transition files
2316 atmosphere label
2316-1 to 2316-n atmosphere label
2316-u atmosphere label
2316-x atmosphere label
2316S oxygen circumference tag set
2316S-1 oxygen enclosure label set
2317 script sequence
2318 signal
2319 digitalization
2320 atmosphere tag editing and recognition
2321 controller
2322 control unit
2323 sensing assembly
2324 user
2325 Environment
2326 translator
2327 electronic document collection
2328 status-transferring archive database
2310-1 first order
2310-2 second stage
2310-m mth order
2311-1 to 2311-p position
2312-1 to 2312-q position
2313-1 to 2313-r positions
2501 electronic file state transition processing flow
2502 script
2503-1 to 2503-n electronic files
2504-1 to 2504-n content tags
2505-1 to 2505-n terminal device
2506 virtual Screen function
2510-1 to 2510-n transition state file processing flow
2511 layout selection
2512 Window content settings
2513 virtual Screen pass-field Effect
Animation effects of 2514 objects
2515-1 to 2515-n transition state file
2516-1 to 2516-n atmosphere label
2517-1 and 2517-2 control units
2518 Screen
2519 article
2520 sensing unit
2521 Lamp arrangement
2522 light passing setting
2523 light situation effect
2524 projection Pattern
2525-1 to 2525-n electronic file flow setup
2526 Process for transitioning between states
2601 treatment device
2602-1 content state-changing device
2602-2 content state-changing device
2603-1 terminal device
2603-2 terminal device
2604 electronic files
2605 content tags
2606 script
2607 transition files
2608 atmosphere label
2620 data flow
2621 data flow
2622 data flow
2623 data flow
2624 data flow
2625 data flow
2626 data flow
2627 data flow
2628 data flow
2629 data flow
2630 data flow
2631 data flow
2632 data flow
2633 data flow
2701 treatment device
2702 content state-changing device
2703-1 to 2703-n content state transition device
2704-1 to 2704-m terminal devices
2705-1 to 2705-p terminal devices
A content tag
B content tag
C content tag
State S1 to Sm
A1 to Am-1 state
G1 to Gm-1 states
L1-L3 lamp
O0 to O3 states
P0 to P2 states
R0 to R2 states
T0 to T2 states
W1-W4 virtual screens

Claims (34)

1. An information-data communication system, comprising:
a processing device;
content meta-phonetics (content meta) device;
a terminal device;
wherein when the terminal device detects that a user enters the range of the communication system, the communication system is configured to perform the following operations:
(a) obtaining a plurality of files from an external network according to the predetermined information data and the biological characteristics of the user;
(b) adding a plurality of content tags to each of the plurality of files;
(c) compiling a script of the information data according to content tags related to the information data in the content tags of each of the files, wherein the script comprises a plurality of feedback tags, and each of the feedback tags comprises a plurality of corresponding operations and a plurality of user feedback levels;
(d) instructing the terminal device to present at least one of the plurality of files to the user in accordance with the transcript; and
(e) instructing the terminal device to sense feedback of the user, and at least one of performing the corresponding operation and determining the user feedback level in response to the feedback of the user.
2. The messaging data communication system of claim 1, further comprising:
changing the files to be presented and the playing mode of the files to be presented in the next time segment according to at least one of the feedback of the user and the corresponding operation.
3. The messaging data communication system of claim 1, further comprising:
determining one of the plurality of user feedback levels in response to the feedback of the user, updating the user satisfaction based on the determined user feedback level.
4. The messaging system of claim 1, wherein the terminal device comprises at least one of: the device comprises a 3D projection device, a 3D printing device, a display device, a projection device, an audio playing device, an audio receiving device, a biological sensing device, a photographic device, a light control device, an environmental parameter sensing device, an odor control device and an environmental control device.
5. The messaging system of claim 1, wherein the plurality of files comprises one or more of: video files, audio files, image files, document files, and engineering drawing files.
6. The messaging data communication system of claim 5, wherein the plurality of content tags comprise one or more of: the temporal location and pixel location of the video file, the temporal location of the audio file, the pixel location of the image file, the number of pages and number of lines of the file, and the pixel location, number of pages and number of lines of the engineering drawing file.
7. The messaging system of claim 6, wherein each of the plurality of content tags further comprises a plurality of parameters for providing a summary of the content of the plurality of files to compose the transcript.
8. The information data communication system according to claim 1, wherein the scenario of the information data includes a file to be played in each time zone and a playback manner of the file to be played.
9. The information data communication system of claim 6, wherein the scenario of the information data further comprises, in each time zone, specifying:
the playing time position and the playing pixel position of the video file to be played;
the playing time position of the audio file to be played;
the position of the playing pixel of the image file to be played;
the number of playing pages and the number of playing lines of the file to be played; or
The position of the pixel to be played, the number of pages to be played and the number of lines to be played in the engineering drawing file.
10. The messaging communication system of claim 8, wherein performing at least one of the corresponding operations based on the feedback from the user comprises changing a file to be played or a manner in which the file to be played is played in a next time segment.
11. The messaging system of claim 9, wherein performing at least one of the corresponding operations based on the feedback from the user comprises changing a playing time position and a playing pixel position of a video file to be played, a playing time position of an audio file to be played, a playing pixel position of an image file to be played, a number of pages and a number of lines of a file to be played, or a playing pixel position, a number of pages and a number of lines of an engineering drawing file to be played.
12. The messaging data communication system of claim 1, wherein the processing device performs operations (a) through (e) to operate in a cooperative mode.
13. The messaging system of claim 1 wherein if the processing device in the system is unable to perform operations (a) through (e), the content state transition device performs operations (a) through (e) to operate in a collaboration mode.
14. The messaging data communication system of claim 1, wherein if the processing device and the content transition device in the system are unable to perform operations (a) through (e), the terminal device performs operations (a) through (e) to operate in a harmonic mode.
15. The messaging system of claim 12 wherein the processing device transmits the transcript to the terminal device via the content transition device.
16. The messaging system of claim 15, wherein the content transition device is capable of sending the transcript to another processing device or another content transition device.
17. The messaging system of claim 16, wherein the terminal device is capable of sending the transcript, at least one of the plurality of files, and the feedback from the user to another terminal device.
18. The messaging system of claim 12 wherein said feedback of said user sensed by said terminal device is transmitted to said processing device via said content state transition device.
19. The messaging system of claim 1, wherein the biometric characteristic of the user comprises at least one of: the gender of the user, the skin tone of the user, the age of the user, and the height of the user.
20. The information data communication system according to claim 1, wherein the information data is predetermined based on a biometric characteristic of the user. ("information data broadcast").
21. The messaging data communication system of claim 3, wherein at least one of the corresponding actions is performed in response to the feedback from the user such that the user satisfaction is maintained within a predetermined range. (information data modeling).
22. The information data communication system according to claim 1, wherein a user feedback level determined in response to the feedback of the user is further transmitted to one of the processing device, the content transition device, and the terminal device. ("interest acquisition").
23. The messaging data communication system of claim 3, wherein at least one of the corresponding actions is performed in response to the feedback from the user to maximize the user satisfaction. ("information data satisfies").
24. A method of information-data communication, comprising:
(a) obtaining a plurality of files from an external network according to the predetermined information data and the biological characteristics of the user;
(b) adding a plurality of content tags to each of the plurality of files, the plurality of content tags being associated with the information data;
(c) compiling a script of the information data according to a plurality of content tags of each of the plurality of files, the script including a plurality of feedback tags, each of the plurality of feedback tags including a plurality of corresponding operations and a plurality of user feedback levels;
(d) instructing a terminal device to present at least one of the plurality of files to the user according to the script; and
(e) instructing the terminal device to sense feedback of the user and at least one of perform the corresponding operation and determine the user feedback level according to the feedback of the user.
25. The messaging method of claim 24, wherein each of the plurality of content tags further comprises a plurality of parameters for providing a summary of the content of the plurality of files.
26. The information data communication method according to claim 24, wherein the scenario of the information data includes a file to be played in each time zone and a playback manner of the file to be played in each time zone.
27. The messaging data communication method of claim 24, further comprising determining one of the plurality of user feedback levels in response to the feedback of the user, updating the user satisfaction level based on the determined user feedback level.
28. The messaging method of claim 26, wherein performing at least one of the corresponding operations based on the feedback from the user comprises changing a file to be played in a next time segment and a manner in which the file is played.
29. The method of communicating information data as in claim 26, further comprising:
changing the files to be presented and the playing mode of the files to be presented in the next time segment according to at least one of the feedback of the user and the corresponding operation.
30. An information-data communication system, comprising:
a processing device;
a content state transition device; and
terminal device, wherein
The processing device is configured to perform the following operations:
obtaining a plurality of files from an external network according to the predetermined information data;
adding a plurality of content tags to each of the plurality of files, the plurality of content tags being associated with the information data;
compiling a script of the information data based on a plurality of content tags of each of the plurality of files; and
transmitting the plurality of files, the plurality of content tags and the script to a content transition device, and
the content state transition device is configured to perform the following operations:
converting a file associated with the terminal device from the plurality of files into a first transition file corresponding to the terminal device;
labeling the first transfer file with a corresponding first atmosphere label; and
transmitting the first transition file and the first atmosphere label to the terminal device to instruct the terminal device to present the transition file to a user.
31. The messaging data communication system of claim 30, further comprising:
the terminal device receiving a signal from the user or environment;
the terminal device searches a second atmosphere label and a second state transition file which are consistent with the signal in a local database, transmits the second atmosphere label and the second state transition file to the content state transition device, and transmits an update request of the state transition file and the atmosphere label.
32. The messaging data communication system of claim 31, further comprising:
the content transition device receives the second atmosphere label, the second transition file and the updating request from the terminal device;
and the content state transferring device searches a local database for a third atmosphere label and a third state transferring file which are consistent with the second atmosphere label and the second state transferring file, and transmits the third atmosphere label and the third state transferring file to the terminal device so as to respond to the updating request of the terminal device.
33. The messaging data communication system of claim 31, further comprising:
the content transition device receives the second atmosphere label, the second transition file and the updating request from the terminal device;
and if the local database of the content state transition device does not have a third atmosphere label and a third state transition file which are consistent with the second atmosphere label and the second state transition file, transmitting the second atmosphere label, the second state transition file and the updating request to another content state transition device so as to respond to the updating request of the terminal device.
34. The messaging data communication system of claim 30, further comprising:
the terminal device receives a second atmosphere label from another terminal device;
and the terminal device searches a second state transition file which is consistent with the second atmosphere label in a local database and plays the second state transition file.
CN201811125155.2A 2018-09-26 2018-09-26 Information data communication system and method thereof Active CN110955326B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202310472854.9A CN116560502A (en) 2018-09-26 2018-09-26 Cultural data communication system and method thereof
CN201811125155.2A CN110955326B (en) 2018-09-26 2018-09-26 Information data communication system and method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811125155.2A CN110955326B (en) 2018-09-26 2018-09-26 Information data communication system and method thereof

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202310472854.9A Division CN116560502A (en) 2018-09-26 2018-09-26 Cultural data communication system and method thereof

Publications (2)

Publication Number Publication Date
CN110955326A true CN110955326A (en) 2020-04-03
CN110955326B CN110955326B (en) 2023-08-04

Family

ID=69964657

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202310472854.9A Pending CN116560502A (en) 2018-09-26 2018-09-26 Cultural data communication system and method thereof
CN201811125155.2A Active CN110955326B (en) 2018-09-26 2018-09-26 Information data communication system and method thereof

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202310472854.9A Pending CN116560502A (en) 2018-09-26 2018-09-26 Cultural data communication system and method thereof

Country Status (1)

Country Link
CN (2) CN116560502A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115062148A (en) * 2022-06-23 2022-09-16 广东国义信息科技有限公司 Database-based risk control method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105611049A (en) * 2014-09-25 2016-05-25 单版画股份有限公司 Selectable styles for text messaging system publishers
CN106612229A (en) * 2015-10-23 2017-05-03 腾讯科技(深圳)有限公司 Method and device for performing feedback on UGC (User Generated Content) and method and device for displaying feedback information
CN107436816A (en) * 2016-05-27 2017-12-05 腾讯科技(深圳)有限公司 Control method, system and the terminal that a kind of message is sent to
US20170364319A1 (en) * 2016-06-20 2017-12-21 Xerox Corporation System and method for conveying print device status information using a light indicator feedback mechanism
CN107609913A (en) * 2017-09-19 2018-01-19 上海恺英网络科技有限公司 A kind of method and system of data analysis tracking
CN108476493A (en) * 2015-12-30 2018-08-31 Idac控股公司 Method, system and equipment for wireless transmitter/receiver unit cooperation

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105611049A (en) * 2014-09-25 2016-05-25 单版画股份有限公司 Selectable styles for text messaging system publishers
CN106612229A (en) * 2015-10-23 2017-05-03 腾讯科技(深圳)有限公司 Method and device for performing feedback on UGC (User Generated Content) and method and device for displaying feedback information
CN108476493A (en) * 2015-12-30 2018-08-31 Idac控股公司 Method, system and equipment for wireless transmitter/receiver unit cooperation
CN107436816A (en) * 2016-05-27 2017-12-05 腾讯科技(深圳)有限公司 Control method, system and the terminal that a kind of message is sent to
US20170364319A1 (en) * 2016-06-20 2017-12-21 Xerox Corporation System and method for conveying print device status information using a light indicator feedback mechanism
CN107609913A (en) * 2017-09-19 2018-01-19 上海恺英网络科技有限公司 A kind of method and system of data analysis tracking

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115062148A (en) * 2022-06-23 2022-09-16 广东国义信息科技有限公司 Database-based risk control method
CN115062148B (en) * 2022-06-23 2023-06-20 广东国义信息科技有限公司 Risk control method based on database

Also Published As

Publication number Publication date
CN110955326B (en) 2023-08-04
CN116560502A (en) 2023-08-08

Similar Documents

Publication Publication Date Title
US11335210B2 (en) Apparatus and method for analyzing images
CN111339246B (en) Query statement template generation method, device, equipment and medium
US20140289323A1 (en) Knowledge-information-processing server system having image recognition system
US11809213B2 (en) Controlling duty cycle in wearable extended reality appliances
McMullan A new understanding of ‘New Media’: Online platforms as digital mediums
CN101356526B (en) Method for generating a work of communication
CN104035995B (en) Group's label generating method and device
US20190095959A1 (en) Internet of advertisement method and system
WO2007043679A1 (en) Information processing device, and program
KR20100002756A (en) Matrix blogging system and service support method thereof
US20220246135A1 (en) Information processing system, information processing method, and recording medium
JP2010224715A (en) Image display system, digital photo-frame, information processing system, program, and information storage medium
US11838587B1 (en) System and method of providing customized media content
Stamatiadou et al. Semantic crowdsourcing of soundscapes heritage: a mojo model for data-driven storytelling
Fraisse et al. Comprehensive framework for describing interactive sound installations: Highlighting trends through a systematic review
JP2008198135A (en) Information delivery system, information delivery device and information delivery method
CN110955326B (en) Information data communication system and method thereof
CN116051192A (en) Method and device for processing data
US20230033675A1 (en) Systems and methods for localized information provision using wireless communication
US20210004747A1 (en) Information processing device, information processing method, and program
Meyer et al. Intelligent video highlights generation with front-camera emotion sensing
JP2006121264A (en) Motion picture processor, processing method and program
JP4546125B2 (en) Interface presenting method and interface presenting system
Beebe A Complete Bibliography of Publications in IEEE MultiMedia
CN116021530A (en) Explanation robot, exhibit explanation method and exhibit explanation system thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20230705

Address after: Daan District, Taiwan city of Taipei Chinese Road 4 No. 325 12 floor 1

Applicant after: Yichi jingcaizitong Co.,Ltd.

Applicant after: Zhang Yijia

Address before: Daan District, Taiwan city of Taipei Chinese Road 4 No. 325 12 floor 1

Applicant before: Yichi jingcaizitong Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant