WO2023109924A1 - Serialization system for random data access - Google Patents

Serialization system for random data access Download PDF

Info

Publication number
WO2023109924A1
WO2023109924A1 PCT/CN2022/139426 CN2022139426W WO2023109924A1 WO 2023109924 A1 WO2023109924 A1 WO 2023109924A1 CN 2022139426 W CN2022139426 W CN 2022139426W WO 2023109924 A1 WO2023109924 A1 WO 2023109924A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
sub
data access
user
server
Prior art date
Application number
PCT/CN2022/139426
Other languages
English (en)
French (fr)
Inventor
Lei Xu
Wenbo Jin
Qingpeng ZHAO
Original Assignee
Lei Xu
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lei Xu filed Critical Lei Xu
Publication of WO2023109924A1 publication Critical patent/WO2023109924A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • A63F13/48Starting a game, e.g. activating a game device or waiting for other players to join a multiplayer session
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F13/38Information transfer, e.g. on bus
    • G06F13/42Bus transfer protocol, e.g. handshake; Synchronisation
    • G06F13/4282Bus transfer protocol, e.g. handshake; Synchronisation on a serial bus, e.g. I2C bus, SPI bus
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • A63F13/358Adapting the game course according to the network or server load, e.g. for reducing latency due to different connection speeds between clients
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/67Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F13/38Information transfer, e.g. on bus
    • G06F13/382Information transfer, e.g. on bus using universal interface adapter
    • G06F13/385Information transfer, e.g. on bus using universal interface adapter for adaptation of a particular data processing system to different peripheral devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/602Providing cryptographic facilities or services
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Definitions

  • the present invention belongs to the field of computer technologies, and specifically, relates to a serialization system for random data access.
  • a to-be-next-accessed data set and its probability can be calculated according to a historical status and a current context of a program by using a machine learning model, and accordingly, random data access is represented in an approximately serial manner to implement preloading, predictive loading, and random reading of data.
  • the present invention provides a serialization system for random data access, to "serialize" access to the pieces of data and serve data access in a manner similar to the streaming media, thereby perfectly resolving the problems in games, programs, and data distribution and access.
  • a game can be clicked and played without waiting for installation or download.
  • a program can also be clicked and opened, and can be quickly entered without waiting for download and installation. Big data can be accessed without waiting for download to the local, and random access at any time becomes possible without worrying about a local storage capacity.
  • a serialization system for random data access includes a big data analysis and processing sub-system, a data access serialization server sub-system, a client runtime sub-system, and a prediction and instruction-combined data distribution sub-system.
  • the big data analysis and processing sub-system includes a machine learning algorithm sub-system, a behavioral data processing and modeling sub-system, and a model evaluation sub-system.
  • the machine learning algorithm sub-system selects machine learning algorithms, such as Aproiri, Naive Bayesian, Bayesian Network, K-Means, KNN, DBSCAN, SVM, LSTM, CNN, AdaBoost, GBDT, and RandomForest, according to feedbacks in researching and actual production, and uses one or more combinations of the machine learning algorithms to construct a machine learning algorithm pool.
  • machine learning algorithms such as Aproiri, Naive Bayesian, Bayesian Network, K-Means, KNN, DBSCAN, SVM, LSTM, CNN, AdaBoost, GBDT, and RandomForest
  • the behavioral data processing and modeling sub-system uses different algorithms in the machine learning algorithm pool to process collected user behaviors and program behaviors, and establishes a data access prediction model for different users and different data with reference to types and characteristics of the data.
  • the models predict, based on a user behavior and a program behavior, data that a user needs next.
  • the model evaluation sub-system predicts a success rate for each piece of data of each user in a model evaluation stage according to stability under samples, and calculates indicators, such as complexity, to select top models with best performance.
  • the data access serialization server sub-system performs authentication on a user when receiving a data access request from a client, performs authentication on accessed data, and rejects access in a case of invalidity; selects, if the data access request is a preloading request, a corresponding model and generates a preloaded packet; directly searches for, if the data access request is a real-time reading request, a corresponding compressed data block, and returns the data block to the client after encrypting the data block; selects, if the data access request is a predicted download request, a corresponding model, inputs a user behavior and a program context transmitted from the client, calculates to-be-needed data, searches for a corresponding compressed data block, and returns the data block to the client after encrypting the data block.
  • the client may negotiate with a server to select non-compressed and non-encrypted data.
  • the server divides data into data blocks, selects compression algorithms with high compression ratios and less time consumption for decompression and compression parameters for
  • the client runtime sub-system takes over the data access request and maps the data access request to data access to a cache and a server, satisfies a data access requirement through preloading, real-time reading, and predictive loading, is transparent to a user and a game (program) developer, records behaviors of the user during game playing (program using), including a key press, key press duration, a dwell time in each interface, a program response, a program running context, and information, such as data access, of a program, and reports the behaviors to the server for establishing a data access model after analysis;
  • intercepts a data access request for a program maps data access to data access to a local cache and the server, and is transparent to a user;
  • a local cache uses an algorithm with a high compression ratio to store compressed data, and uses a more aggressive cache invalidation algorithm to delete invalid caches.
  • a greedier preloading and prediction policy is negotiated with the server, to reduce impact of network jitters.
  • the prediction and instruction-combined data distribution system usually delivers, according to a user behavior and a program behavior and a model generated by a big data processing system, data used or to be used by a user.
  • the prediction and instruction-combined data distribution system further supports a game (program) developer to implant an instruction into game (program) code or on the server according to a protocol, and when a game (program) is run to specific positions or a context satisfies a specified condition, the data distribution system executes a corresponding instruction, for example, an operation such as delivering specific data or deleting specific data in a cache, and under a combination of prediction and an instruction, data access achieves zero-network latency the same as access to local data
  • the serialization system for random data access provided in the present invention is properly designed and includes the following advantages:
  • the present invention greatly reduces capacity limits in game (program) development, and is transparent to both users and developers.
  • a game developer can access a game resource of any capacity by accessing a local file.
  • the present invention naturally supports an open-ended game, such as a metaverse game, and also greatly reduces the difficulty of game development.
  • the present invention naturally supports interactive videos such as AR/VR.
  • the present invention integrates access interfaces of cloud data and local data, to lower the technical threshold of the cloud storage + local computation mode, which really revitalizes cloud storage and makes the cloud storage more convenient.
  • FIG. 1 is a schematic block diagram of a serialization system for random data access according to the present invention
  • FIG. 2 is a block diagram of a big data analysis sub-system for universal data streaming of a serialization system for random data access according to the present invention
  • FIG. 3 is a block flowchart of server data access processing of a serialization system for random data access according to the present invention
  • FIG. 4 is a flowchart of client data access processing of a serialization system for random data access according to the present invention.
  • FIG. 5 is a block diagram of a client runtime sub-system of a serialization system for random data access according to the present invention.
  • orientations or positional relationships such as “center”, “upper”, “lower”, “left”, “right”, “vertical”, “horizontal”, “inner”, and “outer”, are based on the orientations or positional relationships shown in the accompanying drawings, are used merely for the convenience of describing the present invention and simplifying the description, rather than indicating or implying that the indicated apparatuses or elements need to have specific orientations or be constructed and perform operations in specific orientations, and therefore, should not be construed as limitations on the present invention.
  • first”, “second”, and “third” are used merely for the purpose of description, and should not be understood as indicating or implying relative importance.
  • a serialization system for random data access includes a big data analysis and processing sub-system, a data access serialization server sub-system, a client runtime sub-system, and a prediction and instruction-combined data distribution sub-system.
  • the big data analysis and processing sub-system includes a machine learning algorithm sub-system, a behavioral data processing and modeling sub-system, and a model evaluation sub-system.
  • the machine learning algorithm sub-system selects machine learning algorithms according to feedbacks in researching and actual production, and uses one or more combinations of the machine learning algorithms to construct a machine learning algorithm pool.
  • the behavioral data processing and modeling sub-system uses different algorithms in the machine learning algorithm pool to process collected user behaviors and program behaviors, and establishes a data access prediction model for different users and different data with reference to types and characteristics of the data.
  • the models predict, based on a user behavior and a program behavior, data that a user needs next.
  • the model evaluation sub-system predicts a success rate for each piece of data of each user in a model evaluation stage according to stability under samples, and calculates indicators, such as complexity, to select top models with best performance.
  • the data access serialization server sub-system performs authentication on a user when receiving a data access request from a client, performs authentication on accessed data, and rejects access in a case of invalidity; selects, if the data access request is a preloading request, a corresponding model and generates a preloaded packet; directly searches for, if the data access request is a real-time reading request, a corresponding compressed data block, and returns the data block to the client after encrypting the data block; selects, if the data access request is a predicted download request, a corresponding model, inputs a user behavior and a program context transmitted from the client, calculates to-be-needed data, searches for a corresponding compressed data block, and returns the data block to the client after encrypting the data block.
  • the client may negotiate with a server to select non-compressed and non-encrypted data.
  • the server divides data into data blocks, selects compression algorithms with high compression ratios and less time consumption for decompression and compression parameters for
  • the client runtime sub-system takes over the data access request and maps the data access request to data access to a cache and a server, satisfies a data access requirement through preloading, real-time reading, and predictive loading, is transparent to a user and a game (program) developer, records behaviors of the user during game playing (program using), including a key press, key press duration, a dwell time in each interface, a program response, a program running context, and information, such as data access, of a program, and reports the behaviors to the server for establishing a data access model after analysis;
  • intercepts a data access request for a program maps data access to data access to a local cache and the server, and is transparent to a user;
  • a local cache uses an algorithm with a high compression ratio to store compressed data, and uses a more aggressive cache invalidation algorithm to delete invalid caches.
  • a greedier preloading and prediction policy is negotiated with the server, to reduce impact of network jitters.
  • the prediction and instruction-combined data distribution system usually delivers, according to a user behavior and a program behavior and a model generated by a big data processing system, data used or to be used by a user.
  • the prediction and instruction-combined data distribution system further supports a game (program) developer to implant an instruction into game (program) code or on the server according to a protocol, and when a game (program) is run to specific positions or a context satisfies a specified condition, the data distribution system executes a corresponding instruction, for example, an operation such as delivering specific data or deleting specific data in a cache, and under a combination of prediction and an instruction, data access achieves zero-network latency the same as access to local data.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Bioethics (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Hardware Design (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
PCT/CN2022/139426 2021-12-15 2022-12-15 Serialization system for random data access WO2023109924A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111532690.1A CN114791893B (zh) 2021-12-15 2021-12-15 一种随机数据访问的串行化系统
CN202111532690.1 2021-12-15

Publications (1)

Publication Number Publication Date
WO2023109924A1 true WO2023109924A1 (en) 2023-06-22

Family

ID=82460746

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/139426 WO2023109924A1 (en) 2021-12-15 2022-12-15 Serialization system for random data access

Country Status (2)

Country Link
CN (1) CN114791893B (zh)
WO (1) WO2023109924A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114791893B (zh) * 2021-12-15 2023-05-09 许磊 一种随机数据访问的串行化系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109887098A (zh) * 2019-02-13 2019-06-14 浙江传媒学院 一种基于分布式计算的web AR数据呈现方式
CN110023910A (zh) * 2016-10-03 2019-07-16 贝宝公司 减少处理停机时间的预加载数据的计算模式的预测分析
US20200278797A1 (en) * 2019-02-28 2020-09-03 Micron Technology, Inc. Use of outstanding command queues for separate read-only cache and write-read cache in a memory sub-system
CN113730902A (zh) * 2021-08-13 2021-12-03 许磊 一种用于游戏的免下载运行方法
CN114791893A (zh) * 2021-12-15 2022-07-26 许磊 一种随机数据访问的串行化系统

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060080702A1 (en) * 2004-05-20 2006-04-13 Turner Broadcasting System, Inc. Systems and methods for delivering content over a network
US9336483B1 (en) * 2015-04-03 2016-05-10 Pearson Education, Inc. Dynamically updated neural network structures for content distribution networks
CN106383768A (zh) * 2016-09-14 2017-02-08 江苏北弓智能科技有限公司 基于移动设备操作行为的监管分析系统及其方法
CN106874521B (zh) * 2017-03-20 2020-07-28 南京云开科技有限公司 一种大数据学习分析系统及方法
CN108549583B (zh) * 2018-04-17 2021-05-07 致云科技有限公司 大数据处理方法、装置、服务器及可读存储介质
CN109925718B (zh) * 2019-01-14 2023-04-28 珠海金山数字网络科技有限公司 一种分发游戏微端地图的系统及方法
CN113347170B (zh) * 2021-05-27 2023-04-18 北京计算机技术及应用研究所 一种基于大数据框架的智能分析平台设计方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110023910A (zh) * 2016-10-03 2019-07-16 贝宝公司 减少处理停机时间的预加载数据的计算模式的预测分析
CN109887098A (zh) * 2019-02-13 2019-06-14 浙江传媒学院 一种基于分布式计算的web AR数据呈现方式
US20200278797A1 (en) * 2019-02-28 2020-09-03 Micron Technology, Inc. Use of outstanding command queues for separate read-only cache and write-read cache in a memory sub-system
CN113730902A (zh) * 2021-08-13 2021-12-03 许磊 一种用于游戏的免下载运行方法
CN114791893A (zh) * 2021-12-15 2022-07-26 许磊 一种随机数据访问的串行化系统

Also Published As

Publication number Publication date
CN114791893A (zh) 2022-07-26
CN114791893B (zh) 2023-05-09

Similar Documents

Publication Publication Date Title
US9317209B2 (en) Using external memory devices to improve system performance
US9058212B2 (en) Combining memory pages having identical content
TW201019110A (en) Managing cache data and metadata
WO2007106383A1 (en) Selective address translation for a resource such as a hardware device
US20200201820A1 (en) Coordinator for preloading time-based content selection graphs
WO2023109925A1 (en) Universal computing task cooperation system
US11720488B2 (en) Garbage collection of preloaded time-based graph data
WO2023109924A1 (en) Serialization system for random data access
CN103329128A (zh) 经由个人云来利用内容
US20230023208A1 (en) Preloaded content selection graph validation
US20130091326A1 (en) System for providing user data storage environment using network-based file system in n-screen environment
WO2023116795A1 (en) Streaming system for general data
JP2006005759A (ja) サーバ装置、再生装置、コンテンツ送信方法、コンテンツ再生方法、コンテンツ再生システム及びプログラム
US8127079B2 (en) Intelligent cache injection
US11409670B2 (en) Managing lock coordinator rebalance in distributed file systems
US20220147646A1 (en) Enabling applications to access cloud data
WO2023125875A1 (en) Correlation-based streaming method for game data
Jiang et al. Coordinated multilevel buffer cache management with consistent access locality quantification
CN111666559A (zh) 一种支持权限管理的数据总线管理方法、装置、电子设备及存储介质
Alazzawe et al. Efficient big-data access: Taxonomy and a comprehensive survey
US20220100764A1 (en) Collection of timepoints and mapping preloaded graphs
Shen et al. Ditto: An elastic and adaptive memory-disaggregated caching system
Li et al. An efficient large‐scale whole‐genome sequencing analyses practice with an average daily analysis of 100Tbp: ZBOLT
US7664916B2 (en) Global smartcard cache methods and apparatuses
CN101202758B (zh) 多客户端的网络虚拟存储方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22906669

Country of ref document: EP

Kind code of ref document: A1