1Command and Staff College, National Defence University (NDU), Pakistan
Globalized data dissemination in near-real-time is a challenge. Data replication as an alternative is cost intensive and delayed. Therefore, one of the possible solutions is to develop a protocol that classifies data at source. This sorting of data is based on spatial, temporal and spectral significances defined at the source. Parametric thresholds, i.e. boundary conditions for classification, are part of the architecture resulting in propagation-ready warehousing of data at source. Optimizations are also achieved by only propagating variations to the standard data or most specific conditions. This results in a sizable reduction of data transmitted over the networks. The complete data set or subset, as may be applicable, can be rebuilt at the remote site by adding variations to the standard conditions generating data as recorded by the sensors. This protocol can be implemented at the transport or higher layers of the Open Systems Interconnection model. New fields are required in the existing database designs to accommodate keeping only records of received variations. Further optimization can be achieved by recording only variations at sensors even prior to the analogue to digital conversion, and transmission can also be investigated with possibly beneficial outcomes.
Start time: 29/Jun/2017, 09:45 (local time)
Duration: 15 minutes
Location: Hofburg, Rittersaal