A New Hope for Network Model Generalization

HotNets '22: Proceedings of the 21st ACM Workshop on Hot Topics in Networks

Abstract

Generalizing machine learning (ML) models for network traffic dynamics tends to be considered a lost cause. Hence for every new task, we design new models and train them on model-specific datasets closely mimicking the deployment environments. Yet, an ML architecture called Transformer has enabled previously unimaginable generalization in other domains. Nowadays, one can download a model pre-trained on massive datasets and only fine-tune it for a specific task and context with comparatively little time and data. These fine-tuned models are now state-of-the-art for many benchmarks. We believe this progress could translate to networking and propose a Network Traffic Transformer (NTT), a transformer adapted to learn network dynamics from packet traces. Our initial results are promising: NTT seems able to generalize to new prediction tasks and environments. This study suggests there is still hope for generalization through future research.

Research Area: Network Analysis and Reasoning

People

Dr. Alexander Dietmüller
PhD student
2018—2024

Talk

BibTex

@inproceedings{dietmüller2022network,
  author    = {Dietm{\"{u}}ller, Alexander and Ray, Siddhant and Jacob, Romain and Vanbever, Laurent},
  title     = {{A New Hope for Network Model Generalization}},
  booktitle = {HotNets '22: Proceedings of the 21st ACM Workshop on Hot Topics in Networks},
  address   = {Austin, TX, USA},
  year      = 2022,
  month     = nov,
  publisher = {Association for Computing Machinery},
  doi       = {10.1145/3563766.3564104},
  url       = {https://www.research-collection.ethz.ch/bitstream/handle/20.500.11850/577569/main.pdf}
}

Research Collection: 20.500.11850/577569

Slide Sources: https://gitlab.ethz.ch/projects/41272