UM

Browse/Search Results:  1-3 of 3 Help

Selected(0)Clear Items/Page:    Sort:
EINS: Edge-Cloud Deep Model Inference with Network-Efficiency Schedule in Serverless Conference paper
Peng, Shijie, Lin, Yanying, Chen, Wenyan, Tang, Yingfei, Duan, Xu, Ye, Kejiang. EINS: Edge-Cloud Deep Model Inference with Network-Efficiency Schedule in Serverless[C]:IEEE, 2024, 1376-1381.
Authors:  Peng, Shijie;  Lin, Yanying;  Chen, Wenyan;  Tang, Yingfei;  Duan, Xu; et al.
Favorite | TC[Scopus]:0 | Submit date:2024/08/05
Edge-cloud Collaborative  Network-efficiency  Serverless Inference  
QUART: Latency-Aware FaaS System for Pipelining Large Model Inference Conference paper
Lin, Yanying, Li, Yanbo, Peng, Shijie, Tang, Yingfei, Luo, Shutian, Shen, Haiying, Xu, Chengzhong, Ye, Kejiang. QUART: Latency-Aware FaaS System for Pipelining Large Model Inference[C]:Institute of Electrical and Electronics Engineers Inc., 2024, 1-12.
Authors:  Lin, Yanying;  Li, Yanbo;  Peng, Shijie;  Tang, Yingfei;  Luo, Shutian; et al.
Favorite | TC[WOS]:0 TC[Scopus]:0 | Submit date:2024/10/10
Large Model  Latency Aware  Pipeline Inference  Serverless  
BBServerless: A Bursty Traffic Benchmark for Serverless Conference paper
Lin, Yanying, Ye, Kejiang, Li, Yongkang, Lin, Peng, Tang, Yingfei, Xu, Chengzhong. BBServerless: A Bursty Traffic Benchmark for Serverless[C], 2022, 45-60.
Authors:  Lin, Yanying;  Ye, Kejiang;  Li, Yongkang;  Lin, Peng;  Tang, Yingfei; et al.
Favorite | TC[WOS]:0 TC[Scopus]:4 | Submit date:2022/05/17
Architectural Analytics  Benchmark Suit  Bursty Traffic  Serverless Computing  Serverless Workloads