UM

Browse/Search Results:  1-8 of 8 Help

Selected(0)Clear Items/Page:    Sort:
基于无服务器计算的高并发低时延云服务支撑方法 Patent
专利类型: 发明专利Invention,
Authors:  YE KEJIANG;  LIN YANYING;  XU CHENGZHONG
Favorite |  | Submit date:2022/08/26
Serverless  Low Latency  High Throughput  
Heterogeneity-aware Proactive Elastic Resource Allocation for Serverless Applications Journal article
Feng, Binbin, Ding, Zhijun, Zhou, Xiaobo, Jiang, Changjun. Heterogeneity-aware Proactive Elastic Resource Allocation for Serverless Applications[J]. IEEE Transactions on Services Computing, 2024, 17(5), 2473-2487.
Authors:  Feng, Binbin;  Ding, Zhijun;  Zhou, Xiaobo;  Jiang, Changjun
Favorite | TC[WOS]:1 TC[Scopus]:2  IF:5.5/5.9 | Submit date:2024/05/16
Instance Allocation  Numa  Resource Estimation  Server Scaling  Serverless  Workflow  Workload Prediction  
EINS: Edge-Cloud Deep Model Inference with Network-Efficiency Schedule in Serverless Conference paper
Peng, Shijie, Lin, Yanying, Chen, Wenyan, Tang, Yingfei, Duan, Xu, Ye, Kejiang. EINS: Edge-Cloud Deep Model Inference with Network-Efficiency Schedule in Serverless[C]:IEEE, 2024, 1376-1381.
Authors:  Peng, Shijie;  Lin, Yanying;  Chen, Wenyan;  Tang, Yingfei;  Duan, Xu; et al.
Favorite | TC[Scopus]:0 | Submit date:2024/08/05
Edge-cloud Collaborative  Network-efficiency  Serverless Inference  
Incendio: Priority-based Scheduling for Alleviating Cold Start in Serverless Computing Journal article
Cai, Xinquan, Sang, Qianlong, Hu, Chuang, Gong, Yili, Suo, Kun, Zhou, Xiaobo, Cheng, Dazhao. Incendio: Priority-based Scheduling for Alleviating Cold Start in Serverless Computing[J]. IEEE Transactions on Computers, 2024, 73(7), 1780-1794.
Authors:  Cai, Xinquan;  Sang, Qianlong;  Hu, Chuang;  Gong, Yili;  Suo, Kun; et al.
Favorite | TC[WOS]:1 TC[Scopus]:1  IF:3.6/3.2 | Submit date:2024/05/16
Serverless Computing  Cold Start  Priority  Prediction  Scheduling  In-memory Computing  Distributed Systems  
QUART: Latency-Aware FaaS System for Pipelining Large Model Inference Conference paper
Lin, Yanying, Li, Yanbo, Peng, Shijie, Tang, Yingfei, Luo, Shutian, Shen, Haiying, Xu, Chengzhong, Ye, Kejiang. QUART: Latency-Aware FaaS System for Pipelining Large Model Inference[C]:Institute of Electrical and Electronics Engineers Inc., 2024, 1-12.
Authors:  Lin, Yanying;  Li, Yanbo;  Peng, Shijie;  Tang, Yingfei;  Luo, Shutian; et al.
Favorite | TC[WOS]:0 TC[Scopus]:0 | Submit date:2024/10/10
Large Model  Latency Aware  Pipeline Inference  Serverless  
Serverless Computing: State-of-the-Art, Challenges and Opportunities Journal article
Li, Yongkang, Lin, Yanying, Wang, Yang, Ye, Kejiang, Xu, Cheng Zhong. Serverless Computing: State-of-the-Art, Challenges and Opportunities[J]. IEEE Transactions on Services Computing, 2022, 16(2), 1522 - 1539.
Authors:  Li, Yongkang;  Lin, Yanying;  Wang, Yang;  Ye, Kejiang;  Xu, Cheng Zhong
Favorite | TC[WOS]:32 TC[Scopus]:53  IF:5.5/5.9 | Submit date:2022/05/17
Survey  Serverless Computing  Faas And Baas  Startup Latency  Isolation  Scheduling  
BBServerless: A Bursty Traffic Benchmark for Serverless Conference paper
Lin, Yanying, Ye, Kejiang, Li, Yongkang, Lin, Peng, Tang, Yingfei, Xu, Chengzhong. BBServerless: A Bursty Traffic Benchmark for Serverless[C], 2022, 45-60.
Authors:  Lin, Yanying;  Ye, Kejiang;  Li, Yongkang;  Lin, Peng;  Tang, Yingfei; et al.
Favorite | TC[WOS]:0 TC[Scopus]:4 | Submit date:2022/05/17
Architectural Analytics  Benchmark Suit  Bursty Traffic  Serverless Computing  Serverless Workloads  
An Experimental Analysis of Function Performance with Resource Allocation on Serverless Platform Conference paper
Zhang, Yonghe, Ye, Kejiang, Xu, Cheng Zhong. An Experimental Analysis of Function Performance with Resource Allocation on Serverless Platform[C], 2022, 17-31.
Authors:  Zhang, Yonghe;  Ye, Kejiang;  Xu, Cheng Zhong
Favorite | TC[WOS]:0 TC[Scopus]:0 | Submit date:2022/05/17
Cloud Native  Container  Performance Analysis  Serverless Computing