The Library
Browse by Warwick Author
Up a level |
Number of items: 7.
2020
Li, Junyu, He, Ligang, Ren, Shenyuan and Mao, Rui (2020) Developing a loss prediction-based asynchronous stochastic gradient descent algorithm for distributed training of deep neural networks. In: 49th International Conference on Parallel Processing (ICPP2020), Virtual conference, 17-20 Aug 2020. Published in: ICPP '20: 49th International Conference on Parallel Processing - ICPP ISBN 9781450388160. doi:10.1145/3404397.3404432
2019
Ren, Shenyuan, He, Ligang, Li, Junyu, Chen, Zhiyan, Jiang, Peng and Li, Chang-Tsun (2019) Contention-aware prediction for performance impact of task co-running in multicore computers. Wireless Networks . pp. 1-8. doi:10.1007/s11276-018-01902-7 ISSN 1022-0038.
Jiang, Peng, He, Ligang, Ren, Shenyuan, Chen, Zhiyan and Mao, Rui (2019) vChecker : an application-level demand-based co-scheduler for improving the performance of parallel jobs in Xen. Wireless Networks . pp. 1-7. doi:10.1007/s11276-018-01914-3 ISSN 1022-0038.
2018
Ren, Shenyuan, He, Ligang, Li, Junyu, Chen, Chao, Gu, Zhuoer and Chen, Zhiyan (2018) Scheduling DAG Applications for Time Sharing Systems. ICA3PP 2018: Algorithms and Architectures for Parallel Processing , 11335 . pp. 272-286. doi:10.1007/978-3-030-05054-2_21 ISSN 0302-9743.
Jiang, Peng, He, Ligang, Ren, Shenyuan, Chen, Zhiyan and Mao, Rui (2018) vPlacer : a co-scheduler for optimizing the performance of parallel jobs in Xen. ICA3PP 2018: Algorithms and Architectures for Parallel Processing, 11334 . pp. 19-33. doi:10.1007/978-3-030-05051-1_2 ISSN 0302-9743.
Ren, Shenyuan (2018) Performance-aware task scheduling in multi-core computers. PhD thesis, University of Warwick.
2017
Ren, Shenyuan, He, Ligang, Zhu, Huanzhou, Gu, Zhuoer, Song, Wei and Shang, Jiandong (2017) Developing powerβaware scheduling mechanisms for computing systems virtualized by Xen. Concurrency and Computation: Practice and Experience, 29 (3). e3888. doi:10.1002/cpe.3888 ISSN 1532-0626.
This list was generated on Thu Mar 28 19:16:45 2024 GMT.