1-Bit GC-DL | Distributed learning based on 1-bit gradient coding

Summary
In the framework of distributed learning, to mitigate the negative impact of the stragglers on the training time, the gradient coding (GC) technique has been adopted. On the other hand, to deal with high communication burden in distributed learning, 1-bit gradient vectors can be transmitted instead of real-valued ones. However, the existing distributed learning method based on 1-bit data does not take stragglers into account. In addition, current GC techniques are only designed for the distributed learning scheme where real-valued encoded vectors are transmitted and it is difficult to apply them under the case where 1-bit vectors are transmitted.

To overcome the above drawbacks and to reduce the communication overhead and the training time simultaneously, this project aims to propose novel distributed learning methods based on GC with 1-bit data. First, this project will propose a distributed learning method named 1-Bit GC-DL, which develops a 1-bit GC strategy to encode the locally computed gradient vectors of the allocated subsets into 1-bit data. Based on that, the aggregation rule at the central server for the received 1-bit data will be designed, which guarantees that the central server computes an approximated version of the true gradient vector in the presence of a certain number of stragglers to. Second, to further reduce the training time of 1-Bit GC-DL, this project will propose a lazily aggregated distributed learning method based on 1-bit GC, i.e., 1-Bit LA-GC-DL, by combining 1-Bit GC-DL with the lazily aggregated strategy. In 1-Bit LA-GC-DL, only a fraction of the workers participate in local training during each iteration and this project will provide the criterion for selecting the participating workers based on Age of Information. The proposed methods will be compared with other state-of-the-art methods in the context of distributed learning on both simulated and realistic datasets under practical scenarios.
Unfold all
/
Fold all
More information & hyperlinks
Web resources: https://cordis.europa.eu/project/id/101146669
Start date: 06-11-2024
End date: 05-11-2026
Total budget - Public funding: - 206 887,00 Euro
Cordis data

Original description

In the framework of distributed learning, to mitigate the negative impact of the stragglers on the training time, the gradient coding (GC) technique has been adopted. On the other hand, to deal with high communication burden in distributed learning, 1-bit gradient vectors can be transmitted instead of real-valued ones. However, the existing distributed learning method based on 1-bit data does not take stragglers into account. In addition, current GC techniques are only designed for the distributed learning scheme where real-valued encoded vectors are transmitted and it is difficult to apply them under the case where 1-bit vectors are transmitted.

To overcome the above drawbacks and to reduce the communication overhead and the training time simultaneously, this project aims to propose novel distributed learning methods based on GC with 1-bit data. First, this project will propose a distributed learning method named 1-Bit GC-DL, which develops a 1-bit GC strategy to encode the locally computed gradient vectors of the allocated subsets into 1-bit data. Based on that, the aggregation rule at the central server for the received 1-bit data will be designed, which guarantees that the central server computes an approximated version of the true gradient vector in the presence of a certain number of stragglers to. Second, to further reduce the training time of 1-Bit GC-DL, this project will propose a lazily aggregated distributed learning method based on 1-bit GC, i.e., 1-Bit LA-GC-DL, by combining 1-Bit GC-DL with the lazily aggregated strategy. In 1-Bit LA-GC-DL, only a fraction of the workers participate in local training during each iteration and this project will provide the criterion for selecting the participating workers based on Age of Information. The proposed methods will be compared with other state-of-the-art methods in the context of distributed learning on both simulated and realistic datasets under practical scenarios.

Status

SIGNED

Call topic

HORIZON-MSCA-2023-PF-01-01

Update Date

29-09-2024
Images
No images available.
Geographical location(s)
Structured mapping
Unfold all
/
Fold all
Horizon Europe
HORIZON.1 Excellent Science
HORIZON.1.2 Marie Skłodowska-Curie Actions (MSCA)
HORIZON.1.2.0 Cross-cutting call topics
HORIZON-MSCA-2023-PF-01
HORIZON-MSCA-2023-PF-01-01 MSCA Postdoctoral Fellowships 2023