In the last two years, federated learning—a distributed learning paradigm—has been rapidly developing. The distributed machine learning paradigm can effectively solve the data silos issue and allow participants jointly build models without sharing original data. Therefore, it can eliminate data silos and allow AI collaboration. As the world’s first industrial-scale open-source framework, FATE is based on federated learning framework and provides a high-performance federated learning mechanism for machine learning, deep learning and transfer learning. In addition, FATE supports various multi-party computation protocols, such as homomorphic encryption, secret sharing, hash algorithm, all coming with a user-friendly solution for managing inter-domain information interaction.

Recently, version 1.2 of FATE has been officially released. In this version, FATE has two major updates — addition of heterogeneous Deep Neural Network and Secret-Sharing Protocol-SPDZ support. As the first in FATE series that supports heterogeneous federated DNN algorithm, the newest release provides robust support for developers in classification, regression, and sorting scenarios of heterogeneous federation modeling. The Secret-Sharing Protocol-SPDZ feature can expand and enrich the application scenarios.

In the previous major release 1.0, FATE introduced its first federated learning visualization product and the production service of the federated pipeline. Then, in version 1.1, FATE released the KubeFATE project with the team from CloudNative lab, Vmware China R&D open innovation center. Through encapsulating all components of FATE in containers, KubeFATE can use Docker Compose or Kubernetes (Helm Charts) to deploy FATE. The previous two versions significantly improve visualization and deployment experience, while FATE v1.2 focuses on further expanding its algorithm support. Apart from two major additions, FATE v1.2 also adds heterogeneous SQN optimizer and data management module. The heterogeneous SQN optimizer is available for Hetero-LogisticRegression and Hetero-LinearRegression, which can improve training efficiency, and SQN is the key to accelerate algorithm speed. On the other hand, the data management module is used to record the uploaded data tables and model outputs during the job running. In addition, it provides querying and cleaning up CLI.

FederatedML: Begin the journey of supporting federated deep learning and various multi-party computation protocols

In version 1.2 of FATE, we release the homogeneous federated deep learning framework, which supports the federalization of deep learning. The developers can customize the deep neural network structure. The latest version of FATE now supports Tensorflow as backend. We will introduce the Pytorch version soon for the developers to transfer the user experience of Tensorflow and Pytorch at a low cost.

FATE also supports Secret-Sharing Protocol-SPDZ, which means FATE can provide a variety of multi-party computation protocols for the developers, based on the existing homomorphic encryption protocols. Developers are free to choose suitable multi-party computation protocols for their algorithms. The available range of applications of federated learning has been further expanded. It is worth to be mentioned that SPDZ protocol is used for the first time in implementing heterogeneous Pearson Correlation Calculation.

Besides, as for the optimization of algorithm performance, the new version also includes the second-order optimization algorithm for the first time and provides heterogeneous SQN optimizer. The algorithm is successfully applied to the heterogeneous generalized linear model and improves algorithm performance. Heterogeneous feature binning and feature selection are available for the multi-host federated modeling, providing full support for multi-host scenarios.

FATE-Board: Two major visualization tools are available, improving the practicability again.

Since the FATE-Board has been introduced in version 1.0 of FATE, this product is well received by the developers. As for version 1.2, FATE improves the FATE-Board again and adds visualization for heterogeneous feature correlation and LocalBaseline component. The former can directly analyze the correlation distribution between the features. Thus, it helps developers to evaluate and choose features quickly. The latter enables developers to compare directly between the federated training model and the local sklearn training model. Then the developers can draw a conclusion from the visualized comparison report.

Besides, the new version of FATE-Board has enhanced the user experience, such as workflow, model output charts, and evaluation curves. In addition to optimizing greatly the visualization effects and interaction experience, the new updates enhance the practicability. The overall user experience of FATE will be improved.

FATE-Flow: data management module —first step to data governance.

In this version, FATE has added a new data management module, which will become the first step in data governance. Since this new version of FATE is released, the data produced during the whole job lifecycle will be traceable. In addition, the data management module provides common administrative commands such as query and deletion, tools that greatly enhance developers’ control over data.

In summary, we have begun further expansion of new fields. Both the homogeneous federated deep learning framework and multi-party Secret-Sharing Protocol-SPDZ are optimized for the underlying framework, preparing for possible extension to more application scenarios. According to the newest updates, we can tell that FATE is continuously optimizing and improving existing modeling components, focusing on efficiency, diversity, and practicability and providing better user experience for developers.

We welcome more users who are interested in federated learning to contribute codes and submit issues or Pull Requests.

For more details, you can check out the contributor guideline in FATE official website:

If you have any questions, please leave your messages. You can also add our WeChat (FATEZS001) for further communication.