Habanalabs Open-Source TPC LLVM compiler and SynapseAI Core library
From: Oded Gabbay
Date: Fri Sep 10 2021 - 03:27:33 EST
Hi Greg,
Following our conversations a couple of months ago, I'm happy to tell you that
Habanalabs has open-sourced its TPC (Tensor Processing Core) LLVM compiler,
which is a fork of the LLVM open-source project.
The project can be found on Habanalabs GitHub website at:
https://github.com/HabanaAI/tpc_llvm
There is a companion guide on how to write TPC kernels at:
https://docs.habana.ai/en/latest/TPC_User_Guide/TPC_User_Guide.html
The guide details the TPC compute engine's architecture,
how to write TPC kernels using the TPC-C language, etc.
In addition, we have written a reference implementation of the SynapseAI API,
called SynapseAI Core, and released its code under the MIT license to the
open-source community at:
https://github.com/HabanaAI/SynapseAI_Core
SynapseAI Core contains all the necessary building blocks to run Deep Learning
training on Gaudi, although not as optimized as the closed-source library.
The project repository contains a couple of TPC kernels that implement basic
DL operators. These kernels can serve as an example of how to implement more
complex operators.
To work with the Gaudi device, the library calls the Habanalabs kernel driver
uAPI through the already open-source hl-thunk library at:
https://github.com/HabanaAI/hl-thunk
Moreover, the library contains a few tests (and more will follow soon) that
demonstrate how to use the SynapseAI API to run workloads which utilize the
TPC engines on Gaudi devices. We provided a short readme that explains
how to build and run the included tests.
It is important to note we provided all the necessary APIs to connect this
library to any Deep Learning frameworks by writing appropriate backends in
the frameworks and by writing more TPC kernels to implement the different
operators.
Once the driver(s) for the Gaudi NIC ports will be upstreamed, this library
may be used together with IBverbs to perform training on multiple Gaudi devices.
Thanks,
Oded