This repository has some basic installation steps to the Poplar SDK on a Graphcore IPU. In future, I plan to implement and add some basic codes for Parallel Computing Algorithms
-
Updated
Mar 29, 2022
This repository has some basic installation steps to the Poplar SDK on a Graphcore IPU. In future, I plan to implement and add some basic codes for Parallel Computing Algorithms
Track reconstruction on the Graphcore IPU.
JAX for Graphcore IPU (experimental)
PyTorch interface for the IPU
Poplar Advanced Runtime for the IPU
TensorFlow for the IPU
Code for CoNLL BabyLM workshop Mini Minds: Exploring Bebeshka and Zlata Baby Models
Example code and applications for machine learning on Graphcore IPUs
Blazing fast training of 🤗 Transformers on Graphcore IPUs
Poplar implementation of FlashAttention for IPU
TessellateIPU: low level Poplar tile programming from Python
A PyTorch library for Knowledge Graph Embedding on Graphcore IPUs implementing the distribution framework BESS
An implementation of the Search by Triplet track reconstruction algorithm on the Graphcore IPU.
⚡️An Easy-to-use and Fast Deep Learning Model Deployment Toolkit for ☁️Cloud 📱Mobile and 📹Edge. Including Image, Video, Text and Audio 20+ main stream scenarios and 150+ SOTA models with end-to-end optimization, multi-platform and multi-framework support.
🚀 Accelerate training and inference of 🤗 Transformers and 🤗 Diffusers with easy to use hardware optimization tools
Add a description, image, and links to the graphcore topic page so that developers can more easily learn about it.
To associate your repository with the graphcore topic, visit your repo's landing page and select "manage topics."