Skip to content

mrshenli/dummy_collectives

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

dummy_collectives

A minimum demo for PyTorch c10d extension APIs

Build

python setup.py install

Usage

import os

import torch
import dummy_collectives

import torch.distributed as dist

os.environ['MASTER_ADDR'] = 'localhost'
os.environ['MASTER_PORT'] = '29500'

dist.init_process_group("dummy", rank=0, world_size=1)

x = torch.ones(6)
dist.all_reduce(x)
y = x.cuda()
dist.all_reduce(y)

print(f"cpu allreduce: {x}")
print(f"cuda allreduce: {y}")

try:
    dist.broadcast(x, 0)
except RuntimeError:
    print("got RuntimeError when calling broadcast")

About

A minimum demo for PyTorch c10d extension APIs

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published