Skip to content

Commit

Permalink
[fix] Ensure we check deepspeed/sharded in multinode DDP (#6297)
Browse files Browse the repository at this point in the history
* Ensure we check deepspeed/sharded in multinode

* Add CHANGELOG.md

* Add CHANGELOG.md

* Drop mock, use actual multi-gpu node
  • Loading branch information
SeanNaren authored and tchaton committed Mar 9, 2021
1 parent 23c1d41 commit 7f785c2
Show file tree
Hide file tree
Showing 2 changed files with 5 additions and 6 deletions.
4 changes: 4 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,10 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).

- Fixed `SingleTPU` calling `all_gather` ([#6296](https://github.com/PyTorchLightning/pytorch-lightning/pull/6296))


- Ensure we check deepspeed/sharded in multinode DDP ([#6297](https://github.com/PyTorchLightning/pytorch-lightning/pull/6297)


## [1.2.2] - 2021-03-02

### Added
Expand Down
7 changes: 1 addition & 6 deletions pytorch_lightning/profiler/profilers.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,24 +13,19 @@
# limitations under the License.
"""Profiler to check if there are any bottlenecks in your code."""
import cProfile
import inspect
import io
import os
import pstats
import time
from abc import ABC, abstractmethod
from collections import defaultdict
from contextlib import contextmanager
from typing import List, Optional, Union
from typing import Optional, Union

import numpy as np
import torch

from pytorch_lightning import _logger as log
from pytorch_lightning.utilities import rank_zero_only
from pytorch_lightning.utilities.cloud_io import get_filesystem
from pytorch_lightning.utilities.distributed import rank_zero_warn
from pytorch_lightning.utilities.exceptions import MisconfigurationException


class BaseProfiler(ABC):
Expand Down

0 comments on commit 7f785c2

Please sign in to comment.