Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update pytorch-lightning requirement from <2.0.0,>1.7.0 to >1.7.0,<2.1.0 in /requirements #1031

Draft
wants to merge 18 commits into
base: master
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
18 commits
Select commit Hold shift + click to select a range
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/source/callbacks/monitor.rst
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ Data Monitoring in LightningModule
The data monitoring callbacks allow you to log and inspect the distribution of data that passes through
the training step and layers of the model. When used in combination with a supported logger, the
:class:`~pl_bolts.callbacks.data_monitor.TrainingDataMonitor` creates a histogram for each `batch` input in
:meth:`~pytorch_lightning.core.lightning.LightningModule.training_step` and sends it to the logger:
:meth:`~pytorch_lightning.core.LightningModule.training_step` and sends it to the logger:

.. code-block:: python

Expand Down
2 changes: 1 addition & 1 deletion requirements/base.txt
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
numpy <1.26.0
pytorch-lightning >1.7.0, <2.0.0 # strict
pytorch-lightning >1.7.0, <2.1.0
torchmetrics <0.11.0 # strict
lightning-utilities >0.3.1 # this is needed for PL 1.7
torchvision >=0.10.0 # todo: move to topic related extras
Expand Down
2 changes: 1 addition & 1 deletion src/pl_bolts/callbacks/data_monitor.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,10 @@

import numpy as np
import torch
from lightning_utilities import apply_to_collection
from pytorch_lightning import Callback, LightningModule, Trainer
from pytorch_lightning.loggers import TensorBoardLogger, WandbLogger
from pytorch_lightning.utilities import rank_zero_warn
from pytorch_lightning.utilities.apply_func import apply_to_collection
from torch import Tensor, nn
from torch.nn import Module
from torch.utils.hooks import RemovableHandle
Expand Down
2 changes: 1 addition & 1 deletion src/pl_bolts/callbacks/verification/batch_gradient.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,8 @@

import torch
import torch.nn as nn
from lightning_utilities import apply_to_collection
from pytorch_lightning import LightningModule, Trainer
from pytorch_lightning.utilities.apply_func import apply_to_collection
from pytorch_lightning.utilities.exceptions import MisconfigurationException
from torch import Tensor

Expand Down
6 changes: 5 additions & 1 deletion src/pl_bolts/models/detection/yolo/yolo_module.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,10 +4,14 @@
import torch
import torch.nn as nn
from pytorch_lightning import LightningModule
from pytorch_lightning.utilities.cli import LightningCLI
from pytorch_lightning.utilities.types import STEP_OUTPUT
from torch import Tensor, optim

try: # Backward compatibility for Lightning CLI
from pytorch_lightning.cli import LightningCLI # PL v1.9+
except ImportError:
from pytorch_lightning.utilities.cli import LightningCLI # PL v1.8

# It seems to be impossible to avoid mypy errors if using import instead of getattr().
# See https://github.com/python/mypy/issues/8823
try:
Expand Down
6 changes: 5 additions & 1 deletion src/pl_bolts/models/rl/dqn_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,11 +7,15 @@
import torch
from pytorch_lightning import LightningModule, Trainer, seed_everything
from pytorch_lightning.callbacks import ModelCheckpoint
from pytorch_lightning.strategies import DataParallelStrategy
from torch import Tensor, optim
from torch.optim.optimizer import Optimizer
from torch.utils.data import DataLoader

try:
from pytorch_lightning.strategies import DataParallelStrategy # for PL v1.X
except ImportError:
from lightning_fabric.strategies import DataParallelStrategy # for PL v2.X

from pl_bolts.datamodules.experience_source import Experience, ExperienceSourceDataset
from pl_bolts.losses.rl import dqn_loss
from pl_bolts.models.rl.common.agents import ValueAgent
Expand Down
2 changes: 1 addition & 1 deletion tests/callbacks/test_ort.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@
from pl_bolts.callbacks import ORTCallback
from pl_bolts.utils import _TORCH_ORT_AVAILABLE
from pytorch_lightning import Callback, Trainer
from pytorch_lightning.core.lightning import LightningModule
from pytorch_lightning.core import LightningModule
from pytorch_lightning.utilities.exceptions import MisconfigurationException

from tests.helpers.boring_model import BoringModel
Expand Down
2 changes: 1 addition & 1 deletion tests/callbacks/test_sparseml.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@
from pl_bolts.callbacks import SparseMLCallback
from pl_bolts.utils import _SPARSEML_AVAILABLE
from pytorch_lightning import Callback, Trainer
from pytorch_lightning.core.lightning import LightningModule
from pytorch_lightning.core import LightningModule
from pytorch_lightning.utilities.exceptions import MisconfigurationException

from tests.helpers.boring_model import BoringModel
Expand Down
6 changes: 5 additions & 1 deletion tests/conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,11 @@
import torch
from pl_bolts.utils import _IS_WINDOWS, _TORCHVISION_AVAILABLE, _TORCHVISION_LESS_THAN_0_13
from pl_bolts.utils.stability import UnderReviewWarning
from pytorch_lightning.trainer.connectors.signal_connector import SignalConnector

try:
from pytorch_lightning.trainer.connectors.signal_connector import SignalConnector
except ImportError: # patch for PL v2.0+
from pytorch_lightning.trainer.connectors.signal_connector import _SignalConnector as SignalConnector

# GitHub Actions use this path to cache datasets.
# Use `datadir` fixture where possible and use `DATASETS_PATH` in
Expand Down
Loading