Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use merlin-dataloader package #1694

Merged
merged 7 commits into from
Nov 1, 2022
Merged

Use merlin-dataloader package #1694

merged 7 commits into from
Nov 1, 2022

Conversation

benfred
Copy link
Member

@benfred benfred commented Oct 25, 2022

Use the merlin-dataloader package under the hood for the nvt dataloader.

Use the merlin-dataloader package under the hood for the nvt dataloader.
@review-notebook-app
Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

@benfred benfred marked this pull request as draft October 25, 2022 00:18
@nvidia-merlin-bot
Copy link
Contributor

Click to view CI Results
GitHub pull request #1694 of commit 0230b67be62ea6dbca061b098e0adf165799a8cd, no merge conflicts.
Running as SYSTEM
Setting status of 0230b67be62ea6dbca061b098e0adf165799a8cd to PENDING with url http://10.20.17.181:8080/job/nvtabular_tests/4758/ and message: 'Build started for merge commit.'
Using context: Jenkins Unit Test Run
Building on master in workspace /var/jenkins_home/workspace/nvtabular_tests
using credential nvidia-merlin-bot
Cloning the remote Git repository
Cloning repository https://github.com/NVIDIA-Merlin/NVTabular.git
 > git init /var/jenkins_home/workspace/nvtabular_tests/nvtabular # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
 > git --version # timeout=10
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/pull/1694/*:refs/remotes/origin/pr/1694/* # timeout=10
 > git rev-parse 0230b67be62ea6dbca061b098e0adf165799a8cd^{commit} # timeout=10
Checking out Revision 0230b67be62ea6dbca061b098e0adf165799a8cd (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0230b67be62ea6dbca061b098e0adf165799a8cd # timeout=10
Commit message: "Use merlin-dataloader package"
 > git rev-list --no-walk 89ce276422a4531c741fc7e291978e62f7cbc2d2 # timeout=10
First time build. Skipping changelog.
[nvtabular_tests] $ /bin/bash /tmp/jenkins13140158748642544412.sh
GLOB sdist-make: /var/jenkins_home/workspace/nvtabular_tests/nvtabular/setup.py
test-gpu create: /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu
test-gpu installdeps: pytest, pytest-cov
WARNING: Discarding $PYTHONPATH from environment, to override specify PYTHONPATH in 'passenv' in your configuration.
test-gpu inst: /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/.tmp/package/1/nvtabular-1.5.0+2.g0230b67be.zip
WARNING: Discarding $PYTHONPATH from environment, to override specify PYTHONPATH in 'passenv' in your configuration.
ERROR: invocation failed (exit code 1), logfile: /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/log/test-gpu-2.log
================================== log start ===================================
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Processing ./.tox/.tmp/package/1/nvtabular-1.5.0+2.g0230b67be.zip
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Requirement already satisfied: merlin-core>=0.2.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from nvtabular==1.5.0+2.g0230b67be) (0.3.0+12.g78ecddd)
Requirement already satisfied: scipy in /usr/local/lib/python3.8/dist-packages (from nvtabular==1.5.0+2.g0230b67be) (1.9.1)
ERROR: Could not find a version that satisfies the requirement merlin-dataloader (from nvtabular) (from versions: none)
ERROR: No matching distribution found for merlin-dataloader

[notice] A new release of pip available: 22.2.2 -> 22.3
[notice] To update, run: pip install --upgrade pip

=================================== log end ====================================
___________________________________ summary ____________________________________
ERROR: test-gpu: InvocationError for command /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/bin/python -m pip install --exists-action w .tox/.tmp/package/1/nvtabular-1.5.0+2.g0230b67be.zip (exited with code 1)
Build step 'Execute shell' marked build as failure
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://github.com/gitapi/repos/NVIDIA-Merlin/NVTabular/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[nvtabular_tests] $ /bin/bash /tmp/jenkins9831646555239630743.sh

@nvidia-merlin-bot
Copy link
Contributor

Click to view CI Results
GitHub pull request #1694 of commit dfaecf3b46e123de4369757aadc9b2fac7d0beb4, no merge conflicts.
Running as SYSTEM
Setting status of dfaecf3b46e123de4369757aadc9b2fac7d0beb4 to PENDING with url http://10.20.17.181:8080/job/nvtabular_tests/4759/ and message: 'Build started for merge commit.'
Using context: Jenkins Unit Test Run
Building on master in workspace /var/jenkins_home/workspace/nvtabular_tests
using credential nvidia-merlin-bot
Cloning the remote Git repository
Cloning repository https://github.com/NVIDIA-Merlin/NVTabular.git
 > git init /var/jenkins_home/workspace/nvtabular_tests/nvtabular # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
 > git --version # timeout=10
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/pull/1694/*:refs/remotes/origin/pr/1694/* # timeout=10
 > git rev-parse dfaecf3b46e123de4369757aadc9b2fac7d0beb4^{commit} # timeout=10
Checking out Revision dfaecf3b46e123de4369757aadc9b2fac7d0beb4 (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f dfaecf3b46e123de4369757aadc9b2fac7d0beb4 # timeout=10
Commit message: "Merge branch 'main' into merlin_dataloader"
 > git rev-list --no-walk 0230b67be62ea6dbca061b098e0adf165799a8cd # timeout=10
[nvtabular_tests] $ /bin/bash /tmp/jenkins674685108540776710.sh
GLOB sdist-make: /var/jenkins_home/workspace/nvtabular_tests/nvtabular/setup.py
test-gpu create: /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu
test-gpu installdeps: pytest, pytest-cov
WARNING: Discarding $PYTHONPATH from environment, to override specify PYTHONPATH in 'passenv' in your configuration.
test-gpu inst: /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/.tmp/package/1/nvtabular-1.6.0+3.gdfaecf3b4.zip
WARNING: Discarding $PYTHONPATH from environment, to override specify PYTHONPATH in 'passenv' in your configuration.
ERROR: invocation failed (exit code 1), logfile: /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/log/test-gpu-2.log
================================== log start ===================================
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Processing ./.tox/.tmp/package/1/nvtabular-1.6.0+3.gdfaecf3b4.zip
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Requirement already satisfied: scipy in /usr/local/lib/python3.8/dist-packages (from nvtabular==1.6.0+3.gdfaecf3b4) (1.9.1)
Requirement already satisfied: merlin-core>=0.2.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from nvtabular==1.6.0+3.gdfaecf3b4) (0.3.0+12.g78ecddd)
ERROR: Could not find a version that satisfies the requirement merlin-dataloader (from nvtabular) (from versions: none)
ERROR: No matching distribution found for merlin-dataloader

[notice] A new release of pip available: 22.2.2 -> 22.3
[notice] To update, run: pip install --upgrade pip

=================================== log end ====================================
___________________________________ summary ____________________________________
ERROR: test-gpu: InvocationError for command /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/bin/python -m pip install --exists-action w .tox/.tmp/package/1/nvtabular-1.6.0+3.gdfaecf3b4.zip (exited with code 1)
Build step 'Execute shell' marked build as failure
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://github.com/gitapi/repos/NVIDIA-Merlin/NVTabular/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[nvtabular_tests] $ /bin/bash /tmp/jenkins1237728581524987061.sh

@benfred benfred added the chore label Oct 25, 2022
@github-actions
Copy link

Documentation preview

https://nvidia-merlin.github.io/NVTabular/review/pr-1694

@benfred benfred marked this pull request as ready for review October 26, 2022 17:30
@nvidia-merlin-bot
Copy link
Contributor

Click to view CI Results
GitHub pull request #1694 of commit c0c0029bf2e94fad4ef77b399b9acaa54e7ba44b, no merge conflicts.
Running as SYSTEM
Setting status of c0c0029bf2e94fad4ef77b399b9acaa54e7ba44b to PENDING with url http://10.20.17.181:8080/job/nvtabular_tests/4760/ and message: 'Build started for merge commit.'
Using context: Jenkins Unit Test Run
Building on master in workspace /var/jenkins_home/workspace/nvtabular_tests
using credential nvidia-merlin-bot
Cloning the remote Git repository
Cloning repository https://github.com/NVIDIA-Merlin/NVTabular.git
 > git init /var/jenkins_home/workspace/nvtabular_tests/nvtabular # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
 > git --version # timeout=10
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/pull/1694/*:refs/remotes/origin/pr/1694/* # timeout=10
 > git rev-parse c0c0029bf2e94fad4ef77b399b9acaa54e7ba44b^{commit} # timeout=10
Checking out Revision c0c0029bf2e94fad4ef77b399b9acaa54e7ba44b (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f c0c0029bf2e94fad4ef77b399b9acaa54e7ba44b # timeout=10
Commit message: "Share code to transform inputs to schema"
 > git rev-list --no-walk dfaecf3b46e123de4369757aadc9b2fac7d0beb4 # timeout=10
[nvtabular_tests] $ /bin/bash /tmp/jenkins3717466962464540402.sh
GLOB sdist-make: /var/jenkins_home/workspace/nvtabular_tests/nvtabular/setup.py
test-gpu create: /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu
test-gpu installdeps: pytest, pytest-cov
WARNING: Discarding $PYTHONPATH from environment, to override specify PYTHONPATH in 'passenv' in your configuration.
test-gpu inst: /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/.tmp/package/1/nvtabular-1.6.0+6.gc0c0029bf.zip
WARNING: Discarding $PYTHONPATH from environment, to override specify PYTHONPATH in 'passenv' in your configuration.
test-gpu installed: absl-py==1.2.0,aiohttp==3.8.1,aiosignal==1.2.0,alabaster==0.7.12,anyio==3.6.1,argon2-cffi==21.3.0,argon2-cffi-bindings==21.2.0,astroid==2.5.6,asttokens==2.0.8,astunparse==1.6.3,asv==0.5.1,asvdb==0.4.2,async-timeout==4.0.2,attrs==22.1.0,awscli==1.26.0,Babel==2.10.3,backcall==0.2.0,beautifulsoup4==4.11.1,betterproto==1.2.5,black==22.6.0,bleach==5.0.1,boto3==1.24.75,botocore==1.28.0,Brotli==1.0.9,cachetools==5.2.0,certifi==2019.11.28,cffi==1.15.1,chardet==3.0.4,charset-normalizer==2.1.1,clang==5.0,click==8.1.3,cloudpickle==2.2.0,cmake==3.24.1.1,colorama==0.4.4,contourpy==1.0.5,coverage==6.5.0,cuda-python==11.7.1,cupy-cuda117==10.6.0,cycler==0.11.0,Cython==0.29.32,dask==2022.1.1,dbus-python==1.2.16,debugpy==1.6.3,decorator==5.1.1,defusedxml==0.7.1,dill==0.3.5.1,distlib==0.3.6,distributed==2022.5.1,distro==1.7.0,dm-tree==0.1.6,docker-pycreds==0.4.0,docutils==0.16,emoji==1.7.0,entrypoints==0.4,execnet==1.9.0,executing==1.0.0,faiss==1.7.2,faiss-gpu==1.7.2,fastai==2.7.9,fastapi==0.85.0,fastavro==1.6.1,fastcore==1.5.27,fastdownload==0.0.7,fastjsonschema==2.16.1,fastprogress==1.0.3,fastrlock==0.8,feast==0.19.4,fiddle==0.2.2,filelock==3.8.0,flatbuffers==1.12,fonttools==4.37.3,frozenlist==1.3.1,fsspec==2022.5.0,gast==0.4.0,gevent==21.12.0,geventhttpclient==2.0.2,gitdb==4.0.9,GitPython==3.1.27,google==3.0.0,google-api-core==2.10.1,google-auth==2.11.1,google-auth-oauthlib==0.4.6,google-pasta==0.2.0,googleapis-common-protos==1.52.0,graphviz==0.20.1,greenlet==1.1.3,grpcio==1.41.0,grpcio-channelz==1.49.0,grpcio-reflection==1.48.1,grpclib==0.4.3,h11==0.13.0,h2==4.1.0,h5py==3.7.0,HeapDict==1.0.1,hpack==4.0.0,httptools==0.5.0,hugectr2onnx==0.0.0,huggingface-hub==0.9.1,hyperframe==6.0.1,idna==2.8,imagesize==1.4.1,implicit==0.6.1,importlib-metadata==4.12.0,importlib-resources==5.9.0,iniconfig==1.1.1,ipykernel==6.15.3,ipython==8.5.0,ipython-genutils==0.2.0,ipywidgets==7.7.0,jedi==0.18.1,Jinja2==3.1.2,jmespath==1.0.1,joblib==1.2.0,json5==0.9.10,jsonschema==4.16.0,jupyter-cache==0.4.3,jupyter-core==4.11.1,jupyter-server==1.18.1,jupyter-server-mathjax==0.2.5,jupyter-sphinx==0.3.2,jupyter_client==7.3.5,jupyterlab==3.4.7,jupyterlab-pygments==0.2.2,jupyterlab-widgets==1.1.0,jupyterlab_server==2.15.1,keras==2.9.0,Keras-Preprocessing==1.1.2,kiwisolver==1.4.4,lazy-object-proxy==1.7.1,libclang==14.0.6,libcst==0.4.7,lightfm==1.16,lightgbm==3.3.2,linkify-it-py==1.0.3,llvmlite==0.39.1,locket==1.0.0,lxml==4.9.1,Markdown==3.4.1,markdown-it-py==1.1.0,MarkupSafe==2.1.1,matplotlib==3.6.0,matplotlib-inline==0.1.6,mdit-py-plugins==0.2.8,merlin-core==0.6.0+1.g5926fcf,merlin-dataloader==0.0.2,merlin-models==0.7.0+11.g280956aa4,merlin-systems==0.5.0+4.g15074ad,mistune==2.0.4,mmh3==3.0.0,mpi4py==3.1.3,msgpack==1.0.4,multidict==6.0.2,mypy-extensions==0.4.3,myst-nb==0.13.2,myst-parser==0.15.2,natsort==8.1.0,nbclassic==0.4.3,nbclient==0.6.8,nbconvert==7.0.0,nbdime==3.1.1,nbformat==5.5.0,nest-asyncio==1.5.5,ninja==1.10.2.3,notebook==6.4.12,notebook-shim==0.1.0,numba==0.56.2,numpy==1.22.4,nvidia-pyindex==1.0.9,-e git+https://github.com/NVIDIA-Merlin/NVTabular.git@c0c0029bf2e94fad4ef77b399b9acaa54e7ba44b#egg=nvtabular,nvtx==0.2.5,oauthlib==3.2.1,oldest-supported-numpy==2022.8.16,onnx==1.12.0,onnxruntime==1.11.1,opt-einsum==3.3.0,packaging==21.3,pandas==1.3.5,pandavro==1.5.2,pandocfilters==1.5.0,parso==0.8.3,partd==1.3.0,pathtools==0.1.2,pexpect==4.8.0,pickleshare==0.7.5,Pillow==9.2.0,pkgutil_resolve_name==1.3.10,platformdirs==2.5.2,pluggy==1.0.0,prometheus-client==0.14.1,promise==2.3,prompt-toolkit==3.0.31,proto-plus==1.19.6,protobuf==3.19.5,psutil==5.9.2,ptyprocess==0.7.0,pure-eval==0.2.2,py==1.11.0,pyarrow==7.0.0,pyasn1==0.4.8,pyasn1-modules==0.2.8,pybind11==2.10.0,pycparser==2.21,pydantic==1.10.2,pydot==1.4.2,Pygments==2.13.0,PyGObject==3.36.0,pynvml==11.4.1,pyparsing==3.0.9,pyrsistent==0.18.1,pytest==7.1.3,pytest-cov==4.0.0,pytest-forked==1.4.0,pytest-xdist==2.5.0,python-apt==2.0.0+ubuntu0.20.4.8,python-dateutil==2.8.2,python-dotenv==0.21.0,python-rapidjson==1.8,pytz==2022.2.1,PyYAML==5.4.1,pyzmq==24.0.0,regex==2022.9.13,requests==2.22.0,requests-oauthlib==1.3.1,requests-unixsocket==0.2.0,rsa==4.7.2,s3fs==2022.2.0,s3transfer==0.6.0,sacremoses==0.0.53,scikit-build==0.15.0,scikit-learn==1.1.2,scipy==1.9.1,seedir==0.3.0,Send2Trash==1.8.0,sentry-sdk==1.9.8,setproctitle==1.3.2,setuptools-scm==7.0.5,shortuuid==1.0.9,six==1.15.0,sklearn==0.0,smmap==5.0.0,sniffio==1.3.0,snowballstemmer==2.2.0,sortedcontainers==2.4.0,soupsieve==2.3.2.post1,Sphinx==5.3.0,sphinx-multiversion==0.2.4,sphinx-togglebutton==0.3.1,sphinx_external_toc==0.3.0,sphinxcontrib-applehelp==1.0.2,sphinxcontrib-copydirs @ git+https://github.com/mikemckiernan/sphinxcontrib-copydirs.git@bd8c5d79b3f91cf5f1bb0d6995aeca3fe84b670e,sphinxcontrib-devhelp==1.0.2,sphinxcontrib-htmlhelp==2.0.0,sphinxcontrib-jsmath==1.0.1,sphinxcontrib-qthelp==1.0.3,sphinxcontrib-serializinghtml==1.1.5,SQLAlchemy==1.4.36,stack-data==0.5.0,starlette==0.20.4,stringcase==1.2.0,supervisor==4.1.0,tabulate==0.8.10,tblib==1.7.0,tdqm==0.0.1,tenacity==8.0.1,tensorboard==2.9.1,tensorboard-data-server==0.6.1,tensorboard-plugin-wit==1.8.1,tensorflow==2.6.2,tensorflow-estimator==2.9.0,tensorflow-gpu==2.9.2,tensorflow-io-gcs-filesystem==0.27.0,tensorflow-metadata==1.10.0,termcolor==2.0.1,terminado==0.15.0,testbook==0.4.2,threadpoolctl==3.1.0,tinycss2==1.1.1,tokenizers==0.10.3,toml==0.10.2,tomli==2.0.1,toolz==0.12.0,torch==1.12.1+cu113,torchmetrics==0.3.2,tornado==6.2,tox==3.26.0,tqdm==4.64.1,traitlets==5.4.0,transformers==4.12.0,transformers4rec==0.1.12+2.gbcc939255,treelite==2.3.0,treelite-runtime==2.3.0,tritonclient==2.25.0,typing-inspect==0.8.0,typing_extensions==4.3.0,uc-micro-py==1.0.1,urllib3==1.26.12,uvicorn==0.18.3,uvloop==0.17.0,versioneer==0.20,virtualenv==20.16.5,wandb==0.13.3,watchfiles==0.17.0,wcwidth==0.2.5,webencodings==0.5.1,websocket-client==1.4.1,websockets==10.3,Werkzeug==2.2.2,widgetsnbextension==3.6.0,wrapt==1.12.1,xgboost==1.6.2,yarl==1.8.1,zict==2.2.0,zipp==3.8.1,zope.event==4.5.0,zope.interface==5.4.0
test-gpu run-test-pre: PYTHONHASHSEED='1494577504'
test-gpu run-test: commands[0] | python -m pip install --upgrade git+https://github.com/NVIDIA-Merlin/core.git
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Collecting git+https://github.com/NVIDIA-Merlin/core.git
  Cloning https://github.com/NVIDIA-Merlin/core.git to /tmp/pip-req-build-usf2h74o
  Running command git clone --filter=blob:none --quiet https://github.com/NVIDIA-Merlin/core.git /tmp/pip-req-build-usf2h74o
  Resolved https://github.com/NVIDIA-Merlin/core.git to commit 2c621a26a2b1b7ed786c99bd7c2790b9fc675098
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Requirement already satisfied: packaging in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+3.g2c621a2) (21.3)
Requirement already satisfied: pandas<1.4.0dev0,>=1.2.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+3.g2c621a2) (1.3.5)
Requirement already satisfied: distributed>=2022.3.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+3.g2c621a2) (2022.3.0)
Requirement already satisfied: dask>=2022.3.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+3.g2c621a2) (2022.3.0)
Requirement already satisfied: fsspec==2022.5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+3.g2c621a2) (2022.5.0)
Requirement already satisfied: numba>=0.54 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+3.g2c621a2) (0.55.1)
Requirement already satisfied: betterproto<2.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+3.g2c621a2) (1.2.5)
Requirement already satisfied: protobuf>=3.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+3.g2c621a2) (3.19.5)
Requirement already satisfied: tqdm>=4.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+3.g2c621a2) (4.64.1)
Requirement already satisfied: tensorflow-metadata>=1.2.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+3.g2c621a2) (1.10.0)
Requirement already satisfied: pyarrow>=5.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+3.g2c621a2) (7.0.0)
Requirement already satisfied: grpclib in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core==0.8.0+3.g2c621a2) (0.4.3)
Requirement already satisfied: stringcase in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core==0.8.0+3.g2c621a2) (1.2.0)
Requirement already satisfied: cloudpickle>=1.1.1 in /usr/local/lib/python3.8/dist-packages (from dask>=2022.3.0->merlin-core==0.8.0+3.g2c621a2) (2.2.0)
Requirement already satisfied: pyyaml>=5.3.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg (from dask>=2022.3.0->merlin-core==0.8.0+3.g2c621a2) (5.4.1)
Requirement already satisfied: toolz>=0.8.2 in /usr/local/lib/python3.8/dist-packages (from dask>=2022.3.0->merlin-core==0.8.0+3.g2c621a2) (0.12.0)
Requirement already satisfied: partd>=0.3.10 in /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg (from dask>=2022.3.0->merlin-core==0.8.0+3.g2c621a2) (1.2.0)
Requirement already satisfied: msgpack>=0.6.0 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core==0.8.0+3.g2c621a2) (1.0.4)
Requirement already satisfied: jinja2 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core==0.8.0+3.g2c621a2) (3.1.2)
Requirement already satisfied: tornado>=6.0.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg (from distributed>=2022.3.0->merlin-core==0.8.0+3.g2c621a2) (6.1)
Requirement already satisfied: zict>=0.1.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg (from distributed>=2022.3.0->merlin-core==0.8.0+3.g2c621a2) (2.0.0)
Requirement already satisfied: click>=6.6 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core==0.8.0+3.g2c621a2) (8.1.3)
Requirement already satisfied: psutil>=5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg (from distributed>=2022.3.0->merlin-core==0.8.0+3.g2c621a2) (5.8.0)
Requirement already satisfied: tblib>=1.6.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg (from distributed>=2022.3.0->merlin-core==0.8.0+3.g2c621a2) (1.7.0)
Requirement already satisfied: sortedcontainers!=2.0.0,!=2.0.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg (from distributed>=2022.3.0->merlin-core==0.8.0+3.g2c621a2) (2.4.0)
Requirement already satisfied: setuptools in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core==0.8.0+3.g2c621a2) (65.4.1)
Requirement already satisfied: numpy<1.22,>=1.18 in /var/jenkins_home/.local/lib/python3.8/site-packages (from numba>=0.54->merlin-core==0.8.0+3.g2c621a2) (1.20.3)
Requirement already satisfied: llvmlite<0.39,>=0.38.0rc1 in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core==0.8.0+3.g2c621a2) (0.38.1)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /usr/local/lib/python3.8/dist-packages (from packaging->merlin-core==0.8.0+3.g2c621a2) (3.0.9)
Requirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core==0.8.0+3.g2c621a2) (2.8.2)
Requirement already satisfied: pytz>=2017.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core==0.8.0+3.g2c621a2) (2022.2.1)
Requirement already satisfied: absl-py<2.0.0,>=0.9 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core==0.8.0+3.g2c621a2) (1.2.0)
Requirement already satisfied: googleapis-common-protos<2,>=1.52.0 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core==0.8.0+3.g2c621a2) (1.52.0)
Requirement already satisfied: locket in /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg (from partd>=0.3.10->dask>=2022.3.0->merlin-core==0.8.0+3.g2c621a2) (0.2.1)
Requirement already satisfied: six>=1.5 in /var/jenkins_home/.local/lib/python3.8/site-packages (from python-dateutil>=2.7.3->pandas<1.4.0dev0,>=1.2.0->merlin-core==0.8.0+3.g2c621a2) (1.15.0)
Requirement already satisfied: heapdict in /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg (from zict>=0.1.3->distributed>=2022.3.0->merlin-core==0.8.0+3.g2c621a2) (1.0.1)
Requirement already satisfied: multidict in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core==0.8.0+3.g2c621a2) (6.0.2)
Requirement already satisfied: h2<5,>=3.1.0 in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core==0.8.0+3.g2c621a2) (4.1.0)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.8/dist-packages (from jinja2->distributed>=2022.3.0->merlin-core==0.8.0+3.g2c621a2) (2.1.1)
Requirement already satisfied: hpack<5,>=4.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core==0.8.0+3.g2c621a2) (4.0.0)
Requirement already satisfied: hyperframe<7,>=6.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core==0.8.0+3.g2c621a2) (6.0.1)
Building wheels for collected packages: merlin-core
  Building wheel for merlin-core (pyproject.toml): started
  Building wheel for merlin-core (pyproject.toml): finished with status 'done'
  Created wheel for merlin-core: filename=merlin_core-0.8.0+3.g2c621a2-py3-none-any.whl size=118254 sha256=efe6a3fa3898c9320c7d52ba13787e62ab66a450c876ddc5fa93be48eccd3791
  Stored in directory: /tmp/pip-ephem-wheel-cache-0ic2gibk/wheels/c8/38/16/a6968787eafcec5fa772148af8408b089562f71af0752e8e84
Successfully built merlin-core
Installing collected packages: merlin-core
  Attempting uninstall: merlin-core
    Found existing installation: merlin-core 0.3.0+12.g78ecddd
    Not uninstalling merlin-core at /var/jenkins_home/.local/lib/python3.8/site-packages, outside environment /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu
    Can't uninstall 'merlin-core'. No files were found to uninstall.
Successfully installed merlin-core-0.8.0+3.g2c621a2

[notice] A new release of pip available: 22.2.2 -> 22.3
[notice] To update, run: pip install --upgrade pip
test-gpu run-test: commands[1] | python -m pytest --cov-report term --cov merlin -rxs tests/unit
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-7.1.3, pluggy-1.0.0
cachedir: .tox/test-gpu/.pytest_cache
rootdir: /var/jenkins_home/workspace/nvtabular_tests/nvtabular, configfile: pyproject.toml
plugins: anyio-3.5.0, xdist-2.5.0, forked-1.4.0, cov-4.0.0
collected 1436 items / 1 skipped

tests/unit/test_dask_nvt.py ............................................ [ 3%]
........................................................................ [ 8%]
.... [ 8%]
tests/unit/test_notebooks.py F.F. [ 8%]
tests/unit/test_tf4rec.py . [ 8%]
tests/unit/test_tools.py ...................... [ 10%]
tests/unit/test_triton_inference.py ................................ [ 12%]
tests/unit/examples/test_01-Getting-started.py . [ 12%]
tests/unit/examples/test_02-Advanced-NVTabular-workflow.py . [ 12%]
tests/unit/examples/test_03-Running-on-multiple-GPUs-or-on-CPU.py . [ 12%]
tests/unit/framework_utils/test_tf_feature_columns.py . [ 12%]
tests/unit/framework_utils/test_tf_layers.py ........................... [ 14%]
................................................... [ 18%]
tests/unit/framework_utils/test_torch_layers.py . [ 18%]
tests/unit/loader/test_tf_dataloader.py ................................ [ 20%]
........................................s.. [ 23%]
tests/unit/loader/test_torch_dataloader.py ............................. [ 25%]
..................................................... [ 29%]
tests/unit/ops/test_categorify.py ...................................... [ 31%]
........................................................................ [ 36%]
..................................................... [ 40%]
tests/unit/ops/test_column_similarity.py ........................ [ 42%]
tests/unit/ops/test_drop_low_cardinality.py .. [ 42%]
tests/unit/ops/test_fill.py ............................................ [ 45%]
........ [ 45%]
tests/unit/ops/test_groupyby.py ....................... [ 47%]
tests/unit/ops/test_hash_bucket.py ......................... [ 49%]
tests/unit/ops/test_join.py ............................................ [ 52%]
........................................................................ [ 57%]
.................................. [ 59%]
tests/unit/ops/test_lambda.py .......... [ 60%]
tests/unit/ops/test_normalize.py ....................................... [ 63%]
.. [ 63%]
tests/unit/ops/test_ops.py ............................................. [ 66%]
.................... [ 67%]
tests/unit/ops/test_ops_schema.py ...................................... [ 70%]
........................................................................ [ 75%]
........................................................................ [ 80%]
........................................................................ [ 85%]
....................................... [ 88%]
tests/unit/ops/test_reduce_dtype_size.py .. [ 88%]
tests/unit/ops/test_target_encode.py ..................... [ 89%]
tests/unit/workflow/test_cpu_workflow.py ...... [ 90%]
tests/unit/workflow/test_workflow.py ................................... [ 92%]
.......................................................... [ 96%]
tests/unit/workflow/test_workflow_chaining.py ... [ 96%]
tests/unit/workflow/test_workflow_node.py ........... [ 97%]
tests/unit/workflow/test_workflow_ops.py ... [ 97%]
tests/unit/workflow/test_workflow_schemas.py ........................... [ 99%]
... [100%]

=================================== FAILURES ===================================
___________________________ test_criteo_tf_notebook ____________________________

tmpdir = local('/tmp/pytest-of-jenkins/pytest-14/test_criteo_tf_notebook0')

def test_criteo_tf_notebook(tmpdir):
    tor = pytest.importorskip("tensorflow")  # noqa
    # create a toy dataset in tmpdir, and point environment variables so the notebook
    # will read from it
    os.system("mkdir -p " + os.path.join(tmpdir, "converted/criteo"))
    for i in range(24):
        df = _get_random_criteo_data(1000)
        df.to_parquet(os.path.join(tmpdir, "converted/criteo", f"day_{i}.parquet"))
    os.environ["BASE_DIR"] = str(tmpdir)

    def _nb_modify(line):
        # Disable LocalCUDACluster
        line = line.replace("client.run(_rmm_pool)", "# client.run(_rmm_pool)")
        line = line.replace("if cluster is None:", "if False:")
        line = line.replace("client = Client(cluster)", "# client = Client(cluster)")
        line = line.replace(
            "workflow = nvt.Workflow(features, client=client)", "workflow = nvt.Workflow(features)"
        )
        line = line.replace("client", "# client")
        line = line.replace("NUM_GPUS = [0, 1, 2, 3, 4, 5, 6, 7]", "NUM_GPUS = [0]")
        line = line.replace("part_size = int(part_mem_frac * device_size)", "part_size = '128MB'")

        return line

    _run_notebook(
        tmpdir,
        os.path.join(
            dirname(TEST_PATH),
            "examples/scaling-criteo/",
            "02-ETL-with-NVTabular.ipynb",
        ),
        # disable rmm.reinitialize, seems to be causing issues
        transform=_nb_modify,
    )

    def _modify_tf_nb(line):
        return line.replace(
            # don't require grqphviz/pydot
            "tf.keras.utils.plot_model(model)",
            "# tf.keras.utils.plot_model(model)",
        )
  _run_notebook(
        tmpdir,
        os.path.join(
            dirname(TEST_PATH),
            "examples/scaling-criteo/",
            "03-Training-with-TF.ipynb",
        ),
        transform=_modify_tf_nb,
    )

tests/unit/test_notebooks.py:80:


tests/unit/test_notebooks.py:223: in _run_notebook
subprocess.check_output([sys.executable, script_path])
/usr/lib/python3.8/subprocess.py:415: in check_output
return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,


input = None, capture_output = False, timeout = None, check = True
popenargs = (['/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/bin/python', '/tmp/pytest-of-jenkins/pytest-14/test_criteo_tf_notebook0/notebook.py'],)
kwargs = {'stdout': -1}, process = <subprocess.Popen object at 0x7fb794cba280>
stdout = b'', stderr = None, retcode = 1

def run(*popenargs,
        input=None, capture_output=False, timeout=None, check=False, **kwargs):
    """Run command with arguments and return a CompletedProcess instance.

    The returned instance will have attributes args, returncode, stdout and
    stderr. By default, stdout and stderr are not captured, and those attributes
    will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.

    If check is True and the exit code was non-zero, it raises a
    CalledProcessError. The CalledProcessError object will have the return code
    in the returncode attribute, and output & stderr attributes if those streams
    were captured.

    If timeout is given, and the process takes too long, a TimeoutExpired
    exception will be raised.

    There is an optional argument "input", allowing you to
    pass bytes or a string to the subprocess's stdin.  If you use this argument
    you may not also use the Popen constructor's "stdin" argument, as
    it will be used internally.

    By default, all communication is in bytes, and therefore any "input" should
    be bytes, and the stdout and stderr will be bytes. If in text mode, any
    "input" should be a string, and stdout and stderr will be strings decoded
    according to locale encoding, or by "encoding" if set. Text mode is
    triggered by setting any of text, encoding, errors or universal_newlines.

    The other arguments are the same as for the Popen constructor.
    """
    if input is not None:
        if kwargs.get('stdin') is not None:
            raise ValueError('stdin and input arguments may not both be used.')
        kwargs['stdin'] = PIPE

    if capture_output:
        if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
            raise ValueError('stdout and stderr arguments may not be used '
                             'with capture_output.')
        kwargs['stdout'] = PIPE
        kwargs['stderr'] = PIPE

    with Popen(*popenargs, **kwargs) as process:
        try:
            stdout, stderr = process.communicate(input, timeout=timeout)
        except TimeoutExpired as exc:
            process.kill()
            if _mswindows:
                # Windows accumulates the output in a single blocking
                # read() call run on child threads, with the timeout
                # being done in a join() on those threads.  communicate()
                # _after_ kill() is required to collect that and add it
                # to the exception.
                exc.stdout, exc.stderr = process.communicate()
            else:
                # POSIX _communicate already populated the output so
                # far into the TimeoutExpired exception.
                process.wait()
            raise
        except:  # Including KeyboardInterrupt, communicate handled that.
            process.kill()
            # We don't call process.wait() as .__exit__ does that for us.
            raise
        retcode = process.poll()
        if check and retcode:
          raise CalledProcessError(retcode, process.args,
                                     output=stdout, stderr=stderr)

E subprocess.CalledProcessError: Command '['/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/bin/python', '/tmp/pytest-of-jenkins/pytest-14/test_criteo_tf_notebook0/notebook.py']' returned non-zero exit status 1.

/usr/lib/python3.8/subprocess.py:516: CalledProcessError
----------------------------- Captured stderr call -----------------------------
/tmp/pytest-of-jenkins/pytest-14/test_criteo_tf_notebook0/notebook.py:92: UserWarning: BEWARE - 2.366636032 GB is already occupied on device 0!
warnings.warn(f"BEWARE - {used} GB is already occupied on device {int(dev)}!")
2022-10-26 17:35:07.494481: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2022-10-26 17:35:10.453344: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 8139 MB memory: -> device: 0, name: Tesla P100-DGXS-16GB, pci bus id: 0000:07:00.0, compute capability: 6.0
2022-10-26 17:35:10.454089: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:1 with 14500 MB memory: -> device: 1, name: Tesla P100-DGXS-16GB, pci bus id: 0000:08:00.0, compute capability: 6.0
2022-10-26 17:35:10.454669: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:2 with 15149 MB memory: -> device: 2, name: Tesla P100-DGXS-16GB, pci bus id: 0000:0e:00.0, compute capability: 6.0
2022-10-26 17:35:10.455278: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:3 with 15149 MB memory: -> device: 3, name: Tesla P100-DGXS-16GB, pci bus id: 0000:0f:00.0, compute capability: 6.0
/usr/lib/python3/dist-packages/requests/init.py:89: RequestsDependencyWarning: urllib3 (1.26.12) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
Traceback (most recent call last):
File "/tmp/pytest-of-jenkins/pytest-14/test_criteo_tf_notebook0/notebook.py", line 105, in
history = model.fit(train_dataset_tf, callbacks=[validation_callback], epochs=EPOCHS)
File "/usr/local/lib/python3.8/dist-packages/keras/utils/traceback_utils.py", line 67, in error_handler
raise e.with_traceback(filtered_tb) from None
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/tensorflow.py", line 144, in getitem
return LoaderBase.next(self)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 243, in next
return self._get_next_batch()
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 278, in _get_next_batch
batch = next(self._batch_itr)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 402, in
return (self._handle_tensors(batch) for batch in batches)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/tensorflow.py", line 286, in _handle_tensors
to_return = super()._handle_tensors(tensors)
File "/usr/local/lib/python3.8/dist-packages/nvtx/nvtx.py", line 101, in inner
result = func(*args, **kwargs)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 534, in _handle_tensors
tensors = self._tensor_split(tensor, len(names), axis=1)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/tensorflow.py", line 180, in _tensor_split
return tf.split(tensor, idx, axis=axis)
tensorflow.python.framework.errors_impl.InvalidArgumentError: Number of ways to split should evenly divide the split dimension, but got split_dim 1 (size = 27) and num_split 13 [Op:Split] name: split
____________________________ test_movielens_example ____________________________

tmpdir = local('/tmp/pytest-of-jenkins/pytest-14/test_movielens_example0')

def test_movielens_example(tmpdir):
    _get_random_movielens_data(tmpdir, 10000, dataset="movie")
    _get_random_movielens_data(tmpdir, 10000, dataset="ratings")
    _get_random_movielens_data(tmpdir, 5000, dataset="ratings", valid=True)

    triton_model_path = os.path.join(tmpdir, "models")
    os.environ["INPUT_DATA_DIR"] = str(tmpdir)
    os.environ["MODEL_PATH"] = triton_model_path

    notebook_path = os.path.join(
        dirname(TEST_PATH),
        "examples/getting-started-movielens/",
        "02-ETL-with-NVTabular.ipynb",
    )
    _run_notebook(tmpdir, notebook_path)

    def _modify_tf_nb(line):
        return line.replace(
            # don't require graphviz/pydot
            "tf.keras.utils.plot_model(model)",
            "# tf.keras.utils.plot_model(model)",
        )

    def _modify_tf_triton(line):
        # models are already preloaded
        line = line.replace("triton_client.load_model", "# triton_client.load_model")
        line = line.replace("triton_client.unload_model", "# triton_client.unload_model")
        return line

    notebooks = []
    try:
        import torch  # noqa

        notebooks.append("03-Training-with-PyTorch.ipynb")
    except Exception:
        pass
    try:
        import nvtabular.inference.triton  # noqa
        import nvtabular.loader.tensorflow  # noqa

        notebooks.append("03-Training-with-TF.ipynb")
        has_tf = True

    except Exception:
        has_tf = False

    for notebook in notebooks:
        notebook_path = os.path.join(
            dirname(TEST_PATH),
            "examples/getting-started-movielens/",
            notebook,
        )
        if notebook == "03-Training-with-TF.ipynb":
            _run_notebook(tmpdir, notebook_path, transform=_modify_tf_nb)
        else:
          _run_notebook(tmpdir, notebook_path)

tests/unit/test_notebooks.py:169:


tests/unit/test_notebooks.py:223: in _run_notebook
subprocess.check_output([sys.executable, script_path])
/usr/lib/python3.8/subprocess.py:415: in check_output
return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,


input = None, capture_output = False, timeout = None, check = True
popenargs = (['/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/bin/python', '/tmp/pytest-of-jenkins/pytest-14/test_movielens_example0/notebook.py'],)
kwargs = {'stdout': -1}, process = <subprocess.Popen object at 0x7fb764731370>
stdout = b'', stderr = None, retcode = 1

def run(*popenargs,
        input=None, capture_output=False, timeout=None, check=False, **kwargs):
    """Run command with arguments and return a CompletedProcess instance.

    The returned instance will have attributes args, returncode, stdout and
    stderr. By default, stdout and stderr are not captured, and those attributes
    will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.

    If check is True and the exit code was non-zero, it raises a
    CalledProcessError. The CalledProcessError object will have the return code
    in the returncode attribute, and output & stderr attributes if those streams
    were captured.

    If timeout is given, and the process takes too long, a TimeoutExpired
    exception will be raised.

    There is an optional argument "input", allowing you to
    pass bytes or a string to the subprocess's stdin.  If you use this argument
    you may not also use the Popen constructor's "stdin" argument, as
    it will be used internally.

    By default, all communication is in bytes, and therefore any "input" should
    be bytes, and the stdout and stderr will be bytes. If in text mode, any
    "input" should be a string, and stdout and stderr will be strings decoded
    according to locale encoding, or by "encoding" if set. Text mode is
    triggered by setting any of text, encoding, errors or universal_newlines.

    The other arguments are the same as for the Popen constructor.
    """
    if input is not None:
        if kwargs.get('stdin') is not None:
            raise ValueError('stdin and input arguments may not both be used.')
        kwargs['stdin'] = PIPE

    if capture_output:
        if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
            raise ValueError('stdout and stderr arguments may not be used '
                             'with capture_output.')
        kwargs['stdout'] = PIPE
        kwargs['stderr'] = PIPE

    with Popen(*popenargs, **kwargs) as process:
        try:
            stdout, stderr = process.communicate(input, timeout=timeout)
        except TimeoutExpired as exc:
            process.kill()
            if _mswindows:
                # Windows accumulates the output in a single blocking
                # read() call run on child threads, with the timeout
                # being done in a join() on those threads.  communicate()
                # _after_ kill() is required to collect that and add it
                # to the exception.
                exc.stdout, exc.stderr = process.communicate()
            else:
                # POSIX _communicate already populated the output so
                # far into the TimeoutExpired exception.
                process.wait()
            raise
        except:  # Including KeyboardInterrupt, communicate handled that.
            process.kill()
            # We don't call process.wait() as .__exit__ does that for us.
            raise
        retcode = process.poll()
        if check and retcode:
          raise CalledProcessError(retcode, process.args,
                                     output=stdout, stderr=stderr)

E subprocess.CalledProcessError: Command '['/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/bin/python', '/tmp/pytest-of-jenkins/pytest-14/test_movielens_example0/notebook.py']' returned non-zero exit status 1.

/usr/lib/python3.8/subprocess.py:516: CalledProcessError
----------------------------- Captured stderr call -----------------------------
Traceback (most recent call last):
File "/tmp/pytest-of-jenkins/pytest-14/test_movielens_example0/notebook.py", line 115, in
batch = next(iter(train_loader))
File "/usr/local/lib/python3.8/dist-packages/torch/utils/data/dataloader.py", line 681, in next
data = self._next_data()
File "/usr/local/lib/python3.8/dist-packages/torch/utils/data/dataloader.py", line 721, in _next_data
data = self._dataset_fetcher.fetch(index) # may raise StopIteration
File "/usr/local/lib/python3.8/dist-packages/torch/utils/data/_utils/fetch.py", line 39, in fetch
data = next(self.dataset_iter)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 243, in next
return self._get_next_batch()
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 278, in _get_next_batch
batch = next(self._batch_itr)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 402, in
return (self._handle_tensors(batch) for batch in batches)
File "/usr/local/lib/python3.8/dist-packages/nvtx/nvtx.py", line 101, in inner
result = func(*args, **kwargs)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 528, in _handle_tensors
names = self.dtype_reverse_map[np.dtype(dtype)] if dtype is not None else []
KeyError: dtype('float32')
=============================== warnings summary ===============================
../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/core.py:33
/usr/local/lib/python3.8/dist-packages/dask_cudf/core.py:33: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
DASK_VERSION = LooseVersion(dask.version)

.tox/test-gpu/lib/python3.8/site-packages/setuptools/_distutils/version.py:346: 34 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/setuptools/_distutils/version.py:346: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
other = LooseVersion(other)

tests/unit/test_dask_nvt.py: 6 warnings
tests/unit/workflow/test_workflow.py: 78 warnings
/var/jenkins_home/.local/lib/python3.8/site-packages/dask/base.py:1282: UserWarning: Running on a single-machine scheduler when a distributed client is active might lead to unexpected results.
warnings.warn(

tests/unit/test_dask_nvt.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 2 files did not have enough partitions to create 8 files.
warnings.warn(

tests/unit/test_dask_nvt.py::test_merlin_core_execution_managers
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/core/utils.py:431: UserWarning: Existing Dask-client object detected in the current context. New cuda cluster will not be deployed. Set force_new to True to ignore running clusters.
warnings.warn(

tests/unit/loader/test_tf_dataloader.py: 2 warnings
tests/unit/loader/test_torch_dataloader.py: 12 warnings
tests/unit/workflow/test_workflow.py: 9 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 1 files did not have enough partitions to create 2 files.
warnings.warn(

tests/unit/ops/test_fill.py::test_fill_missing[True-True-parquet]
tests/unit/ops/test_fill.py::test_fill_missing[True-False-parquet]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
/var/jenkins_home/.local/lib/python3.8/site-packages/pandas/core/indexing.py:1732: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self._setitem_single_block(indexer, value, name)

tests/unit/ops/test_ops_schema.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/schema/tags.py:148: UserWarning: Compound tags like Tags.USER_ID have been deprecated and will be removed in a future version. Please use the atomic versions of these tags, like [<Tags.USER: 'user'>, <Tags.ID: 'id'>].
warnings.warn(

tests/unit/ops/test_ops_schema.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/schema/tags.py:148: UserWarning: Compound tags like Tags.ITEM_ID have been deprecated and will be removed in a future version. Please use the atomic versions of these tags, like [<Tags.ITEM: 'item'>, <Tags.ID: 'id'>].
warnings.warn(

tests/unit/workflow/test_cpu_workflow.py: 6 warnings
tests/unit/workflow/test_workflow.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 1 files did not have enough partitions to create 10 files.
warnings.warn(

tests/unit/workflow/test_workflow.py: 48 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 2 files did not have enough partitions to create 20 files.
warnings.warn(

tests/unit/workflow/test_workflow.py::test_parquet_output[True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_parquet_output[True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_parquet_output[True-None]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-None]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-None]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 2 files did not have enough partitions to create 4 files.
warnings.warn(

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html

---------- coverage: platform linux, python 3.8.10-final-0 -----------
Name Stmts Miss Cover

merlin/transforms/init.py 1 1 0%
merlin/transforms/ops/init.py 1 1 0%

TOTAL 2 2 0%

=========================== short test summary info ============================
SKIPPED [1] ../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/io/tests/test_s3.py:14: could not import 'moto': No module named 'moto'
SKIPPED [1] tests/unit/loader/test_tf_dataloader.py:529: needs horovod
===== 2 failed, 1433 passed, 2 skipped, 257 warnings in 1090.09s (0:18:10) =====
/usr/local/lib/python3.8/dist-packages/coverage/control.py:801: CoverageWarning: No data was collected. (no-data-collected)
self._warn("No data was collected.", slug="no-data-collected")
ERROR: InvocationError for command /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/bin/python -m pytest --cov-report term --cov merlin -rxs tests/unit (exited with code 1)
___________________________________ summary ____________________________________
ERROR: test-gpu: commands failed
Build step 'Execute shell' marked build as failure
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://github.com/gitapi/repos/NVIDIA-Merlin/NVTabular/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[nvtabular_tests] $ /bin/bash /tmp/jenkins16786173575937406914.sh

@benfred
Copy link
Member Author

benfred commented Oct 26, 2022

Current CI failures need a fix included with NVIDIA-Merlin/dataloader#38

@benfred
Copy link
Member Author

benfred commented Oct 28, 2022

Rerun tests

@nvidia-merlin-bot
Copy link
Contributor

Click to view CI Results
GitHub pull request #1694 of commit c0c0029bf2e94fad4ef77b399b9acaa54e7ba44b, no merge conflicts.
Running as SYSTEM
Setting status of c0c0029bf2e94fad4ef77b399b9acaa54e7ba44b to PENDING with url http://10.20.17.181:8080/job/nvtabular_tests/4767/ and message: 'Build started for merge commit.'
Using context: Jenkins Unit Test Run
Building on master in workspace /var/jenkins_home/workspace/nvtabular_tests
using credential nvidia-merlin-bot
Cloning the remote Git repository
Cloning repository https://github.com/NVIDIA-Merlin/NVTabular.git
 > git init /var/jenkins_home/workspace/nvtabular_tests/nvtabular # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
 > git --version # timeout=10
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/pull/1694/*:refs/remotes/origin/pr/1694/* # timeout=10
 > git rev-parse c0c0029bf2e94fad4ef77b399b9acaa54e7ba44b^{commit} # timeout=10
Checking out Revision c0c0029bf2e94fad4ef77b399b9acaa54e7ba44b (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f c0c0029bf2e94fad4ef77b399b9acaa54e7ba44b # timeout=10
Commit message: "Share code to transform inputs to schema"
 > git rev-list --no-walk 1b7a041b1e3720967ec9c8abcf2f98419cbe967a # timeout=10
First time build. Skipping changelog.
[nvtabular_tests] $ /bin/bash /tmp/jenkins482984488374566132.sh
GLOB sdist-make: /var/jenkins_home/workspace/nvtabular_tests/nvtabular/setup.py
test-gpu create: /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu
test-gpu installdeps: pytest, pytest-cov
WARNING: Discarding $PYTHONPATH from environment, to override specify PYTHONPATH in 'passenv' in your configuration.
test-gpu inst: /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/.tmp/package/1/nvtabular-1.6.0+6.gc0c0029bf.zip
WARNING: Discarding $PYTHONPATH from environment, to override specify PYTHONPATH in 'passenv' in your configuration.
test-gpu installed: absl-py==1.2.0,aiohttp==3.8.1,aiosignal==1.2.0,alabaster==0.7.12,alembic==1.8.1,anyio==3.6.1,argon2-cffi==21.3.0,argon2-cffi-bindings==21.2.0,astroid==2.5.6,asttokens==2.0.8,astunparse==1.6.3,asv==0.5.1,asvdb==0.4.2,async-timeout==4.0.2,attrs==22.1.0,autopage==0.5.1,awscli==1.26.3,Babel==2.10.3,backcall==0.2.0,beautifulsoup4==4.11.1,betterproto==1.2.5,black==22.6.0,bleach==5.0.1,boto3==1.24.75,botocore==1.28.3,Brotli==1.0.9,cachetools==5.2.0,certifi==2019.11.28,cffi==1.15.1,chardet==3.0.4,charset-normalizer==2.1.1,clang==5.0,click==8.1.3,cliff==4.0.0,cloudpickle==2.2.0,cmaes==0.8.2,cmake==3.24.1.1,cmd2==2.4.2,colorama==0.4.4,colorlog==6.7.0,contourpy==1.0.5,coverage==6.5.0,cuda-python==11.7.1,cupy-cuda117==10.6.0,cycler==0.11.0,Cython==0.29.32,dask==2022.1.1,dbus-python==1.2.16,debugpy==1.6.3,decorator==5.1.1,defusedxml==0.7.1,dill==0.3.5.1,distlib==0.3.6,distributed==2022.5.1,distro==1.7.0,dm-tree==0.1.6,docker-pycreds==0.4.0,docutils==0.16,emoji==1.7.0,entrypoints==0.4,execnet==1.9.0,executing==1.0.0,faiss==1.7.2,faiss-gpu==1.7.2,fastai==2.7.9,fastapi==0.85.0,fastavro==1.6.1,fastcore==1.5.27,fastdownload==0.0.7,fastjsonschema==2.16.1,fastprogress==1.0.3,fastrlock==0.8,feast==0.19.4,fiddle==0.2.2,filelock==3.8.0,flatbuffers==1.12,fonttools==4.37.3,frozenlist==1.3.1,fsspec==2022.5.0,gast==0.4.0,gevent==21.12.0,geventhttpclient==2.0.2,gitdb==4.0.9,GitPython==3.1.27,google==3.0.0,google-api-core==2.10.1,google-auth==2.11.1,google-auth-oauthlib==0.4.6,google-pasta==0.2.0,googleapis-common-protos==1.52.0,graphviz==0.20.1,greenlet==1.1.3,grpcio==1.41.0,grpcio-channelz==1.49.0,grpcio-reflection==1.48.1,grpclib==0.4.3,h11==0.13.0,h2==4.1.0,h5py==3.7.0,HeapDict==1.0.1,horovod==0.26.1,hpack==4.0.0,httptools==0.5.0,hugectr2onnx==0.0.0,huggingface-hub==0.9.1,hyperframe==6.0.1,idna==2.8,imagesize==1.4.1,implicit==0.6.1,importlib-metadata==4.12.0,importlib-resources==5.9.0,iniconfig==1.1.1,ipykernel==6.15.3,ipython==8.5.0,ipython-genutils==0.2.0,ipywidgets==7.7.0,jedi==0.18.1,Jinja2==3.1.2,jmespath==1.0.1,joblib==1.2.0,json5==0.9.10,jsonschema==4.16.0,jupyter-cache==0.4.3,jupyter-core==4.11.1,jupyter-server==1.18.1,jupyter-server-mathjax==0.2.5,jupyter-sphinx==0.3.2,jupyter_client==7.3.5,jupyterlab==3.4.7,jupyterlab-pygments==0.2.2,jupyterlab-widgets==1.1.0,jupyterlab_server==2.15.1,keras==2.9.0,Keras-Preprocessing==1.1.2,kiwisolver==1.4.4,lazy-object-proxy==1.8.0,libclang==14.0.6,libcst==0.4.7,lightfm==1.16,lightgbm==3.3.2,linkify-it-py==1.0.3,llvmlite==0.39.1,locket==1.0.0,lxml==4.9.1,Mako==1.2.3,Markdown==3.4.1,markdown-it-py==1.1.0,MarkupSafe==2.1.1,matplotlib==3.6.0,matplotlib-inline==0.1.6,mdit-py-plugins==0.2.8,merlin-core==0.6.0+1.g5926fcf,merlin-dataloader==0.0.2,merlin-models==0.7.0+11.g280956aa4,merlin-systems==0.5.0+4.g15074ad,mistune==2.0.4,mmh3==3.0.0,mpi4py==3.1.3,msgpack==1.0.4,multidict==6.0.2,mypy-extensions==0.4.3,myst-nb==0.13.2,myst-parser==0.15.2,natsort==8.1.0,nbclassic==0.4.3,nbclient==0.6.8,nbconvert==7.0.0,nbdime==3.1.1,nbformat==5.5.0,nest-asyncio==1.5.5,ninja==1.10.2.3,notebook==6.4.12,notebook-shim==0.1.0,numba==0.56.2,numpy==1.22.4,nvidia-pyindex==1.0.9,-e git+https://github.com/NVIDIA-Merlin/NVTabular.git@c0c0029bf2e94fad4ef77b399b9acaa54e7ba44b#egg=nvtabular,nvtx==0.2.5,oauthlib==3.2.1,oldest-supported-numpy==2022.8.16,onnx==1.12.0,onnxruntime==1.11.1,opt-einsum==3.3.0,optuna==3.0.3,packaging==21.3,pandas==1.3.5,pandavro==1.5.2,pandocfilters==1.5.0,parso==0.8.3,partd==1.3.0,pathtools==0.1.2,pbr==5.11.0,pexpect==4.8.0,pickleshare==0.7.5,Pillow==9.2.0,pkgutil_resolve_name==1.3.10,platformdirs==2.5.2,plotly==5.11.0,pluggy==1.0.0,prettytable==3.4.1,prometheus-client==0.14.1,promise==2.3,prompt-toolkit==3.0.31,proto-plus==1.19.6,protobuf==3.19.5,psutil==5.9.2,ptyprocess==0.7.0,pure-eval==0.2.2,py==1.11.0,pyarrow==7.0.0,pyasn1==0.4.8,pyasn1-modules==0.2.8,pybind11==2.10.0,pycparser==2.21,pydantic==1.10.2,pydot==1.4.2,Pygments==2.13.0,PyGObject==3.36.0,pynvml==11.4.1,pyparsing==3.0.9,pyperclip==1.8.2,pyrsistent==0.18.1,pytest==7.1.3,pytest-cov==4.0.0,pytest-xdist==3.0.2,python-apt==2.0.0+ubuntu0.20.4.8,python-dateutil==2.8.2,python-dotenv==0.21.0,python-rapidjson==1.8,pytz==2022.2.1,PyYAML==5.4.1,pyzmq==24.0.0,regex==2022.9.13,requests==2.22.0,requests-oauthlib==1.3.1,requests-unixsocket==0.2.0,rsa==4.7.2,s3fs==2022.2.0,s3transfer==0.6.0,sacremoses==0.0.53,scikit-build==0.15.0,scikit-learn==1.1.2,scipy==1.8.1,seedir==0.3.0,Send2Trash==1.8.0,sentry-sdk==1.9.8,setproctitle==1.3.2,setuptools-scm==7.0.5,shortuuid==1.0.9,six==1.15.0,sklearn==0.0,smmap==5.0.0,sniffio==1.3.0,snowballstemmer==2.2.0,sortedcontainers==2.4.0,soupsieve==2.3.2.post1,Sphinx==5.3.0,sphinx-multiversion==0.2.4,sphinx-togglebutton==0.3.1,sphinx_external_toc==0.3.0,sphinxcontrib-applehelp==1.0.2,sphinxcontrib-copydirs @ git+https://github.com/mikemckiernan/sphinxcontrib-copydirs.git@bd8c5d79b3f91cf5f1bb0d6995aeca3fe84b670e,sphinxcontrib-devhelp==1.0.2,sphinxcontrib-htmlhelp==2.0.0,sphinxcontrib-jsmath==1.0.1,sphinxcontrib-qthelp==1.0.3,sphinxcontrib-serializinghtml==1.1.5,SQLAlchemy==1.4.42,stack-data==0.5.0,starlette==0.20.4,stevedore==4.1.0,stringcase==1.2.0,supervisor==4.1.0,tabulate==0.8.10,tblib==1.7.0,tdqm==0.0.1,tenacity==8.0.1,tensorboard==2.9.1,tensorboard-data-server==0.6.1,tensorboard-plugin-wit==1.8.1,tensorflow==2.9.2,tensorflow-estimator==2.9.0,tensorflow-gpu==2.9.2,tensorflow-io-gcs-filesystem==0.27.0,tensorflow-metadata==1.10.0,termcolor==2.0.1,terminado==0.15.0,testbook==0.4.2,threadpoolctl==3.1.0,tinycss2==1.1.1,tokenizers==0.10.3,toml==0.10.2,tomli==2.0.1,toolz==0.12.0,torch==1.12.1+cu113,torchmetrics==0.3.2,tornado==6.2,tox==3.26.0,tqdm==4.64.1,traitlets==5.4.0,transformers==4.12.0,transformers4rec==0.1.12+2.gbcc939255,treelite==2.3.0,treelite-runtime==2.3.0,tritonclient==2.25.0,typing-inspect==0.8.0,typing_extensions==4.3.0,uc-micro-py==1.0.1,urllib3==1.26.12,uvicorn==0.18.3,uvloop==0.17.0,versioneer==0.20,virtualenv==20.16.5,wandb==0.13.3,watchfiles==0.17.0,wcwidth==0.2.5,webencodings==0.5.1,websocket-client==1.4.1,websockets==10.3,Werkzeug==2.2.2,widgetsnbextension==3.6.0,wrapt==1.12.1,xgboost==1.6.2,yarl==1.8.1,zict==2.2.0,zipp==3.8.1,zope.event==4.5.0,zope.interface==5.4.0
test-gpu run-test-pre: PYTHONHASHSEED='573005511'
test-gpu run-test: commands[0] | python -m pip install --upgrade git+https://github.com/NVIDIA-Merlin/core.git
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Collecting git+https://github.com/NVIDIA-Merlin/core.git
  Cloning https://github.com/NVIDIA-Merlin/core.git to /tmp/pip-req-build-6qvdbwk4
  Running command git clone --filter=blob:none --quiet https://github.com/NVIDIA-Merlin/core.git /tmp/pip-req-build-6qvdbwk4
  Resolved https://github.com/NVIDIA-Merlin/core.git to commit eda153c663aa864da66927c7a0a9d4e64c073120
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Requirement already satisfied: packaging in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (21.3)
Requirement already satisfied: pyarrow>=5.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (7.0.0)
Requirement already satisfied: dask>=2022.3.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+4.geda153c) (2022.3.0)
Requirement already satisfied: pandas<1.4.0dev0,>=1.2.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+4.geda153c) (1.3.5)
Requirement already satisfied: numba>=0.54 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+4.geda153c) (0.55.1)
Requirement already satisfied: distributed>=2022.3.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+4.geda153c) (2022.3.0)
Requirement already satisfied: betterproto<2.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (1.2.5)
Requirement already satisfied: protobuf>=3.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (3.19.5)
Requirement already satisfied: tensorflow-metadata>=1.2.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (1.10.0)
Requirement already satisfied: tqdm>=4.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (4.64.1)
Requirement already satisfied: fsspec==2022.5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+4.geda153c) (2022.5.0)
Requirement already satisfied: grpclib in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (0.4.3)
Requirement already satisfied: stringcase in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (1.2.0)
Requirement already satisfied: partd>=0.3.10 in /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg (from dask>=2022.3.0->merlin-core==0.8.0+4.geda153c) (1.2.0)
Requirement already satisfied: pyyaml>=5.3.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg (from dask>=2022.3.0->merlin-core==0.8.0+4.geda153c) (5.4.1)
Requirement already satisfied: toolz>=0.8.2 in /usr/local/lib/python3.8/dist-packages (from dask>=2022.3.0->merlin-core==0.8.0+4.geda153c) (0.12.0)
Requirement already satisfied: cloudpickle>=1.1.1 in /usr/local/lib/python3.8/dist-packages (from dask>=2022.3.0->merlin-core==0.8.0+4.geda153c) (2.2.0)
Requirement already satisfied: tblib>=1.6.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (1.7.0)
Requirement already satisfied: jinja2 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (3.1.2)
Requirement already satisfied: msgpack>=0.6.0 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (1.0.4)
Requirement already satisfied: zict>=0.1.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (2.0.0)
Requirement already satisfied: tornado>=6.0.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (6.1)
Requirement already satisfied: click>=6.6 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (8.1.3)
Requirement already satisfied: psutil>=5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (5.8.0)
Requirement already satisfied: sortedcontainers!=2.0.0,!=2.0.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (2.4.0)
Requirement already satisfied: setuptools in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core==0.8.0+4.geda153c) (65.4.1)
Requirement already satisfied: llvmlite<0.39,>=0.38.0rc1 in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core==0.8.0+4.geda153c) (0.38.1)
Requirement already satisfied: numpy<1.22,>=1.18 in /var/jenkins_home/.local/lib/python3.8/site-packages (from numba>=0.54->merlin-core==0.8.0+4.geda153c) (1.20.3)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /usr/local/lib/python3.8/dist-packages (from packaging->merlin-core==0.8.0+4.geda153c) (3.0.9)
Requirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core==0.8.0+4.geda153c) (2.8.2)
Requirement already satisfied: pytz>=2017.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core==0.8.0+4.geda153c) (2022.2.1)
Requirement already satisfied: googleapis-common-protos<2,>=1.52.0 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core==0.8.0+4.geda153c) (1.52.0)
Requirement already satisfied: absl-py<2.0.0,>=0.9 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core==0.8.0+4.geda153c) (1.2.0)
Requirement already satisfied: locket in /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg (from partd>=0.3.10->dask>=2022.3.0->merlin-core==0.8.0+4.geda153c) (0.2.1)
Requirement already satisfied: six>=1.5 in /var/jenkins_home/.local/lib/python3.8/site-packages (from python-dateutil>=2.7.3->pandas<1.4.0dev0,>=1.2.0->merlin-core==0.8.0+4.geda153c) (1.15.0)
Requirement already satisfied: heapdict in /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg (from zict>=0.1.3->distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (1.0.1)
Requirement already satisfied: h2<5,>=3.1.0 in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (4.1.0)
Requirement already satisfied: multidict in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (6.0.2)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.8/dist-packages (from jinja2->distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (2.1.1)
Requirement already satisfied: hyperframe<7,>=6.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (6.0.1)
Requirement already satisfied: hpack<5,>=4.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (4.0.0)
Building wheels for collected packages: merlin-core
  Building wheel for merlin-core (pyproject.toml): started
  Building wheel for merlin-core (pyproject.toml): finished with status 'done'
  Created wheel for merlin-core: filename=merlin_core-0.8.0+4.geda153c-py3-none-any.whl size=118258 sha256=e187938a9133724515f6b3a3b0a76af0874b1b54ad4e9be919118dd7afc53253
  Stored in directory: /tmp/pip-ephem-wheel-cache-cj4ynxjk/wheels/c8/38/16/a6968787eafcec5fa772148af8408b089562f71af0752e8e84
Successfully built merlin-core
Installing collected packages: merlin-core
  Attempting uninstall: merlin-core
    Found existing installation: merlin-core 0.3.0+12.g78ecddd
    Not uninstalling merlin-core at /var/jenkins_home/.local/lib/python3.8/site-packages, outside environment /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu
    Can't uninstall 'merlin-core'. No files were found to uninstall.
Successfully installed merlin-core-0.8.0+4.geda153c

[notice] A new release of pip available: 22.2.2 -> 22.3
[notice] To update, run: pip install --upgrade pip
test-gpu run-test: commands[1] | python -m pytest --cov-report term --cov merlin -rxs tests/unit
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-7.1.3, pluggy-1.0.0
cachedir: .tox/test-gpu/.pytest_cache
rootdir: /var/jenkins_home/workspace/nvtabular_tests/nvtabular, configfile: pyproject.toml
plugins: anyio-3.5.0, xdist-3.0.2, cov-4.0.0
collected 1436 items / 1 skipped

tests/unit/test_dask_nvt.py ............................................ [ 3%]
........................................................................ [ 8%]
.... [ 8%]
tests/unit/test_notebooks.py F.F. [ 8%]
tests/unit/test_tf4rec.py . [ 8%]
tests/unit/test_tools.py ...................... [ 10%]
tests/unit/test_triton_inference.py ................................ [ 12%]
tests/unit/examples/test_01-Getting-started.py . [ 12%]
tests/unit/examples/test_02-Advanced-NVTabular-workflow.py . [ 12%]
tests/unit/examples/test_03-Running-on-multiple-GPUs-or-on-CPU.py . [ 12%]
tests/unit/framework_utils/test_tf_feature_columns.py . [ 12%]
tests/unit/framework_utils/test_tf_layers.py .F......................... [ 14%]
................................................... [ 18%]
tests/unit/framework_utils/test_torch_layers.py . [ 18%]
tests/unit/loader/test_tf_dataloader.py ................................ [ 20%]
........................................F.. [ 23%]
tests/unit/loader/test_torch_dataloader.py ............................. [ 25%]
..................................................... [ 29%]
tests/unit/ops/test_categorify.py ...................................... [ 31%]
........................................................................ [ 36%]
..................................................... [ 40%]
tests/unit/ops/test_column_similarity.py ........................ [ 42%]
tests/unit/ops/test_drop_low_cardinality.py .. [ 42%]
tests/unit/ops/test_fill.py ............................................ [ 45%]
........ [ 45%]
tests/unit/ops/test_groupyby.py ....................... [ 47%]
tests/unit/ops/test_hash_bucket.py ......................... [ 49%]
tests/unit/ops/test_join.py ............................................ [ 52%]
........................................................................ [ 57%]
.................................. [ 59%]
tests/unit/ops/test_lambda.py .......... [ 60%]
tests/unit/ops/test_normalize.py ....................................... [ 63%]
.. [ 63%]
tests/unit/ops/test_ops.py ............................................. [ 66%]
.................... [ 67%]
tests/unit/ops/test_ops_schema.py ...................................... [ 70%]
........................................................................ [ 75%]
........................................................................ [ 80%]
........................................................................ [ 85%]
....................................... [ 88%]
tests/unit/ops/test_reduce_dtype_size.py .. [ 88%]
tests/unit/ops/test_target_encode.py ..................... [ 89%]
tests/unit/workflow/test_cpu_workflow.py ...... [ 90%]
tests/unit/workflow/test_workflow.py ................................... [ 92%]
.......................................................... [ 96%]
tests/unit/workflow/test_workflow_chaining.py ... [ 96%]
tests/unit/workflow/test_workflow_node.py ........... [ 97%]
tests/unit/workflow/test_workflow_ops.py ... [ 97%]
tests/unit/workflow/test_workflow_schemas.py ........................... [ 99%]
... [100%]

=================================== FAILURES ===================================
___________________________ test_criteo_tf_notebook ____________________________

tmpdir = local('/tmp/pytest-of-jenkins/pytest-13/test_criteo_tf_notebook0')

def test_criteo_tf_notebook(tmpdir):
    tor = pytest.importorskip("tensorflow")  # noqa
    # create a toy dataset in tmpdir, and point environment variables so the notebook
    # will read from it
    os.system("mkdir -p " + os.path.join(tmpdir, "converted/criteo"))
    for i in range(24):
        df = _get_random_criteo_data(1000)
        df.to_parquet(os.path.join(tmpdir, "converted/criteo", f"day_{i}.parquet"))
    os.environ["BASE_DIR"] = str(tmpdir)

    def _nb_modify(line):
        # Disable LocalCUDACluster
        line = line.replace("client.run(_rmm_pool)", "# client.run(_rmm_pool)")
        line = line.replace("if cluster is None:", "if False:")
        line = line.replace("client = Client(cluster)", "# client = Client(cluster)")
        line = line.replace(
            "workflow = nvt.Workflow(features, client=client)", "workflow = nvt.Workflow(features)"
        )
        line = line.replace("client", "# client")
        line = line.replace("NUM_GPUS = [0, 1, 2, 3, 4, 5, 6, 7]", "NUM_GPUS = [0]")
        line = line.replace("part_size = int(part_mem_frac * device_size)", "part_size = '128MB'")

        return line

    _run_notebook(
        tmpdir,
        os.path.join(
            dirname(TEST_PATH),
            "examples/scaling-criteo/",
            "02-ETL-with-NVTabular.ipynb",
        ),
        # disable rmm.reinitialize, seems to be causing issues
        transform=_nb_modify,
    )

    def _modify_tf_nb(line):
        return line.replace(
            # don't require grqphviz/pydot
            "tf.keras.utils.plot_model(model)",
            "# tf.keras.utils.plot_model(model)",
        )
  _run_notebook(
        tmpdir,
        os.path.join(
            dirname(TEST_PATH),
            "examples/scaling-criteo/",
            "03-Training-with-TF.ipynb",
        ),
        transform=_modify_tf_nb,
    )

tests/unit/test_notebooks.py:80:


tests/unit/test_notebooks.py:223: in _run_notebook
subprocess.check_output([sys.executable, script_path])
/usr/lib/python3.8/subprocess.py:415: in check_output
return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,


input = None, capture_output = False, timeout = None, check = True
popenargs = (['/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/bin/python', '/tmp/pytest-of-jenkins/pytest-13/test_criteo_tf_notebook0/notebook.py'],)
kwargs = {'stdout': -1}, process = <subprocess.Popen object at 0x7f04348a9190>
stdout = b'', stderr = None, retcode = 1

def run(*popenargs,
        input=None, capture_output=False, timeout=None, check=False, **kwargs):
    """Run command with arguments and return a CompletedProcess instance.

    The returned instance will have attributes args, returncode, stdout and
    stderr. By default, stdout and stderr are not captured, and those attributes
    will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.

    If check is True and the exit code was non-zero, it raises a
    CalledProcessError. The CalledProcessError object will have the return code
    in the returncode attribute, and output & stderr attributes if those streams
    were captured.

    If timeout is given, and the process takes too long, a TimeoutExpired
    exception will be raised.

    There is an optional argument "input", allowing you to
    pass bytes or a string to the subprocess's stdin.  If you use this argument
    you may not also use the Popen constructor's "stdin" argument, as
    it will be used internally.

    By default, all communication is in bytes, and therefore any "input" should
    be bytes, and the stdout and stderr will be bytes. If in text mode, any
    "input" should be a string, and stdout and stderr will be strings decoded
    according to locale encoding, or by "encoding" if set. Text mode is
    triggered by setting any of text, encoding, errors or universal_newlines.

    The other arguments are the same as for the Popen constructor.
    """
    if input is not None:
        if kwargs.get('stdin') is not None:
            raise ValueError('stdin and input arguments may not both be used.')
        kwargs['stdin'] = PIPE

    if capture_output:
        if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
            raise ValueError('stdout and stderr arguments may not be used '
                             'with capture_output.')
        kwargs['stdout'] = PIPE
        kwargs['stderr'] = PIPE

    with Popen(*popenargs, **kwargs) as process:
        try:
            stdout, stderr = process.communicate(input, timeout=timeout)
        except TimeoutExpired as exc:
            process.kill()
            if _mswindows:
                # Windows accumulates the output in a single blocking
                # read() call run on child threads, with the timeout
                # being done in a join() on those threads.  communicate()
                # _after_ kill() is required to collect that and add it
                # to the exception.
                exc.stdout, exc.stderr = process.communicate()
            else:
                # POSIX _communicate already populated the output so
                # far into the TimeoutExpired exception.
                process.wait()
            raise
        except:  # Including KeyboardInterrupt, communicate handled that.
            process.kill()
            # We don't call process.wait() as .__exit__ does that for us.
            raise
        retcode = process.poll()
        if check and retcode:
          raise CalledProcessError(retcode, process.args,
                                     output=stdout, stderr=stderr)

E subprocess.CalledProcessError: Command '['/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/bin/python', '/tmp/pytest-of-jenkins/pytest-13/test_criteo_tf_notebook0/notebook.py']' returned non-zero exit status 1.

/usr/lib/python3.8/subprocess.py:516: CalledProcessError
----------------------------- Captured stderr call -----------------------------
2022-10-28 22:18:41.699401: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2022-10-28 22:18:44.620713: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 8139 MB memory: -> device: 0, name: Tesla P100-DGXS-16GB, pci bus id: 0000:07:00.0, compute capability: 6.0
2022-10-28 22:18:44.621534: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:1 with 15149 MB memory: -> device: 1, name: Tesla P100-DGXS-16GB, pci bus id: 0000:08:00.0, compute capability: 6.0
2022-10-28 22:18:44.622169: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:2 with 15149 MB memory: -> device: 2, name: Tesla P100-DGXS-16GB, pci bus id: 0000:0e:00.0, compute capability: 6.0
2022-10-28 22:18:44.622793: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:3 with 15149 MB memory: -> device: 3, name: Tesla P100-DGXS-16GB, pci bus id: 0000:0f:00.0, compute capability: 6.0
/usr/lib/python3/dist-packages/requests/init.py:89: RequestsDependencyWarning: urllib3 (1.26.12) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
Traceback (most recent call last):
File "/tmp/pytest-of-jenkins/pytest-13/test_criteo_tf_notebook0/notebook.py", line 105, in
history = model.fit(train_dataset_tf, callbacks=[validation_callback], epochs=EPOCHS)
File "/var/jenkins_home/.local/lib/python3.8/site-packages/keras/utils/traceback_utils.py", line 67, in error_handler
raise e.with_traceback(filtered_tb) from None
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/tensorflow.py", line 144, in getitem
return LoaderBase.next(self)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 243, in next
return self._get_next_batch()
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 278, in _get_next_batch
batch = next(self._batch_itr)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 402, in
return (self._handle_tensors(batch) for batch in batches)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/tensorflow.py", line 286, in _handle_tensors
to_return = super()._handle_tensors(tensors)
File "/usr/local/lib/python3.8/dist-packages/nvtx/nvtx.py", line 101, in inner
result = func(*args, **kwargs)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 534, in _handle_tensors
tensors = self._tensor_split(tensor, len(names), axis=1)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/tensorflow.py", line 180, in _tensor_split
return tf.split(tensor, idx, axis=axis)
tensorflow.python.framework.errors_impl.InvalidArgumentError: Number of ways to split should evenly divide the split dimension, but got split_dim 1 (size = 27) and num_split 13 [Op:Split] name: split
____________________________ test_movielens_example ____________________________

tmpdir = local('/tmp/pytest-of-jenkins/pytest-13/test_movielens_example0')

def test_movielens_example(tmpdir):
    _get_random_movielens_data(tmpdir, 10000, dataset="movie")
    _get_random_movielens_data(tmpdir, 10000, dataset="ratings")
    _get_random_movielens_data(tmpdir, 5000, dataset="ratings", valid=True)

    triton_model_path = os.path.join(tmpdir, "models")
    os.environ["INPUT_DATA_DIR"] = str(tmpdir)
    os.environ["MODEL_PATH"] = triton_model_path

    notebook_path = os.path.join(
        dirname(TEST_PATH),
        "examples/getting-started-movielens/",
        "02-ETL-with-NVTabular.ipynb",
    )
    _run_notebook(tmpdir, notebook_path)

    def _modify_tf_nb(line):
        return line.replace(
            # don't require graphviz/pydot
            "tf.keras.utils.plot_model(model)",
            "# tf.keras.utils.plot_model(model)",
        )

    def _modify_tf_triton(line):
        # models are already preloaded
        line = line.replace("triton_client.load_model", "# triton_client.load_model")
        line = line.replace("triton_client.unload_model", "# triton_client.unload_model")
        return line

    notebooks = []
    try:
        import torch  # noqa

        notebooks.append("03-Training-with-PyTorch.ipynb")
    except Exception:
        pass
    try:
        import nvtabular.inference.triton  # noqa
        import nvtabular.loader.tensorflow  # noqa

        notebooks.append("03-Training-with-TF.ipynb")
        has_tf = True

    except Exception:
        has_tf = False

    for notebook in notebooks:
        notebook_path = os.path.join(
            dirname(TEST_PATH),
            "examples/getting-started-movielens/",
            notebook,
        )
        if notebook == "03-Training-with-TF.ipynb":
            _run_notebook(tmpdir, notebook_path, transform=_modify_tf_nb)
        else:
          _run_notebook(tmpdir, notebook_path)

tests/unit/test_notebooks.py:169:


tests/unit/test_notebooks.py:223: in _run_notebook
subprocess.check_output([sys.executable, script_path])
/usr/lib/python3.8/subprocess.py:415: in check_output
return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,


input = None, capture_output = False, timeout = None, check = True
popenargs = (['/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/bin/python', '/tmp/pytest-of-jenkins/pytest-13/test_movielens_example0/notebook.py'],)
kwargs = {'stdout': -1}, process = <subprocess.Popen object at 0x7f03f0833520>
stdout = b'', stderr = None, retcode = 1

def run(*popenargs,
        input=None, capture_output=False, timeout=None, check=False, **kwargs):
    """Run command with arguments and return a CompletedProcess instance.

    The returned instance will have attributes args, returncode, stdout and
    stderr. By default, stdout and stderr are not captured, and those attributes
    will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.

    If check is True and the exit code was non-zero, it raises a
    CalledProcessError. The CalledProcessError object will have the return code
    in the returncode attribute, and output & stderr attributes if those streams
    were captured.

    If timeout is given, and the process takes too long, a TimeoutExpired
    exception will be raised.

    There is an optional argument "input", allowing you to
    pass bytes or a string to the subprocess's stdin.  If you use this argument
    you may not also use the Popen constructor's "stdin" argument, as
    it will be used internally.

    By default, all communication is in bytes, and therefore any "input" should
    be bytes, and the stdout and stderr will be bytes. If in text mode, any
    "input" should be a string, and stdout and stderr will be strings decoded
    according to locale encoding, or by "encoding" if set. Text mode is
    triggered by setting any of text, encoding, errors or universal_newlines.

    The other arguments are the same as for the Popen constructor.
    """
    if input is not None:
        if kwargs.get('stdin') is not None:
            raise ValueError('stdin and input arguments may not both be used.')
        kwargs['stdin'] = PIPE

    if capture_output:
        if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
            raise ValueError('stdout and stderr arguments may not be used '
                             'with capture_output.')
        kwargs['stdout'] = PIPE
        kwargs['stderr'] = PIPE

    with Popen(*popenargs, **kwargs) as process:
        try:
            stdout, stderr = process.communicate(input, timeout=timeout)
        except TimeoutExpired as exc:
            process.kill()
            if _mswindows:
                # Windows accumulates the output in a single blocking
                # read() call run on child threads, with the timeout
                # being done in a join() on those threads.  communicate()
                # _after_ kill() is required to collect that and add it
                # to the exception.
                exc.stdout, exc.stderr = process.communicate()
            else:
                # POSIX _communicate already populated the output so
                # far into the TimeoutExpired exception.
                process.wait()
            raise
        except:  # Including KeyboardInterrupt, communicate handled that.
            process.kill()
            # We don't call process.wait() as .__exit__ does that for us.
            raise
        retcode = process.poll()
        if check and retcode:
          raise CalledProcessError(retcode, process.args,
                                     output=stdout, stderr=stderr)

E subprocess.CalledProcessError: Command '['/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/bin/python', '/tmp/pytest-of-jenkins/pytest-13/test_movielens_example0/notebook.py']' returned non-zero exit status 1.

/usr/lib/python3.8/subprocess.py:516: CalledProcessError
----------------------------- Captured stderr call -----------------------------
Traceback (most recent call last):
File "/tmp/pytest-of-jenkins/pytest-13/test_movielens_example0/notebook.py", line 115, in
batch = next(iter(train_loader))
File "/usr/local/lib/python3.8/dist-packages/torch/utils/data/dataloader.py", line 681, in next
data = self._next_data()
File "/usr/local/lib/python3.8/dist-packages/torch/utils/data/dataloader.py", line 721, in _next_data
data = self._dataset_fetcher.fetch(index) # may raise StopIteration
File "/usr/local/lib/python3.8/dist-packages/torch/utils/data/_utils/fetch.py", line 39, in fetch
data = next(self.dataset_iter)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 243, in next
return self._get_next_batch()
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 278, in _get_next_batch
batch = next(self._batch_itr)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 402, in
return (self._handle_tensors(batch) for batch in batches)
File "/usr/local/lib/python3.8/dist-packages/nvtx/nvtx.py", line 101, in inner
result = func(*args, **kwargs)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 528, in _handle_tensors
names = self.dtype_reverse_map[np.dtype(dtype)] if dtype is not None else []
KeyError: dtype('float32')
____________________ test_dense_embedding_layer[sum-concat] ____________________

aggregation = 'concat', combiner = 'sum'

@pytest.mark.parametrize("aggregation", ["stack", "concat"])
@pytest.mark.parametrize("combiner", ["sum", "mean"])  # TODO: add sqrtn
def test_dense_embedding_layer(aggregation, combiner):
    raw_good_columns = get_good_feature_columns()
    scalar_numeric, vector_numeric, one_hot, multi_hot = raw_good_columns
    one_hot_embedding = tf.feature_column.indicator_column(one_hot)
    multi_hot_embedding = tf.feature_column.embedding_column(multi_hot, 8, combiner=combiner)

    # should raise ValueError if passed categorical columns
    with pytest.raises(ValueError):
        embedding_layer = layers.DenseFeatures(raw_good_columns, aggregation=aggregation)

    if aggregation == "stack":
        # can't pass numeric to stack aggregation unless dims are 1
        with pytest.raises(ValueError):
            embedding_layer = layers.DenseFeatures(
                [
                    scalar_numeric,
                    vector_numeric,
                    one_hot_embedding,
                    multi_hot_embedding,
                ],
                aggregation=aggregation,
            )
        # can't have mismatched dims with stack aggregation
        with pytest.raises(ValueError):
            embedding_layer = layers.DenseFeatures(
                [one_hot_embedding, multi_hot_embedding], aggregation=aggregation
            )

        # reset b embedding to have matching dims
        multi_hot_embedding = tf.feature_column.embedding_column(multi_hot, 100, combiner=combiner)
        cols = [one_hot_embedding, multi_hot_embedding]
    else:
        cols = [scalar_numeric, vector_numeric, one_hot_embedding, multi_hot_embedding]

    embedding_layer = layers.DenseFeatures(cols, aggregation=aggregation)
    inputs = {
        "scalar_continuous": tf.keras.Input(name="scalar_continuous", shape=(1,), dtype=tf.float32),
        "vector_continuous": tf.keras.Input(
            name="vector_continuous__values", shape=(1,), dtype=tf.float32
        ),
        "one_hot": tf.keras.Input(name="one_hot", shape=(1,), dtype=tf.int64),
        "multi_hot": (
            tf.keras.Input(name="multi_hot__values", shape=(1,), dtype=tf.int64),
            tf.keras.Input(name="multi_hot__nnzs", shape=(1,), dtype=tf.int64),
        ),
    }
    if aggregation == "stack":
        inputs.pop("scalar_continuous")
        inputs.pop("vector_continuous")

    output = embedding_layer(inputs)
    model = tf.keras.Model(inputs=inputs, outputs=output)
    model.compile("sgd", "mse")

    # TODO: check for out-of-range categorical behavior
    scalar = np.array([0.1, -0.2, 0.3], dtype=np.float32)
    vector = np.random.randn(3, 128).astype("float32")
    one_hot = np.array([44, 21, 32])
    multi_hot_values = np.array([0, 2, 1, 4, 1, 3, 1])
    multi_hot_nnzs = np.array([1, 2, 4])
    x = {
        "scalar_continuous": scalar[:, None],
        "vector_continuous": vector.flatten()[:, None],
        "one_hot": one_hot[:, None],
        "multi_hot": (multi_hot_values[:, None], multi_hot_nnzs[:, None]),
    }
    if aggregation == "stack":
        x.pop("scalar_continuous")
        x.pop("vector_continuous")

    multi_hot_embedding_table = embedding_layer.embedding_tables["multi_hot"].numpy()
    multi_hot_embedding_rows = _compute_expected_multi_hot(
        multi_hot_embedding_table, multi_hot_values, multi_hot_nnzs, combiner
    )

    # check that shape and values match up
    y_hat = model(x).numpy()
    assert y_hat.shape[0] == 3
    if aggregation == "stack":
        assert len(y_hat.shape) == 3
        # len of columns is 2 because of mh (vals, nnzs) struct
        assert y_hat.shape[1] == (len(x))
        assert y_hat.shape[2] == 100
        np.testing.assert_allclose(y_hat[:, 0], multi_hot_embedding_rows, rtol=1e-03)
        y_c = y_hat[:, 1]

    elif aggregation == "concat":
        assert len(y_hat.shape) == 2
        assert y_hat.shape[1] == 1 + 100 + 8 + 128

        assert (y_hat[:, 108] == scalar).all()
        assert (y_hat[:, 109:] == vector).all()
      np.testing.assert_allclose(y_hat[:, :8], multi_hot_embedding_rows, rtol=1e-05)

E AssertionError:
E Not equal to tolerance rtol=1e-05, atol=0
E
E Mismatched elements: 1 / 24 (4.17%)
E Max absolute difference: 5.9604645e-08
E Max relative difference: 4.9838025e-05
E x: array([[ 1.602328e-01, -6.203187e-01, 8.058422e-02, -4.399309e-01,
E -3.360389e-01, -2.151353e-01, 2.805609e-01, -3.414805e-01],
E [-6.753068e-02, 6.907119e-01, -1.032804e-01, 1.146152e+00,...
E y: array([[ 1.602328e-01, -6.203187e-01, 8.058422e-02, -4.399309e-01,
E -3.360389e-01, -2.151353e-01, 2.805609e-01, -3.414805e-01],
E [-6.753068e-02, 6.907119e-01, -1.032804e-01, 1.146152e+00,...

tests/unit/framework_utils/test_tf_layers.py:148: AssertionError
____________________________ test_horovod_multigpu _____________________________

tmpdir = local('/tmp/pytest-of-jenkins/pytest-13/test_horovod_multigpu0')

@pytest.mark.skipif(
    os.environ.get("NR_USER") is not None, reason="not working correctly in ci environment"
)
@pytest.mark.skipif(importlib.util.find_spec("horovod") is None, reason="needs horovod")
@pytest.mark.skipif(
    cupy and cupy.cuda.runtime.getDeviceCount() <= 1,
    reason="This unittest requires multiple gpu's to run",
)
def test_horovod_multigpu(tmpdir):
    json_sample = {
        "conts": {},
        "cats": {
            "genres": {
                "dtype": None,
                "cardinality": 50,
                "min_entry_size": 1,
                "max_entry_size": 5,
                "multi_min": 2,
                "multi_max": 4,
                "multi_avg": 3,
            },
            "movieId": {
                "dtype": None,
                "cardinality": 500,
                "min_entry_size": 1,
                "max_entry_size": 5,
            },
            "userId": {"dtype": None, "cardinality": 500, "min_entry_size": 1, "max_entry_size": 5},
        },
        "labels": {"rating": {"dtype": None, "cardinality": 2}},
    }
    cols = datagen._get_cols_from_schema(json_sample)
    df_gen = datagen.DatasetGen(datagen.UniformDistro(), gpu_frac=0.0001)
    target_path = os.path.join(tmpdir, "input/")
    os.mkdir(target_path)
    df_files = df_gen.full_df_create(10000, cols, output=target_path)
    # process them
    cat_features = nvt.ColumnSelector(["userId", "movieId", "genres"]) >> nvt.ops.Categorify()
    ratings = nvt.ColumnSelector(["rating"]) >> nvt.ops.LambdaOp(
        lambda col: (col > 3).astype("int8")
    )
    output = cat_features + ratings
    proc = nvt.Workflow(output)
    target_path_train = os.path.join(tmpdir, "train/")
    os.mkdir(target_path_train)
    proc.fit_transform(nvt.Dataset(df_files)).to_parquet(
        output_path=target_path_train, out_files_per_proc=5
    )
    # add new location
    target_path = os.path.join(tmpdir, "workflow/")
    os.mkdir(target_path)
    proc.save(target_path)
    curr_path = os.path.abspath(__file__)
    repo_root = os.path.relpath(os.path.normpath(os.path.join(curr_path, "../../../..")))
    hvd_wrap_path = os.path.join(repo_root, "examples/multi-gpu-movielens/hvd_wrapper.sh")
    hvd_exam_path = os.path.join(repo_root, "examples/multi-gpu-movielens/tf_trainer.py")
    with subprocess.Popen(
        [
            "horovodrun",
            "-np",
            "2",
            "-H",
            "localhost:2",
            "sh",
            hvd_wrap_path,
            "python",
            hvd_exam_path,
            "--dir_in",
            f"{tmpdir}",
            "--batch_size",
            "1024",
        ],
        stdout=subprocess.PIPE,
        stderr=subprocess.PIPE,
    ) as process:
        process.wait()
        stdout, stderr = process.communicate()
        print(stdout, stderr)
      assert "Loss:" in str(stdout)

E assert 'Loss:' in "b''"
E + where "b''" = str(b'')

tests/unit/loader/test_tf_dataloader.py:607: AssertionError
----------------------------- Captured stdout call -----------------------------
b'' b'Was unable to run mpirun --version:\nmpirun: Error: unknown option "--version"\n\nWas unable to run mpirun --version:\nmpirun: Error: unknown option "--version"\n\nWas unable to run mpirun --version:\nmpirun: Error: unknown option "--version"\n\nWas unable to run mpirun --version:\nmpirun: Error: unknown option "--version"\n\nTraceback (most recent call last):\n File "/usr/local/bin/horovodrun", line 8, in \n sys.exit(run_commandline())\n File "/var/jenkins_home/.local/lib/python3.8/site-packages/horovod/runner/launch.py", line 837, in run_commandline\n _run(args)\n File "/var/jenkins_home/.local/lib/python3.8/site-packages/horovod/runner/launch.py", line 827, in _run\n return _run_static(args)\n File "/var/jenkins_home/.local/lib/python3.8/site-packages/horovod/runner/launch.py", line 685, in _run_static\n _launch_job(args, settings, nics, command)\n File "/var/jenkins_home/.local/lib/python3.8/site-packages/horovod/runner/launch.py", line 800, in _launch_job\n run_controller(args.use_gloo, gloo_run_fn,\n File "/var/jenkins_home/.local/lib/python3.8/site-packages/horovod/runner/launch.py", line 774, in run_controller\n mpi_run()\n File "/var/jenkins_home/.local/lib/python3.8/site-packages/horovod/runner/launch.py", line 795, in mpi_run_fn\n mpi_run(settings, nics, env, command)\n File "/var/jenkins_home/.local/lib/python3.8/site-packages/horovod/runner/mpi_run.py", line 154, in mpi_run\n raise Exception(_MPI_NOT_FOUND_ERROR_MSG)\nException: horovod does not find an installed MPI.\n\nChoose one of:\n1. Install Open MPI 4.0.0+ or IBM Spectrum MPI or MPICH and re-install Horovod (use --no-cache-dir pip option).\n2. Run distributed training script using the standard way provided by your MPI distribution (usually mpirun, srun, or jsrun).\n3. Use built-in gloo option (horovodrun --gloo ...).\n'
=============================== warnings summary ===============================
../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/core.py:33
/usr/local/lib/python3.8/dist-packages/dask_cudf/core.py:33: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
DASK_VERSION = LooseVersion(dask.version)

.tox/test-gpu/lib/python3.8/site-packages/setuptools/_distutils/version.py:346: 34 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/setuptools/_distutils/version.py:346: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
other = LooseVersion(other)

tests/unit/test_dask_nvt.py: 6 warnings
tests/unit/workflow/test_workflow.py: 78 warnings
/var/jenkins_home/.local/lib/python3.8/site-packages/dask/base.py:1282: UserWarning: Running on a single-machine scheduler when a distributed client is active might lead to unexpected results.
warnings.warn(

tests/unit/test_dask_nvt.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 2 files did not have enough partitions to create 8 files.
warnings.warn(

tests/unit/test_dask_nvt.py::test_merlin_core_execution_managers
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/core/utils.py:431: UserWarning: Existing Dask-client object detected in the current context. New cuda cluster will not be deployed. Set force_new to True to ignore running clusters.
warnings.warn(

tests/unit/loader/test_tf_dataloader.py::test_horovod_multigpu
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 1 files did not have enough partitions to create 5 files.
warnings.warn(

tests/unit/loader/test_tf_dataloader.py: 2 warnings
tests/unit/loader/test_torch_dataloader.py: 12 warnings
tests/unit/workflow/test_workflow.py: 9 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 1 files did not have enough partitions to create 2 files.
warnings.warn(

tests/unit/ops/test_fill.py::test_fill_missing[True-True-parquet]
tests/unit/ops/test_fill.py::test_fill_missing[True-False-parquet]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
/var/jenkins_home/.local/lib/python3.8/site-packages/pandas/core/indexing.py:1732: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self._setitem_single_block(indexer, value, name)

tests/unit/ops/test_ops_schema.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/schema/tags.py:148: UserWarning: Compound tags like Tags.USER_ID have been deprecated and will be removed in a future version. Please use the atomic versions of these tags, like [<Tags.USER: 'user'>, <Tags.ID: 'id'>].
warnings.warn(

tests/unit/ops/test_ops_schema.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/schema/tags.py:148: UserWarning: Compound tags like Tags.ITEM_ID have been deprecated and will be removed in a future version. Please use the atomic versions of these tags, like [<Tags.ITEM: 'item'>, <Tags.ID: 'id'>].
warnings.warn(

tests/unit/workflow/test_cpu_workflow.py: 6 warnings
tests/unit/workflow/test_workflow.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 1 files did not have enough partitions to create 10 files.
warnings.warn(

tests/unit/workflow/test_workflow.py: 48 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 2 files did not have enough partitions to create 20 files.
warnings.warn(

tests/unit/workflow/test_workflow.py::test_parquet_output[True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_parquet_output[True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_parquet_output[True-None]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-None]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-None]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 2 files did not have enough partitions to create 4 files.
warnings.warn(

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html

---------- coverage: platform linux, python 3.8.10-final-0 -----------
Name Stmts Miss Cover

merlin/transforms/init.py 1 1 0%
merlin/transforms/ops/init.py 1 1 0%

TOTAL 2 2 0%

=========================== short test summary info ============================
SKIPPED [1] ../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/io/tests/test_s3.py:14: could not import 'moto': No module named 'moto'
===== 4 failed, 1432 passed, 1 skipped, 258 warnings in 1108.71s (0:18:28) =====
/usr/local/lib/python3.8/dist-packages/coverage/control.py:801: CoverageWarning: No data was collected. (no-data-collected)
self._warn("No data was collected.", slug="no-data-collected")
/usr/local/lib/python3.8/dist-packages/coverage/data.py:130: CoverageWarning: Data file '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.coverage.10.20.17.231.16212.717022' doesn't seem to be a coverage data file: cannot unpack non-iterable NoneType object
data._warn(str(exc))
/usr/local/lib/python3.8/dist-packages/coverage/data.py:130: CoverageWarning: Data file '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.coverage.10.20.17.231.16216.549596' doesn't seem to be a coverage data file: cannot unpack non-iterable NoneType object
data._warn(str(exc))
ERROR: InvocationError for command /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/bin/python -m pytest --cov-report term --cov merlin -rxs tests/unit (exited with code 1)
___________________________________ summary ____________________________________
ERROR: test-gpu: commands failed
Build step 'Execute shell' marked build as failure
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://github.com/gitapi/repos/NVIDIA-Merlin/NVTabular/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[nvtabular_tests] $ /bin/bash /tmp/jenkins7392462394996495619.sh

@nvidia-merlin-bot
Copy link
Contributor

Click to view CI Results
GitHub pull request #1694 of commit 6c4510c978521f604c3384cbed06bb8123e65d11, no merge conflicts.
Running as SYSTEM
Setting status of 6c4510c978521f604c3384cbed06bb8123e65d11 to PENDING with url http://10.20.17.181:8080/job/nvtabular_tests/4768/ and message: 'Build started for merge commit.'
Using context: Jenkins Unit Test Run
Building on master in workspace /var/jenkins_home/workspace/nvtabular_tests
using credential nvidia-merlin-bot
Cloning the remote Git repository
Cloning repository https://github.com/NVIDIA-Merlin/NVTabular.git
 > git init /var/jenkins_home/workspace/nvtabular_tests/nvtabular # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
 > git --version # timeout=10
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/pull/1694/*:refs/remotes/origin/pr/1694/* # timeout=10
 > git rev-parse 6c4510c978521f604c3384cbed06bb8123e65d11^{commit} # timeout=10
Checking out Revision 6c4510c978521f604c3384cbed06bb8123e65d11 (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 6c4510c978521f604c3384cbed06bb8123e65d11 # timeout=10
Commit message: "fix"
 > git rev-list --no-walk c0c0029bf2e94fad4ef77b399b9acaa54e7ba44b # timeout=10
[nvtabular_tests] $ /bin/bash /tmp/jenkins13678725165529892833.sh
GLOB sdist-make: /var/jenkins_home/workspace/nvtabular_tests/nvtabular/setup.py
test-gpu create: /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu
test-gpu installdeps: pytest, pytest-cov
WARNING: Discarding $PYTHONPATH from environment, to override specify PYTHONPATH in 'passenv' in your configuration.
test-gpu inst: /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/.tmp/package/1/nvtabular-1.6.0+7.g6c4510c97.zip
WARNING: Discarding $PYTHONPATH from environment, to override specify PYTHONPATH in 'passenv' in your configuration.
test-gpu installed: absl-py==1.2.0,aiohttp==3.8.1,aiosignal==1.2.0,alabaster==0.7.12,alembic==1.8.1,anyio==3.6.1,argon2-cffi==21.3.0,argon2-cffi-bindings==21.2.0,astroid==2.5.6,asttokens==2.0.8,astunparse==1.6.3,asv==0.5.1,asvdb==0.4.2,async-timeout==4.0.2,attrs==22.1.0,autopage==0.5.1,awscli==1.26.3,Babel==2.10.3,backcall==0.2.0,beautifulsoup4==4.11.1,betterproto==1.2.5,black==22.6.0,bleach==5.0.1,boto3==1.24.75,botocore==1.28.3,Brotli==1.0.9,cachetools==5.2.0,certifi==2019.11.28,cffi==1.15.1,chardet==3.0.4,charset-normalizer==2.1.1,clang==5.0,click==8.1.3,cliff==4.0.0,cloudpickle==2.2.0,cmaes==0.8.2,cmake==3.24.1.1,cmd2==2.4.2,colorama==0.4.4,colorlog==6.7.0,contourpy==1.0.5,coverage==6.5.0,cuda-python==11.7.1,cupy-cuda117==10.6.0,cycler==0.11.0,Cython==0.29.32,dask==2022.1.1,dbus-python==1.2.16,debugpy==1.6.3,decorator==5.1.1,defusedxml==0.7.1,dill==0.3.5.1,distlib==0.3.6,distributed==2022.5.1,distro==1.7.0,dm-tree==0.1.6,docker-pycreds==0.4.0,docutils==0.16,emoji==1.7.0,entrypoints==0.4,execnet==1.9.0,executing==1.0.0,faiss==1.7.2,faiss-gpu==1.7.2,fastai==2.7.9,fastapi==0.85.0,fastavro==1.6.1,fastcore==1.5.27,fastdownload==0.0.7,fastjsonschema==2.16.1,fastprogress==1.0.3,fastrlock==0.8,feast==0.19.4,fiddle==0.2.2,filelock==3.8.0,flatbuffers==1.12,fonttools==4.37.3,frozenlist==1.3.1,fsspec==2022.5.0,gast==0.4.0,gevent==21.12.0,geventhttpclient==2.0.2,gitdb==4.0.9,GitPython==3.1.27,google==3.0.0,google-api-core==2.10.1,google-auth==2.11.1,google-auth-oauthlib==0.4.6,google-pasta==0.2.0,googleapis-common-protos==1.52.0,graphviz==0.20.1,greenlet==1.1.3,grpcio==1.41.0,grpcio-channelz==1.49.0,grpcio-reflection==1.48.1,grpclib==0.4.3,h11==0.13.0,h2==4.1.0,h5py==3.7.0,HeapDict==1.0.1,horovod==0.26.1,hpack==4.0.0,httptools==0.5.0,hugectr2onnx==0.0.0,huggingface-hub==0.9.1,hyperframe==6.0.1,idna==2.8,imagesize==1.4.1,implicit==0.6.1,importlib-metadata==4.12.0,importlib-resources==5.9.0,iniconfig==1.1.1,ipykernel==6.15.3,ipython==8.5.0,ipython-genutils==0.2.0,ipywidgets==7.7.0,jedi==0.18.1,Jinja2==3.1.2,jmespath==1.0.1,joblib==1.2.0,json5==0.9.10,jsonschema==4.16.0,jupyter-cache==0.4.3,jupyter-core==4.11.1,jupyter-server==1.18.1,jupyter-server-mathjax==0.2.5,jupyter-sphinx==0.3.2,jupyter_client==7.3.5,jupyterlab==3.4.7,jupyterlab-pygments==0.2.2,jupyterlab-widgets==1.1.0,jupyterlab_server==2.15.1,keras==2.9.0,Keras-Preprocessing==1.1.2,kiwisolver==1.4.4,lazy-object-proxy==1.8.0,libclang==14.0.6,libcst==0.4.7,lightfm==1.16,lightgbm==3.3.2,linkify-it-py==1.0.3,llvmlite==0.39.1,locket==1.0.0,lxml==4.9.1,Mako==1.2.3,Markdown==3.4.1,markdown-it-py==1.1.0,MarkupSafe==2.1.1,matplotlib==3.6.0,matplotlib-inline==0.1.6,mdit-py-plugins==0.2.8,merlin-core==0.6.0+1.g5926fcf,merlin-dataloader==0.0.2,merlin-models==0.7.0+11.g280956aa4,merlin-systems==0.5.0+4.g15074ad,mistune==2.0.4,mmh3==3.0.0,mpi4py==3.1.3,msgpack==1.0.4,multidict==6.0.2,mypy-extensions==0.4.3,myst-nb==0.13.2,myst-parser==0.15.2,natsort==8.1.0,nbclassic==0.4.3,nbclient==0.6.8,nbconvert==7.0.0,nbdime==3.1.1,nbformat==5.5.0,nest-asyncio==1.5.5,ninja==1.10.2.3,notebook==6.4.12,notebook-shim==0.1.0,numba==0.56.2,numpy==1.22.4,nvidia-pyindex==1.0.9,-e git+https://github.com/NVIDIA-Merlin/NVTabular.git@6c4510c978521f604c3384cbed06bb8123e65d11#egg=nvtabular,nvtx==0.2.5,oauthlib==3.2.1,oldest-supported-numpy==2022.8.16,onnx==1.12.0,onnxruntime==1.11.1,opt-einsum==3.3.0,optuna==3.0.3,packaging==21.3,pandas==1.3.5,pandavro==1.5.2,pandocfilters==1.5.0,parso==0.8.3,partd==1.3.0,pathtools==0.1.2,pbr==5.11.0,pexpect==4.8.0,pickleshare==0.7.5,Pillow==9.2.0,pkgutil_resolve_name==1.3.10,platformdirs==2.5.2,plotly==5.11.0,pluggy==1.0.0,prettytable==3.4.1,prometheus-client==0.14.1,promise==2.3,prompt-toolkit==3.0.31,proto-plus==1.19.6,protobuf==3.19.5,psutil==5.9.2,ptyprocess==0.7.0,pure-eval==0.2.2,py==1.11.0,pyarrow==7.0.0,pyasn1==0.4.8,pyasn1-modules==0.2.8,pybind11==2.10.0,pycparser==2.21,pydantic==1.10.2,pydot==1.4.2,Pygments==2.13.0,PyGObject==3.36.0,pynvml==11.4.1,pyparsing==3.0.9,pyperclip==1.8.2,pyrsistent==0.18.1,pytest==7.1.3,pytest-cov==4.0.0,pytest-xdist==3.0.2,python-apt==2.0.0+ubuntu0.20.4.8,python-dateutil==2.8.2,python-dotenv==0.21.0,python-rapidjson==1.8,pytz==2022.2.1,PyYAML==5.4.1,pyzmq==24.0.0,regex==2022.9.13,requests==2.22.0,requests-oauthlib==1.3.1,requests-unixsocket==0.2.0,rsa==4.7.2,s3fs==2022.2.0,s3transfer==0.6.0,sacremoses==0.0.53,scikit-build==0.15.0,scikit-learn==1.1.2,scipy==1.8.1,seedir==0.3.0,Send2Trash==1.8.0,sentry-sdk==1.9.8,setproctitle==1.3.2,setuptools-scm==7.0.5,shortuuid==1.0.9,six==1.15.0,sklearn==0.0,smmap==5.0.0,sniffio==1.3.0,snowballstemmer==2.2.0,sortedcontainers==2.4.0,soupsieve==2.3.2.post1,Sphinx==5.3.0,sphinx-multiversion==0.2.4,sphinx-togglebutton==0.3.1,sphinx_external_toc==0.3.0,sphinxcontrib-applehelp==1.0.2,sphinxcontrib-copydirs @ git+https://github.com/mikemckiernan/sphinxcontrib-copydirs.git@bd8c5d79b3f91cf5f1bb0d6995aeca3fe84b670e,sphinxcontrib-devhelp==1.0.2,sphinxcontrib-htmlhelp==2.0.0,sphinxcontrib-jsmath==1.0.1,sphinxcontrib-qthelp==1.0.3,sphinxcontrib-serializinghtml==1.1.5,SQLAlchemy==1.4.42,stack-data==0.5.0,starlette==0.20.4,stevedore==4.1.0,stringcase==1.2.0,supervisor==4.1.0,tabulate==0.8.10,tblib==1.7.0,tdqm==0.0.1,tenacity==8.0.1,tensorboard==2.9.1,tensorboard-data-server==0.6.1,tensorboard-plugin-wit==1.8.1,tensorflow==2.9.2,tensorflow-estimator==2.9.0,tensorflow-gpu==2.9.2,tensorflow-io-gcs-filesystem==0.27.0,tensorflow-metadata==1.10.0,termcolor==2.0.1,terminado==0.15.0,testbook==0.4.2,threadpoolctl==3.1.0,tinycss2==1.1.1,tokenizers==0.10.3,toml==0.10.2,tomli==2.0.1,toolz==0.12.0,torch==1.12.1+cu113,torchmetrics==0.3.2,tornado==6.2,tox==3.26.0,tqdm==4.64.1,traitlets==5.4.0,transformers==4.12.0,transformers4rec==0.1.12+2.gbcc939255,treelite==2.3.0,treelite-runtime==2.3.0,tritonclient==2.25.0,typing-inspect==0.8.0,typing_extensions==4.3.0,uc-micro-py==1.0.1,urllib3==1.26.12,uvicorn==0.18.3,uvloop==0.17.0,versioneer==0.20,virtualenv==20.16.5,wandb==0.13.3,watchfiles==0.17.0,wcwidth==0.2.5,webencodings==0.5.1,websocket-client==1.4.1,websockets==10.3,Werkzeug==2.2.2,widgetsnbextension==3.6.0,wrapt==1.12.1,xgboost==1.6.2,yarl==1.8.1,zict==2.2.0,zipp==3.8.1,zope.event==4.5.0,zope.interface==5.4.0
test-gpu run-test-pre: PYTHONHASHSEED='2388154641'
test-gpu run-test: commands[0] | python -m pip install --upgrade git+https://github.com/NVIDIA-Merlin/core.git
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Collecting git+https://github.com/NVIDIA-Merlin/core.git
  Cloning https://github.com/NVIDIA-Merlin/core.git to /tmp/pip-req-build-ni84o0kv
  Running command git clone --filter=blob:none --quiet https://github.com/NVIDIA-Merlin/core.git /tmp/pip-req-build-ni84o0kv
  Resolved https://github.com/NVIDIA-Merlin/core.git to commit eda153c663aa864da66927c7a0a9d4e64c073120
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Requirement already satisfied: dask>=2022.3.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+4.geda153c) (2022.3.0)
Requirement already satisfied: pyarrow>=5.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (7.0.0)
Requirement already satisfied: protobuf>=3.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (3.19.5)
Requirement already satisfied: numba>=0.54 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+4.geda153c) (0.55.1)
Requirement already satisfied: fsspec==2022.5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+4.geda153c) (2022.5.0)
Requirement already satisfied: tensorflow-metadata>=1.2.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (1.10.0)
Requirement already satisfied: packaging in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (21.3)
Requirement already satisfied: pandas<1.4.0dev0,>=1.2.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+4.geda153c) (1.3.5)
Requirement already satisfied: tqdm>=4.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (4.64.1)
Requirement already satisfied: distributed>=2022.3.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+4.geda153c) (2022.3.0)
Requirement already satisfied: betterproto<2.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (1.2.5)
Requirement already satisfied: stringcase in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (1.2.0)
Requirement already satisfied: grpclib in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (0.4.3)
Requirement already satisfied: pyyaml>=5.3.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg (from dask>=2022.3.0->merlin-core==0.8.0+4.geda153c) (5.4.1)
Requirement already satisfied: cloudpickle>=1.1.1 in /usr/local/lib/python3.8/dist-packages (from dask>=2022.3.0->merlin-core==0.8.0+4.geda153c) (2.2.0)
Requirement already satisfied: toolz>=0.8.2 in /usr/local/lib/python3.8/dist-packages (from dask>=2022.3.0->merlin-core==0.8.0+4.geda153c) (0.12.0)
Requirement already satisfied: partd>=0.3.10 in /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg (from dask>=2022.3.0->merlin-core==0.8.0+4.geda153c) (1.2.0)
Requirement already satisfied: zict>=0.1.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (2.0.0)
Requirement already satisfied: msgpack>=0.6.0 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (1.0.4)
Requirement already satisfied: tornado>=6.0.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (6.1)
Requirement already satisfied: jinja2 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (3.1.2)
Requirement already satisfied: psutil>=5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (5.8.0)
Requirement already satisfied: tblib>=1.6.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (1.7.0)
Requirement already satisfied: click>=6.6 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (8.1.3)
Requirement already satisfied: sortedcontainers!=2.0.0,!=2.0.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (2.4.0)
Requirement already satisfied: llvmlite<0.39,>=0.38.0rc1 in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core==0.8.0+4.geda153c) (0.38.1)
Requirement already satisfied: numpy<1.22,>=1.18 in /var/jenkins_home/.local/lib/python3.8/site-packages (from numba>=0.54->merlin-core==0.8.0+4.geda153c) (1.20.3)
Requirement already satisfied: setuptools in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core==0.8.0+4.geda153c) (65.4.1)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /usr/local/lib/python3.8/dist-packages (from packaging->merlin-core==0.8.0+4.geda153c) (3.0.9)
Requirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core==0.8.0+4.geda153c) (2.8.2)
Requirement already satisfied: pytz>=2017.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core==0.8.0+4.geda153c) (2022.2.1)
Requirement already satisfied: absl-py<2.0.0,>=0.9 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core==0.8.0+4.geda153c) (1.2.0)
Requirement already satisfied: googleapis-common-protos<2,>=1.52.0 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core==0.8.0+4.geda153c) (1.52.0)
Requirement already satisfied: locket in /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg (from partd>=0.3.10->dask>=2022.3.0->merlin-core==0.8.0+4.geda153c) (0.2.1)
Requirement already satisfied: six>=1.5 in /var/jenkins_home/.local/lib/python3.8/site-packages (from python-dateutil>=2.7.3->pandas<1.4.0dev0,>=1.2.0->merlin-core==0.8.0+4.geda153c) (1.15.0)
Requirement already satisfied: heapdict in /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg (from zict>=0.1.3->distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (1.0.1)
Requirement already satisfied: multidict in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (6.0.2)
Requirement already satisfied: h2<5,>=3.1.0 in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (4.1.0)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.8/dist-packages (from jinja2->distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (2.1.1)
Requirement already satisfied: hpack<5,>=4.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (4.0.0)
Requirement already satisfied: hyperframe<7,>=6.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (6.0.1)
Building wheels for collected packages: merlin-core
  Building wheel for merlin-core (pyproject.toml): started
  Building wheel for merlin-core (pyproject.toml): finished with status 'done'
  Created wheel for merlin-core: filename=merlin_core-0.8.0+4.geda153c-py3-none-any.whl size=118258 sha256=86f02f1473f5a17e94a3396361979ef5da86e67d4256aa95bd313ecc0c5682fe
  Stored in directory: /tmp/pip-ephem-wheel-cache-n2_evtxt/wheels/c8/38/16/a6968787eafcec5fa772148af8408b089562f71af0752e8e84
Successfully built merlin-core
Installing collected packages: merlin-core
  Attempting uninstall: merlin-core
    Found existing installation: merlin-core 0.3.0+12.g78ecddd
    Not uninstalling merlin-core at /var/jenkins_home/.local/lib/python3.8/site-packages, outside environment /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu
    Can't uninstall 'merlin-core'. No files were found to uninstall.
Successfully installed merlin-core-0.8.0+4.geda153c

[notice] A new release of pip available: 22.2.2 -> 22.3
[notice] To update, run: pip install --upgrade pip
test-gpu run-test: commands[1] | python -m pytest --cov-report term --cov merlin -rxs tests/unit
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-7.1.3, pluggy-1.0.0
cachedir: .tox/test-gpu/.pytest_cache
rootdir: /var/jenkins_home/workspace/nvtabular_tests/nvtabular, configfile: pyproject.toml
plugins: anyio-3.5.0, xdist-3.0.2, cov-4.0.0
collected 1436 items / 1 skipped

tests/unit/test_dask_nvt.py ............................................ [ 3%]
........................................................................ [ 8%]
.... [ 8%]
tests/unit/test_notebooks.py F.F. [ 8%]
tests/unit/test_tf4rec.py . [ 8%]
tests/unit/test_tools.py ...................... [ 10%]
tests/unit/test_triton_inference.py ................................ [ 12%]
tests/unit/examples/test_01-Getting-started.py . [ 12%]
tests/unit/examples/test_02-Advanced-NVTabular-workflow.py . [ 12%]
tests/unit/examples/test_03-Running-on-multiple-GPUs-or-on-CPU.py . [ 12%]
tests/unit/framework_utils/test_tf_feature_columns.py . [ 12%]
tests/unit/framework_utils/test_tf_layers.py ........................... [ 14%]
................................................... [ 18%]
tests/unit/framework_utils/test_torch_layers.py . [ 18%]
tests/unit/loader/test_tf_dataloader.py ................................ [ 20%]
........................................s.. [ 23%]
tests/unit/loader/test_torch_dataloader.py ............................. [ 25%]
..................................................... [ 29%]
tests/unit/ops/test_categorify.py ...................................... [ 31%]
........................................................................ [ 36%]
..................................................... [ 40%]
tests/unit/ops/test_column_similarity.py ........................ [ 42%]
tests/unit/ops/test_drop_low_cardinality.py .. [ 42%]
tests/unit/ops/test_fill.py ............................................ [ 45%]
........ [ 45%]
tests/unit/ops/test_groupyby.py ....................... [ 47%]
tests/unit/ops/test_hash_bucket.py ......................... [ 49%]
tests/unit/ops/test_join.py ............................................ [ 52%]
........................................................................ [ 57%]
.................................. [ 59%]
tests/unit/ops/test_lambda.py .......... [ 60%]
tests/unit/ops/test_normalize.py ....................................... [ 63%]
.. [ 63%]
tests/unit/ops/test_ops.py ............................................. [ 66%]
.................... [ 67%]
tests/unit/ops/test_ops_schema.py ...................................... [ 70%]
........................................................................ [ 75%]
........................................................................ [ 80%]
........................................................................ [ 85%]
....................................... [ 88%]
tests/unit/ops/test_reduce_dtype_size.py .. [ 88%]
tests/unit/ops/test_target_encode.py ..................... [ 89%]
tests/unit/workflow/test_cpu_workflow.py ...... [ 90%]
tests/unit/workflow/test_workflow.py ................................... [ 92%]
.......................................................... [ 96%]
tests/unit/workflow/test_workflow_chaining.py ... [ 96%]
tests/unit/workflow/test_workflow_node.py ........... [ 97%]
tests/unit/workflow/test_workflow_ops.py ... [ 97%]
tests/unit/workflow/test_workflow_schemas.py ........................... [ 99%]
... [100%]

=================================== FAILURES ===================================
___________________________ test_criteo_tf_notebook ____________________________

tmpdir = local('/tmp/pytest-of-jenkins/pytest-14/test_criteo_tf_notebook0')

def test_criteo_tf_notebook(tmpdir):
    tor = pytest.importorskip("tensorflow")  # noqa
    # create a toy dataset in tmpdir, and point environment variables so the notebook
    # will read from it
    os.system("mkdir -p " + os.path.join(tmpdir, "converted/criteo"))
    for i in range(24):
        df = _get_random_criteo_data(1000)
        df.to_parquet(os.path.join(tmpdir, "converted/criteo", f"day_{i}.parquet"))
    os.environ["BASE_DIR"] = str(tmpdir)

    def _nb_modify(line):
        # Disable LocalCUDACluster
        line = line.replace("client.run(_rmm_pool)", "# client.run(_rmm_pool)")
        line = line.replace("if cluster is None:", "if False:")
        line = line.replace("client = Client(cluster)", "# client = Client(cluster)")
        line = line.replace(
            "workflow = nvt.Workflow(features, client=client)", "workflow = nvt.Workflow(features)"
        )
        line = line.replace("client", "# client")
        line = line.replace("NUM_GPUS = [0, 1, 2, 3, 4, 5, 6, 7]", "NUM_GPUS = [0]")
        line = line.replace("part_size = int(part_mem_frac * device_size)", "part_size = '128MB'")

        return line

    _run_notebook(
        tmpdir,
        os.path.join(
            dirname(TEST_PATH),
            "examples/scaling-criteo/",
            "02-ETL-with-NVTabular.ipynb",
        ),
        # disable rmm.reinitialize, seems to be causing issues
        transform=_nb_modify,
    )

    def _modify_tf_nb(line):
        return line.replace(
            # don't require grqphviz/pydot
            "tf.keras.utils.plot_model(model)",
            "# tf.keras.utils.plot_model(model)",
        )
  _run_notebook(
        tmpdir,
        os.path.join(
            dirname(TEST_PATH),
            "examples/scaling-criteo/",
            "03-Training-with-TF.ipynb",
        ),
        transform=_modify_tf_nb,
    )

tests/unit/test_notebooks.py:80:


tests/unit/test_notebooks.py:223: in _run_notebook
subprocess.check_output([sys.executable, script_path])
/usr/lib/python3.8/subprocess.py:415: in check_output
return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,


input = None, capture_output = False, timeout = None, check = True
popenargs = (['/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/bin/python', '/tmp/pytest-of-jenkins/pytest-14/test_criteo_tf_notebook0/notebook.py'],)
kwargs = {'stdout': -1}, process = <subprocess.Popen object at 0x7efc3447ddc0>
stdout = b'', stderr = None, retcode = 1

def run(*popenargs,
        input=None, capture_output=False, timeout=None, check=False, **kwargs):
    """Run command with arguments and return a CompletedProcess instance.

    The returned instance will have attributes args, returncode, stdout and
    stderr. By default, stdout and stderr are not captured, and those attributes
    will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.

    If check is True and the exit code was non-zero, it raises a
    CalledProcessError. The CalledProcessError object will have the return code
    in the returncode attribute, and output & stderr attributes if those streams
    were captured.

    If timeout is given, and the process takes too long, a TimeoutExpired
    exception will be raised.

    There is an optional argument "input", allowing you to
    pass bytes or a string to the subprocess's stdin.  If you use this argument
    you may not also use the Popen constructor's "stdin" argument, as
    it will be used internally.

    By default, all communication is in bytes, and therefore any "input" should
    be bytes, and the stdout and stderr will be bytes. If in text mode, any
    "input" should be a string, and stdout and stderr will be strings decoded
    according to locale encoding, or by "encoding" if set. Text mode is
    triggered by setting any of text, encoding, errors or universal_newlines.

    The other arguments are the same as for the Popen constructor.
    """
    if input is not None:
        if kwargs.get('stdin') is not None:
            raise ValueError('stdin and input arguments may not both be used.')
        kwargs['stdin'] = PIPE

    if capture_output:
        if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
            raise ValueError('stdout and stderr arguments may not be used '
                             'with capture_output.')
        kwargs['stdout'] = PIPE
        kwargs['stderr'] = PIPE

    with Popen(*popenargs, **kwargs) as process:
        try:
            stdout, stderr = process.communicate(input, timeout=timeout)
        except TimeoutExpired as exc:
            process.kill()
            if _mswindows:
                # Windows accumulates the output in a single blocking
                # read() call run on child threads, with the timeout
                # being done in a join() on those threads.  communicate()
                # _after_ kill() is required to collect that and add it
                # to the exception.
                exc.stdout, exc.stderr = process.communicate()
            else:
                # POSIX _communicate already populated the output so
                # far into the TimeoutExpired exception.
                process.wait()
            raise
        except:  # Including KeyboardInterrupt, communicate handled that.
            process.kill()
            # We don't call process.wait() as .__exit__ does that for us.
            raise
        retcode = process.poll()
        if check and retcode:
          raise CalledProcessError(retcode, process.args,
                                     output=stdout, stderr=stderr)

E subprocess.CalledProcessError: Command '['/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/bin/python', '/tmp/pytest-of-jenkins/pytest-14/test_criteo_tf_notebook0/notebook.py']' returned non-zero exit status 1.

/usr/lib/python3.8/subprocess.py:516: CalledProcessError
----------------------------- Captured stderr call -----------------------------
2022-10-28 22:50:17.405229: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2022-10-28 22:50:20.370093: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 8139 MB memory: -> device: 0, name: Tesla P100-DGXS-16GB, pci bus id: 0000:07:00.0, compute capability: 6.0
2022-10-28 22:50:20.370877: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:1 with 15149 MB memory: -> device: 1, name: Tesla P100-DGXS-16GB, pci bus id: 0000:08:00.0, compute capability: 6.0
2022-10-28 22:50:20.371495: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:2 with 15149 MB memory: -> device: 2, name: Tesla P100-DGXS-16GB, pci bus id: 0000:0e:00.0, compute capability: 6.0
2022-10-28 22:50:20.372113: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:3 with 15149 MB memory: -> device: 3, name: Tesla P100-DGXS-16GB, pci bus id: 0000:0f:00.0, compute capability: 6.0
/usr/lib/python3/dist-packages/requests/init.py:89: RequestsDependencyWarning: urllib3 (1.26.12) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
Traceback (most recent call last):
File "/tmp/pytest-of-jenkins/pytest-14/test_criteo_tf_notebook0/notebook.py", line 105, in
history = model.fit(train_dataset_tf, callbacks=[validation_callback], epochs=EPOCHS)
File "/var/jenkins_home/.local/lib/python3.8/site-packages/keras/utils/traceback_utils.py", line 67, in error_handler
raise e.with_traceback(filtered_tb) from None
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/tensorflow.py", line 144, in getitem
return LoaderBase.next(self)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 243, in next
return self._get_next_batch()
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 278, in _get_next_batch
batch = next(self._batch_itr)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 402, in
return (self._handle_tensors(batch) for batch in batches)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/tensorflow.py", line 286, in _handle_tensors
to_return = super()._handle_tensors(tensors)
File "/usr/local/lib/python3.8/dist-packages/nvtx/nvtx.py", line 101, in inner
result = func(*args, **kwargs)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 534, in _handle_tensors
tensors = self._tensor_split(tensor, len(names), axis=1)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/tensorflow.py", line 180, in _tensor_split
return tf.split(tensor, idx, axis=axis)
tensorflow.python.framework.errors_impl.InvalidArgumentError: Number of ways to split should evenly divide the split dimension, but got split_dim 1 (size = 27) and num_split 13 [Op:Split] name: split
____________________________ test_movielens_example ____________________________

tmpdir = local('/tmp/pytest-of-jenkins/pytest-14/test_movielens_example0')

def test_movielens_example(tmpdir):
    _get_random_movielens_data(tmpdir, 10000, dataset="movie")
    _get_random_movielens_data(tmpdir, 10000, dataset="ratings")
    _get_random_movielens_data(tmpdir, 5000, dataset="ratings", valid=True)

    triton_model_path = os.path.join(tmpdir, "models")
    os.environ["INPUT_DATA_DIR"] = str(tmpdir)
    os.environ["MODEL_PATH"] = triton_model_path

    notebook_path = os.path.join(
        dirname(TEST_PATH),
        "examples/getting-started-movielens/",
        "02-ETL-with-NVTabular.ipynb",
    )
    _run_notebook(tmpdir, notebook_path)

    def _modify_tf_nb(line):
        return line.replace(
            # don't require graphviz/pydot
            "tf.keras.utils.plot_model(model)",
            "# tf.keras.utils.plot_model(model)",
        )

    def _modify_tf_triton(line):
        # models are already preloaded
        line = line.replace("triton_client.load_model", "# triton_client.load_model")
        line = line.replace("triton_client.unload_model", "# triton_client.unload_model")
        return line

    notebooks = []
    try:
        import torch  # noqa

        notebooks.append("03-Training-with-PyTorch.ipynb")
    except Exception:
        pass
    try:
        import nvtabular.inference.triton  # noqa
        import nvtabular.loader.tensorflow  # noqa

        notebooks.append("03-Training-with-TF.ipynb")
        has_tf = True

    except Exception:
        has_tf = False

    for notebook in notebooks:
        notebook_path = os.path.join(
            dirname(TEST_PATH),
            "examples/getting-started-movielens/",
            notebook,
        )
        if notebook == "03-Training-with-TF.ipynb":
            _run_notebook(tmpdir, notebook_path, transform=_modify_tf_nb)
        else:
          _run_notebook(tmpdir, notebook_path)

tests/unit/test_notebooks.py:169:


tests/unit/test_notebooks.py:223: in _run_notebook
subprocess.check_output([sys.executable, script_path])
/usr/lib/python3.8/subprocess.py:415: in check_output
return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,


input = None, capture_output = False, timeout = None, check = True
popenargs = (['/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/bin/python', '/tmp/pytest-of-jenkins/pytest-14/test_movielens_example0/notebook.py'],)
kwargs = {'stdout': -1}, process = <subprocess.Popen object at 0x7efbefbde520>
stdout = b'', stderr = None, retcode = 1

def run(*popenargs,
        input=None, capture_output=False, timeout=None, check=False, **kwargs):
    """Run command with arguments and return a CompletedProcess instance.

    The returned instance will have attributes args, returncode, stdout and
    stderr. By default, stdout and stderr are not captured, and those attributes
    will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.

    If check is True and the exit code was non-zero, it raises a
    CalledProcessError. The CalledProcessError object will have the return code
    in the returncode attribute, and output & stderr attributes if those streams
    were captured.

    If timeout is given, and the process takes too long, a TimeoutExpired
    exception will be raised.

    There is an optional argument "input", allowing you to
    pass bytes or a string to the subprocess's stdin.  If you use this argument
    you may not also use the Popen constructor's "stdin" argument, as
    it will be used internally.

    By default, all communication is in bytes, and therefore any "input" should
    be bytes, and the stdout and stderr will be bytes. If in text mode, any
    "input" should be a string, and stdout and stderr will be strings decoded
    according to locale encoding, or by "encoding" if set. Text mode is
    triggered by setting any of text, encoding, errors or universal_newlines.

    The other arguments are the same as for the Popen constructor.
    """
    if input is not None:
        if kwargs.get('stdin') is not None:
            raise ValueError('stdin and input arguments may not both be used.')
        kwargs['stdin'] = PIPE

    if capture_output:
        if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
            raise ValueError('stdout and stderr arguments may not be used '
                             'with capture_output.')
        kwargs['stdout'] = PIPE
        kwargs['stderr'] = PIPE

    with Popen(*popenargs, **kwargs) as process:
        try:
            stdout, stderr = process.communicate(input, timeout=timeout)
        except TimeoutExpired as exc:
            process.kill()
            if _mswindows:
                # Windows accumulates the output in a single blocking
                # read() call run on child threads, with the timeout
                # being done in a join() on those threads.  communicate()
                # _after_ kill() is required to collect that and add it
                # to the exception.
                exc.stdout, exc.stderr = process.communicate()
            else:
                # POSIX _communicate already populated the output so
                # far into the TimeoutExpired exception.
                process.wait()
            raise
        except:  # Including KeyboardInterrupt, communicate handled that.
            process.kill()
            # We don't call process.wait() as .__exit__ does that for us.
            raise
        retcode = process.poll()
        if check and retcode:
          raise CalledProcessError(retcode, process.args,
                                     output=stdout, stderr=stderr)

E subprocess.CalledProcessError: Command '['/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/bin/python', '/tmp/pytest-of-jenkins/pytest-14/test_movielens_example0/notebook.py']' returned non-zero exit status 1.

/usr/lib/python3.8/subprocess.py:516: CalledProcessError
----------------------------- Captured stderr call -----------------------------
Traceback (most recent call last):
File "/tmp/pytest-of-jenkins/pytest-14/test_movielens_example0/notebook.py", line 115, in
batch = next(iter(train_loader))
File "/usr/local/lib/python3.8/dist-packages/torch/utils/data/dataloader.py", line 681, in next
data = self._next_data()
File "/usr/local/lib/python3.8/dist-packages/torch/utils/data/dataloader.py", line 721, in _next_data
data = self._dataset_fetcher.fetch(index) # may raise StopIteration
File "/usr/local/lib/python3.8/dist-packages/torch/utils/data/_utils/fetch.py", line 39, in fetch
data = next(self.dataset_iter)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 243, in next
return self._get_next_batch()
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 278, in _get_next_batch
batch = next(self._batch_itr)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 402, in
return (self._handle_tensors(batch) for batch in batches)
File "/usr/local/lib/python3.8/dist-packages/nvtx/nvtx.py", line 101, in inner
result = func(*args, **kwargs)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 528, in _handle_tensors
names = self.dtype_reverse_map[np.dtype(dtype)] if dtype is not None else []
KeyError: dtype('float32')
=============================== warnings summary ===============================
../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/core.py:33
/usr/local/lib/python3.8/dist-packages/dask_cudf/core.py:33: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
DASK_VERSION = LooseVersion(dask.version)

.tox/test-gpu/lib/python3.8/site-packages/setuptools/_distutils/version.py:346: 34 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/setuptools/_distutils/version.py:346: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
other = LooseVersion(other)

tests/unit/test_dask_nvt.py: 6 warnings
tests/unit/workflow/test_workflow.py: 78 warnings
/var/jenkins_home/.local/lib/python3.8/site-packages/dask/base.py:1282: UserWarning: Running on a single-machine scheduler when a distributed client is active might lead to unexpected results.
warnings.warn(

tests/unit/test_dask_nvt.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 2 files did not have enough partitions to create 8 files.
warnings.warn(

tests/unit/test_dask_nvt.py::test_merlin_core_execution_managers
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/core/utils.py:431: UserWarning: Existing Dask-client object detected in the current context. New cuda cluster will not be deployed. Set force_new to True to ignore running clusters.
warnings.warn(

tests/unit/loader/test_tf_dataloader.py::test_horovod_multigpu
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 1 files did not have enough partitions to create 5 files.
warnings.warn(

tests/unit/loader/test_tf_dataloader.py: 2 warnings
tests/unit/loader/test_torch_dataloader.py: 12 warnings
tests/unit/workflow/test_workflow.py: 9 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 1 files did not have enough partitions to create 2 files.
warnings.warn(

tests/unit/ops/test_fill.py::test_fill_missing[True-True-parquet]
tests/unit/ops/test_fill.py::test_fill_missing[True-False-parquet]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
/var/jenkins_home/.local/lib/python3.8/site-packages/pandas/core/indexing.py:1732: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self._setitem_single_block(indexer, value, name)

tests/unit/ops/test_ops_schema.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/schema/tags.py:148: UserWarning: Compound tags like Tags.USER_ID have been deprecated and will be removed in a future version. Please use the atomic versions of these tags, like [<Tags.USER: 'user'>, <Tags.ID: 'id'>].
warnings.warn(

tests/unit/ops/test_ops_schema.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/schema/tags.py:148: UserWarning: Compound tags like Tags.ITEM_ID have been deprecated and will be removed in a future version. Please use the atomic versions of these tags, like [<Tags.ITEM: 'item'>, <Tags.ID: 'id'>].
warnings.warn(

tests/unit/workflow/test_cpu_workflow.py: 6 warnings
tests/unit/workflow/test_workflow.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 1 files did not have enough partitions to create 10 files.
warnings.warn(

tests/unit/workflow/test_workflow.py: 48 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 2 files did not have enough partitions to create 20 files.
warnings.warn(

tests/unit/workflow/test_workflow.py::test_parquet_output[True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_parquet_output[True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_parquet_output[True-None]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-None]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-None]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 2 files did not have enough partitions to create 4 files.
warnings.warn(

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html

---------- coverage: platform linux, python 3.8.10-final-0 -----------
Name Stmts Miss Cover

merlin/transforms/init.py 1 1 0%
merlin/transforms/ops/init.py 1 1 0%

TOTAL 2 2 0%

=========================== short test summary info ============================
SKIPPED [1] ../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/io/tests/test_s3.py:14: could not import 'moto': No module named 'moto'
SKIPPED [1] tests/unit/loader/test_tf_dataloader.py:609: Skipping test because of horovod missing MPI
===== 2 failed, 1433 passed, 2 skipped, 258 warnings in 1130.66s (0:18:50) =====
/usr/local/lib/python3.8/dist-packages/coverage/control.py:801: CoverageWarning: No data was collected. (no-data-collected)
self._warn("No data was collected.", slug="no-data-collected")
/usr/local/lib/python3.8/dist-packages/coverage/data.py:130: CoverageWarning: Data file '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.coverage.10.20.17.231.23101.227914' doesn't seem to be a coverage data file: cannot unpack non-iterable NoneType object
data._warn(str(exc))
ERROR: InvocationError for command /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/bin/python -m pytest --cov-report term --cov merlin -rxs tests/unit (exited with code 1)
___________________________________ summary ____________________________________
ERROR: test-gpu: commands failed
Build step 'Execute shell' marked build as failure
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://github.com/gitapi/repos/NVIDIA-Merlin/NVTabular/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[nvtabular_tests] $ /bin/bash /tmp/jenkins8212195491524927905.sh

@benfred
Copy link
Member Author

benfred commented Nov 1, 2022

rerun tests

@nvidia-merlin-bot
Copy link
Contributor

Click to view CI Results
GitHub pull request #1694 of commit 6c4510c978521f604c3384cbed06bb8123e65d11, no merge conflicts.
GitHub pull request #1694 of commit 6c4510c978521f604c3384cbed06bb8123e65d11, no merge conflicts.
Running as SYSTEM
Setting status of 6c4510c978521f604c3384cbed06bb8123e65d11 to PENDING with url http://10.20.17.181:8080/job/nvtabular_tests/4777/ and message: 'Build started for merge commit.'
Using context: Jenkins Unit Test Run
Building on master in workspace /var/jenkins_home/workspace/nvtabular_tests
using credential nvidia-merlin-bot
Cloning the remote Git repository
Cloning repository https://github.com/NVIDIA-Merlin/NVTabular.git
 > git init /var/jenkins_home/workspace/nvtabular_tests/nvtabular # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
 > git --version # timeout=10
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/pull/1694/*:refs/remotes/origin/pr/1694/* # timeout=10
 > git rev-parse 6c4510c978521f604c3384cbed06bb8123e65d11^{commit} # timeout=10
Checking out Revision 6c4510c978521f604c3384cbed06bb8123e65d11 (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 6c4510c978521f604c3384cbed06bb8123e65d11 # timeout=10
Commit message: "fix"
 > git rev-list --no-walk 69daa93cfc04db06f8c3d9b0ae5393448442c560 # timeout=10
First time build. Skipping changelog.
[nvtabular_tests] $ /bin/bash /tmp/jenkins13685151105850663206.sh
GLOB sdist-make: /var/jenkins_home/workspace/nvtabular_tests/nvtabular/setup.py
test-gpu create: /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu
test-gpu installdeps: pytest, pytest-cov
WARNING: Discarding $PYTHONPATH from environment, to override specify PYTHONPATH in 'passenv' in your configuration.
test-gpu inst: /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/.tmp/package/1/nvtabular-1.6.0+7.g6c4510c97.zip
WARNING: Discarding $PYTHONPATH from environment, to override specify PYTHONPATH in 'passenv' in your configuration.
test-gpu installed: absl-py==1.2.0,aiohttp==3.8.1,aiosignal==1.2.0,alabaster==0.7.12,alembic==1.8.1,anyio==3.6.1,argon2-cffi==21.3.0,argon2-cffi-bindings==21.2.0,astroid==2.5.6,asttokens==2.0.8,astunparse==1.6.3,asv==0.5.1,asvdb==0.4.2,async-timeout==4.0.2,attrs==22.1.0,autopage==0.5.1,awscli==1.26.5,Babel==2.10.3,backcall==0.2.0,beautifulsoup4==4.11.1,betterproto==1.2.5,black==22.6.0,bleach==5.0.1,boto3==1.24.75,botocore==1.28.5,Brotli==1.0.9,cachetools==5.2.0,certifi==2019.11.28,cffi==1.15.1,chardet==3.0.4,charset-normalizer==2.1.1,clang==5.0,click==8.1.3,cliff==4.0.0,cloudpickle==2.2.0,cmaes==0.8.2,cmake==3.24.1.1,cmd2==2.4.2,colorama==0.4.4,colorlog==6.7.0,contourpy==1.0.5,coverage==6.5.0,cuda-python==11.7.1,cupy-cuda117==10.6.0,cycler==0.11.0,Cython==0.29.32,dask==2022.1.1,dbus-python==1.2.16,debugpy==1.6.3,decorator==5.1.1,defusedxml==0.7.1,dill==0.3.5.1,distlib==0.3.6,distributed==2022.5.1,distro==1.7.0,dm-tree==0.1.6,docker-pycreds==0.4.0,docutils==0.16,emoji==1.7.0,entrypoints==0.4,execnet==1.9.0,executing==1.0.0,faiss==1.7.2,faiss-gpu==1.7.2,fastai==2.7.9,fastapi==0.85.0,fastavro==1.6.1,fastcore==1.5.27,fastdownload==0.0.7,fastjsonschema==2.16.1,fastprogress==1.0.3,fastrlock==0.8,feast==0.19.4,fiddle==0.2.2,filelock==3.8.0,flatbuffers==1.12,fonttools==4.37.3,frozenlist==1.3.1,fsspec==2022.5.0,gast==0.4.0,gevent==21.12.0,geventhttpclient==2.0.2,gitdb==4.0.9,GitPython==3.1.27,google==3.0.0,google-api-core==2.10.1,google-auth==2.11.1,google-auth-oauthlib==0.4.6,google-pasta==0.2.0,googleapis-common-protos==1.52.0,graphviz==0.20.1,greenlet==1.1.3,grpcio==1.41.0,grpcio-channelz==1.49.0,grpcio-reflection==1.48.1,grpclib==0.4.3,h11==0.13.0,h2==4.1.0,h5py==3.7.0,HeapDict==1.0.1,horovod==0.26.1,hpack==4.0.0,httptools==0.5.0,hugectr2onnx==0.0.0,huggingface-hub==0.9.1,hyperframe==6.0.1,idna==2.8,imagesize==1.4.1,implicit==0.6.1,importlib-metadata==4.12.0,importlib-resources==5.9.0,iniconfig==1.1.1,ipykernel==6.15.3,ipython==8.5.0,ipython-genutils==0.2.0,ipywidgets==7.7.0,jedi==0.18.1,Jinja2==3.1.2,jmespath==1.0.1,joblib==1.2.0,json5==0.9.10,jsonschema==4.16.0,jupyter-cache==0.4.3,jupyter-core==4.11.1,jupyter-server==1.18.1,jupyter-server-mathjax==0.2.5,jupyter-sphinx==0.3.2,jupyter_client==7.3.5,jupyterlab==3.4.7,jupyterlab-pygments==0.2.2,jupyterlab-widgets==1.1.0,jupyterlab_server==2.15.1,keras==2.9.0,Keras-Preprocessing==1.1.2,kiwisolver==1.4.4,lazy-object-proxy==1.8.0,libclang==14.0.6,libcst==0.4.7,lightfm==1.16,lightgbm==3.3.2,linkify-it-py==1.0.3,llvmlite==0.39.1,locket==1.0.0,lxml==4.9.1,Mako==1.2.3,Markdown==3.4.1,markdown-it-py==1.1.0,MarkupSafe==2.1.1,matplotlib==3.6.0,matplotlib-inline==0.1.6,mdit-py-plugins==0.2.8,merlin-core==0.6.0+1.g5926fcf,merlin-dataloader==0.0.2,merlin-models==0.7.0+11.g280956aa4,merlin-systems==0.5.0+4.g15074ad,mistune==2.0.4,mmh3==3.0.0,mpi4py==3.1.3,msgpack==1.0.4,multidict==6.0.2,mypy-extensions==0.4.3,myst-nb==0.13.2,myst-parser==0.15.2,natsort==8.1.0,nbclassic==0.4.3,nbclient==0.6.8,nbconvert==7.0.0,nbdime==3.1.1,nbformat==5.5.0,nest-asyncio==1.5.5,ninja==1.10.2.3,notebook==6.4.12,notebook-shim==0.1.0,numba==0.56.2,numpy==1.22.4,nvidia-pyindex==1.0.9,-e git+https://github.com/NVIDIA-Merlin/NVTabular.git@6c4510c978521f604c3384cbed06bb8123e65d11#egg=nvtabular,nvtx==0.2.5,oauthlib==3.2.1,oldest-supported-numpy==2022.8.16,onnx==1.12.0,onnxruntime==1.11.1,opt-einsum==3.3.0,optuna==3.0.3,packaging==21.3,pandas==1.3.5,pandavro==1.5.2,pandocfilters==1.5.0,parso==0.8.3,partd==1.3.0,pathtools==0.1.2,pbr==5.11.0,pexpect==4.8.0,pickleshare==0.7.5,Pillow==9.2.0,pkgutil_resolve_name==1.3.10,platformdirs==2.5.2,plotly==5.11.0,pluggy==1.0.0,prettytable==3.5.0,prometheus-client==0.14.1,promise==2.3,prompt-toolkit==3.0.31,proto-plus==1.19.6,protobuf==3.19.5,psutil==5.9.2,ptyprocess==0.7.0,pure-eval==0.2.2,py==1.11.0,pyarrow==7.0.0,pyasn1==0.4.8,pyasn1-modules==0.2.8,pybind11==2.10.0,pycparser==2.21,pydantic==1.10.2,pydot==1.4.2,Pygments==2.13.0,PyGObject==3.36.0,pynvml==11.4.1,pyparsing==3.0.9,pyperclip==1.8.2,pyrsistent==0.18.1,pytest==7.1.3,pytest-cov==4.0.0,pytest-xdist==3.0.2,python-apt==2.0.0+ubuntu0.20.4.8,python-dateutil==2.8.2,python-dotenv==0.21.0,python-rapidjson==1.8,pytz==2022.2.1,PyYAML==5.4.1,pyzmq==24.0.0,regex==2022.9.13,requests==2.22.0,requests-oauthlib==1.3.1,requests-unixsocket==0.2.0,rsa==4.7.2,s3fs==2022.2.0,s3transfer==0.6.0,sacremoses==0.0.53,scikit-build==0.15.0,scikit-learn==1.1.2,scipy==1.8.1,seedir==0.3.0,Send2Trash==1.8.0,sentry-sdk==1.9.8,setproctitle==1.3.2,setuptools-scm==7.0.5,shortuuid==1.0.9,six==1.15.0,sklearn==0.0,smmap==5.0.0,sniffio==1.3.0,snowballstemmer==2.2.0,sortedcontainers==2.4.0,soupsieve==2.3.2.post1,Sphinx==5.3.0,sphinx-multiversion==0.2.4,sphinx-togglebutton==0.3.1,sphinx_external_toc==0.3.0,sphinxcontrib-applehelp==1.0.2,sphinxcontrib-copydirs @ git+https://github.com/mikemckiernan/sphinxcontrib-copydirs.git@bd8c5d79b3f91cf5f1bb0d6995aeca3fe84b670e,sphinxcontrib-devhelp==1.0.2,sphinxcontrib-htmlhelp==2.0.0,sphinxcontrib-jsmath==1.0.1,sphinxcontrib-qthelp==1.0.3,sphinxcontrib-serializinghtml==1.1.5,SQLAlchemy==1.4.42,stack-data==0.5.0,starlette==0.20.4,stevedore==4.1.0,stringcase==1.2.0,supervisor==4.1.0,tabulate==0.8.10,tblib==1.7.0,tdqm==0.0.1,tenacity==8.0.1,tensorboard==2.9.1,tensorboard-data-server==0.6.1,tensorboard-plugin-wit==1.8.1,tensorflow==2.9.2,tensorflow-estimator==2.9.0,tensorflow-gpu==2.9.2,tensorflow-io-gcs-filesystem==0.27.0,tensorflow-metadata==1.10.0,termcolor==2.0.1,terminado==0.15.0,testbook==0.4.2,threadpoolctl==3.1.0,tinycss2==1.1.1,tokenizers==0.10.3,toml==0.10.2,tomli==2.0.1,toolz==0.12.0,torch==1.12.1+cu113,torchmetrics==0.3.2,tornado==6.2,tox==3.26.0,tqdm==4.64.1,traitlets==5.4.0,transformers==4.12.0,transformers4rec==0.1.12+2.gbcc939255,treelite==2.3.0,treelite-runtime==2.3.0,tritonclient==2.25.0,typing-inspect==0.8.0,typing_extensions==4.3.0,uc-micro-py==1.0.1,urllib3==1.26.12,uvicorn==0.18.3,uvloop==0.17.0,versioneer==0.20,virtualenv==20.16.5,wandb==0.13.3,watchfiles==0.17.0,wcwidth==0.2.5,webencodings==0.5.1,websocket-client==1.4.1,websockets==10.3,Werkzeug==2.2.2,widgetsnbextension==3.6.0,wrapt==1.12.1,xgboost==1.6.2,yarl==1.8.1,zict==2.2.0,zipp==3.8.1,zope.event==4.5.0,zope.interface==5.4.0
test-gpu run-test-pre: PYTHONHASHSEED='912547750'
test-gpu run-test: commands[0] | python -m pip install --upgrade git+https://github.com/NVIDIA-Merlin/core.git
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Collecting git+https://github.com/NVIDIA-Merlin/core.git
  Cloning https://github.com/NVIDIA-Merlin/core.git to /tmp/pip-req-build-8oqjnv4n
  Running command git clone --filter=blob:none --quiet https://github.com/NVIDIA-Merlin/core.git /tmp/pip-req-build-8oqjnv4n
  Resolved https://github.com/NVIDIA-Merlin/core.git to commit eda153c663aa864da66927c7a0a9d4e64c073120
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Requirement already satisfied: pyarrow>=5.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (7.0.0)
Requirement already satisfied: tqdm>=4.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (4.64.1)
Requirement already satisfied: betterproto<2.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (1.2.5)
Requirement already satisfied: dask>=2022.3.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+4.geda153c) (2022.3.0)
Requirement already satisfied: numba>=0.54 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+4.geda153c) (0.55.1)
Requirement already satisfied: fsspec==2022.5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+4.geda153c) (2022.5.0)
Requirement already satisfied: distributed>=2022.3.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+4.geda153c) (2022.3.0)
Requirement already satisfied: pandas<1.4.0dev0,>=1.2.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+4.geda153c) (1.3.5)
Requirement already satisfied: protobuf>=3.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (3.19.5)
Requirement already satisfied: tensorflow-metadata>=1.2.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (1.10.0)
Requirement already satisfied: packaging in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (21.3)
Requirement already satisfied: grpclib in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (0.4.3)
Requirement already satisfied: stringcase in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (1.2.0)
Requirement already satisfied: partd>=0.3.10 in /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg (from dask>=2022.3.0->merlin-core==0.8.0+4.geda153c) (1.2.0)
Requirement already satisfied: cloudpickle>=1.1.1 in /usr/local/lib/python3.8/dist-packages (from dask>=2022.3.0->merlin-core==0.8.0+4.geda153c) (2.2.0)
Requirement already satisfied: toolz>=0.8.2 in /usr/local/lib/python3.8/dist-packages (from dask>=2022.3.0->merlin-core==0.8.0+4.geda153c) (0.12.0)
Requirement already satisfied: pyyaml>=5.3.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg (from dask>=2022.3.0->merlin-core==0.8.0+4.geda153c) (5.4.1)
Requirement already satisfied: tblib>=1.6.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (1.7.0)
Requirement already satisfied: click>=6.6 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (8.1.3)
Requirement already satisfied: psutil>=5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (5.8.0)
Requirement already satisfied: sortedcontainers!=2.0.0,!=2.0.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (2.4.0)
Requirement already satisfied: zict>=0.1.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (2.0.0)
Requirement already satisfied: tornado>=6.0.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (6.1)
Requirement already satisfied: msgpack>=0.6.0 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (1.0.4)
Requirement already satisfied: jinja2 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (3.1.2)
Requirement already satisfied: llvmlite<0.39,>=0.38.0rc1 in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core==0.8.0+4.geda153c) (0.38.1)
Requirement already satisfied: setuptools in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core==0.8.0+4.geda153c) (65.4.1)
Requirement already satisfied: numpy<1.22,>=1.18 in /var/jenkins_home/.local/lib/python3.8/site-packages (from numba>=0.54->merlin-core==0.8.0+4.geda153c) (1.20.3)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /usr/local/lib/python3.8/dist-packages (from packaging->merlin-core==0.8.0+4.geda153c) (3.0.9)
Requirement already satisfied: pytz>=2017.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core==0.8.0+4.geda153c) (2022.2.1)
Requirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core==0.8.0+4.geda153c) (2.8.2)
Requirement already satisfied: googleapis-common-protos<2,>=1.52.0 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core==0.8.0+4.geda153c) (1.52.0)
Requirement already satisfied: absl-py<2.0.0,>=0.9 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core==0.8.0+4.geda153c) (1.2.0)
Requirement already satisfied: locket in /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg (from partd>=0.3.10->dask>=2022.3.0->merlin-core==0.8.0+4.geda153c) (0.2.1)
Requirement already satisfied: six>=1.5 in /var/jenkins_home/.local/lib/python3.8/site-packages (from python-dateutil>=2.7.3->pandas<1.4.0dev0,>=1.2.0->merlin-core==0.8.0+4.geda153c) (1.15.0)
Requirement already satisfied: heapdict in /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg (from zict>=0.1.3->distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (1.0.1)
Requirement already satisfied: multidict in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (6.0.2)
Requirement already satisfied: h2<5,>=3.1.0 in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (4.1.0)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.8/dist-packages (from jinja2->distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (2.1.1)
Requirement already satisfied: hpack<5,>=4.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (4.0.0)
Requirement already satisfied: hyperframe<7,>=6.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (6.0.1)
Building wheels for collected packages: merlin-core
  Building wheel for merlin-core (pyproject.toml): started
  Building wheel for merlin-core (pyproject.toml): finished with status 'done'
  Created wheel for merlin-core: filename=merlin_core-0.8.0+4.geda153c-py3-none-any.whl size=118258 sha256=605b5f9c233ed0269a43cc4ce7c4526dcf2e9957a5ecdfea8da5e9a415b6af0b
  Stored in directory: /tmp/pip-ephem-wheel-cache-pk_sq9t1/wheels/c8/38/16/a6968787eafcec5fa772148af8408b089562f71af0752e8e84
Successfully built merlin-core
Installing collected packages: merlin-core
  Attempting uninstall: merlin-core
    Found existing installation: merlin-core 0.3.0+12.g78ecddd
    Not uninstalling merlin-core at /var/jenkins_home/.local/lib/python3.8/site-packages, outside environment /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu
    Can't uninstall 'merlin-core'. No files were found to uninstall.
Successfully installed merlin-core-0.8.0+4.geda153c

[notice] A new release of pip available: 22.2.2 -> 22.3
[notice] To update, run: pip install --upgrade pip
test-gpu run-test: commands[1] | python -m pytest --cov-report term --cov merlin -rxs tests/unit
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-7.1.3, pluggy-1.0.0
cachedir: .tox/test-gpu/.pytest_cache
rootdir: /var/jenkins_home/workspace/nvtabular_tests/nvtabular, configfile: pyproject.toml
plugins: anyio-3.5.0, xdist-3.0.2, cov-4.0.0
collected 1436 items / 1 skipped

tests/unit/test_dask_nvt.py ............................................ [ 3%]
........................................................................ [ 8%]
.... [ 8%]
tests/unit/test_notebooks.py F.F. [ 8%]
tests/unit/test_tf4rec.py . [ 8%]
tests/unit/test_tools.py ...................... [ 10%]
tests/unit/test_triton_inference.py ................................ [ 12%]
tests/unit/examples/test_01-Getting-started.py . [ 12%]
tests/unit/examples/test_02-Advanced-NVTabular-workflow.py . [ 12%]
tests/unit/examples/test_03-Running-on-multiple-GPUs-or-on-CPU.py . [ 12%]
tests/unit/framework_utils/test_tf_feature_columns.py . [ 12%]
tests/unit/framework_utils/test_tf_layers.py ........................... [ 14%]
................................................... [ 18%]
tests/unit/framework_utils/test_torch_layers.py . [ 18%]
tests/unit/loader/test_tf_dataloader.py ................................ [ 20%]
........................................s.. [ 23%]
tests/unit/loader/test_torch_dataloader.py ............................. [ 25%]
..................................................... [ 29%]
tests/unit/ops/test_categorify.py ...................................... [ 31%]
........................................................................ [ 36%]
..................................................... [ 40%]
tests/unit/ops/test_column_similarity.py ........................ [ 42%]
tests/unit/ops/test_drop_low_cardinality.py .. [ 42%]
tests/unit/ops/test_fill.py ............................................ [ 45%]
........ [ 45%]
tests/unit/ops/test_groupyby.py ....................... [ 47%]
tests/unit/ops/test_hash_bucket.py ......................... [ 49%]
tests/unit/ops/test_join.py ............................................ [ 52%]
........................................................................ [ 57%]
.................................. [ 59%]
tests/unit/ops/test_lambda.py .......... [ 60%]
tests/unit/ops/test_normalize.py ....................................... [ 63%]
.. [ 63%]
tests/unit/ops/test_ops.py ............................................. [ 66%]
.................... [ 67%]
tests/unit/ops/test_ops_schema.py ...................................... [ 70%]
........................................................................ [ 75%]
........................................................................ [ 80%]
........................................................................ [ 85%]
....................................... [ 88%]
tests/unit/ops/test_reduce_dtype_size.py .. [ 88%]
tests/unit/ops/test_target_encode.py ..................... [ 89%]
tests/unit/workflow/test_cpu_workflow.py ...... [ 90%]
tests/unit/workflow/test_workflow.py ................................... [ 92%]
.......................................................... [ 96%]
tests/unit/workflow/test_workflow_chaining.py ... [ 96%]
tests/unit/workflow/test_workflow_node.py ........... [ 97%]
tests/unit/workflow/test_workflow_ops.py ... [ 97%]
tests/unit/workflow/test_workflow_schemas.py ........................... [ 99%]
... [100%]

=================================== FAILURES ===================================
___________________________ test_criteo_tf_notebook ____________________________

tmpdir = local('/tmp/pytest-of-jenkins/pytest-18/test_criteo_tf_notebook0')

def test_criteo_tf_notebook(tmpdir):
    tor = pytest.importorskip("tensorflow")  # noqa
    # create a toy dataset in tmpdir, and point environment variables so the notebook
    # will read from it
    os.system("mkdir -p " + os.path.join(tmpdir, "converted/criteo"))
    for i in range(24):
        df = _get_random_criteo_data(1000)
        df.to_parquet(os.path.join(tmpdir, "converted/criteo", f"day_{i}.parquet"))
    os.environ["BASE_DIR"] = str(tmpdir)

    def _nb_modify(line):
        # Disable LocalCUDACluster
        line = line.replace("client.run(_rmm_pool)", "# client.run(_rmm_pool)")
        line = line.replace("if cluster is None:", "if False:")
        line = line.replace("client = Client(cluster)", "# client = Client(cluster)")
        line = line.replace(
            "workflow = nvt.Workflow(features, client=client)", "workflow = nvt.Workflow(features)"
        )
        line = line.replace("client", "# client")
        line = line.replace("NUM_GPUS = [0, 1, 2, 3, 4, 5, 6, 7]", "NUM_GPUS = [0]")
        line = line.replace("part_size = int(part_mem_frac * device_size)", "part_size = '128MB'")

        return line

    _run_notebook(
        tmpdir,
        os.path.join(
            dirname(TEST_PATH),
            "examples/scaling-criteo/",
            "02-ETL-with-NVTabular.ipynb",
        ),
        # disable rmm.reinitialize, seems to be causing issues
        transform=_nb_modify,
    )

    def _modify_tf_nb(line):
        return line.replace(
            # don't require grqphviz/pydot
            "tf.keras.utils.plot_model(model)",
            "# tf.keras.utils.plot_model(model)",
        )
  _run_notebook(
        tmpdir,
        os.path.join(
            dirname(TEST_PATH),
            "examples/scaling-criteo/",
            "03-Training-with-TF.ipynb",
        ),
        transform=_modify_tf_nb,
    )

tests/unit/test_notebooks.py:80:


tests/unit/test_notebooks.py:223: in _run_notebook
subprocess.check_output([sys.executable, script_path])
/usr/lib/python3.8/subprocess.py:415: in check_output
return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,


input = None, capture_output = False, timeout = None, check = True
popenargs = (['/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/bin/python', '/tmp/pytest-of-jenkins/pytest-18/test_criteo_tf_notebook0/notebook.py'],)
kwargs = {'stdout': -1}, process = <subprocess.Popen object at 0x7f06e0781820>
stdout = b'', stderr = None, retcode = 1

def run(*popenargs,
        input=None, capture_output=False, timeout=None, check=False, **kwargs):
    """Run command with arguments and return a CompletedProcess instance.

    The returned instance will have attributes args, returncode, stdout and
    stderr. By default, stdout and stderr are not captured, and those attributes
    will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.

    If check is True and the exit code was non-zero, it raises a
    CalledProcessError. The CalledProcessError object will have the return code
    in the returncode attribute, and output & stderr attributes if those streams
    were captured.

    If timeout is given, and the process takes too long, a TimeoutExpired
    exception will be raised.

    There is an optional argument "input", allowing you to
    pass bytes or a string to the subprocess's stdin.  If you use this argument
    you may not also use the Popen constructor's "stdin" argument, as
    it will be used internally.

    By default, all communication is in bytes, and therefore any "input" should
    be bytes, and the stdout and stderr will be bytes. If in text mode, any
    "input" should be a string, and stdout and stderr will be strings decoded
    according to locale encoding, or by "encoding" if set. Text mode is
    triggered by setting any of text, encoding, errors or universal_newlines.

    The other arguments are the same as for the Popen constructor.
    """
    if input is not None:
        if kwargs.get('stdin') is not None:
            raise ValueError('stdin and input arguments may not both be used.')
        kwargs['stdin'] = PIPE

    if capture_output:
        if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
            raise ValueError('stdout and stderr arguments may not be used '
                             'with capture_output.')
        kwargs['stdout'] = PIPE
        kwargs['stderr'] = PIPE

    with Popen(*popenargs, **kwargs) as process:
        try:
            stdout, stderr = process.communicate(input, timeout=timeout)
        except TimeoutExpired as exc:
            process.kill()
            if _mswindows:
                # Windows accumulates the output in a single blocking
                # read() call run on child threads, with the timeout
                # being done in a join() on those threads.  communicate()
                # _after_ kill() is required to collect that and add it
                # to the exception.
                exc.stdout, exc.stderr = process.communicate()
            else:
                # POSIX _communicate already populated the output so
                # far into the TimeoutExpired exception.
                process.wait()
            raise
        except:  # Including KeyboardInterrupt, communicate handled that.
            process.kill()
            # We don't call process.wait() as .__exit__ does that for us.
            raise
        retcode = process.poll()
        if check and retcode:
          raise CalledProcessError(retcode, process.args,
                                     output=stdout, stderr=stderr)

E subprocess.CalledProcessError: Command '['/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/bin/python', '/tmp/pytest-of-jenkins/pytest-18/test_criteo_tf_notebook0/notebook.py']' returned non-zero exit status 1.

/usr/lib/python3.8/subprocess.py:516: CalledProcessError
----------------------------- Captured stderr call -----------------------------
/tmp/pytest-of-jenkins/pytest-18/test_criteo_tf_notebook0/notebook.py:92: UserWarning: BEWARE - 2.606759936 GB is already occupied on device 0!
warnings.warn(f"BEWARE - {used} GB is already occupied on device {int(dev)}!")
2022-11-01 16:58:01.390434: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2022-11-01 16:58:04.334206: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 8139 MB memory: -> device: 0, name: Tesla P100-DGXS-16GB, pci bus id: 0000:07:00.0, compute capability: 6.0
2022-11-01 16:58:04.335094: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:1 with 13875 MB memory: -> device: 1, name: Tesla P100-DGXS-16GB, pci bus id: 0000:08:00.0, compute capability: 6.0
2022-11-01 16:58:04.335737: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:2 with 15149 MB memory: -> device: 2, name: Tesla P100-DGXS-16GB, pci bus id: 0000:0e:00.0, compute capability: 6.0
2022-11-01 16:58:04.336357: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:3 with 15149 MB memory: -> device: 3, name: Tesla P100-DGXS-16GB, pci bus id: 0000:0f:00.0, compute capability: 6.0
/usr/lib/python3/dist-packages/requests/init.py:89: RequestsDependencyWarning: urllib3 (1.26.12) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
Traceback (most recent call last):
File "/tmp/pytest-of-jenkins/pytest-18/test_criteo_tf_notebook0/notebook.py", line 105, in
history = model.fit(train_dataset_tf, callbacks=[validation_callback], epochs=EPOCHS)
File "/var/jenkins_home/.local/lib/python3.8/site-packages/keras/utils/traceback_utils.py", line 67, in error_handler
raise e.with_traceback(filtered_tb) from None
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/tensorflow.py", line 144, in getitem
return LoaderBase.next(self)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 243, in next
return self._get_next_batch()
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 278, in _get_next_batch
batch = next(self._batch_itr)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 402, in
return (self._handle_tensors(batch) for batch in batches)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/tensorflow.py", line 286, in _handle_tensors
to_return = super()._handle_tensors(tensors)
File "/usr/local/lib/python3.8/dist-packages/nvtx/nvtx.py", line 101, in inner
result = func(*args, **kwargs)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 534, in _handle_tensors
tensors = self._tensor_split(tensor, len(names), axis=1)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/tensorflow.py", line 180, in _tensor_split
return tf.split(tensor, idx, axis=axis)
tensorflow.python.framework.errors_impl.InvalidArgumentError: Number of ways to split should evenly divide the split dimension, but got split_dim 1 (size = 27) and num_split 13 [Op:Split] name: split
____________________________ test_movielens_example ____________________________

tmpdir = local('/tmp/pytest-of-jenkins/pytest-18/test_movielens_example0')

def test_movielens_example(tmpdir):
    _get_random_movielens_data(tmpdir, 10000, dataset="movie")
    _get_random_movielens_data(tmpdir, 10000, dataset="ratings")
    _get_random_movielens_data(tmpdir, 5000, dataset="ratings", valid=True)

    triton_model_path = os.path.join(tmpdir, "models")
    os.environ["INPUT_DATA_DIR"] = str(tmpdir)
    os.environ["MODEL_PATH"] = triton_model_path

    notebook_path = os.path.join(
        dirname(TEST_PATH),
        "examples/getting-started-movielens/",
        "02-ETL-with-NVTabular.ipynb",
    )
    _run_notebook(tmpdir, notebook_path)

    def _modify_tf_nb(line):
        return line.replace(
            # don't require graphviz/pydot
            "tf.keras.utils.plot_model(model)",
            "# tf.keras.utils.plot_model(model)",
        )

    def _modify_tf_triton(line):
        # models are already preloaded
        line = line.replace("triton_client.load_model", "# triton_client.load_model")
        line = line.replace("triton_client.unload_model", "# triton_client.unload_model")
        return line

    notebooks = []
    try:
        import torch  # noqa

        notebooks.append("03-Training-with-PyTorch.ipynb")
    except Exception:
        pass
    try:
        import nvtabular.inference.triton  # noqa
        import nvtabular.loader.tensorflow  # noqa

        notebooks.append("03-Training-with-TF.ipynb")
        has_tf = True

    except Exception:
        has_tf = False

    for notebook in notebooks:
        notebook_path = os.path.join(
            dirname(TEST_PATH),
            "examples/getting-started-movielens/",
            notebook,
        )
        if notebook == "03-Training-with-TF.ipynb":
            _run_notebook(tmpdir, notebook_path, transform=_modify_tf_nb)
        else:
          _run_notebook(tmpdir, notebook_path)

tests/unit/test_notebooks.py:169:


tests/unit/test_notebooks.py:223: in _run_notebook
subprocess.check_output([sys.executable, script_path])
/usr/lib/python3.8/subprocess.py:415: in check_output
return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,


input = None, capture_output = False, timeout = None, check = True
popenargs = (['/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/bin/python', '/tmp/pytest-of-jenkins/pytest-18/test_movielens_example0/notebook.py'],)
kwargs = {'stdout': -1}, process = <subprocess.Popen object at 0x7f06e07cda00>
stdout = b'', stderr = None, retcode = 1

def run(*popenargs,
        input=None, capture_output=False, timeout=None, check=False, **kwargs):
    """Run command with arguments and return a CompletedProcess instance.

    The returned instance will have attributes args, returncode, stdout and
    stderr. By default, stdout and stderr are not captured, and those attributes
    will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.

    If check is True and the exit code was non-zero, it raises a
    CalledProcessError. The CalledProcessError object will have the return code
    in the returncode attribute, and output & stderr attributes if those streams
    were captured.

    If timeout is given, and the process takes too long, a TimeoutExpired
    exception will be raised.

    There is an optional argument "input", allowing you to
    pass bytes or a string to the subprocess's stdin.  If you use this argument
    you may not also use the Popen constructor's "stdin" argument, as
    it will be used internally.

    By default, all communication is in bytes, and therefore any "input" should
    be bytes, and the stdout and stderr will be bytes. If in text mode, any
    "input" should be a string, and stdout and stderr will be strings decoded
    according to locale encoding, or by "encoding" if set. Text mode is
    triggered by setting any of text, encoding, errors or universal_newlines.

    The other arguments are the same as for the Popen constructor.
    """
    if input is not None:
        if kwargs.get('stdin') is not None:
            raise ValueError('stdin and input arguments may not both be used.')
        kwargs['stdin'] = PIPE

    if capture_output:
        if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
            raise ValueError('stdout and stderr arguments may not be used '
                             'with capture_output.')
        kwargs['stdout'] = PIPE
        kwargs['stderr'] = PIPE

    with Popen(*popenargs, **kwargs) as process:
        try:
            stdout, stderr = process.communicate(input, timeout=timeout)
        except TimeoutExpired as exc:
            process.kill()
            if _mswindows:
                # Windows accumulates the output in a single blocking
                # read() call run on child threads, with the timeout
                # being done in a join() on those threads.  communicate()
                # _after_ kill() is required to collect that and add it
                # to the exception.
                exc.stdout, exc.stderr = process.communicate()
            else:
                # POSIX _communicate already populated the output so
                # far into the TimeoutExpired exception.
                process.wait()
            raise
        except:  # Including KeyboardInterrupt, communicate handled that.
            process.kill()
            # We don't call process.wait() as .__exit__ does that for us.
            raise
        retcode = process.poll()
        if check and retcode:
          raise CalledProcessError(retcode, process.args,
                                     output=stdout, stderr=stderr)

E subprocess.CalledProcessError: Command '['/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/bin/python', '/tmp/pytest-of-jenkins/pytest-18/test_movielens_example0/notebook.py']' returned non-zero exit status 1.

/usr/lib/python3.8/subprocess.py:516: CalledProcessError
----------------------------- Captured stderr call -----------------------------
Traceback (most recent call last):
File "/tmp/pytest-of-jenkins/pytest-18/test_movielens_example0/notebook.py", line 115, in
batch = next(iter(train_loader))
File "/usr/local/lib/python3.8/dist-packages/torch/utils/data/dataloader.py", line 681, in next
data = self._next_data()
File "/usr/local/lib/python3.8/dist-packages/torch/utils/data/dataloader.py", line 721, in _next_data
data = self._dataset_fetcher.fetch(index) # may raise StopIteration
File "/usr/local/lib/python3.8/dist-packages/torch/utils/data/_utils/fetch.py", line 39, in fetch
data = next(self.dataset_iter)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 243, in next
return self._get_next_batch()
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 278, in _get_next_batch
batch = next(self._batch_itr)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 402, in
return (self._handle_tensors(batch) for batch in batches)
File "/usr/local/lib/python3.8/dist-packages/nvtx/nvtx.py", line 101, in inner
result = func(*args, **kwargs)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/loader_base.py", line 528, in _handle_tensors
names = self.dtype_reverse_map[np.dtype(dtype)] if dtype is not None else []
KeyError: dtype('float32')
=============================== warnings summary ===============================
../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/core.py:33
/usr/local/lib/python3.8/dist-packages/dask_cudf/core.py:33: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
DASK_VERSION = LooseVersion(dask.version)

.tox/test-gpu/lib/python3.8/site-packages/setuptools/_distutils/version.py:346: 34 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/setuptools/_distutils/version.py:346: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
other = LooseVersion(other)

tests/unit/test_dask_nvt.py: 6 warnings
tests/unit/workflow/test_workflow.py: 78 warnings
/var/jenkins_home/.local/lib/python3.8/site-packages/dask/base.py:1282: UserWarning: Running on a single-machine scheduler when a distributed client is active might lead to unexpected results.
warnings.warn(

tests/unit/test_dask_nvt.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 2 files did not have enough partitions to create 8 files.
warnings.warn(

tests/unit/test_dask_nvt.py::test_merlin_core_execution_managers
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/core/utils.py:431: UserWarning: Existing Dask-client object detected in the current context. New cuda cluster will not be deployed. Set force_new to True to ignore running clusters.
warnings.warn(

tests/unit/loader/test_tf_dataloader.py::test_horovod_multigpu
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 1 files did not have enough partitions to create 5 files.
warnings.warn(

tests/unit/loader/test_tf_dataloader.py: 2 warnings
tests/unit/loader/test_torch_dataloader.py: 12 warnings
tests/unit/workflow/test_workflow.py: 9 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 1 files did not have enough partitions to create 2 files.
warnings.warn(

tests/unit/ops/test_fill.py::test_fill_missing[True-True-parquet]
tests/unit/ops/test_fill.py::test_fill_missing[True-False-parquet]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
/var/jenkins_home/.local/lib/python3.8/site-packages/pandas/core/indexing.py:1732: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self._setitem_single_block(indexer, value, name)

tests/unit/ops/test_ops_schema.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/schema/tags.py:148: UserWarning: Compound tags like Tags.USER_ID have been deprecated and will be removed in a future version. Please use the atomic versions of these tags, like [<Tags.USER: 'user'>, <Tags.ID: 'id'>].
warnings.warn(

tests/unit/ops/test_ops_schema.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/schema/tags.py:148: UserWarning: Compound tags like Tags.ITEM_ID have been deprecated and will be removed in a future version. Please use the atomic versions of these tags, like [<Tags.ITEM: 'item'>, <Tags.ID: 'id'>].
warnings.warn(

tests/unit/workflow/test_cpu_workflow.py: 6 warnings
tests/unit/workflow/test_workflow.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 1 files did not have enough partitions to create 10 files.
warnings.warn(

tests/unit/workflow/test_workflow.py: 48 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 2 files did not have enough partitions to create 20 files.
warnings.warn(

tests/unit/workflow/test_workflow.py::test_parquet_output[True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_parquet_output[True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_parquet_output[True-None]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-None]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-None]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 2 files did not have enough partitions to create 4 files.
warnings.warn(

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html

---------- coverage: platform linux, python 3.8.10-final-0 -----------
Name Stmts Miss Cover

merlin/transforms/init.py 1 1 0%
merlin/transforms/ops/init.py 1 1 0%

TOTAL 2 2 0%

=========================== short test summary info ============================
SKIPPED [1] ../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/io/tests/test_s3.py:14: could not import 'moto': No module named 'moto'
SKIPPED [1] tests/unit/loader/test_tf_dataloader.py:609: Skipping test because of horovod missing MPI
===== 2 failed, 1433 passed, 2 skipped, 258 warnings in 1193.65s (0:19:53) =====
/usr/local/lib/python3.8/dist-packages/coverage/control.py:801: CoverageWarning: No data was collected. (no-data-collected)
self._warn("No data was collected.", slug="no-data-collected")
ERROR: InvocationError for command /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/bin/python -m pytest --cov-report term --cov merlin -rxs tests/unit (exited with code 1)
___________________________________ summary ____________________________________
ERROR: test-gpu: commands failed
Build step 'Execute shell' marked build as failure
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://github.com/gitapi/repos/NVIDIA-Merlin/NVTabular/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[nvtabular_tests] $ /bin/bash /tmp/jenkins3960202685850694298.sh

@nvidia-merlin-bot
Copy link
Contributor

Click to view CI Results
GitHub pull request #1694 of commit e0cfe203dade09a0b4e6280c0193e5f5197c03b2, no merge conflicts.
Running as SYSTEM
Setting status of e0cfe203dade09a0b4e6280c0193e5f5197c03b2 to PENDING with url http://10.20.17.181:8080/job/nvtabular_tests/4778/ and message: 'Build started for merge commit.'
Using context: Jenkins Unit Test Run
Building on master in workspace /var/jenkins_home/workspace/nvtabular_tests
using credential nvidia-merlin-bot
Cloning the remote Git repository
Cloning repository https://github.com/NVIDIA-Merlin/NVTabular.git
 > git init /var/jenkins_home/workspace/nvtabular_tests/nvtabular # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
 > git --version # timeout=10
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/pull/1694/*:refs/remotes/origin/pr/1694/* # timeout=10
 > git rev-parse e0cfe203dade09a0b4e6280c0193e5f5197c03b2^{commit} # timeout=10
Checking out Revision e0cfe203dade09a0b4e6280c0193e5f5197c03b2 (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e0cfe203dade09a0b4e6280c0193e5f5197c03b2 # timeout=10
Commit message: "use latest dataloader in tox.ini"
 > git rev-list --no-walk 6c4510c978521f604c3384cbed06bb8123e65d11 # timeout=10
[nvtabular_tests] $ /bin/bash /tmp/jenkins5492357560744453336.sh
GLOB sdist-make: /var/jenkins_home/workspace/nvtabular_tests/nvtabular/setup.py
test-gpu create: /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu
test-gpu installdeps: pytest, pytest-cov
WARNING: Discarding $PYTHONPATH from environment, to override specify PYTHONPATH in 'passenv' in your configuration.
test-gpu inst: /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/.tmp/package/1/nvtabular-1.6.0+8.ge0cfe203d.zip
WARNING: Discarding $PYTHONPATH from environment, to override specify PYTHONPATH in 'passenv' in your configuration.
test-gpu installed: absl-py==1.2.0,aiohttp==3.8.1,aiosignal==1.2.0,alabaster==0.7.12,alembic==1.8.1,anyio==3.6.1,argon2-cffi==21.3.0,argon2-cffi-bindings==21.2.0,astroid==2.5.6,asttokens==2.0.8,astunparse==1.6.3,asv==0.5.1,asvdb==0.4.2,async-timeout==4.0.2,attrs==22.1.0,autopage==0.5.1,awscli==1.26.5,Babel==2.10.3,backcall==0.2.0,beautifulsoup4==4.11.1,betterproto==1.2.5,black==22.6.0,bleach==5.0.1,boto3==1.24.75,botocore==1.28.5,Brotli==1.0.9,cachetools==5.2.0,certifi==2019.11.28,cffi==1.15.1,chardet==3.0.4,charset-normalizer==2.1.1,clang==5.0,click==8.1.3,cliff==4.0.0,cloudpickle==2.2.0,cmaes==0.8.2,cmake==3.24.1.1,cmd2==2.4.2,colorama==0.4.4,colorlog==6.7.0,contourpy==1.0.5,coverage==6.5.0,cuda-python==11.7.1,cupy-cuda117==10.6.0,cycler==0.11.0,Cython==0.29.32,dask==2022.1.1,dbus-python==1.2.16,debugpy==1.6.3,decorator==5.1.1,defusedxml==0.7.1,dill==0.3.5.1,distlib==0.3.6,distributed==2022.5.1,distro==1.7.0,dm-tree==0.1.6,docker-pycreds==0.4.0,docutils==0.16,emoji==1.7.0,entrypoints==0.4,execnet==1.9.0,executing==1.0.0,faiss==1.7.2,faiss-gpu==1.7.2,fastai==2.7.9,fastapi==0.85.0,fastavro==1.6.1,fastcore==1.5.27,fastdownload==0.0.7,fastjsonschema==2.16.1,fastprogress==1.0.3,fastrlock==0.8,feast==0.19.4,fiddle==0.2.2,filelock==3.8.0,flatbuffers==1.12,fonttools==4.37.3,frozenlist==1.3.1,fsspec==2022.5.0,gast==0.4.0,gevent==21.12.0,geventhttpclient==2.0.2,gitdb==4.0.9,GitPython==3.1.27,google==3.0.0,google-api-core==2.10.1,google-auth==2.11.1,google-auth-oauthlib==0.4.6,google-pasta==0.2.0,googleapis-common-protos==1.52.0,graphviz==0.20.1,greenlet==1.1.3,grpcio==1.41.0,grpcio-channelz==1.49.0,grpcio-reflection==1.48.1,grpclib==0.4.3,h11==0.13.0,h2==4.1.0,h5py==3.7.0,HeapDict==1.0.1,horovod==0.26.1,hpack==4.0.0,httptools==0.5.0,hugectr2onnx==0.0.0,huggingface-hub==0.9.1,hyperframe==6.0.1,idna==2.8,imagesize==1.4.1,implicit==0.6.1,importlib-metadata==4.12.0,importlib-resources==5.9.0,iniconfig==1.1.1,ipykernel==6.15.3,ipython==8.5.0,ipython-genutils==0.2.0,ipywidgets==7.7.0,jedi==0.18.1,Jinja2==3.1.2,jmespath==1.0.1,joblib==1.2.0,json5==0.9.10,jsonschema==4.16.0,jupyter-cache==0.4.3,jupyter-core==4.11.1,jupyter-server==1.18.1,jupyter-server-mathjax==0.2.5,jupyter-sphinx==0.3.2,jupyter_client==7.3.5,jupyterlab==3.4.7,jupyterlab-pygments==0.2.2,jupyterlab-widgets==1.1.0,jupyterlab_server==2.15.1,keras==2.9.0,Keras-Preprocessing==1.1.2,kiwisolver==1.4.4,lazy-object-proxy==1.8.0,libclang==14.0.6,libcst==0.4.7,lightfm==1.16,lightgbm==3.3.2,linkify-it-py==1.0.3,llvmlite==0.39.1,locket==1.0.0,lxml==4.9.1,Mako==1.2.3,Markdown==3.4.1,markdown-it-py==1.1.0,MarkupSafe==2.1.1,matplotlib==3.6.0,matplotlib-inline==0.1.6,mdit-py-plugins==0.2.8,merlin-core==0.6.0+1.g5926fcf,merlin-dataloader==0.0.2,merlin-models==0.7.0+11.g280956aa4,merlin-systems==0.5.0+4.g15074ad,mistune==2.0.4,mmh3==3.0.0,mpi4py==3.1.3,msgpack==1.0.4,multidict==6.0.2,mypy-extensions==0.4.3,myst-nb==0.13.2,myst-parser==0.15.2,natsort==8.1.0,nbclassic==0.4.3,nbclient==0.6.8,nbconvert==7.0.0,nbdime==3.1.1,nbformat==5.5.0,nest-asyncio==1.5.5,ninja==1.10.2.3,notebook==6.4.12,notebook-shim==0.1.0,numba==0.56.2,numpy==1.22.4,nvidia-pyindex==1.0.9,-e git+https://github.com/NVIDIA-Merlin/NVTabular.git@e0cfe203dade09a0b4e6280c0193e5f5197c03b2#egg=nvtabular,nvtx==0.2.5,oauthlib==3.2.1,oldest-supported-numpy==2022.8.16,onnx==1.12.0,onnxruntime==1.11.1,opt-einsum==3.3.0,optuna==3.0.3,packaging==21.3,pandas==1.3.5,pandavro==1.5.2,pandocfilters==1.5.0,parso==0.8.3,partd==1.3.0,pathtools==0.1.2,pbr==5.11.0,pexpect==4.8.0,pickleshare==0.7.5,Pillow==9.2.0,pkgutil_resolve_name==1.3.10,platformdirs==2.5.2,plotly==5.11.0,pluggy==1.0.0,prettytable==3.5.0,prometheus-client==0.14.1,promise==2.3,prompt-toolkit==3.0.31,proto-plus==1.19.6,protobuf==3.19.5,psutil==5.9.2,ptyprocess==0.7.0,pure-eval==0.2.2,py==1.11.0,pyarrow==7.0.0,pyasn1==0.4.8,pyasn1-modules==0.2.8,pybind11==2.10.0,pycparser==2.21,pydantic==1.10.2,pydot==1.4.2,Pygments==2.13.0,PyGObject==3.36.0,pynvml==11.4.1,pyparsing==3.0.9,pyperclip==1.8.2,pyrsistent==0.18.1,pytest==7.1.3,pytest-cov==4.0.0,pytest-xdist==3.0.2,python-apt==2.0.0+ubuntu0.20.4.8,python-dateutil==2.8.2,python-dotenv==0.21.0,python-rapidjson==1.8,pytz==2022.2.1,PyYAML==5.4.1,pyzmq==24.0.0,regex==2022.9.13,requests==2.22.0,requests-oauthlib==1.3.1,requests-unixsocket==0.2.0,rsa==4.7.2,s3fs==2022.2.0,s3transfer==0.6.0,sacremoses==0.0.53,scikit-build==0.15.0,scikit-learn==1.1.2,scipy==1.8.1,seedir==0.3.0,Send2Trash==1.8.0,sentry-sdk==1.9.8,setproctitle==1.3.2,setuptools-scm==7.0.5,shortuuid==1.0.9,six==1.15.0,sklearn==0.0,smmap==5.0.0,sniffio==1.3.0,snowballstemmer==2.2.0,sortedcontainers==2.4.0,soupsieve==2.3.2.post1,Sphinx==5.3.0,sphinx-multiversion==0.2.4,sphinx-togglebutton==0.3.1,sphinx_external_toc==0.3.0,sphinxcontrib-applehelp==1.0.2,sphinxcontrib-copydirs @ git+https://github.com/mikemckiernan/sphinxcontrib-copydirs.git@bd8c5d79b3f91cf5f1bb0d6995aeca3fe84b670e,sphinxcontrib-devhelp==1.0.2,sphinxcontrib-htmlhelp==2.0.0,sphinxcontrib-jsmath==1.0.1,sphinxcontrib-qthelp==1.0.3,sphinxcontrib-serializinghtml==1.1.5,SQLAlchemy==1.4.42,stack-data==0.5.0,starlette==0.20.4,stevedore==4.1.0,stringcase==1.2.0,supervisor==4.1.0,tabulate==0.8.10,tblib==1.7.0,tdqm==0.0.1,tenacity==8.0.1,tensorboard==2.9.1,tensorboard-data-server==0.6.1,tensorboard-plugin-wit==1.8.1,tensorflow==2.9.2,tensorflow-estimator==2.9.0,tensorflow-gpu==2.9.2,tensorflow-io-gcs-filesystem==0.27.0,tensorflow-metadata==1.10.0,termcolor==2.0.1,terminado==0.15.0,testbook==0.4.2,threadpoolctl==3.1.0,tinycss2==1.1.1,tokenizers==0.10.3,toml==0.10.2,tomli==2.0.1,toolz==0.12.0,torch==1.12.1+cu113,torchmetrics==0.3.2,tornado==6.2,tox==3.26.0,tqdm==4.64.1,traitlets==5.4.0,transformers==4.12.0,transformers4rec==0.1.12+2.gbcc939255,treelite==2.3.0,treelite-runtime==2.3.0,tritonclient==2.25.0,typing-inspect==0.8.0,typing_extensions==4.3.0,uc-micro-py==1.0.1,urllib3==1.26.12,uvicorn==0.18.3,uvloop==0.17.0,versioneer==0.20,virtualenv==20.16.5,wandb==0.13.3,watchfiles==0.17.0,wcwidth==0.2.5,webencodings==0.5.1,websocket-client==1.4.1,websockets==10.3,Werkzeug==2.2.2,widgetsnbextension==3.6.0,wrapt==1.12.1,xgboost==1.6.2,yarl==1.8.1,zict==2.2.0,zipp==3.8.1,zope.event==4.5.0,zope.interface==5.4.0
test-gpu run-test-pre: PYTHONHASHSEED='1947863543'
test-gpu run-test: commands[0] | python -m pip install --upgrade git+https://github.com/NVIDIA-Merlin/core.git
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Collecting git+https://github.com/NVIDIA-Merlin/core.git
  Cloning https://github.com/NVIDIA-Merlin/core.git to /tmp/pip-req-build-2popbzpy
  Running command git clone --filter=blob:none --quiet https://github.com/NVIDIA-Merlin/core.git /tmp/pip-req-build-2popbzpy
  Resolved https://github.com/NVIDIA-Merlin/core.git to commit eda153c663aa864da66927c7a0a9d4e64c073120
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Requirement already satisfied: distributed>=2022.3.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+4.geda153c) (2022.3.0)
Requirement already satisfied: dask>=2022.3.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+4.geda153c) (2022.3.0)
Requirement already satisfied: betterproto<2.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (1.2.5)
Requirement already satisfied: pyarrow>=5.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (7.0.0)
Requirement already satisfied: packaging in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (21.3)
Requirement already satisfied: fsspec==2022.5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+4.geda153c) (2022.5.0)
Requirement already satisfied: protobuf>=3.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (3.19.5)
Requirement already satisfied: pandas<1.4.0dev0,>=1.2.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+4.geda153c) (1.3.5)
Requirement already satisfied: numba>=0.54 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+4.geda153c) (0.55.1)
Requirement already satisfied: tqdm>=4.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (4.64.1)
Requirement already satisfied: tensorflow-metadata>=1.2.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (1.10.0)
Requirement already satisfied: grpclib in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (0.4.3)
Requirement already satisfied: stringcase in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (1.2.0)
Requirement already satisfied: cloudpickle>=1.1.1 in /usr/local/lib/python3.8/dist-packages (from dask>=2022.3.0->merlin-core==0.8.0+4.geda153c) (2.2.0)
Requirement already satisfied: partd>=0.3.10 in /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg (from dask>=2022.3.0->merlin-core==0.8.0+4.geda153c) (1.2.0)
Requirement already satisfied: toolz>=0.8.2 in /usr/local/lib/python3.8/dist-packages (from dask>=2022.3.0->merlin-core==0.8.0+4.geda153c) (0.12.0)
Requirement already satisfied: pyyaml>=5.3.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg (from dask>=2022.3.0->merlin-core==0.8.0+4.geda153c) (5.4.1)
Requirement already satisfied: click>=6.6 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (8.1.3)
Requirement already satisfied: psutil>=5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (5.8.0)
Requirement already satisfied: tblib>=1.6.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (1.7.0)
Requirement already satisfied: sortedcontainers!=2.0.0,!=2.0.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (2.4.0)
Requirement already satisfied: jinja2 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (3.1.2)
Requirement already satisfied: tornado>=6.0.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (6.1)
Requirement already satisfied: zict>=0.1.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (2.0.0)
Requirement already satisfied: msgpack>=0.6.0 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (1.0.4)
Requirement already satisfied: llvmlite<0.39,>=0.38.0rc1 in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core==0.8.0+4.geda153c) (0.38.1)
Requirement already satisfied: setuptools in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core==0.8.0+4.geda153c) (65.4.1)
Requirement already satisfied: numpy<1.22,>=1.18 in /var/jenkins_home/.local/lib/python3.8/site-packages (from numba>=0.54->merlin-core==0.8.0+4.geda153c) (1.20.3)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /usr/local/lib/python3.8/dist-packages (from packaging->merlin-core==0.8.0+4.geda153c) (3.0.9)
Requirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core==0.8.0+4.geda153c) (2.8.2)
Requirement already satisfied: pytz>=2017.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core==0.8.0+4.geda153c) (2022.2.1)
Requirement already satisfied: absl-py<2.0.0,>=0.9 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core==0.8.0+4.geda153c) (1.2.0)
Requirement already satisfied: googleapis-common-protos<2,>=1.52.0 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core==0.8.0+4.geda153c) (1.52.0)
Requirement already satisfied: locket in /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg (from partd>=0.3.10->dask>=2022.3.0->merlin-core==0.8.0+4.geda153c) (0.2.1)
Requirement already satisfied: six>=1.5 in /var/jenkins_home/.local/lib/python3.8/site-packages (from python-dateutil>=2.7.3->pandas<1.4.0dev0,>=1.2.0->merlin-core==0.8.0+4.geda153c) (1.15.0)
Requirement already satisfied: heapdict in /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg (from zict>=0.1.3->distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (1.0.1)
Requirement already satisfied: h2<5,>=3.1.0 in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (4.1.0)
Requirement already satisfied: multidict in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (6.0.2)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.8/dist-packages (from jinja2->distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (2.1.1)
Requirement already satisfied: hyperframe<7,>=6.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (6.0.1)
Requirement already satisfied: hpack<5,>=4.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (4.0.0)
Building wheels for collected packages: merlin-core
  Building wheel for merlin-core (pyproject.toml): started
  Building wheel for merlin-core (pyproject.toml): finished with status 'done'
  Created wheel for merlin-core: filename=merlin_core-0.8.0+4.geda153c-py3-none-any.whl size=118258 sha256=ea0190faf759c9a1d8e43dd6d19cc7e38151046003a6ffb9a5210d09db2e0677
  Stored in directory: /tmp/pip-ephem-wheel-cache-p3u7zrph/wheels/c8/38/16/a6968787eafcec5fa772148af8408b089562f71af0752e8e84
Successfully built merlin-core
Installing collected packages: merlin-core
  Attempting uninstall: merlin-core
    Found existing installation: merlin-core 0.3.0+12.g78ecddd
    Not uninstalling merlin-core at /var/jenkins_home/.local/lib/python3.8/site-packages, outside environment /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu
    Can't uninstall 'merlin-core'. No files were found to uninstall.
Successfully installed merlin-core-0.8.0+4.geda153c

[notice] A new release of pip available: 22.2.2 -> 22.3
[notice] To update, run: pip install --upgrade pip
test-gpu run-test: commands[1] | python -m pip install --upgrade git+https://github.com/NVIDIA-Merlin/dataloader.git
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Collecting git+https://github.com/NVIDIA-Merlin/dataloader.git
Cloning https://github.com/NVIDIA-Merlin/dataloader.git to /tmp/pip-req-build-uu7kjh4x
Running command git clone --filter=blob:none --quiet https://github.com/NVIDIA-Merlin/dataloader.git /tmp/pip-req-build-uu7kjh4x
Resolved https://github.com/NVIDIA-Merlin/dataloader.git to commit 5905283777ff5ebd748a1c91b7c9fde5710ae775
Installing build dependencies: started
Installing build dependencies: finished with status 'done'
Getting requirements to build wheel: started
Getting requirements to build wheel: finished with status 'done'
Preparing metadata (pyproject.toml): started
Preparing metadata (pyproject.toml): finished with status 'done'
Requirement already satisfied: merlin-core in ./.tox/test-gpu/lib/python3.8/site-packages (from merlin-dataloader==0.0.2+1.g5905283) (0.8.0+4.geda153c)
Requirement already satisfied: distributed>=2022.3.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (2022.3.0)
Requirement already satisfied: dask>=2022.3.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (2022.3.0)
Requirement already satisfied: betterproto<2.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.2.5)
Requirement already satisfied: pyarrow>=5.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (7.0.0)
Requirement already satisfied: packaging in /usr/local/lib/python3.8/dist-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (21.3)
Requirement already satisfied: fsspec==2022.5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (2022.5.0)
Requirement already satisfied: protobuf>=3.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (3.19.5)
Requirement already satisfied: pandas<1.4.0dev0,>=1.2.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.3.5)
Requirement already satisfied: numba>=0.54 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (0.55.1)
Requirement already satisfied: tqdm>=4.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (4.64.1)
Requirement already satisfied: tensorflow-metadata>=1.2.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.10.0)
Requirement already satisfied: grpclib in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (0.4.3)
Requirement already satisfied: stringcase in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.2.0)
Requirement already satisfied: cloudpickle>=1.1.1 in /usr/local/lib/python3.8/dist-packages (from dask>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (2.2.0)
Requirement already satisfied: partd>=0.3.10 in /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg (from dask>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.2.0)
Requirement already satisfied: toolz>=0.8.2 in /usr/local/lib/python3.8/dist-packages (from dask>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (0.12.0)
Requirement already satisfied: pyyaml>=5.3.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg (from dask>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (5.4.1)
Requirement already satisfied: click>=6.6 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (8.1.3)
Requirement already satisfied: psutil>=5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg (from distributed>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (5.8.0)
Requirement already satisfied: tblib>=1.6.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg (from distributed>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.7.0)
Requirement already satisfied: sortedcontainers!=2.0.0,!=2.0.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg (from distributed>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (2.4.0)
Requirement already satisfied: jinja2 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (3.1.2)
Requirement already satisfied: tornado>=6.0.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg (from distributed>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (6.1)
Requirement already satisfied: zict>=0.1.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg (from distributed>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (2.0.0)
Requirement already satisfied: msgpack>=0.6.0 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.0.4)
Requirement already satisfied: llvmlite<0.39,>=0.38.0rc1 in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (0.38.1)
Requirement already satisfied: setuptools in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (65.4.1)
Requirement already satisfied: numpy<1.22,>=1.18 in /var/jenkins_home/.local/lib/python3.8/site-packages (from numba>=0.54->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.20.3)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /usr/local/lib/python3.8/dist-packages (from packaging->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (3.0.9)
Requirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (2.8.2)
Requirement already satisfied: pytz>=2017.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (2022.2.1)
Requirement already satisfied: absl-py<2.0.0,>=0.9 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.2.0)
Requirement already satisfied: googleapis-common-protos<2,>=1.52.0 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.52.0)
Requirement already satisfied: locket in /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg (from partd>=0.3.10->dask>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (0.2.1)
Requirement already satisfied: six>=1.5 in /var/jenkins_home/.local/lib/python3.8/site-packages (from python-dateutil>=2.7.3->pandas<1.4.0dev0,>=1.2.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.15.0)
Requirement already satisfied: heapdict in /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg (from zict>=0.1.3->distributed>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.0.1)
Requirement already satisfied: h2<5,>=3.1.0 in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (4.1.0)
Requirement already satisfied: multidict in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (6.0.2)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.8/dist-packages (from jinja2->distributed>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (2.1.1)
Requirement already satisfied: hyperframe<7,>=6.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (6.0.1)
Requirement already satisfied: hpack<5,>=4.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (4.0.0)
Building wheels for collected packages: merlin-dataloader
Building wheel for merlin-dataloader (pyproject.toml): started
Building wheel for merlin-dataloader (pyproject.toml): finished with status 'done'
Created wheel for merlin-dataloader: filename=merlin_dataloader-0.0.2+1.g5905283-py3-none-any.whl size=31625 sha256=e3eaa9543d4448da4c833a8b5519977d8d4d943c5bb7e5384eb516c46b395825
Stored in directory: /tmp/pip-ephem-wheel-cache-wuxuy105/wheels/de/f5/d9/251909f4627d2920fb15548f5ffd6daf1bf24c3c56bb4977b1
Successfully built merlin-dataloader
Installing collected packages: merlin-dataloader
Attempting uninstall: merlin-dataloader
Found existing installation: merlin-dataloader 0.0.2
Uninstalling merlin-dataloader-0.0.2:
Successfully uninstalled merlin-dataloader-0.0.2
Successfully installed merlin-dataloader-0.0.2+1.g5905283

[notice] A new release of pip available: 22.2.2 -> 22.3
[notice] To update, run: pip install --upgrade pip
test-gpu run-test: commands[2] | python -m pytest --cov-report term --cov merlin -rxs tests/unit
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-7.1.3, pluggy-1.0.0
cachedir: .tox/test-gpu/.pytest_cache
rootdir: /var/jenkins_home/workspace/nvtabular_tests/nvtabular, configfile: pyproject.toml
plugins: anyio-3.5.0, xdist-3.0.2, cov-4.0.0
collected 1436 items / 1 skipped

tests/unit/test_dask_nvt.py ............................................ [ 3%]
........................................................................ [ 8%]
.... [ 8%]
tests/unit/test_notebooks.py .... [ 8%]
tests/unit/test_tf4rec.py . [ 8%]
tests/unit/test_tools.py ...................... [ 10%]
tests/unit/test_triton_inference.py ................................ [ 12%]
tests/unit/examples/test_01-Getting-started.py . [ 12%]
tests/unit/examples/test_02-Advanced-NVTabular-workflow.py . [ 12%]
tests/unit/examples/test_03-Running-on-multiple-GPUs-or-on-CPU.py F [ 12%]
tests/unit/framework_utils/test_tf_feature_columns.py . [ 12%]
tests/unit/framework_utils/test_tf_layers.py ........................... [ 14%]
................................................... [ 18%]
tests/unit/framework_utils/test_torch_layers.py . [ 18%]
tests/unit/loader/test_tf_dataloader.py ................................ [ 20%]
........................................s.. [ 23%]
tests/unit/loader/test_torch_dataloader.py ............................. [ 25%]
..................................................... [ 29%]
tests/unit/ops/test_categorify.py ...................................... [ 31%]
........................................................................ [ 36%]
..................................................... [ 40%]
tests/unit/ops/test_column_similarity.py ........................ [ 42%]
tests/unit/ops/test_drop_low_cardinality.py .. [ 42%]
tests/unit/ops/test_fill.py ............................................ [ 45%]
........ [ 45%]
tests/unit/ops/test_groupyby.py ....................... [ 47%]
tests/unit/ops/test_hash_bucket.py ......................... [ 49%]
tests/unit/ops/test_join.py ............................................ [ 52%]
........................................................................ [ 57%]
.................................. [ 59%]
tests/unit/ops/test_lambda.py .......... [ 60%]
tests/unit/ops/test_normalize.py ....................................... [ 63%]
.. [ 63%]
tests/unit/ops/test_ops.py ............................................. [ 66%]
.................... [ 67%]
tests/unit/ops/test_ops_schema.py ...................................... [ 70%]
........................................................................ [ 75%]
........................................................................ [ 80%]
........................................................................ [ 85%]
....................................... [ 88%]
tests/unit/ops/test_reduce_dtype_size.py .. [ 88%]
tests/unit/ops/test_target_encode.py ..................... [ 89%]
tests/unit/workflow/test_cpu_workflow.py ...... [ 90%]
tests/unit/workflow/test_workflow.py ................................... [ 92%]
.......................................................... [ 96%]
tests/unit/workflow/test_workflow_chaining.py ... [ 96%]
tests/unit/workflow/test_workflow_node.py ........... [ 97%]
tests/unit/workflow/test_workflow_ops.py ... [ 97%]
tests/unit/workflow/test_workflow_schemas.py ........................... [ 99%]
... [100%]

=================================== FAILURES ===================================
_______________________________ test_example_03 ________________________________

def test_example_03():
    with testbook(
        REPO_ROOT / "examples" / "03-Running-on-multiple-GPUs-or-on-CPU.ipynb",
        execute=False,
        timeout=180,
    ) as tb:
        tb.inject(
            """
            import os
            from unittest.mock import patch
            from merlin.datasets.synthetic import generate_data
            mock_train, mock_valid = generate_data(
                input="movielens-1m",
                num_rows=1000,
                set_sizes=(0.8, 0.2)
            )
            input_path = os.environ.get(
                "INPUT_DATA_DIR",
                os.path.expanduser("~/merlin-framework/movielens/")
            )
            from pathlib import Path
            Path(f'{input_path}ml-1m').mkdir(parents=True, exist_ok=True)
            mock_train.compute().to_parquet(f'{input_path}ml-1m/train.parquet')
            mock_train.compute().to_parquet(f'{input_path}ml-1m/valid.parquet')

            p1 = patch(
                "merlin.datasets.entertainment.get_movielens",
                return_value=[mock_train, mock_valid]
            )
            p1.start()

            """
        )
      tb.execute()

tests/unit/examples/test_03-Running-on-multiple-GPUs-or-on-CPU.py:59:


../../../.local/lib/python3.8/site-packages/testbook/client.py:147: in execute
super().execute_cell(cell, index)
../../../.local/lib/python3.8/site-packages/nbclient/util.py:84: in wrapped
return just_run(coro(*args, **kwargs))
../../../.local/lib/python3.8/site-packages/nbclient/util.py:62: in just_run
return loop.run_until_complete(coro)
/usr/local/lib/python3.8/dist-packages/nest_asyncio.py:89: in run_until_complete
return f.result()
/usr/lib/python3.8/asyncio/futures.py:178: in result
raise self._exception
/usr/lib/python3.8/asyncio/tasks.py:280: in __step
result = coro.send(None)
../../../.local/lib/python3.8/site-packages/nbclient/client.py:965: in async_execute_cell
await self._check_raise_for_error(cell, cell_index, exec_reply)


self = <testbook.client.TestbookNotebookClient object at 0x7f6680567640>
cell = {'cell_type': 'code', 'execution_count': 6, 'id': 'e02409ee', 'metadata': {'execution': {'iopub.status.busy': '2022-11...rkdir,\n dashboard_address=":" + dashboard_port,\n rmm_pool_size=(device_pool_size // 256) * 256\n )'}
cell_index = 11
exec_reply = {'buffers': [], 'content': {'ename': 'MemoryError', 'engine_info': {'engine_id': -1, 'engine_uuid': '8086baec-1e2d-453...e, 'engine': '8086baec-1e2d-453c-8803-01b178bedadd', 'started': '2022-11-01T17:45:18.403849Z', 'status': 'error'}, ...}

async def _check_raise_for_error(
    self, cell: NotebookNode, cell_index: int, exec_reply: t.Optional[t.Dict]
) -> None:

    if exec_reply is None:
        return None

    exec_reply_content = exec_reply['content']
    if exec_reply_content['status'] != 'error':
        return None

    cell_allows_errors = (not self.force_raise_errors) and (
        self.allow_errors
        or exec_reply_content.get('ename') in self.allow_error_names
        or "raises-exception" in cell.metadata.get("tags", [])
    )
    await run_hook(self.on_cell_error, cell=cell, cell_index=cell_index)
    if not cell_allows_errors:
      raise CellExecutionError.from_cell_and_msg(cell, exec_reply_content)

E nbclient.exceptions.CellExecutionError: An error occurred while executing the following cell:
E ------------------
E if cluster is None:
E cluster = LocalCUDACluster(
E protocol=protocol,
E n_workers=len(visible_devices.split(",")),
E CUDA_VISIBLE_DEVICES=visible_devices,
E device_memory_limit=device_limit,
E local_directory=dask_workdir,
E dashboard_address=":" + dashboard_port,
E rmm_pool_size=(device_pool_size // 256) * 256
E )
E ------------------
E
E �[0;31m---------------------------------------------------------------------------�[0m
E �[0;31mMemoryError�[0m Traceback (most recent call last)
E Cell �[0;32mIn [6], line 2�[0m
E �[1;32m 1�[0m �[38;5;28;01mif�[39;00m cluster �[38;5;129;01mis�[39;00m �[38;5;28;01mNone�[39;00m:
E �[0;32m----> 2�[0m cluster �[38;5;241m=�[39m �[43mLocalCUDACluster�[49m�[43m(�[49m
E �[1;32m 3�[0m �[43m �[49m�[43mprotocol�[49m�[38;5;241;43m=�[39;49m�[43mprotocol�[49m�[43m,�[49m
E �[1;32m 4�[0m �[43m �[49m�[43mn_workers�[49m�[38;5;241;43m=�[39;49m�[38;5;28;43mlen�[39;49m�[43m(�[49m�[43mvisible_devices�[49m�[38;5;241;43m.�[39;49m�[43msplit�[49m�[43m(�[49m�[38;5;124;43m"�[39;49m�[38;5;124;43m,�[39;49m�[38;5;124;43m"�[39;49m�[43m)�[49m�[43m)�[49m�[43m,�[49m
E �[1;32m 5�[0m �[43m �[49m�[43mCUDA_VISIBLE_DEVICES�[49m�[38;5;241;43m=�[39;49m�[43mvisible_devices�[49m�[43m,�[49m
E �[1;32m 6�[0m �[43m �[49m�[43mdevice_memory_limit�[49m�[38;5;241;43m=�[39;49m�[43mdevice_limit�[49m�[43m,�[49m
E �[1;32m 7�[0m �[43m �[49m�[43mlocal_directory�[49m�[38;5;241;43m=�[39;49m�[43mdask_workdir�[49m�[43m,�[49m
E �[1;32m 8�[0m �[43m �[49m�[43mdashboard_address�[49m�[38;5;241;43m=�[39;49m�[38;5;124;43m"�[39;49m�[38;5;124;43m:�[39;49m�[38;5;124;43m"�[39;49m�[43m �[49m�[38;5;241;43m+�[39;49m�[43m �[49m�[43mdashboard_port�[49m�[43m,�[49m
E �[1;32m 9�[0m �[43m �[49m�[43mrmm_pool_size�[49m�[38;5;241;43m=�[39;49m�[43m(�[49m�[43mdevice_pool_size�[49m�[43m �[49m�[38;5;241;43m/�[39;49m�[38;5;241;43m/�[39;49m�[43m �[49m�[38;5;241;43m256�[39;49m�[43m)�[49m�[43m �[49m�[38;5;241;43m�[39;49m�[43m �[49m�[38;5;241;43m256�[39;49m
E �[1;32m 10�[0m �[43m �[49m�[43m)�[49m
E
E File �[0;32m/usr/local/lib/python3.8/dist-packages/dask_cuda/local_cuda_cluster.py:367�[0m, in �[0;36mLocalCUDACluster.__init__�[0;34m(self, CUDA_VISIBLE_DEVICES, n_workers, threads_per_worker, memory_limit, device_memory_limit, data, local_directory, shared_filesystem, protocol, enable_tcp_over_ucx, enable_infiniband, enable_nvlink, enable_rdmacm, rmm_pool_size, rmm_maximum_pool_size, rmm_managed_memory, rmm_async, rmm_log_directory, rmm_track_allocations, jit_unspill, log_spilling, worker_class, pre_import, **kwargs)�[0m
E �[1;32m 365�[0m �[38;5;28mself�[39m�[38;5;241m.�[39mcuda_visible_devices �[38;5;241m=�[39m CUDA_VISIBLE_DEVICES
E �[1;32m 366�[0m �[38;5;28mself�[39m�[38;5;241m.�[39mscale(n_workers)
E �[0;32m--> 367�[0m �[38;5;28;43mself�[39;49m�[38;5;241;43m.�[39;49m�[43msync�[49m�[43m(�[49m�[38;5;28;43mself�[39;49m�[38;5;241;43m.�[39;49m�[43m_correct_state�[49m�[43m)�[49m
E
E File �[0;32m~/.local/lib/python3.8/site-packages/distributed/utils.py:309�[0m, in �[0;36mSyncMethodMixin.sync�[0;34m(self, func, asynchronous, callback_timeout, args, **kwargs)�[0m
E �[1;32m 307�[0m �[38;5;28;01mreturn�[39;00m future
E �[1;32m 308�[0m �[38;5;28;01melse�[39;00m:
E �[0;32m--> 309�[0m �[38;5;28;01mreturn�[39;00m �[43msync�[49m�[43m(�[49m
E �[1;32m 310�[0m �[43m �[49m�[38;5;28;43mself�[39;49m�[38;5;241;43m.�[39;49m�[43mloop�[49m�[43m,�[49m�[43m �[49m�[43mfunc�[49m�[43m,�[49m�[43m �[49m�[38;5;241;43m
�[39;49m�[43margs�[49m�[43m,�[49m�[43m �[49m�[43mcallback_timeout�[49m�[38;5;241;43m=�[39;49m�[43mcallback_timeout�[49m�[43m,�[49m�[43m �[49m�[38;5;241;43m
�[39;49m�[38;5;241;43m*�[39;49m�[43mkwargs�[49m
E �[1;32m 311�[0m �[43m �[49m�[43m)�[49m
E
E File �[0;32m~/.local/lib/python3.8/site-packages/distributed/utils.py:376�[0m, in �[0;36msync�[0;34m(loop, func, callback_timeout, *args, **kwargs)�[0m
E �[1;32m 374�[0m �[38;5;28;01mif�[39;00m error:
E �[1;32m 375�[0m typ, exc, tb �[38;5;241m=�[39m error
E �[0;32m--> 376�[0m �[38;5;28;01mraise�[39;00m exc�[38;5;241m.�[39mwith_traceback(tb)
E �[1;32m 377�[0m �[38;5;28;01melse�[39;00m:
E �[1;32m 378�[0m �[38;5;28;01mreturn�[39;00m result
E
E File �[0;32m~/.local/lib/python3.8/site-packages/distributed/utils.py:349�[0m, in �[0;36msync..f�[0;34m()�[0m
E �[1;32m 347�[0m future �[38;5;241m=�[39m asyncio�[38;5;241m.�[39mwait_for(future, callback_timeout)
E �[1;32m 348�[0m future �[38;5;241m=�[39m asyncio�[38;5;241m.�[39mensure_future(future)
E �[0;32m--> 349�[0m result �[38;5;241m=�[39m �[38;5;28;01myield�[39;00m future
E �[1;32m 350�[0m �[38;5;28;01mexcept�[39;00m �[38;5;167;01mException�[39;00m:
E �[1;32m 351�[0m error �[38;5;241m=�[39m sys�[38;5;241m.�[39mexc_info()
E
E File �[0;32m~/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg/tornado/gen.py:762�[0m, in �[0;36mRunner.run�[0;34m(self)�[0m
E �[1;32m 759�[0m exc_info �[38;5;241m=�[39m �[38;5;28;01mNone�[39;00m
E �[1;32m 761�[0m �[38;5;28;01mtry�[39;00m:
E �[0;32m--> 762�[0m value �[38;5;241m=�[39m �[43mfuture�[49m�[38;5;241;43m.�[39;49m�[43mresult�[49m�[43m(�[49m�[43m)�[49m
E �[1;32m 763�[0m �[38;5;28;01mexcept�[39;00m �[38;5;167;01mException�[39;00m:
E �[1;32m 764�[0m exc_info �[38;5;241m=�[39m sys�[38;5;241m.�[39mexc_info()
E
E File �[0;32m~/.local/lib/python3.8/site-packages/distributed/deploy/spec.py:352�[0m, in �[0;36mSpecCluster._correct_state_internal�[0;34m(self)�[0m
E �[1;32m 350�[0m �[38;5;28;01mfor�[39;00m w �[38;5;129;01min�[39;00m workers:
E �[1;32m 351�[0m w�[38;5;241m.�[39m_cluster �[38;5;241m=�[39m weakref�[38;5;241m.�[39mref(�[38;5;28mself�[39m)
E �[0;32m--> 352�[0m �[38;5;28;01mawait�[39;00m w �[38;5;66;03m# for tornado gen.coroutine support�[39;00m
E �[1;32m 353�[0m �[38;5;28mself�[39m�[38;5;241m.�[39mworkers�[38;5;241m.�[39mupdate(�[38;5;28mdict�[39m(�[38;5;28mzip�[39m(to_open, workers)))
E
E File �[0;32m~/.local/lib/python3.8/site-packages/distributed/core.py:299�[0m, in �[0;36mServer.await.._�[0;34m()�[0m
E �[1;32m 293�[0m �[38;5;28;01mraise�[39;00m �[38;5;167;01mTimeoutError�[39;00m(
E �[1;32m 294�[0m �[38;5;124m"�[39m�[38;5;132;01m{}�[39;00m�[38;5;124m failed to start in �[39m�[38;5;132;01m{}�[39;00m�[38;5;124m seconds�[39m�[38;5;124m"�[39m�[38;5;241m.�[39mformat(
E �[1;32m 295�[0m �[38;5;28mtype�[39m(�[38;5;28mself�[39m)�[38;5;241m.�[39m�[38;5;18m__name__�[39m, timeout
E �[1;32m 296�[0m )
E �[1;32m 297�[0m )
E �[1;32m 298�[0m �[38;5;28;01melse�[39;00m:
E �[0;32m--> 299�[0m �[38;5;28;01mawait�[39;00m �[38;5;28mself�[39m�[38;5;241m.�[39mstart()
E �[1;32m 300�[0m �[38;5;28mself�[39m�[38;5;241m.�[39mstatus �[38;5;241m=�[39m Status�[38;5;241m.�[39mrunning
E �[1;32m 301�[0m �[38;5;28;01mreturn�[39;00m �[38;5;28mself�[39m
E
E File �[0;32m~/.local/lib/python3.8/site-packages/distributed/nanny.py:347�[0m, in �[0;36mNanny.start�[0;34m(self)�[0m
E �[1;32m 344�[0m �[38;5;28;01mawait�[39;00m �[38;5;28mself�[39m�[38;5;241m.�[39mplugin_add(plugin�[38;5;241m=�[39mplugin, name�[38;5;241m=�[39mname)
E �[1;32m 346�[0m logger�[38;5;241m.�[39minfo(�[38;5;124m"�[39m�[38;5;124m Start Nanny at: �[39m�[38;5;132;01m%r�[39;00m�[38;5;124m"�[39m, �[38;5;28mself�[39m�[38;5;241m.�[39maddress)
E �[0;32m--> 347�[0m response �[38;5;241m=�[39m �[38;5;28;01mawait�[39;00m �[38;5;28mself�[39m�[38;5;241m.�[39minstantiate()
E �[1;32m 348�[0m �[38;5;28;01mif�[39;00m response �[38;5;241m==�[39m Status�[38;5;241m.�[39mrunning:
E �[1;32m 349�[0m �[38;5;28;01massert�[39;00m �[38;5;28mself�[39m�[38;5;241m.�[39mworker_address
E
E File �[0;32m~/.local/lib/python3.8/site-packages/distributed/nanny.py:430�[0m, in �[0;36mNanny.instantiate�[0;34m(self)�[0m
E �[1;32m 428�[0m �[38;5;28;01melse�[39;00m:
E �[1;32m 429�[0m �[38;5;28;01mtry�[39;00m:
E �[0;32m--> 430�[0m result �[38;5;241m=�[39m �[38;5;28;01mawait�[39;00m �[38;5;28mself�[39m�[38;5;241m.�[39mprocess�[38;5;241m.�[39mstart()
E �[1;32m 431�[0m �[38;5;28;01mexcept�[39;00m �[38;5;167;01mException�[39;00m:
E �[1;32m 432�[0m �[38;5;28;01mawait�[39;00m �[38;5;28mself�[39m�[38;5;241m.�[39mclose()
E
E File �[0;32m~/.local/lib/python3.8/site-packages/distributed/nanny.py:685�[0m, in �[0;36mWorkerProcess.start�[0;34m(self)�[0m
E �[1;32m 683�[0m �[38;5;28;01mreturn�[39;00m �[38;5;28mself�[39m�[38;5;241m.�[39mstatus
E �[1;32m 684�[0m �[38;5;28;01mtry�[39;00m:
E �[0;32m--> 685�[0m msg �[38;5;241m=�[39m �[38;5;28;01mawait�[39;00m �[38;5;28mself�[39m�[38;5;241m.�[39m_wait_until_connected(uid)
E �[1;32m 686�[0m �[38;5;28;01mexcept�[39;00m �[38;5;167;01mException�[39;00m:
E �[1;32m 687�[0m �[38;5;28mself�[39m�[38;5;241m.�[39mstatus �[38;5;241m=�[39m Status�[38;5;241m.�[39mfailed
E
E File �[0;32m~/.local/lib/python3.8/site-packages/distributed/nanny.py:803�[0m, in �[0;36mWorkerProcess._wait_until_connected�[0;34m(self, uid)�[0m
E �[1;32m 799�[0m �[38;5;28;01mif�[39;00m �[38;5;124m"�[39m�[38;5;124mexception�[39m�[38;5;124m"�[39m �[38;5;129;01min�[39;00m msg:
E �[1;32m 800�[0m logger�[38;5;241m.�[39merror(
E �[1;32m 801�[0m �[38;5;124m"�[39m�[38;5;124mFailed while trying to start worker process: �[39m�[38;5;132;01m%s�[39;00m�[38;5;124m"�[39m, msg[�[38;5;124m"�[39m�[38;5;124mexception�[39m�[38;5;124m"�[39m]
E �[1;32m 802�[0m )
E �[0;32m--> 803�[0m �[38;5;28;01mraise�[39;00m msg[�[38;5;124m"�[39m�[38;5;124mexception�[39m�[38;5;124m"�[39m]
E �[1;32m 804�[0m �[38;5;28;01melse�[39;00m:
E �[1;32m 805�[0m �[38;5;28;01mreturn�[39;00m msg
E
E File �[0;32m~/.local/lib/python3.8/site-packages/distributed/nanny.py:869�[0m, in �[0;36mrun�[0;34m()�[0m
E �[1;32m 865�[0m �[38;5;124;03m"""�[39;00m
E �[1;32m 866�[0m �[38;5;124;03mTry to start worker and inform parent of outcome.�[39;00m
E �[1;32m 867�[0m �[38;5;124;03m"""�[39;00m
E �[1;32m 868�[0m �[38;5;28;01mtry�[39;00m:
E �[0;32m--> 869�[0m �[38;5;28;01mawait�[39;00m worker
E �[1;32m 870�[0m �[38;5;28;01mexcept�[39;00m �[38;5;167;01mException�[39;00m �[38;5;28;01mas�[39;00m e:
E �[1;32m 871�[0m logger�[38;5;241m.�[39mexception(�[38;5;124m"�[39m�[38;5;124mFailed to start worker�[39m�[38;5;124m"�[39m)
E
E File �[0;32m~/.local/lib/python3.8/site-packages/distributed/core.py:299�[0m, in �[0;36m_�[0;34m()�[0m
E �[1;32m 293�[0m �[38;5;28;01mraise�[39;00m �[38;5;167;01mTimeoutError�[39;00m(
E �[1;32m 294�[0m �[38;5;124m"�[39m�[38;5;132;01m{}�[39;00m�[38;5;124m failed to start in �[39m�[38;5;132;01m{}�[39;00m�[38;5;124m seconds�[39m�[38;5;124m"�[39m�[38;5;241m.�[39mformat(
E �[1;32m 295�[0m �[38;5;28mtype�[39m(�[38;5;28mself�[39m)�[38;5;241m.�[39m�[38;5;18m__name__�[39m, timeout
E �[1;32m 296�[0m )
E �[1;32m 297�[0m )
E �[1;32m 298�[0m �[38;5;28;01melse�[39;00m:
E �[0;32m--> 299�[0m �[38;5;28;01mawait�[39;00m �[38;5;28mself�[39m�[38;5;241m.�[39mstart()
E �[1;32m 300�[0m �[38;5;28mself�[39m�[38;5;241m.�[39mstatus �[38;5;241m=�[39m Status�[38;5;241m.�[39mrunning
E �[1;32m 301�[0m �[38;5;28;01mreturn�[39;00m �[38;5;28mself�[39m
E
E File �[0;32m~/.local/lib/python3.8/site-packages/distributed/worker.py:1372�[0m, in �[0;36mstart�[0;34m()�[0m
E �[1;32m 1370�[0m �[38;5;28;01mfor�[39;00m exc �[38;5;129;01min�[39;00m plugins_exceptions:
E �[1;32m 1371�[0m logger�[38;5;241m.�[39merror(�[38;5;28mrepr�[39m(exc))
E �[0;32m-> 1372�[0m �[38;5;28;01mraise�[39;00m plugins_exceptions[�[38;5;241m0�[39m]
E �[1;32m 1374�[0m �[38;5;28mself�[39m�[38;5;241m.�[39m_pending_plugins �[38;5;241m=�[39m ()
E �[1;32m 1376�[0m �[38;5;28;01mawait�[39;00m �[38;5;28mself�[39m�[38;5;241m.�[39m_register_with_scheduler()
E
E File �[0;32m~/.local/lib/python3.8/site-packages/distributed/worker.py:3248�[0m, in �[0;36mplugin_add�[0;34m()�[0m
E �[1;32m 3246�[0m �[38;5;28;01mif�[39;00m �[38;5;28mhasattr�[39m(plugin, �[38;5;124m"�[39m�[38;5;124msetup�[39m�[38;5;124m"�[39m):
E �[1;32m 3247�[0m �[38;5;28;01mtry�[39;00m:
E �[0;32m-> 3248�[0m result �[38;5;241m=�[39m plugin�[38;5;241m.�[39msetup(worker�[38;5;241m=�[39m�[38;5;28mself�[39m)
E �[1;32m 3249�[0m �[38;5;28;01mif�[39;00m isawaitable(result):
E �[1;32m 3250�[0m result �[38;5;241m=�[39m �[38;5;28;01mawait�[39;00m result
E
E File �[0;32m/usr/local/lib/python3.8/dist-packages/dask_cuda/utils.py:78�[0m, in �[0;36msetup�[0;34m()�[0m
E �[1;32m 74�[0m �[38;5;28;01mimport�[39;00m �[38;5;21;01mrmm�[39;00m
E �[1;32m 76�[0m pool_allocator �[38;5;241m=�[39m �[38;5;28;01mFalse�[39;00m �[38;5;28;01mif�[39;00m �[38;5;28mself�[39m�[38;5;241m.�[39minitial_pool_size �[38;5;129;01mis�[39;00m �[38;5;28;01mNone�[39;00m �[38;5;28;01melse�[39;00m �[38;5;28;01mTrue�[39;00m
E �[0;32m---> 78�[0m rmm�[38;5;241m.�[39mreinitialize(
E �[1;32m 79�[0m pool_allocator�[38;5;241m=�[39mpool_allocator,
E �[1;32m 80�[0m managed_memory�[38;5;241m=�[39m�[38;5;28mself�[39m�[38;5;241m.�[39mmanaged_memory,
E �[1;32m 81�[0m initial_pool_size�[38;5;241m=�[39m�[38;5;28mself�[39m�[38;5;241m.�[39minitial_pool_size,
E �[1;32m 82�[0m maximum_pool_size�[38;5;241m=�[39m�[38;5;28mself�[39m�[38;5;241m.�[39mmaximum_pool_size,
E �[1;32m 83�[0m logging�[38;5;241m=�[39m�[38;5;28mself�[39m�[38;5;241m.�[39mlogging,
E �[1;32m 84�[0m log_file_name�[38;5;241m=�[39mget_rmm_log_file_name(
E �[1;32m 85�[0m worker, �[38;5;28mself�[39m�[38;5;241m.�[39mlogging, �[38;5;28mself�[39m�[38;5;241m.�[39mlog_directory
E �[1;32m 86�[0m ),
E �[1;32m 87�[0m )
E �[1;32m 88�[0m �[38;5;28;01mif�[39;00m �[38;5;28mself�[39m�[38;5;241m.�[39mrmm_track_allocations:
E �[1;32m 89�[0m �[38;5;28;01mimport�[39;00m �[38;5;21;01mrmm�[39;00m
E
E File �[0;32m/usr/local/lib/python3.8/dist-packages/rmm/rmm.py:85�[0m, in �[0;36mreinitialize�[0;34m()�[0m
E �[1;32m 32�[0m �[38;5;28;01mdef�[39;00m �[38;5;21mreinitialize�[39m(
E �[1;32m 33�[0m pool_allocator�[38;5;241m=�[39m�[38;5;28;01mFalse�[39;00m,
E �[1;32m 34�[0m managed_memory�[38;5;241m=�[39m�[38;5;28;01mFalse�[39;00m,
E �[0;32m (...)�[0m
E �[1;32m 39�[0m log_file_name�[38;5;241m=�[39m�[38;5;28;01mNone�[39;00m,
E �[1;32m 40�[0m ):
E �[1;32m 41�[0m �[38;5;124;03m"""�[39;00m
E �[1;32m 42�[0m �[38;5;124;03m Finalizes and then initializes RMM using the options passed. Using memory�[39;00m
E �[1;32m 43�[0m �[38;5;124;03m from a previous initialization of RMM is undefined behavior and should be�[39;00m
E �[0;32m (...)�[0m
E �[1;32m 83�[0m �[38;5;124;03m corresponding to each device.�[39;00m
E �[1;32m 84�[0m �[38;5;124;03m """�[39;00m
E �[0;32m---> 85�[0m rmm�[38;5;241m.�[39mmr�[38;5;241m.�[39m_initialize(
E �[1;32m 86�[0m pool_allocator�[38;5;241m=�[39mpool_allocator,
E �[1;32m 87�[0m managed_memory�[38;5;241m=�[39mmanaged_memory,
E �[1;32m 88�[0m initial_pool_size�[38;5;241m=�[39minitial_pool_size,
E �[1;32m 89�[0m maximum_pool_size�[38;5;241m=�[39mmaximum_pool_size,
E �[1;32m 90�[0m devices�[38;5;241m=�[39mdevices,
E �[1;32m 91�[0m logging�[38;5;241m=�[39mlogging,
E �[1;32m 92�[0m log_file_name�[38;5;241m=�[39mlog_file_name,
E �[1;32m 93�[0m )
E
E File �[0;32mmemory_resource.pyx:823�[0m, in �[0;36mrmm._lib.memory_resource._initialize�[0;34m()�[0m
E
E File �[0;32mmemory_resource.pyx:883�[0m, in �[0;36mrmm._lib.memory_resource._initialize�[0;34m()�[0m
E
E File �[0;32mmemory_resource.pyx:342�[0m, in �[0;36mrmm._lib.memory_resource.PoolMemoryResource.__cinit__�[0;34m()�[0m
E
E �[0;31mMemoryError�[0m: std::bad_alloc: out_of_memory: RMM failure at:/opt/rapids/rmm/include/rmm/mr/device/pool_memory_resource.hpp:192: Maximum pool size exceeded
E MemoryError: std::bad_alloc: out_of_memory: RMM failure at:/opt/rapids/rmm/include/rmm/mr/device/pool_memory_resource.hpp:192: Maximum pool size exceeded

../../../.local/lib/python3.8/site-packages/nbclient/client.py:862: CellExecutionError
----------------------------- Captured stderr call -----------------------------
2022-11-01 17:45:21,097 - distributed.preloading - INFO - Import preload module: dask_cuda.initialize
2022-11-01 17:45:21,110 - distributed.preloading - INFO - Import preload module: dask_cuda.initialize
2022-11-01 17:45:21,120 - distributed.preloading - INFO - Import preload module: dask_cuda.initialize
2022-11-01 17:45:21,130 - distributed.preloading - INFO - Import preload module: dask_cuda.initialize
2022-11-01 17:45:21,618 - distributed.utils - ERROR - std::bad_alloc: out_of_memory: RMM failure at:/opt/rapids/rmm/include/rmm/mr/device/pool_memory_resource.hpp:192: Maximum pool size exceeded
Traceback (most recent call last):
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/utils.py", line 693, in log_errors
yield
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/worker.py", line 3248, in plugin_add
result = plugin.setup(worker=self)
File "/usr/local/lib/python3.8/dist-packages/dask_cuda/utils.py", line 78, in setup
rmm.reinitialize(
File "/usr/local/lib/python3.8/dist-packages/rmm/rmm.py", line 85, in reinitialize
rmm.mr._initialize(
File "memory_resource.pyx", line 823, in rmm._lib.memory_resource._initialize
File "memory_resource.pyx", line 883, in rmm._lib.memory_resource._initialize
File "memory_resource.pyx", line 342, in rmm._lib.memory_resource.PoolMemoryResource.cinit
MemoryError: std::bad_alloc: out_of_memory: RMM failure at:/opt/rapids/rmm/include/rmm/mr/device/pool_memory_resource.hpp:192: Maximum pool size exceeded
2022-11-01 17:45:21,619 - distributed.nanny - ERROR - Failed to start worker
Traceback (most recent call last):
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/nanny.py", line 869, in run
await worker
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/core.py", line 299, in _
await self.start()
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/worker.py", line 1372, in start
raise plugins_exceptions[0]
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/worker.py", line 3248, in plugin_add
result = plugin.setup(worker=self)
File "/usr/local/lib/python3.8/dist-packages/dask_cuda/utils.py", line 78, in setup
rmm.reinitialize(
File "/usr/local/lib/python3.8/dist-packages/rmm/rmm.py", line 85, in reinitialize
rmm.mr._initialize(
File "memory_resource.pyx", line 823, in rmm._lib.memory_resource._initialize
File "memory_resource.pyx", line 883, in rmm._lib.memory_resource._initialize
File "memory_resource.pyx", line 342, in rmm._lib.memory_resource.PoolMemoryResource.cinit
MemoryError: std::bad_alloc: out_of_memory: RMM failure at:/opt/rapids/rmm/include/rmm/mr/device/pool_memory_resource.hpp:192: Maximum pool size exceeded
2022-11-01 17:45:24,090 - distributed.diskutils - INFO - Found stale lock file and directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/test_dask/workdir/dask-worker-space/worker-owet7gry', purging
2022-11-01 17:45:24,091 - distributed.preloading - INFO - Import preload module: dask_cuda.initialize
2022-11-01 17:45:24,352 - distributed.utils - ERROR - std::bad_alloc: out_of_memory: RMM failure at:/opt/rapids/rmm/include/rmm/mr/device/pool_memory_resource.hpp:192: Maximum pool size exceeded
Traceback (most recent call last):
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/utils.py", line 693, in log_errors
yield
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/worker.py", line 3248, in plugin_add
result = plugin.setup(worker=self)
File "/usr/local/lib/python3.8/dist-packages/dask_cuda/utils.py", line 78, in setup
rmm.reinitialize(
File "/usr/local/lib/python3.8/dist-packages/rmm/rmm.py", line 85, in reinitialize
rmm.mr._initialize(
File "memory_resource.pyx", line 823, in rmm._lib.memory_resource._initialize
File "memory_resource.pyx", line 883, in rmm._lib.memory_resource._initialize
File "memory_resource.pyx", line 342, in rmm._lib.memory_resource.PoolMemoryResource.cinit
MemoryError: std::bad_alloc: out_of_memory: RMM failure at:/opt/rapids/rmm/include/rmm/mr/device/pool_memory_resource.hpp:192: Maximum pool size exceeded
2022-11-01 17:45:24,353 - distributed.nanny - ERROR - Failed to start worker
Traceback (most recent call last):
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/nanny.py", line 869, in run
await worker
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/core.py", line 299, in _
await self.start()
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/worker.py", line 1372, in start
raise plugins_exceptions[0]
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/worker.py", line 3248, in plugin_add
result = plugin.setup(worker=self)
File "/usr/local/lib/python3.8/dist-packages/dask_cuda/utils.py", line 78, in setup
rmm.reinitialize(
File "/usr/local/lib/python3.8/dist-packages/rmm/rmm.py", line 85, in reinitialize
rmm.mr._initialize(
File "memory_resource.pyx", line 823, in rmm._lib.memory_resource._initialize
File "memory_resource.pyx", line 883, in rmm._lib.memory_resource._initialize
File "memory_resource.pyx", line 342, in rmm._lib.memory_resource.PoolMemoryResource.cinit
MemoryError: std::bad_alloc: out_of_memory: RMM failure at:/opt/rapids/rmm/include/rmm/mr/device/pool_memory_resource.hpp:192: Maximum pool size exceeded
2022-11-01 17:45:26,946 - distributed.preloading - INFO - Import preload module: dask_cuda.initialize
2022-11-01 17:45:26,946 - distributed.diskutils - INFO - Found stale lock file and directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/test_dask/workdir/dask-worker-space/worker-3ym4jncs', purging
2022-11-01 17:45:26,947 - distributed.preloading - INFO - Import preload module: dask_cuda.initialize
2022-11-01 17:45:26,948 - distributed.preloading - INFO - Import preload module: dask_cuda.initialize
2022-11-01 17:45:26,983 - distributed.preloading - INFO - Import preload module: dask_cuda.initialize
2022-11-01 17:45:27,554 - distributed.utils - ERROR - std::bad_alloc: out_of_memory: RMM failure at:/opt/rapids/rmm/include/rmm/mr/device/pool_memory_resource.hpp:192: Maximum pool size exceeded
Traceback (most recent call last):
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/utils.py", line 693, in log_errors
yield
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/worker.py", line 3248, in plugin_add
result = plugin.setup(worker=self)
File "/usr/local/lib/python3.8/dist-packages/dask_cuda/utils.py", line 78, in setup
rmm.reinitialize(
File "/usr/local/lib/python3.8/dist-packages/rmm/rmm.py", line 85, in reinitialize
rmm.mr._initialize(
File "memory_resource.pyx", line 823, in rmm._lib.memory_resource._initialize
File "memory_resource.pyx", line 883, in rmm._lib.memory_resource._initialize
File "memory_resource.pyx", line 342, in rmm._lib.memory_resource.PoolMemoryResource.cinit
MemoryError: std::bad_alloc: out_of_memory: RMM failure at:/opt/rapids/rmm/include/rmm/mr/device/pool_memory_resource.hpp:192: Maximum pool size exceeded
2022-11-01 17:45:27,555 - distributed.nanny - ERROR - Failed to start worker
Traceback (most recent call last):
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/nanny.py", line 869, in run
await worker
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/core.py", line 299, in _
await self.start()
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/worker.py", line 1372, in start
raise plugins_exceptions[0]
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/worker.py", line 3248, in plugin_add
result = plugin.setup(worker=self)
File "/usr/local/lib/python3.8/dist-packages/dask_cuda/utils.py", line 78, in setup
rmm.reinitialize(
File "/usr/local/lib/python3.8/dist-packages/rmm/rmm.py", line 85, in reinitialize
rmm.mr._initialize(
File "memory_resource.pyx", line 823, in rmm._lib.memory_resource._initialize
File "memory_resource.pyx", line 883, in rmm._lib.memory_resource._initialize
File "memory_resource.pyx", line 342, in rmm._lib.memory_resource.PoolMemoryResource.cinit
MemoryError: std::bad_alloc: out_of_memory: RMM failure at:/opt/rapids/rmm/include/rmm/mr/device/pool_memory_resource.hpp:192: Maximum pool size exceeded
2022-11-01 17:45:27,570 - distributed.utils - ERROR - std::bad_alloc: out_of_memory: RMM failure at:/opt/rapids/rmm/include/rmm/mr/device/pool_memory_resource.hpp:192: Maximum pool size exceeded
Traceback (most recent call last):
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/utils.py", line 693, in log_errors
yield
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/worker.py", line 3248, in plugin_add
result = plugin.setup(worker=self)
File "/usr/local/lib/python3.8/dist-packages/dask_cuda/utils.py", line 78, in setup
rmm.reinitialize(
File "/usr/local/lib/python3.8/dist-packages/rmm/rmm.py", line 85, in reinitialize
rmm.mr._initialize(
File "memory_resource.pyx", line 823, in rmm._lib.memory_resource._initialize
File "memory_resource.pyx", line 883, in rmm._lib.memory_resource._initialize
File "memory_resource.pyx", line 342, in rmm._lib.memory_resource.PoolMemoryResource.cinit
MemoryError: std::bad_alloc: out_of_memory: RMM failure at:/opt/rapids/rmm/include/rmm/mr/device/pool_memory_resource.hpp:192: Maximum pool size exceeded
2022-11-01 17:45:27,571 - distributed.nanny - ERROR - Failed to start worker
Traceback (most recent call last):
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/nanny.py", line 869, in run
await worker
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/core.py", line 299, in _
await self.start()
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/worker.py", line 1372, in start
raise plugins_exceptions[0]
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/worker.py", line 3248, in plugin_add
result = plugin.setup(worker=self)
File "/usr/local/lib/python3.8/dist-packages/dask_cuda/utils.py", line 78, in setup
rmm.reinitialize(
File "/usr/local/lib/python3.8/dist-packages/rmm/rmm.py", line 85, in reinitialize
rmm.mr._initialize(
File "memory_resource.pyx", line 823, in rmm._lib.memory_resource._initialize
File "memory_resource.pyx", line 883, in rmm._lib.memory_resource._initialize
File "memory_resource.pyx", line 342, in rmm._lib.memory_resource.PoolMemoryResource.cinit
MemoryError: std::bad_alloc: out_of_memory: RMM failure at:/opt/rapids/rmm/include/rmm/mr/device/pool_memory_resource.hpp:192: Maximum pool size exceeded
2022-11-01 17:45:27,572 - distributed.utils - ERROR - std::bad_alloc: out_of_memory: RMM failure at:/opt/rapids/rmm/include/rmm/mr/device/pool_memory_resource.hpp:192: Maximum pool size exceeded
Traceback (most recent call last):
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/utils.py", line 693, in log_errors
yield
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/worker.py", line 3248, in plugin_add
result = plugin.setup(worker=self)
File "/usr/local/lib/python3.8/dist-packages/dask_cuda/utils.py", line 78, in setup
rmm.reinitialize(
File "/usr/local/lib/python3.8/dist-packages/rmm/rmm.py", line 85, in reinitialize
rmm.mr._initialize(
File "memory_resource.pyx", line 823, in rmm._lib.memory_resource._initialize
File "memory_resource.pyx", line 883, in rmm._lib.memory_resource._initialize
File "memory_resource.pyx", line 342, in rmm._lib.memory_resource.PoolMemoryResource.cinit
MemoryError: std::bad_alloc: out_of_memory: RMM failure at:/opt/rapids/rmm/include/rmm/mr/device/pool_memory_resource.hpp:192: Maximum pool size exceeded
2022-11-01 17:45:27,573 - distributed.nanny - ERROR - Failed to start worker
Traceback (most recent call last):
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/nanny.py", line 869, in run
await worker
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/core.py", line 299, in _
await self.start()
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/worker.py", line 1372, in start
raise plugins_exceptions[0]
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/worker.py", line 3248, in plugin_add
result = plugin.setup(worker=self)
File "/usr/local/lib/python3.8/dist-packages/dask_cuda/utils.py", line 78, in setup
rmm.reinitialize(
File "/usr/local/lib/python3.8/dist-packages/rmm/rmm.py", line 85, in reinitialize
rmm.mr._initialize(
File "memory_resource.pyx", line 823, in rmm._lib.memory_resource._initialize
File "memory_resource.pyx", line 883, in rmm._lib.memory_resource._initialize
File "memory_resource.pyx", line 342, in rmm._lib.memory_resource.PoolMemoryResource.cinit
MemoryError: std::bad_alloc: out_of_memory: RMM failure at:/opt/rapids/rmm/include/rmm/mr/device/pool_memory_resource.hpp:192: Maximum pool size exceeded
2022-11-01 17:45:27,582 - distributed.utils - ERROR - std::bad_alloc: out_of_memory: RMM failure at:/opt/rapids/rmm/include/rmm/mr/device/pool_memory_resource.hpp:192: Maximum pool size exceeded
Traceback (most recent call last):
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/utils.py", line 693, in log_errors
yield
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/worker.py", line 3248, in plugin_add
result = plugin.setup(worker=self)
File "/usr/local/lib/python3.8/dist-packages/dask_cuda/utils.py", line 78, in setup
rmm.reinitialize(
File "/usr/local/lib/python3.8/dist-packages/rmm/rmm.py", line 85, in reinitialize
rmm.mr._initialize(
File "memory_resource.pyx", line 823, in rmm._lib.memory_resource._initialize
File "memory_resource.pyx", line 883, in rmm._lib.memory_resource._initialize
File "memory_resource.pyx", line 342, in rmm._lib.memory_resource.PoolMemoryResource.cinit
MemoryError: std::bad_alloc: out_of_memory: RMM failure at:/opt/rapids/rmm/include/rmm/mr/device/pool_memory_resource.hpp:192: Maximum pool size exceeded
2022-11-01 17:45:27,583 - distributed.nanny - ERROR - Failed to start worker
Traceback (most recent call last):
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/nanny.py", line 869, in run
await worker
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/core.py", line 299, in _
await self.start()
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/worker.py", line 1372, in start
raise plugins_exceptions[0]
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/worker.py", line 3248, in plugin_add
result = plugin.setup(worker=self)
File "/usr/local/lib/python3.8/dist-packages/dask_cuda/utils.py", line 78, in setup
rmm.reinitialize(
File "/usr/local/lib/python3.8/dist-packages/rmm/rmm.py", line 85, in reinitialize
rmm.mr._initialize(
File "memory_resource.pyx", line 823, in rmm._lib.memory_resource._initialize
File "memory_resource.pyx", line 883, in rmm._lib.memory_resource._initialize
File "memory_resource.pyx", line 342, in rmm._lib.memory_resource.PoolMemoryResource.cinit
MemoryError: std::bad_alloc: out_of_memory: RMM failure at:/opt/rapids/rmm/include/rmm/mr/device/pool_memory_resource.hpp:192: Maximum pool size exceeded
2022-11-01 17:45:29,902 - distributed.diskutils - INFO - Found stale lock file and directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/test_dask/workdir/dask-worker-space/worker-xl_5j3ko', purging
2022-11-01 17:45:29,903 - distributed.diskutils - INFO - Found stale lock file and directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/test_dask/workdir/dask-worker-space/worker-z34pqbwj', purging
2022-11-01 17:45:29,903 - distributed.diskutils - INFO - Found stale lock file and directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/test_dask/workdir/dask-worker-space/worker-1sbum2th', purging
2022-11-01 17:45:29,904 - distributed.diskutils - INFO - Found stale lock file and directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/test_dask/workdir/dask-worker-space/worker-6stfchfp', purging
2022-11-01 17:45:29,904 - distributed.preloading - INFO - Import preload module: dask_cuda.initialize
2022-11-01 17:45:30,200 - distributed.utils - ERROR - std::bad_alloc: out_of_memory: RMM failure at:/opt/rapids/rmm/include/rmm/mr/device/pool_memory_resource.hpp:192: Maximum pool size exceeded
Traceback (most recent call last):
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/utils.py", line 693, in log_errors
yield
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/worker.py", line 3248, in plugin_add
result = plugin.setup(worker=self)
File "/usr/local/lib/python3.8/dist-packages/dask_cuda/utils.py", line 78, in setup
rmm.reinitialize(
File "/usr/local/lib/python3.8/dist-packages/rmm/rmm.py", line 85, in reinitialize
rmm.mr._initialize(
File "memory_resource.pyx", line 823, in rmm._lib.memory_resource._initialize
File "memory_resource.pyx", line 883, in rmm._lib.memory_resource._initialize
File "memory_resource.pyx", line 342, in rmm._lib.memory_resource.PoolMemoryResource.cinit
MemoryError: std::bad_alloc: out_of_memory: RMM failure at:/opt/rapids/rmm/include/rmm/mr/device/pool_memory_resource.hpp:192: Maximum pool size exceeded
2022-11-01 17:45:30,201 - distributed.nanny - ERROR - Failed to start worker
Traceback (most recent call last):
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/nanny.py", line 869, in run
await worker
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/core.py", line 299, in _
await self.start()
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/worker.py", line 1372, in start
raise plugins_exceptions[0]
File "/var/jenkins_home/.local/lib/python3.8/site-packages/distributed/worker.py", line 3248, in plugin_add
result = plugin.setup(worker=self)
File "/usr/local/lib/python3.8/dist-packages/dask_cuda/utils.py", line 78, in setup
rmm.reinitialize(
File "/usr/local/lib/python3.8/dist-packages/rmm/rmm.py", line 85, in reinitialize
rmm.mr._initialize(
File "memory_resource.pyx", line 823, in rmm._lib.memory_resource._initialize
File "memory_resource.pyx", line 883, in rmm._lib.memory_resource._initialize
File "memory_resource.pyx", line 342, in rmm._lib.memory_resource.PoolMemoryResource.cinit
MemoryError: std::bad_alloc: out_of_memory: RMM failure at:/opt/rapids/rmm/include/rmm/mr/device/pool_memory_resource.hpp:192: Maximum pool size exceeded
/usr/lib/python3.8/multiprocessing/resource_tracker.py:216: UserWarning: resource_tracker: There appear to be 48 leaked semaphore objects to clean up at shutdown
warnings.warn('resource_tracker: There appear to be %d '
=============================== warnings summary ===============================
../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/core.py:33
/usr/local/lib/python3.8/dist-packages/dask_cudf/core.py:33: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
DASK_VERSION = LooseVersion(dask.version)

.tox/test-gpu/lib/python3.8/site-packages/setuptools/_distutils/version.py:346: 34 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/setuptools/_distutils/version.py:346: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
other = LooseVersion(other)

tests/unit/test_dask_nvt.py: 6 warnings
tests/unit/workflow/test_workflow.py: 78 warnings
/var/jenkins_home/.local/lib/python3.8/site-packages/dask/base.py:1282: UserWarning: Running on a single-machine scheduler when a distributed client is active might lead to unexpected results.
warnings.warn(

tests/unit/test_dask_nvt.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 2 files did not have enough partitions to create 8 files.
warnings.warn(

tests/unit/test_dask_nvt.py::test_merlin_core_execution_managers
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/core/utils.py:431: UserWarning: Existing Dask-client object detected in the current context. New cuda cluster will not be deployed. Set force_new to True to ignore running clusters.
warnings.warn(

tests/unit/loader/test_tf_dataloader.py::test_horovod_multigpu
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 1 files did not have enough partitions to create 5 files.
warnings.warn(

tests/unit/loader/test_tf_dataloader.py: 2 warnings
tests/unit/loader/test_torch_dataloader.py: 12 warnings
tests/unit/workflow/test_workflow.py: 9 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 1 files did not have enough partitions to create 2 files.
warnings.warn(

tests/unit/ops/test_fill.py::test_fill_missing[True-True-parquet]
tests/unit/ops/test_fill.py::test_fill_missing[True-False-parquet]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
/var/jenkins_home/.local/lib/python3.8/site-packages/pandas/core/indexing.py:1732: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self._setitem_single_block(indexer, value, name)

tests/unit/ops/test_ops_schema.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/schema/tags.py:148: UserWarning: Compound tags like Tags.USER_ID have been deprecated and will be removed in a future version. Please use the atomic versions of these tags, like [<Tags.USER: 'user'>, <Tags.ID: 'id'>].
warnings.warn(

tests/unit/ops/test_ops_schema.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/schema/tags.py:148: UserWarning: Compound tags like Tags.ITEM_ID have been deprecated and will be removed in a future version. Please use the atomic versions of these tags, like [<Tags.ITEM: 'item'>, <Tags.ID: 'id'>].
warnings.warn(

tests/unit/workflow/test_cpu_workflow.py: 6 warnings
tests/unit/workflow/test_workflow.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 1 files did not have enough partitions to create 10 files.
warnings.warn(

tests/unit/workflow/test_workflow.py: 48 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 2 files did not have enough partitions to create 20 files.
warnings.warn(

tests/unit/workflow/test_workflow.py::test_parquet_output[True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_parquet_output[True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_parquet_output[True-None]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-None]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-None]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 2 files did not have enough partitions to create 4 files.
warnings.warn(

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html

---------- coverage: platform linux, python 3.8.10-final-0 -----------
Name Stmts Miss Cover

merlin/transforms/init.py 1 1 0%
merlin/transforms/ops/init.py 1 1 0%

TOTAL 2 2 0%

=========================== short test summary info ============================
SKIPPED [1] ../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/io/tests/test_s3.py:14: could not import 'moto': No module named 'moto'
SKIPPED [1] tests/unit/loader/test_tf_dataloader.py:609: Skipping test because of horovod missing MPI
===== 1 failed, 1434 passed, 2 skipped, 258 warnings in 1214.96s (0:20:14) =====
/usr/local/lib/python3.8/dist-packages/coverage/control.py:801: CoverageWarning: No data was collected. (no-data-collected)
self._warn("No data was collected.", slug="no-data-collected")
ERROR: InvocationError for command /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/bin/python -m pytest --cov-report term --cov merlin -rxs tests/unit (exited with code 1)
___________________________________ summary ____________________________________
ERROR: test-gpu: commands failed
Build step 'Execute shell' marked build as failure
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://github.com/gitapi/repos/NVIDIA-Merlin/NVTabular/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[nvtabular_tests] $ /bin/bash /tmp/jenkins18012770432816404158.sh

@benfred
Copy link
Member Author

benfred commented Nov 1, 2022

rerun tests

@nvidia-merlin-bot
Copy link
Contributor

Click to view CI Results
GitHub pull request #1694 of commit e0cfe203dade09a0b4e6280c0193e5f5197c03b2, no merge conflicts.
GitHub pull request #1694 of commit e0cfe203dade09a0b4e6280c0193e5f5197c03b2, no merge conflicts.
Running as SYSTEM
Setting status of e0cfe203dade09a0b4e6280c0193e5f5197c03b2 to PENDING with url http://10.20.17.181:8080/job/nvtabular_tests/4779/ and message: 'Build started for merge commit.'
Using context: Jenkins Unit Test Run
Building on master in workspace /var/jenkins_home/workspace/nvtabular_tests
using credential nvidia-merlin-bot
Cloning the remote Git repository
Cloning repository https://github.com/NVIDIA-Merlin/NVTabular.git
 > git init /var/jenkins_home/workspace/nvtabular_tests/nvtabular # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
 > git --version # timeout=10
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/pull/1694/*:refs/remotes/origin/pr/1694/* # timeout=10
 > git rev-parse e0cfe203dade09a0b4e6280c0193e5f5197c03b2^{commit} # timeout=10
Checking out Revision e0cfe203dade09a0b4e6280c0193e5f5197c03b2 (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e0cfe203dade09a0b4e6280c0193e5f5197c03b2 # timeout=10
Commit message: "use latest dataloader in tox.ini"
 > git rev-list --no-walk e0cfe203dade09a0b4e6280c0193e5f5197c03b2 # timeout=10
[nvtabular_tests] $ /bin/bash /tmp/jenkins527007882008695415.sh
GLOB sdist-make: /var/jenkins_home/workspace/nvtabular_tests/nvtabular/setup.py
test-gpu create: /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu
test-gpu installdeps: pytest, pytest-cov
WARNING: Discarding $PYTHONPATH from environment, to override specify PYTHONPATH in 'passenv' in your configuration.
test-gpu inst: /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/.tmp/package/1/nvtabular-1.6.0+8.ge0cfe203d.zip
WARNING: Discarding $PYTHONPATH from environment, to override specify PYTHONPATH in 'passenv' in your configuration.
test-gpu installed: absl-py==1.2.0,aiohttp==3.8.1,aiosignal==1.2.0,alabaster==0.7.12,alembic==1.8.1,anyio==3.6.1,argon2-cffi==21.3.0,argon2-cffi-bindings==21.2.0,astroid==2.5.6,asttokens==2.0.8,astunparse==1.6.3,asv==0.5.1,asvdb==0.4.2,async-timeout==4.0.2,attrs==22.1.0,autopage==0.5.1,awscli==1.26.5,Babel==2.10.3,backcall==0.2.0,beautifulsoup4==4.11.1,betterproto==1.2.5,black==22.6.0,bleach==5.0.1,boto3==1.24.75,botocore==1.28.5,Brotli==1.0.9,cachetools==5.2.0,certifi==2019.11.28,cffi==1.15.1,chardet==3.0.4,charset-normalizer==2.1.1,clang==5.0,click==8.1.3,cliff==4.0.0,cloudpickle==2.2.0,cmaes==0.8.2,cmake==3.24.1.1,cmd2==2.4.2,colorama==0.4.4,colorlog==6.7.0,contourpy==1.0.5,coverage==6.5.0,cuda-python==11.7.1,cupy-cuda117==10.6.0,cycler==0.11.0,Cython==0.29.32,dask==2022.1.1,dbus-python==1.2.16,debugpy==1.6.3,decorator==5.1.1,defusedxml==0.7.1,dill==0.3.5.1,distlib==0.3.6,distributed==2022.5.1,distro==1.7.0,dm-tree==0.1.6,docker-pycreds==0.4.0,docutils==0.16,emoji==1.7.0,entrypoints==0.4,execnet==1.9.0,executing==1.0.0,faiss==1.7.2,faiss-gpu==1.7.2,fastai==2.7.9,fastapi==0.85.0,fastavro==1.6.1,fastcore==1.5.27,fastdownload==0.0.7,fastjsonschema==2.16.1,fastprogress==1.0.3,fastrlock==0.8,feast==0.19.4,fiddle==0.2.2,filelock==3.8.0,flatbuffers==1.12,fonttools==4.37.3,frozenlist==1.3.1,fsspec==2022.5.0,gast==0.4.0,gevent==21.12.0,geventhttpclient==2.0.2,gitdb==4.0.9,GitPython==3.1.27,google==3.0.0,google-api-core==2.10.1,google-auth==2.11.1,google-auth-oauthlib==0.4.6,google-pasta==0.2.0,googleapis-common-protos==1.52.0,graphviz==0.20.1,greenlet==1.1.3,grpcio==1.41.0,grpcio-channelz==1.49.0,grpcio-reflection==1.48.1,grpclib==0.4.3,h11==0.13.0,h2==4.1.0,h5py==3.7.0,HeapDict==1.0.1,horovod==0.26.1,hpack==4.0.0,httptools==0.5.0,hugectr2onnx==0.0.0,huggingface-hub==0.9.1,hyperframe==6.0.1,idna==2.8,imagesize==1.4.1,implicit==0.6.1,importlib-metadata==4.12.0,importlib-resources==5.9.0,iniconfig==1.1.1,ipykernel==6.15.3,ipython==8.5.0,ipython-genutils==0.2.0,ipywidgets==7.7.0,jedi==0.18.1,Jinja2==3.1.2,jmespath==1.0.1,joblib==1.2.0,json5==0.9.10,jsonschema==4.16.0,jupyter-cache==0.4.3,jupyter-core==4.11.1,jupyter-server==1.18.1,jupyter-server-mathjax==0.2.5,jupyter-sphinx==0.3.2,jupyter_client==7.3.5,jupyterlab==3.4.7,jupyterlab-pygments==0.2.2,jupyterlab-widgets==1.1.0,jupyterlab_server==2.15.1,keras==2.9.0,Keras-Preprocessing==1.1.2,kiwisolver==1.4.4,lazy-object-proxy==1.8.0,libclang==14.0.6,libcst==0.4.7,lightfm==1.16,lightgbm==3.3.2,linkify-it-py==1.0.3,llvmlite==0.39.1,locket==1.0.0,lxml==4.9.1,Mako==1.2.3,Markdown==3.4.1,markdown-it-py==1.1.0,MarkupSafe==2.1.1,matplotlib==3.6.0,matplotlib-inline==0.1.6,mdit-py-plugins==0.2.8,merlin-core==0.6.0+1.g5926fcf,merlin-dataloader==0.0.2,merlin-models==0.7.0+11.g280956aa4,merlin-systems==0.5.0+4.g15074ad,mistune==2.0.4,mmh3==3.0.0,mpi4py==3.1.3,msgpack==1.0.4,multidict==6.0.2,mypy-extensions==0.4.3,myst-nb==0.13.2,myst-parser==0.15.2,natsort==8.1.0,nbclassic==0.4.3,nbclient==0.6.8,nbconvert==7.0.0,nbdime==3.1.1,nbformat==5.5.0,nest-asyncio==1.5.5,ninja==1.10.2.3,notebook==6.4.12,notebook-shim==0.1.0,numba==0.56.2,numpy==1.22.4,nvidia-pyindex==1.0.9,-e git+https://github.com/NVIDIA-Merlin/NVTabular.git@e0cfe203dade09a0b4e6280c0193e5f5197c03b2#egg=nvtabular,nvtx==0.2.5,oauthlib==3.2.1,oldest-supported-numpy==2022.8.16,onnx==1.12.0,onnxruntime==1.11.1,opt-einsum==3.3.0,optuna==3.0.3,packaging==21.3,pandas==1.3.5,pandavro==1.5.2,pandocfilters==1.5.0,parso==0.8.3,partd==1.3.0,pathtools==0.1.2,pbr==5.11.0,pexpect==4.8.0,pickleshare==0.7.5,Pillow==9.2.0,pkgutil_resolve_name==1.3.10,platformdirs==2.5.2,plotly==5.11.0,pluggy==1.0.0,prettytable==3.5.0,prometheus-client==0.14.1,promise==2.3,prompt-toolkit==3.0.31,proto-plus==1.19.6,protobuf==3.19.5,psutil==5.9.2,ptyprocess==0.7.0,pure-eval==0.2.2,py==1.11.0,pyarrow==7.0.0,pyasn1==0.4.8,pyasn1-modules==0.2.8,pybind11==2.10.0,pycparser==2.21,pydantic==1.10.2,pydot==1.4.2,Pygments==2.13.0,PyGObject==3.36.0,pynvml==11.4.1,pyparsing==3.0.9,pyperclip==1.8.2,pyrsistent==0.18.1,pytest==7.1.3,pytest-cov==4.0.0,pytest-xdist==3.0.2,python-apt==2.0.0+ubuntu0.20.4.8,python-dateutil==2.8.2,python-dotenv==0.21.0,python-rapidjson==1.8,pytz==2022.2.1,PyYAML==5.4.1,pyzmq==24.0.0,regex==2022.9.13,requests==2.22.0,requests-oauthlib==1.3.1,requests-unixsocket==0.2.0,rsa==4.7.2,s3fs==2022.2.0,s3transfer==0.6.0,sacremoses==0.0.53,scikit-build==0.15.0,scikit-learn==1.1.2,scipy==1.8.1,seedir==0.3.0,Send2Trash==1.8.0,sentry-sdk==1.9.8,setproctitle==1.3.2,setuptools-scm==7.0.5,shortuuid==1.0.9,six==1.15.0,sklearn==0.0,smmap==5.0.0,sniffio==1.3.0,snowballstemmer==2.2.0,sortedcontainers==2.4.0,soupsieve==2.3.2.post1,Sphinx==5.3.0,sphinx-multiversion==0.2.4,sphinx-togglebutton==0.3.1,sphinx_external_toc==0.3.0,sphinxcontrib-applehelp==1.0.2,sphinxcontrib-copydirs @ git+https://github.com/mikemckiernan/sphinxcontrib-copydirs.git@bd8c5d79b3f91cf5f1bb0d6995aeca3fe84b670e,sphinxcontrib-devhelp==1.0.2,sphinxcontrib-htmlhelp==2.0.0,sphinxcontrib-jsmath==1.0.1,sphinxcontrib-qthelp==1.0.3,sphinxcontrib-serializinghtml==1.1.5,SQLAlchemy==1.4.42,stack-data==0.5.0,starlette==0.20.4,stevedore==4.1.0,stringcase==1.2.0,supervisor==4.1.0,tabulate==0.8.10,tblib==1.7.0,tdqm==0.0.1,tenacity==8.0.1,tensorboard==2.9.1,tensorboard-data-server==0.6.1,tensorboard-plugin-wit==1.8.1,tensorflow==2.9.2,tensorflow-estimator==2.9.0,tensorflow-gpu==2.9.2,tensorflow-io-gcs-filesystem==0.27.0,tensorflow-metadata==1.10.0,termcolor==2.0.1,terminado==0.15.0,testbook==0.4.2,threadpoolctl==3.1.0,tinycss2==1.1.1,tokenizers==0.10.3,toml==0.10.2,tomli==2.0.1,toolz==0.12.0,torch==1.12.1+cu113,torchmetrics==0.3.2,tornado==6.2,tox==3.26.0,tqdm==4.64.1,traitlets==5.4.0,transformers==4.12.0,transformers4rec==0.1.12+2.gbcc939255,treelite==2.3.0,treelite-runtime==2.3.0,tritonclient==2.25.0,typing-inspect==0.8.0,typing_extensions==4.3.0,uc-micro-py==1.0.1,urllib3==1.26.12,uvicorn==0.18.3,uvloop==0.17.0,versioneer==0.20,virtualenv==20.16.5,wandb==0.13.3,watchfiles==0.17.0,wcwidth==0.2.5,webencodings==0.5.1,websocket-client==1.4.1,websockets==10.3,Werkzeug==2.2.2,widgetsnbextension==3.6.0,wrapt==1.12.1,xgboost==1.6.2,yarl==1.8.1,zict==2.2.0,zipp==3.8.1,zope.event==4.5.0,zope.interface==5.4.0
test-gpu run-test-pre: PYTHONHASHSEED='1222710459'
test-gpu run-test: commands[0] | python -m pip install --upgrade git+https://github.com/NVIDIA-Merlin/core.git
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Collecting git+https://github.com/NVIDIA-Merlin/core.git
  Cloning https://github.com/NVIDIA-Merlin/core.git to /tmp/pip-req-build-gmeswhis
  Running command git clone --filter=blob:none --quiet https://github.com/NVIDIA-Merlin/core.git /tmp/pip-req-build-gmeswhis
  Resolved https://github.com/NVIDIA-Merlin/core.git to commit eda153c663aa864da66927c7a0a9d4e64c073120
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Requirement already satisfied: betterproto<2.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (1.2.5)
Requirement already satisfied: pandas<1.4.0dev0,>=1.2.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+4.geda153c) (1.3.5)
Requirement already satisfied: packaging in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (21.3)
Requirement already satisfied: tensorflow-metadata>=1.2.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (1.10.0)
Requirement already satisfied: tqdm>=4.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (4.64.1)
Requirement already satisfied: pyarrow>=5.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (7.0.0)
Requirement already satisfied: dask>=2022.3.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+4.geda153c) (2022.3.0)
Requirement already satisfied: protobuf>=3.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (3.19.5)
Requirement already satisfied: numba>=0.54 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+4.geda153c) (0.55.1)
Requirement already satisfied: fsspec==2022.5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+4.geda153c) (2022.5.0)
Requirement already satisfied: distributed>=2022.3.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+4.geda153c) (2022.3.0)
Requirement already satisfied: stringcase in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (1.2.0)
Requirement already satisfied: grpclib in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (0.4.3)
Requirement already satisfied: toolz>=0.8.2 in /usr/local/lib/python3.8/dist-packages (from dask>=2022.3.0->merlin-core==0.8.0+4.geda153c) (0.12.0)
Requirement already satisfied: partd>=0.3.10 in /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg (from dask>=2022.3.0->merlin-core==0.8.0+4.geda153c) (1.2.0)
Requirement already satisfied: cloudpickle>=1.1.1 in /usr/local/lib/python3.8/dist-packages (from dask>=2022.3.0->merlin-core==0.8.0+4.geda153c) (2.2.0)
Requirement already satisfied: pyyaml>=5.3.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg (from dask>=2022.3.0->merlin-core==0.8.0+4.geda153c) (5.4.1)
Requirement already satisfied: tblib>=1.6.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (1.7.0)
Requirement already satisfied: psutil>=5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (5.8.0)
Requirement already satisfied: tornado>=6.0.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (6.1)
Requirement already satisfied: zict>=0.1.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (2.0.0)
Requirement already satisfied: jinja2 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (3.1.2)
Requirement already satisfied: sortedcontainers!=2.0.0,!=2.0.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (2.4.0)
Requirement already satisfied: msgpack>=0.6.0 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (1.0.4)
Requirement already satisfied: click>=6.6 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (8.1.3)
Requirement already satisfied: setuptools in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core==0.8.0+4.geda153c) (65.4.1)
Requirement already satisfied: llvmlite<0.39,>=0.38.0rc1 in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core==0.8.0+4.geda153c) (0.38.1)
Requirement already satisfied: numpy<1.22,>=1.18 in /var/jenkins_home/.local/lib/python3.8/site-packages (from numba>=0.54->merlin-core==0.8.0+4.geda153c) (1.20.3)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /usr/local/lib/python3.8/dist-packages (from packaging->merlin-core==0.8.0+4.geda153c) (3.0.9)
Requirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core==0.8.0+4.geda153c) (2.8.2)
Requirement already satisfied: pytz>=2017.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core==0.8.0+4.geda153c) (2022.2.1)
Requirement already satisfied: googleapis-common-protos<2,>=1.52.0 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core==0.8.0+4.geda153c) (1.52.0)
Requirement already satisfied: absl-py<2.0.0,>=0.9 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core==0.8.0+4.geda153c) (1.2.0)
Requirement already satisfied: locket in /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg (from partd>=0.3.10->dask>=2022.3.0->merlin-core==0.8.0+4.geda153c) (0.2.1)
Requirement already satisfied: six>=1.5 in /var/jenkins_home/.local/lib/python3.8/site-packages (from python-dateutil>=2.7.3->pandas<1.4.0dev0,>=1.2.0->merlin-core==0.8.0+4.geda153c) (1.15.0)
Requirement already satisfied: heapdict in /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg (from zict>=0.1.3->distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (1.0.1)
Requirement already satisfied: h2<5,>=3.1.0 in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (4.1.0)
Requirement already satisfied: multidict in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (6.0.2)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.8/dist-packages (from jinja2->distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (2.1.1)
Requirement already satisfied: hpack<5,>=4.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (4.0.0)
Requirement already satisfied: hyperframe<7,>=6.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (6.0.1)
Building wheels for collected packages: merlin-core
  Building wheel for merlin-core (pyproject.toml): started
  Building wheel for merlin-core (pyproject.toml): finished with status 'done'
  Created wheel for merlin-core: filename=merlin_core-0.8.0+4.geda153c-py3-none-any.whl size=118258 sha256=546f3a6657442304b59c952b08e6da182f135631a988a9a524005556bf853651
  Stored in directory: /tmp/pip-ephem-wheel-cache-irsln_34/wheels/c8/38/16/a6968787eafcec5fa772148af8408b089562f71af0752e8e84
Successfully built merlin-core
Installing collected packages: merlin-core
  Attempting uninstall: merlin-core
    Found existing installation: merlin-core 0.3.0+12.g78ecddd
    Not uninstalling merlin-core at /var/jenkins_home/.local/lib/python3.8/site-packages, outside environment /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu
    Can't uninstall 'merlin-core'. No files were found to uninstall.
Successfully installed merlin-core-0.8.0+4.geda153c

[notice] A new release of pip available: 22.2.2 -> 22.3
[notice] To update, run: pip install --upgrade pip
test-gpu run-test: commands[1] | python -m pip install --upgrade git+https://github.com/NVIDIA-Merlin/dataloader.git
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Collecting git+https://github.com/NVIDIA-Merlin/dataloader.git
Cloning https://github.com/NVIDIA-Merlin/dataloader.git to /tmp/pip-req-build-3dvt36c8
Running command git clone --filter=blob:none --quiet https://github.com/NVIDIA-Merlin/dataloader.git /tmp/pip-req-build-3dvt36c8
Resolved https://github.com/NVIDIA-Merlin/dataloader.git to commit 5905283777ff5ebd748a1c91b7c9fde5710ae775
Installing build dependencies: started
Installing build dependencies: finished with status 'done'
Getting requirements to build wheel: started
Getting requirements to build wheel: finished with status 'done'
Preparing metadata (pyproject.toml): started
Preparing metadata (pyproject.toml): finished with status 'done'
Requirement already satisfied: merlin-core in ./.tox/test-gpu/lib/python3.8/site-packages (from merlin-dataloader==0.0.2+1.g5905283) (0.8.0+4.geda153c)
Requirement already satisfied: betterproto<2.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.2.5)
Requirement already satisfied: pandas<1.4.0dev0,>=1.2.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.3.5)
Requirement already satisfied: packaging in /usr/local/lib/python3.8/dist-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (21.3)
Requirement already satisfied: tensorflow-metadata>=1.2.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.10.0)
Requirement already satisfied: tqdm>=4.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (4.64.1)
Requirement already satisfied: pyarrow>=5.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (7.0.0)
Requirement already satisfied: dask>=2022.3.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (2022.3.0)
Requirement already satisfied: protobuf>=3.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (3.19.5)
Requirement already satisfied: numba>=0.54 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (0.55.1)
Requirement already satisfied: fsspec==2022.5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (2022.5.0)
Requirement already satisfied: distributed>=2022.3.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (2022.3.0)
Requirement already satisfied: stringcase in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.2.0)
Requirement already satisfied: grpclib in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (0.4.3)
Requirement already satisfied: toolz>=0.8.2 in /usr/local/lib/python3.8/dist-packages (from dask>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (0.12.0)
Requirement already satisfied: partd>=0.3.10 in /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg (from dask>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.2.0)
Requirement already satisfied: cloudpickle>=1.1.1 in /usr/local/lib/python3.8/dist-packages (from dask>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (2.2.0)
Requirement already satisfied: pyyaml>=5.3.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg (from dask>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (5.4.1)
Requirement already satisfied: tblib>=1.6.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg (from distributed>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.7.0)
Requirement already satisfied: psutil>=5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg (from distributed>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (5.8.0)
Requirement already satisfied: tornado>=6.0.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg (from distributed>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (6.1)
Requirement already satisfied: zict>=0.1.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg (from distributed>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (2.0.0)
Requirement already satisfied: jinja2 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (3.1.2)
Requirement already satisfied: sortedcontainers!=2.0.0,!=2.0.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg (from distributed>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (2.4.0)
Requirement already satisfied: msgpack>=0.6.0 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.0.4)
Requirement already satisfied: click>=6.6 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (8.1.3)
Requirement already satisfied: setuptools in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (65.4.1)
Requirement already satisfied: llvmlite<0.39,>=0.38.0rc1 in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (0.38.1)
Requirement already satisfied: numpy<1.22,>=1.18 in /var/jenkins_home/.local/lib/python3.8/site-packages (from numba>=0.54->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.20.3)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /usr/local/lib/python3.8/dist-packages (from packaging->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (3.0.9)
Requirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (2.8.2)
Requirement already satisfied: pytz>=2017.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (2022.2.1)
Requirement already satisfied: googleapis-common-protos<2,>=1.52.0 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.52.0)
Requirement already satisfied: absl-py<2.0.0,>=0.9 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.2.0)
Requirement already satisfied: locket in /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg (from partd>=0.3.10->dask>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (0.2.1)
Requirement already satisfied: six>=1.5 in /var/jenkins_home/.local/lib/python3.8/site-packages (from python-dateutil>=2.7.3->pandas<1.4.0dev0,>=1.2.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.15.0)
Requirement already satisfied: heapdict in /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg (from zict>=0.1.3->distributed>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.0.1)
Requirement already satisfied: h2<5,>=3.1.0 in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (4.1.0)
Requirement already satisfied: multidict in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (6.0.2)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.8/dist-packages (from jinja2->distributed>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (2.1.1)
Requirement already satisfied: hpack<5,>=4.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (4.0.0)
Requirement already satisfied: hyperframe<7,>=6.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (6.0.1)
Building wheels for collected packages: merlin-dataloader
Building wheel for merlin-dataloader (pyproject.toml): started
Building wheel for merlin-dataloader (pyproject.toml): finished with status 'done'
Created wheel for merlin-dataloader: filename=merlin_dataloader-0.0.2+1.g5905283-py3-none-any.whl size=31625 sha256=7972c4e8f7e2743b9681f26cd0f5e65c3e822c4fff49a68a85fea327ab27157a
Stored in directory: /tmp/pip-ephem-wheel-cache-jhic9ubu/wheels/de/f5/d9/251909f4627d2920fb15548f5ffd6daf1bf24c3c56bb4977b1
Successfully built merlin-dataloader
Installing collected packages: merlin-dataloader
Attempting uninstall: merlin-dataloader
Found existing installation: merlin-dataloader 0.0.2
Uninstalling merlin-dataloader-0.0.2:
Successfully uninstalled merlin-dataloader-0.0.2
Successfully installed merlin-dataloader-0.0.2+1.g5905283

[notice] A new release of pip available: 22.2.2 -> 22.3
[notice] To update, run: pip install --upgrade pip
test-gpu run-test: commands[2] | python -m pytest --cov-report term --cov merlin -rxs tests/unit
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-7.1.3, pluggy-1.0.0
cachedir: .tox/test-gpu/.pytest_cache
rootdir: /var/jenkins_home/workspace/nvtabular_tests/nvtabular, configfile: pyproject.toml
plugins: anyio-3.5.0, xdist-3.0.2, cov-4.0.0
collected 1436 items / 1 skipped

tests/unit/test_dask_nvt.py ............................................ [ 3%]
........................................................................ [ 8%]
.... [ 8%]
tests/unit/test_notebooks.py .... [ 8%]
tests/unit/test_tf4rec.py . [ 8%]
tests/unit/test_tools.py ...................... [ 10%]
tests/unit/test_triton_inference.py ................................ [ 12%]
tests/unit/examples/test_01-Getting-started.py . [ 12%]
tests/unit/examples/test_02-Advanced-NVTabular-workflow.py . [ 12%]
tests/unit/examples/test_03-Running-on-multiple-GPUs-or-on-CPU.py . [ 12%]
tests/unit/framework_utils/test_tf_feature_columns.py . [ 12%]
tests/unit/framework_utils/test_tf_layers.py ........................... [ 14%]
..................................................F [ 18%]
tests/unit/framework_utils/test_torch_layers.py . [ 18%]
tests/unit/loader/test_tf_dataloader.py ................................ [ 20%]
........................................s.. [ 23%]
tests/unit/loader/test_torch_dataloader.py ............................. [ 25%]
..................................................... [ 29%]
tests/unit/ops/test_categorify.py ...................................... [ 31%]
........................................................................ [ 36%]
..................................................... [ 40%]
tests/unit/ops/test_column_similarity.py ........................ [ 42%]
tests/unit/ops/test_drop_low_cardinality.py .. [ 42%]
tests/unit/ops/test_fill.py ............................................ [ 45%]
........ [ 45%]
tests/unit/ops/test_groupyby.py ....................... [ 47%]
tests/unit/ops/test_hash_bucket.py ......................... [ 49%]
tests/unit/ops/test_join.py ............................................ [ 52%]
........................................................................ [ 57%]
.................................. [ 59%]
tests/unit/ops/test_lambda.py .......... [ 60%]
tests/unit/ops/test_normalize.py ....................................... [ 63%]
.. [ 63%]
tests/unit/ops/test_ops.py ............................................. [ 66%]
.................... [ 67%]
tests/unit/ops/test_ops_schema.py ...................................... [ 70%]
........................................................................ [ 75%]
........................................................................ [ 80%]
........................................................................ [ 85%]
....................................... [ 88%]
tests/unit/ops/test_reduce_dtype_size.py .. [ 88%]
tests/unit/ops/test_target_encode.py ..................... [ 89%]
tests/unit/workflow/test_cpu_workflow.py ...... [ 90%]
tests/unit/workflow/test_workflow.py ................................... [ 92%]
.......................................................... [ 96%]
tests/unit/workflow/test_workflow_chaining.py ... [ 96%]
tests/unit/workflow/test_workflow_node.py ........... [ 97%]
tests/unit/workflow/test_workflow_ops.py ... [ 97%]
tests/unit/workflow/test_workflow_schemas.py ........................... [ 99%]
... [100%]

=================================== FAILURES ===================================
___________________________ test_multihot_empty_rows ___________________________

def test_multihot_empty_rows():
    multi_hot = tf.feature_column.categorical_column_with_identity("multihot", 5)
    multi_hot_embedding = tf.feature_column.embedding_column(multi_hot, 8, combiner="sum")

    embedding_layer = layers.DenseFeatures([multi_hot_embedding])
    inputs = {
        "multihot": (
            tf.keras.Input(name="multihot__values", shape=(1,), dtype=tf.int64),
            tf.keras.Input(name="multihot__nnzs", shape=(1,), dtype=tf.int64),
        )
    }
    output = embedding_layer(inputs)

    model = tf.keras.Model(inputs=inputs, outputs=output)
    model.compile("sgd", "binary_crossentropy")

    multi_hot_values = np.array([0, 2, 1, 4, 1, 3, 1])
    multi_hot_nnzs = np.array([1, 0, 2, 4, 0])
    x = {"multihot": (multi_hot_values[:, None], multi_hot_nnzs[:, None])}

    multi_hot_embedding_table = embedding_layer.embedding_tables["multihot"].numpy()
    multi_hot_embedding_rows = _compute_expected_multi_hot(
        multi_hot_embedding_table, multi_hot_values, multi_hot_nnzs, "sum"
    )

    y_hat = model(x).numpy()
  np.testing.assert_allclose(y_hat, multi_hot_embedding_rows, rtol=1e-06)

E AssertionError:
E Not equal to tolerance rtol=1e-06, atol=0
E
E Mismatched elements: 1 / 40 (2.5%)
E Max absolute difference: 2.3841858e-07
E Max relative difference: 1.8716112e-06
E x: array([[-0.297748, -0.141842, -0.083815, -0.52647 , -0.371137, 0.076028,
E 0.19368 , -0.479709],
E [ 0. , 0. , 0. , 0. , 0. , 0. ,...
E y: array([[-0.297748, -0.141842, -0.083815, -0.52647 , -0.371137, 0.076028,
E 0.19368 , -0.479709],
E [ 0. , 0. , 0. , 0. , 0. , 0. ,...

tests/unit/framework_utils/test_tf_layers.py:321: AssertionError
=============================== warnings summary ===============================
../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/core.py:33
/usr/local/lib/python3.8/dist-packages/dask_cudf/core.py:33: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
DASK_VERSION = LooseVersion(dask.version)

.tox/test-gpu/lib/python3.8/site-packages/setuptools/_distutils/version.py:346: 34 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/setuptools/_distutils/version.py:346: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
other = LooseVersion(other)

tests/unit/test_dask_nvt.py: 6 warnings
tests/unit/workflow/test_workflow.py: 78 warnings
/var/jenkins_home/.local/lib/python3.8/site-packages/dask/base.py:1282: UserWarning: Running on a single-machine scheduler when a distributed client is active might lead to unexpected results.
warnings.warn(

tests/unit/test_dask_nvt.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 2 files did not have enough partitions to create 8 files.
warnings.warn(

tests/unit/test_dask_nvt.py::test_merlin_core_execution_managers
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/core/utils.py:431: UserWarning: Existing Dask-client object detected in the current context. New cuda cluster will not be deployed. Set force_new to True to ignore running clusters.
warnings.warn(

tests/unit/loader/test_tf_dataloader.py::test_horovod_multigpu
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 1 files did not have enough partitions to create 5 files.
warnings.warn(

tests/unit/loader/test_tf_dataloader.py: 2 warnings
tests/unit/loader/test_torch_dataloader.py: 12 warnings
tests/unit/workflow/test_workflow.py: 9 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 1 files did not have enough partitions to create 2 files.
warnings.warn(

tests/unit/ops/test_fill.py::test_fill_missing[True-True-parquet]
tests/unit/ops/test_fill.py::test_fill_missing[True-False-parquet]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
/var/jenkins_home/.local/lib/python3.8/site-packages/pandas/core/indexing.py:1732: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self._setitem_single_block(indexer, value, name)

tests/unit/ops/test_ops_schema.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/schema/tags.py:148: UserWarning: Compound tags like Tags.USER_ID have been deprecated and will be removed in a future version. Please use the atomic versions of these tags, like [<Tags.USER: 'user'>, <Tags.ID: 'id'>].
warnings.warn(

tests/unit/ops/test_ops_schema.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/schema/tags.py:148: UserWarning: Compound tags like Tags.ITEM_ID have been deprecated and will be removed in a future version. Please use the atomic versions of these tags, like [<Tags.ITEM: 'item'>, <Tags.ID: 'id'>].
warnings.warn(

tests/unit/workflow/test_cpu_workflow.py: 6 warnings
tests/unit/workflow/test_workflow.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 1 files did not have enough partitions to create 10 files.
warnings.warn(

tests/unit/workflow/test_workflow.py: 48 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 2 files did not have enough partitions to create 20 files.
warnings.warn(

tests/unit/workflow/test_workflow.py::test_parquet_output[True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_parquet_output[True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_parquet_output[True-None]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-None]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-None]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 2 files did not have enough partitions to create 4 files.
warnings.warn(

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html

---------- coverage: platform linux, python 3.8.10-final-0 -----------
Name Stmts Miss Cover

merlin/transforms/init.py 1 1 0%
merlin/transforms/ops/init.py 1 1 0%

TOTAL 2 2 0%

=========================== short test summary info ============================
SKIPPED [1] ../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/io/tests/test_s3.py:14: could not import 'moto': No module named 'moto'
SKIPPED [1] tests/unit/loader/test_tf_dataloader.py:609: Skipping test because of horovod missing MPI
===== 1 failed, 1434 passed, 2 skipped, 258 warnings in 1139.97s (0:18:59) =====
/usr/local/lib/python3.8/dist-packages/coverage/control.py:801: CoverageWarning: No data was collected. (no-data-collected)
self._warn("No data was collected.", slug="no-data-collected")
ERROR: InvocationError for command /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/bin/python -m pytest --cov-report term --cov merlin -rxs tests/unit (exited with code 1)
___________________________________ summary ____________________________________
ERROR: test-gpu: commands failed
Build step 'Execute shell' marked build as failure
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://github.com/gitapi/repos/NVIDIA-Merlin/NVTabular/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[nvtabular_tests] $ /bin/bash /tmp/jenkins2909350137158316566.sh

@benfred
Copy link
Member Author

benfred commented Nov 1, 2022

rerun tests

@nvidia-merlin-bot
Copy link
Contributor

Click to view CI Results
GitHub pull request #1694 of commit e0cfe203dade09a0b4e6280c0193e5f5197c03b2, no merge conflicts.
GitHub pull request #1694 of commit e0cfe203dade09a0b4e6280c0193e5f5197c03b2, no merge conflicts.
Running as SYSTEM
Setting status of e0cfe203dade09a0b4e6280c0193e5f5197c03b2 to PENDING with url http://10.20.17.181:8080/job/nvtabular_tests/4780/ and message: 'Build started for merge commit.'
Using context: Jenkins Unit Test Run
Building on master in workspace /var/jenkins_home/workspace/nvtabular_tests
using credential nvidia-merlin-bot
Cloning the remote Git repository
Cloning repository https://github.com/NVIDIA-Merlin/NVTabular.git
 > git init /var/jenkins_home/workspace/nvtabular_tests/nvtabular # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
 > git --version # timeout=10
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/pull/1694/*:refs/remotes/origin/pr/1694/* # timeout=10
 > git rev-parse e0cfe203dade09a0b4e6280c0193e5f5197c03b2^{commit} # timeout=10
Checking out Revision e0cfe203dade09a0b4e6280c0193e5f5197c03b2 (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e0cfe203dade09a0b4e6280c0193e5f5197c03b2 # timeout=10
Commit message: "use latest dataloader in tox.ini"
 > git rev-list --no-walk e0cfe203dade09a0b4e6280c0193e5f5197c03b2 # timeout=10
[nvtabular_tests] $ /bin/bash /tmp/jenkins8200214639953133814.sh
GLOB sdist-make: /var/jenkins_home/workspace/nvtabular_tests/nvtabular/setup.py
test-gpu create: /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu
test-gpu installdeps: pytest, pytest-cov
WARNING: Discarding $PYTHONPATH from environment, to override specify PYTHONPATH in 'passenv' in your configuration.
test-gpu inst: /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/.tmp/package/1/nvtabular-1.6.0+8.ge0cfe203d.zip
WARNING: Discarding $PYTHONPATH from environment, to override specify PYTHONPATH in 'passenv' in your configuration.
test-gpu installed: absl-py==1.2.0,aiohttp==3.8.1,aiosignal==1.2.0,alabaster==0.7.12,alembic==1.8.1,anyio==3.6.1,argon2-cffi==21.3.0,argon2-cffi-bindings==21.2.0,astroid==2.5.6,asttokens==2.0.8,astunparse==1.6.3,asv==0.5.1,asvdb==0.4.2,async-timeout==4.0.2,attrs==22.1.0,autopage==0.5.1,awscli==1.26.5,Babel==2.10.3,backcall==0.2.0,beautifulsoup4==4.11.1,betterproto==1.2.5,black==22.6.0,bleach==5.0.1,boto3==1.24.75,botocore==1.28.5,Brotli==1.0.9,cachetools==5.2.0,certifi==2019.11.28,cffi==1.15.1,chardet==3.0.4,charset-normalizer==2.1.1,clang==5.0,click==8.1.3,cliff==4.0.0,cloudpickle==2.2.0,cmaes==0.8.2,cmake==3.24.1.1,cmd2==2.4.2,colorama==0.4.4,colorlog==6.7.0,contourpy==1.0.5,coverage==6.5.0,cuda-python==11.7.1,cupy-cuda117==10.6.0,cycler==0.11.0,Cython==0.29.32,dask==2022.1.1,dbus-python==1.2.16,debugpy==1.6.3,decorator==5.1.1,defusedxml==0.7.1,dill==0.3.5.1,distlib==0.3.6,distributed==2022.5.1,distro==1.7.0,dm-tree==0.1.6,docker-pycreds==0.4.0,docutils==0.16,emoji==1.7.0,entrypoints==0.4,execnet==1.9.0,executing==1.0.0,faiss==1.7.2,faiss-gpu==1.7.2,fastai==2.7.9,fastapi==0.85.0,fastavro==1.6.1,fastcore==1.5.27,fastdownload==0.0.7,fastjsonschema==2.16.1,fastprogress==1.0.3,fastrlock==0.8,feast==0.19.4,fiddle==0.2.2,filelock==3.8.0,flatbuffers==1.12,fonttools==4.37.3,frozenlist==1.3.1,fsspec==2022.5.0,gast==0.4.0,gevent==21.12.0,geventhttpclient==2.0.2,gitdb==4.0.9,GitPython==3.1.27,google==3.0.0,google-api-core==2.10.1,google-auth==2.11.1,google-auth-oauthlib==0.4.6,google-pasta==0.2.0,googleapis-common-protos==1.52.0,graphviz==0.20.1,greenlet==1.1.3,grpcio==1.41.0,grpcio-channelz==1.49.0,grpcio-reflection==1.48.1,grpclib==0.4.3,h11==0.13.0,h2==4.1.0,h5py==3.7.0,HeapDict==1.0.1,horovod==0.26.1,hpack==4.0.0,httptools==0.5.0,hugectr2onnx==0.0.0,huggingface-hub==0.9.1,hyperframe==6.0.1,idna==2.8,imagesize==1.4.1,implicit==0.6.1,importlib-metadata==4.12.0,importlib-resources==5.9.0,iniconfig==1.1.1,ipykernel==6.15.3,ipython==8.5.0,ipython-genutils==0.2.0,ipywidgets==7.7.0,jedi==0.18.1,Jinja2==3.1.2,jmespath==1.0.1,joblib==1.2.0,json5==0.9.10,jsonschema==4.16.0,jupyter-cache==0.4.3,jupyter-core==4.11.1,jupyter-server==1.18.1,jupyter-server-mathjax==0.2.5,jupyter-sphinx==0.3.2,jupyter_client==7.3.5,jupyterlab==3.4.7,jupyterlab-pygments==0.2.2,jupyterlab-widgets==1.1.0,jupyterlab_server==2.15.1,keras==2.9.0,Keras-Preprocessing==1.1.2,kiwisolver==1.4.4,lazy-object-proxy==1.8.0,libclang==14.0.6,libcst==0.4.7,lightfm==1.16,lightgbm==3.3.2,linkify-it-py==1.0.3,llvmlite==0.39.1,locket==1.0.0,lxml==4.9.1,Mako==1.2.3,Markdown==3.4.1,markdown-it-py==1.1.0,MarkupSafe==2.1.1,matplotlib==3.6.0,matplotlib-inline==0.1.6,mdit-py-plugins==0.2.8,merlin-core==0.6.0+1.g5926fcf,merlin-dataloader==0.0.2,merlin-models==0.7.0+11.g280956aa4,merlin-systems==0.5.0+4.g15074ad,mistune==2.0.4,mmh3==3.0.0,mpi4py==3.1.3,msgpack==1.0.4,multidict==6.0.2,mypy-extensions==0.4.3,myst-nb==0.13.2,myst-parser==0.15.2,natsort==8.1.0,nbclassic==0.4.3,nbclient==0.6.8,nbconvert==7.0.0,nbdime==3.1.1,nbformat==5.5.0,nest-asyncio==1.5.5,ninja==1.10.2.3,notebook==6.4.12,notebook-shim==0.1.0,numba==0.56.2,numpy==1.22.4,nvidia-pyindex==1.0.9,-e git+https://github.com/NVIDIA-Merlin/NVTabular.git@e0cfe203dade09a0b4e6280c0193e5f5197c03b2#egg=nvtabular,nvtx==0.2.5,oauthlib==3.2.1,oldest-supported-numpy==2022.8.16,onnx==1.12.0,onnxruntime==1.11.1,opt-einsum==3.3.0,optuna==3.0.3,packaging==21.3,pandas==1.3.5,pandavro==1.5.2,pandocfilters==1.5.0,parso==0.8.3,partd==1.3.0,pathtools==0.1.2,pbr==5.11.0,pexpect==4.8.0,pickleshare==0.7.5,Pillow==9.2.0,pkgutil_resolve_name==1.3.10,platformdirs==2.5.2,plotly==5.11.0,pluggy==1.0.0,prettytable==3.5.0,prometheus-client==0.14.1,promise==2.3,prompt-toolkit==3.0.31,proto-plus==1.19.6,protobuf==3.19.5,psutil==5.9.2,ptyprocess==0.7.0,pure-eval==0.2.2,py==1.11.0,pyarrow==7.0.0,pyasn1==0.4.8,pyasn1-modules==0.2.8,pybind11==2.10.0,pycparser==2.21,pydantic==1.10.2,pydot==1.4.2,Pygments==2.13.0,PyGObject==3.36.0,pynvml==11.4.1,pyparsing==3.0.9,pyperclip==1.8.2,pyrsistent==0.18.1,pytest==7.1.3,pytest-cov==4.0.0,pytest-xdist==3.0.2,python-apt==2.0.0+ubuntu0.20.4.8,python-dateutil==2.8.2,python-dotenv==0.21.0,python-rapidjson==1.8,pytz==2022.2.1,PyYAML==5.4.1,pyzmq==24.0.0,regex==2022.9.13,requests==2.22.0,requests-oauthlib==1.3.1,requests-unixsocket==0.2.0,rsa==4.7.2,s3fs==2022.2.0,s3transfer==0.6.0,sacremoses==0.0.53,scikit-build==0.15.0,scikit-learn==1.1.2,scipy==1.8.1,seedir==0.3.0,Send2Trash==1.8.0,sentry-sdk==1.9.8,setproctitle==1.3.2,setuptools-scm==7.0.5,shortuuid==1.0.9,six==1.15.0,sklearn==0.0,smmap==5.0.0,sniffio==1.3.0,snowballstemmer==2.2.0,sortedcontainers==2.4.0,soupsieve==2.3.2.post1,Sphinx==5.3.0,sphinx-multiversion==0.2.4,sphinx-togglebutton==0.3.1,sphinx_external_toc==0.3.0,sphinxcontrib-applehelp==1.0.2,sphinxcontrib-copydirs @ git+https://github.com/mikemckiernan/sphinxcontrib-copydirs.git@bd8c5d79b3f91cf5f1bb0d6995aeca3fe84b670e,sphinxcontrib-devhelp==1.0.2,sphinxcontrib-htmlhelp==2.0.0,sphinxcontrib-jsmath==1.0.1,sphinxcontrib-qthelp==1.0.3,sphinxcontrib-serializinghtml==1.1.5,SQLAlchemy==1.4.42,stack-data==0.5.0,starlette==0.20.4,stevedore==4.1.0,stringcase==1.2.0,supervisor==4.1.0,tabulate==0.8.10,tblib==1.7.0,tdqm==0.0.1,tenacity==8.0.1,tensorboard==2.9.1,tensorboard-data-server==0.6.1,tensorboard-plugin-wit==1.8.1,tensorflow==2.9.2,tensorflow-estimator==2.9.0,tensorflow-gpu==2.9.2,tensorflow-io-gcs-filesystem==0.27.0,tensorflow-metadata==1.10.0,termcolor==2.0.1,terminado==0.15.0,testbook==0.4.2,threadpoolctl==3.1.0,tinycss2==1.1.1,tokenizers==0.10.3,toml==0.10.2,tomli==2.0.1,toolz==0.12.0,torch==1.12.1+cu113,torchmetrics==0.3.2,tornado==6.2,tox==3.26.0,tqdm==4.64.1,traitlets==5.4.0,transformers==4.12.0,transformers4rec==0.1.12+2.gbcc939255,treelite==2.3.0,treelite-runtime==2.3.0,tritonclient==2.25.0,typing-inspect==0.8.0,typing_extensions==4.3.0,uc-micro-py==1.0.1,urllib3==1.26.12,uvicorn==0.18.3,uvloop==0.17.0,versioneer==0.20,virtualenv==20.16.5,wandb==0.13.3,watchfiles==0.17.0,wcwidth==0.2.5,webencodings==0.5.1,websocket-client==1.4.1,websockets==10.3,Werkzeug==2.2.2,widgetsnbextension==3.6.0,wrapt==1.12.1,xgboost==1.6.2,yarl==1.8.1,zict==2.2.0,zipp==3.8.1,zope.event==4.5.0,zope.interface==5.4.0
test-gpu run-test-pre: PYTHONHASHSEED='90574518'
test-gpu run-test: commands[0] | python -m pip install --upgrade git+https://github.com/NVIDIA-Merlin/core.git
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Collecting git+https://github.com/NVIDIA-Merlin/core.git
  Cloning https://github.com/NVIDIA-Merlin/core.git to /tmp/pip-req-build-u_utt79s
  Running command git clone --filter=blob:none --quiet https://github.com/NVIDIA-Merlin/core.git /tmp/pip-req-build-u_utt79s
  Resolved https://github.com/NVIDIA-Merlin/core.git to commit eda153c663aa864da66927c7a0a9d4e64c073120
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Requirement already satisfied: protobuf>=3.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (3.19.5)
Requirement already satisfied: packaging in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (21.3)
Requirement already satisfied: tqdm>=4.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (4.64.1)
Requirement already satisfied: pyarrow>=5.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (7.0.0)
Requirement already satisfied: tensorflow-metadata>=1.2.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (1.10.0)
Requirement already satisfied: fsspec==2022.5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+4.geda153c) (2022.5.0)
Requirement already satisfied: distributed>=2022.3.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+4.geda153c) (2022.3.0)
Requirement already satisfied: dask>=2022.3.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+4.geda153c) (2022.3.0)
Requirement already satisfied: betterproto<2.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.8.0+4.geda153c) (1.2.5)
Requirement already satisfied: numba>=0.54 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+4.geda153c) (0.55.1)
Requirement already satisfied: pandas<1.4.0dev0,>=1.2.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.8.0+4.geda153c) (1.3.5)
Requirement already satisfied: grpclib in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (0.4.3)
Requirement already satisfied: stringcase in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (1.2.0)
Requirement already satisfied: cloudpickle>=1.1.1 in /usr/local/lib/python3.8/dist-packages (from dask>=2022.3.0->merlin-core==0.8.0+4.geda153c) (2.2.0)
Requirement already satisfied: toolz>=0.8.2 in /usr/local/lib/python3.8/dist-packages (from dask>=2022.3.0->merlin-core==0.8.0+4.geda153c) (0.12.0)
Requirement already satisfied: partd>=0.3.10 in /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg (from dask>=2022.3.0->merlin-core==0.8.0+4.geda153c) (1.2.0)
Requirement already satisfied: pyyaml>=5.3.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg (from dask>=2022.3.0->merlin-core==0.8.0+4.geda153c) (5.4.1)
Requirement already satisfied: tblib>=1.6.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (1.7.0)
Requirement already satisfied: click>=6.6 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (8.1.3)
Requirement already satisfied: tornado>=6.0.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (6.1)
Requirement already satisfied: msgpack>=0.6.0 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (1.0.4)
Requirement already satisfied: psutil>=5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (5.8.0)
Requirement already satisfied: zict>=0.1.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (2.0.0)
Requirement already satisfied: sortedcontainers!=2.0.0,!=2.0.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (2.4.0)
Requirement already satisfied: jinja2 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (3.1.2)
Requirement already satisfied: numpy<1.22,>=1.18 in /var/jenkins_home/.local/lib/python3.8/site-packages (from numba>=0.54->merlin-core==0.8.0+4.geda153c) (1.20.3)
Requirement already satisfied: llvmlite<0.39,>=0.38.0rc1 in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core==0.8.0+4.geda153c) (0.38.1)
Requirement already satisfied: setuptools in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core==0.8.0+4.geda153c) (65.4.1)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /usr/local/lib/python3.8/dist-packages (from packaging->merlin-core==0.8.0+4.geda153c) (3.0.9)
Requirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core==0.8.0+4.geda153c) (2.8.2)
Requirement already satisfied: pytz>=2017.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core==0.8.0+4.geda153c) (2022.2.1)
Requirement already satisfied: googleapis-common-protos<2,>=1.52.0 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core==0.8.0+4.geda153c) (1.52.0)
Requirement already satisfied: absl-py<2.0.0,>=0.9 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core==0.8.0+4.geda153c) (1.2.0)
Requirement already satisfied: locket in /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg (from partd>=0.3.10->dask>=2022.3.0->merlin-core==0.8.0+4.geda153c) (0.2.1)
Requirement already satisfied: six>=1.5 in /var/jenkins_home/.local/lib/python3.8/site-packages (from python-dateutil>=2.7.3->pandas<1.4.0dev0,>=1.2.0->merlin-core==0.8.0+4.geda153c) (1.15.0)
Requirement already satisfied: heapdict in /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg (from zict>=0.1.3->distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (1.0.1)
Requirement already satisfied: multidict in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (6.0.2)
Requirement already satisfied: h2<5,>=3.1.0 in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (4.1.0)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.8/dist-packages (from jinja2->distributed>=2022.3.0->merlin-core==0.8.0+4.geda153c) (2.1.1)
Requirement already satisfied: hyperframe<7,>=6.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (6.0.1)
Requirement already satisfied: hpack<5,>=4.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core==0.8.0+4.geda153c) (4.0.0)
Building wheels for collected packages: merlin-core
  Building wheel for merlin-core (pyproject.toml): started
  Building wheel for merlin-core (pyproject.toml): finished with status 'done'
  Created wheel for merlin-core: filename=merlin_core-0.8.0+4.geda153c-py3-none-any.whl size=118258 sha256=95123fdf66b5a8769d893d124ec9557ab56a92cb63f2c02f6b8198e88bbcb65a
  Stored in directory: /tmp/pip-ephem-wheel-cache-_wb1rhaj/wheels/c8/38/16/a6968787eafcec5fa772148af8408b089562f71af0752e8e84
Successfully built merlin-core
Installing collected packages: merlin-core
  Attempting uninstall: merlin-core
    Found existing installation: merlin-core 0.3.0+12.g78ecddd
    Not uninstalling merlin-core at /var/jenkins_home/.local/lib/python3.8/site-packages, outside environment /var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu
    Can't uninstall 'merlin-core'. No files were found to uninstall.
Successfully installed merlin-core-0.8.0+4.geda153c

[notice] A new release of pip available: 22.2.2 -> 22.3
[notice] To update, run: pip install --upgrade pip
test-gpu run-test: commands[1] | python -m pip install --upgrade git+https://github.com/NVIDIA-Merlin/dataloader.git
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Collecting git+https://github.com/NVIDIA-Merlin/dataloader.git
Cloning https://github.com/NVIDIA-Merlin/dataloader.git to /tmp/pip-req-build-vmyr4anp
Running command git clone --filter=blob:none --quiet https://github.com/NVIDIA-Merlin/dataloader.git /tmp/pip-req-build-vmyr4anp
Resolved https://github.com/NVIDIA-Merlin/dataloader.git to commit 5905283777ff5ebd748a1c91b7c9fde5710ae775
Installing build dependencies: started
Installing build dependencies: finished with status 'done'
Getting requirements to build wheel: started
Getting requirements to build wheel: finished with status 'done'
Preparing metadata (pyproject.toml): started
Preparing metadata (pyproject.toml): finished with status 'done'
Requirement already satisfied: merlin-core in ./.tox/test-gpu/lib/python3.8/site-packages (from merlin-dataloader==0.0.2+1.g5905283) (0.8.0+4.geda153c)
Requirement already satisfied: protobuf>=3.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (3.19.5)
Requirement already satisfied: packaging in /usr/local/lib/python3.8/dist-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (21.3)
Requirement already satisfied: tqdm>=4.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (4.64.1)
Requirement already satisfied: pyarrow>=5.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (7.0.0)
Requirement already satisfied: tensorflow-metadata>=1.2.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.10.0)
Requirement already satisfied: fsspec==2022.5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (2022.5.0)
Requirement already satisfied: distributed>=2022.3.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (2022.3.0)
Requirement already satisfied: dask>=2022.3.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (2022.3.0)
Requirement already satisfied: betterproto<2.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.2.5)
Requirement already satisfied: numba>=0.54 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (0.55.1)
Requirement already satisfied: pandas<1.4.0dev0,>=1.2.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.3.5)
Requirement already satisfied: grpclib in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (0.4.3)
Requirement already satisfied: stringcase in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.2.0)
Requirement already satisfied: cloudpickle>=1.1.1 in /usr/local/lib/python3.8/dist-packages (from dask>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (2.2.0)
Requirement already satisfied: toolz>=0.8.2 in /usr/local/lib/python3.8/dist-packages (from dask>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (0.12.0)
Requirement already satisfied: partd>=0.3.10 in /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg (from dask>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.2.0)
Requirement already satisfied: pyyaml>=5.3.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg (from dask>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (5.4.1)
Requirement already satisfied: tblib>=1.6.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg (from distributed>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.7.0)
Requirement already satisfied: click>=6.6 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (8.1.3)
Requirement already satisfied: tornado>=6.0.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg (from distributed>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (6.1)
Requirement already satisfied: msgpack>=0.6.0 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.0.4)
Requirement already satisfied: psutil>=5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg (from distributed>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (5.8.0)
Requirement already satisfied: zict>=0.1.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg (from distributed>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (2.0.0)
Requirement already satisfied: sortedcontainers!=2.0.0,!=2.0.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg (from distributed>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (2.4.0)
Requirement already satisfied: jinja2 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (3.1.2)
Requirement already satisfied: numpy<1.22,>=1.18 in /var/jenkins_home/.local/lib/python3.8/site-packages (from numba>=0.54->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.20.3)
Requirement already satisfied: llvmlite<0.39,>=0.38.0rc1 in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (0.38.1)
Requirement already satisfied: setuptools in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (65.4.1)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /usr/local/lib/python3.8/dist-packages (from packaging->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (3.0.9)
Requirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (2.8.2)
Requirement already satisfied: pytz>=2017.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (2022.2.1)
Requirement already satisfied: googleapis-common-protos<2,>=1.52.0 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.52.0)
Requirement already satisfied: absl-py<2.0.0,>=0.9 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.2.0)
Requirement already satisfied: locket in /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg (from partd>=0.3.10->dask>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (0.2.1)
Requirement already satisfied: six>=1.5 in /var/jenkins_home/.local/lib/python3.8/site-packages (from python-dateutil>=2.7.3->pandas<1.4.0dev0,>=1.2.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.15.0)
Requirement already satisfied: heapdict in /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg (from zict>=0.1.3->distributed>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (1.0.1)
Requirement already satisfied: multidict in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (6.0.2)
Requirement already satisfied: h2<5,>=3.1.0 in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (4.1.0)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.8/dist-packages (from jinja2->distributed>=2022.3.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (2.1.1)
Requirement already satisfied: hyperframe<7,>=6.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (6.0.1)
Requirement already satisfied: hpack<5,>=4.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core->merlin-dataloader==0.0.2+1.g5905283) (4.0.0)
Building wheels for collected packages: merlin-dataloader
Building wheel for merlin-dataloader (pyproject.toml): started
Building wheel for merlin-dataloader (pyproject.toml): finished with status 'done'
Created wheel for merlin-dataloader: filename=merlin_dataloader-0.0.2+1.g5905283-py3-none-any.whl size=31625 sha256=0fe4fe8b7bcd3d84a55e939b9e8475829339b9a18db0259b3aae83f49bbe63c8
Stored in directory: /tmp/pip-ephem-wheel-cache-vqa67sv5/wheels/de/f5/d9/251909f4627d2920fb15548f5ffd6daf1bf24c3c56bb4977b1
Successfully built merlin-dataloader
Installing collected packages: merlin-dataloader
Attempting uninstall: merlin-dataloader
Found existing installation: merlin-dataloader 0.0.2
Uninstalling merlin-dataloader-0.0.2:
Successfully uninstalled merlin-dataloader-0.0.2
Successfully installed merlin-dataloader-0.0.2+1.g5905283

[notice] A new release of pip available: 22.2.2 -> 22.3
[notice] To update, run: pip install --upgrade pip
test-gpu run-test: commands[2] | python -m pytest --cov-report term --cov merlin -rxs tests/unit
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-7.1.3, pluggy-1.0.0
cachedir: .tox/test-gpu/.pytest_cache
rootdir: /var/jenkins_home/workspace/nvtabular_tests/nvtabular, configfile: pyproject.toml
plugins: anyio-3.5.0, xdist-3.0.2, cov-4.0.0
collected 1436 items / 1 skipped

tests/unit/test_dask_nvt.py ............................................ [ 3%]
........................................................................ [ 8%]
.... [ 8%]
tests/unit/test_notebooks.py .... [ 8%]
tests/unit/test_tf4rec.py . [ 8%]
tests/unit/test_tools.py ...................... [ 10%]
tests/unit/test_triton_inference.py ................................ [ 12%]
tests/unit/examples/test_01-Getting-started.py . [ 12%]
tests/unit/examples/test_02-Advanced-NVTabular-workflow.py . [ 12%]
tests/unit/examples/test_03-Running-on-multiple-GPUs-or-on-CPU.py . [ 12%]
tests/unit/framework_utils/test_tf_feature_columns.py . [ 12%]
tests/unit/framework_utils/test_tf_layers.py ........................... [ 14%]
................................................... [ 18%]
tests/unit/framework_utils/test_torch_layers.py . [ 18%]
tests/unit/loader/test_tf_dataloader.py ................................ [ 20%]
........................................s.. [ 23%]
tests/unit/loader/test_torch_dataloader.py ............................. [ 25%]
..................................................... [ 29%]
tests/unit/ops/test_categorify.py ...................................... [ 31%]
........................................................................ [ 36%]
..................................................... [ 40%]
tests/unit/ops/test_column_similarity.py ........................ [ 42%]
tests/unit/ops/test_drop_low_cardinality.py .. [ 42%]
tests/unit/ops/test_fill.py ............................................ [ 45%]
........ [ 45%]
tests/unit/ops/test_groupyby.py ....................... [ 47%]
tests/unit/ops/test_hash_bucket.py ......................... [ 49%]
tests/unit/ops/test_join.py ............................................ [ 52%]
........................................................................ [ 57%]
.................................. [ 59%]
tests/unit/ops/test_lambda.py .......... [ 60%]
tests/unit/ops/test_normalize.py ....................................... [ 63%]
.. [ 63%]
tests/unit/ops/test_ops.py ............................................. [ 66%]
.................... [ 67%]
tests/unit/ops/test_ops_schema.py ...................................... [ 70%]
........................................................................ [ 75%]
........................................................................ [ 80%]
........................................................................ [ 85%]
....................................... [ 88%]
tests/unit/ops/test_reduce_dtype_size.py .. [ 88%]
tests/unit/ops/test_target_encode.py ..................... [ 89%]
tests/unit/workflow/test_cpu_workflow.py ...... [ 90%]
tests/unit/workflow/test_workflow.py ................................... [ 92%]
.......................................................... [ 96%]
tests/unit/workflow/test_workflow_chaining.py ... [ 96%]
tests/unit/workflow/test_workflow_node.py ........... [ 97%]
tests/unit/workflow/test_workflow_ops.py ... [ 97%]
tests/unit/workflow/test_workflow_schemas.py ........................... [ 99%]
... [100%]

=============================== warnings summary ===============================
../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/core.py:33
/usr/local/lib/python3.8/dist-packages/dask_cudf/core.py:33: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
DASK_VERSION = LooseVersion(dask.version)

.tox/test-gpu/lib/python3.8/site-packages/setuptools/_distutils/version.py:346: 34 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/setuptools/_distutils/version.py:346: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
other = LooseVersion(other)

tests/unit/test_dask_nvt.py: 6 warnings
tests/unit/workflow/test_workflow.py: 78 warnings
/var/jenkins_home/.local/lib/python3.8/site-packages/dask/base.py:1282: UserWarning: Running on a single-machine scheduler when a distributed client is active might lead to unexpected results.
warnings.warn(

tests/unit/test_dask_nvt.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 2 files did not have enough partitions to create 8 files.
warnings.warn(

tests/unit/test_dask_nvt.py::test_merlin_core_execution_managers
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/core/utils.py:431: UserWarning: Existing Dask-client object detected in the current context. New cuda cluster will not be deployed. Set force_new to True to ignore running clusters.
warnings.warn(

tests/unit/loader/test_tf_dataloader.py::test_horovod_multigpu
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 1 files did not have enough partitions to create 5 files.
warnings.warn(

tests/unit/loader/test_tf_dataloader.py: 2 warnings
tests/unit/loader/test_torch_dataloader.py: 12 warnings
tests/unit/workflow/test_workflow.py: 9 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 1 files did not have enough partitions to create 2 files.
warnings.warn(

tests/unit/ops/test_fill.py::test_fill_missing[True-True-parquet]
tests/unit/ops/test_fill.py::test_fill_missing[True-False-parquet]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
/var/jenkins_home/.local/lib/python3.8/site-packages/pandas/core/indexing.py:1732: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self._setitem_single_block(indexer, value, name)

tests/unit/ops/test_ops_schema.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/schema/tags.py:148: UserWarning: Compound tags like Tags.USER_ID have been deprecated and will be removed in a future version. Please use the atomic versions of these tags, like [<Tags.USER: 'user'>, <Tags.ID: 'id'>].
warnings.warn(

tests/unit/ops/test_ops_schema.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/schema/tags.py:148: UserWarning: Compound tags like Tags.ITEM_ID have been deprecated and will be removed in a future version. Please use the atomic versions of these tags, like [<Tags.ITEM: 'item'>, <Tags.ID: 'id'>].
warnings.warn(

tests/unit/workflow/test_cpu_workflow.py: 6 warnings
tests/unit/workflow/test_workflow.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 1 files did not have enough partitions to create 10 files.
warnings.warn(

tests/unit/workflow/test_workflow.py: 48 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 2 files did not have enough partitions to create 20 files.
warnings.warn(

tests/unit/workflow/test_workflow.py::test_parquet_output[True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_parquet_output[True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_parquet_output[True-None]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-None]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-None]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py:862: UserWarning: Only created 2 files did not have enough partitions to create 4 files.
warnings.warn(

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html

---------- coverage: platform linux, python 3.8.10-final-0 -----------
Name Stmts Miss Cover

merlin/transforms/init.py 1 1 0%
merlin/transforms/ops/init.py 1 1 0%

TOTAL 2 2 0%

=========================== short test summary info ============================
SKIPPED [1] ../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/io/tests/test_s3.py:14: could not import 'moto': No module named 'moto'
SKIPPED [1] tests/unit/loader/test_tf_dataloader.py:609: Skipping test because of horovod missing MPI
========== 1435 passed, 2 skipped, 258 warnings in 1157.06s (0:19:17) ==========
/usr/local/lib/python3.8/dist-packages/coverage/control.py:801: CoverageWarning: No data was collected. (no-data-collected)
self._warn("No data was collected.", slug="no-data-collected")
___________________________________ summary ____________________________________
test-gpu: commands succeeded
congratulations :)
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://github.com/gitapi/repos/NVIDIA-Merlin/NVTabular/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[nvtabular_tests] $ /bin/bash /tmp/jenkins4004047581712898951.sh

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Update nvtabular to use the new dataloader package
3 participants